2018 New Frontiers in Research Fund—Exploration stream competition: Co-Chairs’ Report
This report from the co-chairs of the New Frontiers in Research Fund (NFRF) Multidisciplinary Review Panel, with regard to the inaugural 2018 NFRF—Exploration stream competition, is submitted for consideration by the Canada Research Coordinating Committee.
Université de Montréal
Université de Québec à Montréal
Phillip F. Gardiner
University of Manitoba
Université de Québec à Rimouski
University of Manitoba
University of Alberta
Marcelo L. Urquia
Manitoba Centre for Health Policy
University of Toronto
University of Ottawa
University of Calgary
The multidisciplinary review panel met for 2 full days (March 26-27, 2019) to review and discuss the 321 top-ranked applications of the 1,315 eligible applications submitted to the competition. The top-ranked applications were identified based on rankings provided by reviewers in advance of the meeting. During the meeting, applications were discussed individually within 1 of the 5 sub-committees. Conflicts of interest were managed according to the tri-agency policies used for peer review processes.
After all applications were discussed, members were asked to individually identify the ones they felt were strongest; members could select up to 50% of the applications that met the threshold of excellence. Members’ votes were consolidated and a ranked list was produced for each sub-committee. The co-chairs for each of the 5 sub-committees (15 in total) met to review and discuss a consolidated ranked list to finalize the recommendation of the multidisciplinary review panel. After discussion and taking into consideration the input that members had provided them, the co-chairs recommended 157 for funding—approximately 31 from each sub-committee—for a total budget of $38.43 million over 2 years. The co-chairs unanimously agreed not to fund further than the established threshold reflecting the most meritorious applications.
At the end of the competition process, a policy discussion was held in each meeting room and an additional one was held after, with all co-chairs. In general, members were very positive about the program and the overall competition process, while recognizing that its unique character and the short timelines created many challenges for applicants, staff and members alike. Suggestions for consideration for future competitions are listed below.
- Individuals with experience and expertise in conducting, contributing to and/or assessing multidisciplinary projects were selected to be members of the multidisciplinary panel. The breadth of topics in submitted applications, spanning the mandates of the 3 agencies, represented a new level of “multidisciplinary” review for many members. The level and quality of discussions were rich and members appreciated the experience, noting that they felt it was the right approach for this program.
- Some members suggested adding an external expert review to the process for future competitions, although they acknowledged that expert reviews may result in the elimination of more risky proposals.
- The importance of applications being written in plain language for a diverse audience was strongly noted. Applications that were dense or jargon-filled were difficult to read and understand.
- The committees would benefit from the addition of more early career researchers.
High Risk, High Reward, Feasibility
- The tension between high risk and feasibility was discussed. Members are accustomed to considering the proposed project (methodology, etc.) and feasibility as components of an overall rating for a research proposal. Guidelines should be reviewed to ensure clarity, with high risk referring to the proposed approach, and feasibility relating to the ability to undertake the work (including a reasonable methodology, sufficient experience on the team, access to critical equipment/infrastructure, etc.).
- Members thought that some applicants seemed to confound novelty with risk. Although novel research is typically riskier, a novel approach is not necessarily high risk. It was suggested that the difference be clarified in program literature by explaining, for example, that a high risk project: (1) could fail; or (2) proposes something so innovative that it is unknown where it might lead. It was also recommended that the program literature make note that there is an expectation that some projects will fail, to encourage more researchers to conceive and propose high risk projects.
- The relatively low number of applications from the social sciences and humanities was noted by members. It was suggested that the program literature be reviewed to ensure that social science and humanities researchers see where they fit in the program (e.g., defining high reward as outcomes with “profound significance”).
- The terms in French do not have the same connotation as the terms in English. Rendement (High Reward in French) could be perceived as having a financial inference, which may make applicants decide not to apply if the potential impact of their research does not have economic implications.
- The inaugural Exploration stream competition defined interdisciplinarity using the new Canada Research Development Classification codes to define disciplines. In some cases, because of the flexibility of the definition, this approach allowed applicants to propose research that crossed disciplines that have an established tradition of collaborating. Notably, applications that were discussed and successful were those that pushed the boundaries of interdisciplinarity.
- Some members suggested that the definition of interdisciplinarity be changed to require that the proposal crosses agency boundaries; others disagreed. Members supported the notion that applicants provide an explanation for fit to the program.
Equity, Diversity and Inclusion
- The challenges associated with the review of the equity, diversity and inclusion (EDI) criterion were discussed. While many members felt that the emphasis placed on EDI was appropriate and that, to further highlight its importance, this section should be the first one in the application, other members found it challenging to evaluate applications against this particular criterion.
- The consequence of a fail evaluation on EDI was discussed. Again, while some members showed strong support and some suggested that the review should have been more severe, others questioned that this rating resulted in an application being deemed not fundable. One suggestion to address this was to move from a pass/fail to a rating scale for evaluation.
- Opinions were also different regarding the appropriate process for reviewing EDI. Some members believed that they did not have the right expertise to review this criterion and suggested that it be delegated to the tri-agency internal review committee or a committee of individuals who have more expertise in assessing EDI than panel members.
- In addition to best practices in managing EDI in the research environment of a project, it was suggested that applicants be required to identify specific EDI challenges in their situation, with the assumption that no situation is ideal, and present their plan to address it. This will allow applicants to address their particular context and identify the concrete measures they have taken or will undertake to improve it.
- Members noted that a number of applications that involved Indigenous research were not identified as such by the applicants using the tick-box.
- Members felt that either a separate committee of experts in Indigenous research review all relevant applications before the multidisciplinary review panel meeting, or that several experts be included on all sub-committees.
- Members support the idea of triaging applications at earlier stages to limit work for applicants and for members.
- The length of the applications was felt to be appropriate, though there were some suggestions to be more prescriptive regarding the length of individual sections and allow for more information about the team members.
- Date modified: