Summary Report of Stakeholder Input received for the Peer Review Expert Panel

CIHR Evaluation Unit
Performance and Accountability Branch

November 2016

Stakeholder Engagement Approaches

CIHR and its Institutes engaged stakeholders in providing input on the Peer Review Expert Panel’s six questions.

  1. Web submission form: On behalf of its Institutes, CIHR launched a web submission form to collect online feedback from its stakeholder community. The form included a small number of demographic profile questions and provided an unlimited amount of open text space for respondents to provide their views on the six questions being addressed by the Peer Review Expert Panel. In addition, respondents had the option of uploading additional, supplemental documentation.
  2. Institute-led stakeholder consultation: CIHR’s Institutes collected feedback from their stakeholder communities, on the six questions posed to the Peer Review Expert Panel (PREP) using a variety of approaches, including a web form launched by CIHR, targeted focus groups, emails, and discussions during other meetings. The information gathered was summarized by each Institute in individual reports. The views in the reports reflect those of the stakeholders consulted and not of CIHR.
  3. Submitted briefs by stakeholder organizations: National-level organizations which represent constituencies of CIHR stakeholders were invited to submit briefs that provided their organizational view on the Panel questions.

This report synthesizes feedback from the web submission form and the Institute-led stakeholder consultations. The submitted briefs are provided verbatim as an appendix to this report, as are the individual reports that summarize the Institute-led consultations.

The web submission form data and reports of the Institute-led consultations were coded according to major themes through a content analysis. The main themes from the content analysis representing more than a third of web submission respndents (i.e., 30+ individual responses) or themes that appeared in more than half of the Institute-led consultation reports (i.e., in 7-12 reports) are presented below.

Participants

The web submission form process generated 102 submissions as follows:

  • 11 of CIHR’s 13 Institutes had respondents affiliated with them (all except for the Institute of Aboriginal Peoples’ Health and the Institute of Gender and Health);
  • 77.5% (n=79) of responses were from people who identified with the Biomedical pillar;
  • 66.7% (n=68) were male;
  • 49% (n=50) were senior investigators; 32.4% (n=33) were mid-career; 12.7% (n= 13) were early; the balance were “other”;
  • 71.6% (n=75) had applied to at least one of 2014-2015 Foundation Grant competition, 2015-2016 Foundation Grant competition, or 2015 Project Grant competition;
  • 65.8% (n=48) of those that had applied indicated that they were unsuccessful, and;
  • 32.7% (n=33) had served as a reviewer or virtual Chair for at least one of 2014-2015 Foundation Grant competition, 2015-2016 Foundation Grant competition, or 2015 Project Grant competition.

Twelve of CIHR’s 13 Institutes submitted reports that summarized the results of their Institute stakeholder consultation processes. Institutes engaged a variety of stakeholders including representatives from their research communities, including holders of CIHR grants (e.g., large multi-year team grants or leaders of research centres) or awards (e.g., Chair awards). Some of those stakeholders consulted had applied, reviewed, or served as a Chair in the new funding programs. Some Institutes engaged their former Advisory Board Members.

Summary of Major Themes

  1. Respondents did not feel that the implementation of the Reforms had been successful or had addressed the challenges that it was designed to. Some felt that it was too soon to assess the success of the design overall, however they believed the potential benefits of the design had been undermined by the significant issues in implementation (e.g., reviewer matching to applications; changes in delivery approach and application requirements/rules; Canadian Common CV; delays in implementing the College of Reviewers).
  2. Respondents felt strongly that changes to CIHR’s investigator-initiated programs and peer review processes had resulted in several significant outcomes:
    1. CIHR’s credibility for funding excellence and delivering high quality peer review processes had been jeopardized;
    2. concentration of resources in certain areas of the research community (e.g. in larger labs; biomedical pillar; to more senior/established researchers);
    3. Less support for innovative research;
    4. Potential biases in the Foundation and Project Grant Programs against female applicants and new/early career investigators.
  3. Respondents perceived that the changes negatively affected CIHR’s peer review system due to:
    1. Insufficient reviewer capacity (e.g., a loss of expert peer review and discipline-based peer review committees; lack of expertise for interdisciplinary applications; insufficient numbers of reviewers; increase in peer reviewer fatigue; issues attracting and developing a pool of reviewers with appropriate expertise (acknowledging the small pool of reviewers in Canada),
    2. Issues with peer review system (e.g., the reviewer matching/algorithm and review criteria; issues with ranking applications rather than rating applications)
    3. Issues with low-quality online reviews (some attributed this to the anonymity of the system and online review process) and a lack of accountability for reviews [amongst reviewers]; and,
    4. Lack of reviewer incentives and recognition, including institutional recognition.
  4. Respondents called for face-to-face peer review to be reinstated (or an effective means of real time discussion) in order to keep up with international standards of peer review because this model serves as a form of quality control, helps to correct biases and ensure equity across genders and career stages and enhances peer reviewer accountability and upholds peer review standards. The NIH and other peer review systems (e.g., Australia National Health and Medical Research Council; Wellcome Trust) were mentioned examples to follow.
  5. Respondents stated that CIHR should examine success rates, taking into consideration the following: career stage; gender; pillar; risk/innovation of project (e.g., the Reuter ranking on innovation); reviewer statistics (average review time, number of reviewers per applications, # reviewers who decline, reasons for declining to review), reviewer information (field, province, and institution); and, previous success rate and whether the application was resubmitted using previous feedback.
  6. Respondents expressed frustration at CIHR’s research community engagement process and felt that their concerns had not been heard, taken seriously or subsequently addressed by CIHR. It was suggested that CIHR should collect feedback from stakeholders (e.g., surveys and interviews with the research community, applicants, reviewers, panel/virtual Chairs, etc.).
Date modified: