Setting the Context: Presentation to the University Delegates

June 2013

Summary Table

  1. CIHR: Reforms of open programs and peer review
  2. Setting the Context
  3. Introduction
  4. Common Themes
  5. Why are you changing the programs and the peer review system at the same time? This is a lot of change at once and is risky.
  6. Research is done in the four pillars in very different ways. Why are you using the same adjudication criteria for all pillars?
  7. What does enhanced institution support really mean?
  8. How will you ensure that one pillar is not negatively impacted by the changes?
  9. Who is eligible for the first Foundation Scheme Pilot?
  10. My grant is ending during the transition period and I will have a gap in funding. What are my options?
  11. Why are you only holding one OOGP competition in 2014?
  12. Will there be enough reviewers to adjudicate the new schemes?
  13. Will there be enough reviewers to adjudicate the new schemes? (continued)
  14. How will you ensure quality of review without face to face meetings?
  15. How will you integrate results from all reviews and how will the face to face meetings work?
  16. CCV was a disaster. How will CIHR ensure that the technology is in place to support the change?
  17. I have requested specific changes to the design and they have not been made. Why is CIHR not listening?

[Slide 1]

CIHR: Reforms of open programs and peer review
Setting the Context

University Delegates Face-to-Face Meeting
June 27, 2013

[Slide 2]

Setting the Context

[Slide 3]

  • Since the release of the Designing for the Future document, a number of presentations have been made and discussions have been held:
    • Over 60 Town halls and discussions with institution administrators
    • Numerous discussions with professional societies, other funders (e.g., NAPHRO members, VHOs), national roundtables
    • Numerous working sessions with advisory groups:
      • Advisory Working Group of Senior Researchers
      • Humanities and Social Sciences Working Group
      • Engineering and Natural Sciences Advisory Group
      • CAURA Working Group
  • The tone of the conversation has changed – the focus now is much more on the transition and how the changes will impact individuals in specific circumstances.
  • Some common themes have emerged through these discussions...

[Slide 4]

Common Themes

  1. Why is CIHR changing the programs and the peer review system at the same time? – This is a lot of change at once and is risky.
  2. Research is done in the four pillars in very different ways. Why are you using the same adjudication criteria for all pillars?
  3. What does enhanced institution support really mean?
  4. How will you ensure that one pillar is not negatively impacted by the changes?
  5. Who is eligible for the first Foundation Scheme Pilot?
  6. My grant is ending during the transition period and I will have a gap in funding. What are my options?
  7. Why are you holding only one OOGP competition in 2014?
  8. Will there be enough reviewers to adjudicate applications to the new schemes?
  9. How will you ensure quality of review without face to face meetings?
  10. How will you integrate results from all reviews and how will the face to face meetings work?
  11. CCV was a disaster. How will CIHR ensure that the technology is in place to support the change?
  12. I have requested specific changes to the design and they have not been made. Why is CIHR not listening?

[Slide 5]

1. Why are you changing the programs and the peer review system at the same time? This is a lot of change at once and is risky.

  • Science Council fully acknowledges the risks that the implementation of these changes present.
  • We have deliberated at length on the advantages and disadvantages of making changes to our program architecture, and the process we use to adjudicate these programs, at the same time.
  • After hearing lessons learned from other funders around the world and reflecting on our own experiences, we believe that to achieve the intended benefits, it is critical to design and roll out these changes together.
  • In fact, we feel the risk is higher if we separate the two – a mistake many other funders have made over the years.
  • The transition plan that has been developed is a component of our overall risk mitigation plan. It includes a number of pilots, which allows us to phase in the design of a number of elements over time.
  • We are committed to monitoring each step of the transition plan very closely and making adjustments to improve success.
  • This may mean keeping existing programs in place longer than planned and/or adjusting the launch dates of the new schemes.

Pilots will be discussed in more detail later on the agenda.

[Slide 6]

2. Research is done in the four pillars in very different ways. Why are you using the same adjudication criteria for all pillars?

  • At the beginning of this process, Science Council believed that the adjudication criteria and the content for an application in both the Foundation and Project Schemes would have to be tailored for each pillar.
  • After numerous discussions with advisory groups about the adjudication criteria and analysis of all the current open programs, it was determined that the criteria should be the same across all pillars.
  • All pillars value:
    • the quality and feasibility of an idea in the project scheme and
    • the caliber of the applicant and quality of a program in the Foundation Scheme
  • With application focused review, each application will be reviewed by an appropriate expert who is able to apply the standards and metrics that are relevant to a specific area of research.
  • Interpretation guidelines for each criterion are under development to ensure that all reviewers have a common understanding of the criteria and are able to apply their knowledge appropriately and consistently.

The interpretation guidelines will be discussed in more detail later on the agenda.

[Slide 7]

3. What does enhanced institution support really mean?

  • In the original design documents, there was a desire for institutions to commit to additional or enhanced support for successful Foundation grant holders.
  • It was seen as an opportunity for CIHR and institutions to work together and improve the success of long-term programs of research.
  • Today, when institutions sign a grant application, they are committing to providing physical, organizational, policy and procedural infrastructure for the conduct of research.
  • This commitment is included in the terms and conditions of the Institutional Agreements.
  • After discussions with institution administrators from across the country, the current thinking is that this signature is committing the institution to provide the support a Foundation grant holder requires to be successful and that no additional commitments are required.
  • We will continue to work closely with institutions to ensure that they have a sound understanding of the objectives of the program and the eligibility criteria so that they can ensure the best applicants are applying from their institutions.

[Slide 8]

4. How will you ensure that one pillar is not negatively impacted by the changes?

  • One of the key indicators of success for these new programs is that systemic barriers are removed. This means that:
    • All types of research, within CIHR’s broad mandate, are eligible within the Foundation and Project Schemes, and
    • Excellence continues to get funded.
  • The funding landscape in Canada (and the world) remains competitive – it will continue to be as difficult as it has been over the last few years to get CIHR funding.
  • Science Council and Governing Council have debated the advantages and disadvantages of creating special funding envelopes for pillars.
  • The final decision is that the investigator-initiated programs should be open and should be adjudicated based on excellence.
  • The application pressure and success rates of each of the pillars will, however, be monitored after each competition to determine if any future interventions are required.

[Slide 9]

5. Who is eligible for the first Foundation Scheme Pilot?

An investigator is eligible to apply to the Fall 2014 Foundation “live pilot” competition if the individual is the Nominated Principal Investigator or Co-Principal Investigator of:
  • An OOGP grant with a grant term expiry date of March 31, 2015; or
  • An OOGP grant with a grant term expiry date of September 30, 2015; or
  • Any other Open grant with a grant term expiry date no earlier than October 1, 2014 and no later than September 30, 2015. This includes Partnerships for Health System Improvement, Knowledge Synthesis, Knowledge to Action, Proof-of-Principal Program Phase I and II, Industry-partnered Collaborative Research and OOGP grants with atypical expiry dates.

Figure 1

Figure 1: long description

[Slide 10]

6. My grant is ending during the transition period and I will have a gap in funding. What are my options?

  • CIHR recognizes that the transition plan will leave a number of researchers (estimated to be ~225 over the transition period) with a gap in funding.
  • To ensure that these researchers are not disadvantaged by the transition plan, a number of options are being provided:
    • Apply early to the OOGP without penalty (would keep existing grant if they are unsuccessful);
    • Request a change to their grant term end date without an increase in budget;
    • Participate in the Foundation Scheme pilots.
  • These researchers are encouraged to discuss their situation with their institution to make sure that they are aware of all possible options.

[Slide 11]

7. Why are you only holding one OOGP competition in 2014?

  • In 2014, CIHR is holding two competitions – a transitional OOGP and the first Foundation Scheme pilot.
  • The decision to hold only one OOGP competition was driven primarily by available uncommitted budget.
  • The budget envelope for these two competitions is the same envelope that would normally fund two OOGP competitions.
  • To successfully fund approximately 250 Foundation grants, it is not feasible to also fund 800 OOGP grants.

[Slide 12]

8. Will there be enough reviewers to adjudicate the new schemes?

  • CIHR has also analyzed the expertise that has been accessed over the last 5 years for peer review:
    • 5873 individuals reviewed at least once in the last 5 years.
    • CIHR annually recruits about 600 new reviewers to meet its regular needs, without a formal recruitment strategy.
    • Only 10% of CIHR reviewers are from outside Canada.
    • Reviewers currently engaged in Open programs are generally older than the Grantee and Applicant groups.

[Slide 13]

  • There is generally an alignment of the distribution of primary themes and research areas for the reviewer base and the application base

Figure 4

Figure 4: long description

[Slide 14]

9. How will you ensure quality of review without face to face meetings?

  • Quality of review continues to be an important consideration for CIHR.
  • An audit is currently underway to establish a baseline for quality of review in the OOGP.
  • CIHR believes the proposed changes will improve the quality of peer review overall, but is committed to ensuring that the overall quality of review remains at least at current levels and this will be monitored closely over time.
  • A number of design elements included in the new adjudication process are designed to increase the quality of review.
  • Structured Application & Review Criteria
    • Application information is presented in a consistent way for all applications
    • Application information is directly mapped to the adjudication criteria
    • Review criteria are consistently and appropriately applied.
    • Transparency of review process in increased
  • Application-Focused Review
    • Avoid “force fitting” applications into standing committee structure
    • Assign appropriate expertise to each application
    • Each application receives 5 expert reviews
  • Remote (virtual) Screening/ Review
    • Facilitates access to expertise, including international
    • Provides forum for discussion of discrepancies in ranking (reviewer names, rankings and comments are shared with other reviewers)
    • Minimizes group dynamics and committee culture biases

[Slide 15]

10. How will you integrate results from all reviews and how will the face to face meetings work?

  • The details of the adjudication process were not well described in the Designing for the Future document as they were still under development.
  • More thinking has been done about the overall process and discussions have now been held with a number of advisory groups.
  • A draft process has been developed and will continue to be refined and tested over the coming months.

The adjudication process will be discussed in more detail later on the agenda.

[Slide 16]

11. CCV was a disaster. How will CIHR ensure that the technology is in place to support the change?

  • The implementation of the CCV was difficult for researchers and for CIHR.
  • This experience has resulted in a number of changes at CIHR to address both CCV and future technology implementations.
  • Some of these changes include:
    • Ensuring that the user community is involved in identifying and validating the requirements up front (e.g. working with Chairs and SOs on the pilots)
    • Ensuring that comprehensive user testing is completed before any roll-out of new technology
    • Ensuring appropriate training and support are provided
    • Surveying applicants and reviewers to seek input about the changes
    • Ensuring strong project management, governance and risk mitigation
  • Each of the pilots will provide us with an opportunity to make adjustments to both processes and technology.
  • CIHR has contracted external expertise to assess what additional technology changes are required to support the implementation of the reforms.

[Slide 17]

12. I have requested specific changes to the design and they have not been made. Why is CIHR not listening?

  • Over the last two years, CIHR has heard from a wide set of stakeholders.
  • Input has been carefully considered by the Reforms Task Force and by Science Council.
  • If there is one thing that we have learned through this process – it is that there is not one common voice in Canada’s health research community. The landscape is extremely diverse.
  • CIHR has been listening carefully and has changed elements of the design as a result of feedback.
  • Unfortunately, not every change that was suggested was considered appropriate or feasible, and there are elements of the design that Science Council believes are critical to achieve desired outcomes and a successful implementation.
Date modified: