Lessons Learned in the Transition from Peer Review to Peer Learning

The new peer learning system in generating significantly more discrepancies resulted in angst that the data could more robustly be used for performance review. This fear was mitigated by the department chair emphasizing that the new peer learning process was meant to improve the educational value of peer review and would not be used in generating error rates for performance review.

Ryan K. Lee, MD, MBA
Vice Chair, Quality & Safety, Radiology, Einstein Healthcare Network
April 6, 2021

A team from the Einstein Healthcare Network saw participation soar when it transitioned from a peer review to a peer improvement process implemented across a network that included an academic tertiary care center, two community hospitals, and multiple outpatient sites. That success in itself yielded some unintended consequences that were addressed in further iterations, according to a helpful article from Lee et al that appeared in the Journal of the American College of Radiology, “Transitioning to Peer Learning: Lessons Learned.”

The authors acknowledge that while the traditional peer review process of scoring a peer’s interpretation has become the de facto quality marker for credentialing bodies, the programs are less successful in their ability to produce meaningful learning opportunities. Hence, Einstein embarked on a trial to compare radiologist response to its traditional program with a new process designed to “improve the practice of radiology through education,” the authors wrote.

What they discovered was that the changes required in mindset, workflow, and logistics with the peer learning approach were so substantial that they imperiled implementation. Nonetheless, when the dust settled, radiologist approval was such that there is now no turning back. 

Creating a Peer Learning Process

The transition from RADPEER to peer learning entailed developing both a new rating system as well as considerable tweaking within its PACS and workflow management software to accommodate the new approach. The rating system embedded into the reading platform includes the following:

  • Great call
  • Learning opportunity
    • Perception
    • Cognition
    • Communication
    • Reporting
    • Other

The great call option was created to recognize “exceptional identification of subtle findings” or “instances of impressive cognition.”  If learning opportunity was selected, the reader was prompted to choose whether or not the choice was clinically significant. A relatively low minimum of two reviews per month was required so as to not overburden radiologists with an untested workflow that could present unanticipated problems.

The reviewed cases were collected and sent to the Peer Review Committee (PRC), which was expanded to include at least one representative of each subspecialty to support the education priority. The PRC also was tasked with choosing cases that would be reviewed at the morbidity and mortality (M&M) conferences. Rather than focus on one subspecialty, the decision was made to represent all subspecialties at each conference.

Trial Outcome

For purposes of comparing radiologist feedback, the team at Einstein compared data and experience from the test year of 2019 with data from the baseline year of 2018, when the peer review program was still in place. At year’s end, the team found that the number of significant peer learning events increased under the new peer learning process.

In 2018, just five significant discrepancies were identified. During the peer learning trial, 416 total discrepancies were identified of which 45% were perception errors; 35.1% were great calls; 8.4% were cognition errors; 6.7% were reporting errors; and 0.5% were communication errors. In total, 16.64 peer learning events per radiologist occurred during the intervention period.

Most of the 22 participating radiologists (16 or 73%) agreed or strongly agreed that the peer learning system contributed more to their development as a radiologist than the traditional peer review system; 5 were neutral; and 1 radiologist disagreed. When it came to ease of use, 41% (or 9 radiologists) agreed or strongly agreed that the baseline peer review process was easier to use than the peer learning process; 7 radiologists were neutral; 5 radiologists disagreed; and 1 radiologist strongly disagreed. In general, 77% (17) of the radiologists agreed or strongly agreed that peer learning provided more opportunity to learn, and five radiologists were neutral.

Tweaks, Adjustments, Acceptance

Recognizing that engagement had increased dramatically, the Einstein team continued the tweaks and adjustments in response to lessons learned as the trial progressed. One of the early adjustments was eliminating removing from clinical workflow the responsibility and the mouse-click required to rate a learning opportunity as clinically significant or not and give that to the PRC, deemed to be a more appropriate place for the decision.

The survey made it clear that the software interface for peer learning was perceived as more cumbersome than peer review, likely due to increased granularity of information collected and the associated increase in mouse clicks. For instance, fields were deleted if the information sought could be found elsewhere, such as whether a study was read by a radiologist or a resident.

A bigger problem was that the radiologists feared becoming victims of the program’s success. “In theory, the baseline peer review process was designed for the purpose of assessing radiologist performance, because error rates for significant discrepancies were routinely generated,” wrote Lee et al. “In reality, the small numbers of significant discrepancies generated demonstrated that this process was more to document compliance, and its utility in assessing true performance was poor.

“The new peer learning system in generating significantly more discrepancies resulted in angst that the data could more robustly be used for performance review. This fear was mitigated by the department chair emphasizing that the new peer learning process was meant to improve the educational value of peer review and would not be used in generating error rates for performance review. Ultimately, buy-in of senior leadership was critical to the acceptance of peer learning as a legitimate educational tool.”

Addressing Regulatory Concerns

Another hurdle that had to be cleared was concern that in shifting the focus from clinical assessment to clinical improvement, the new program would not meet Joint Commission requirements for ongoing professional practice evaluation (OPPE).  After a thorough review of Joint Commission (JC) requirements, the team and health system believe that it can satisfy OPPE requirements using peer learning instead of peer review because the JC allows the use of quantitative or qualitative data, providing the data satisfies the appropriate hospital committees. If quantitative date is desired by the hospital, separate data can be collected such as report turnaround times for each radiologist.

A limiting factor in implementing and improving the new peer learning solution was the availability of an overworked IT support team, in part due to Covid-19. Reducing the number of mouse-clicks compared with the baseline solution entailed a great deal of back-end coding, and while some progress has been made, more work lies ahead to thoroughly optimize workflow based on lessons learned.

All of these challenges posed serious threats to the implementation of the peer learning program, but each was met by accounting for the viewpoints of each stakeholder: the radiologist as reviewer and reviewee, department leadership, the informatics team, and regulators.

“Considering these varied viewpoints was key in our initial transition to the peer learning system as well as its subsequent optimization to its current form,” concluded Lee et al.  “As is the case for any quality project, the current iteration is not meant to be final but part of a process that will continue to evolve as we continue to learn more.”

To learn more from the Einstein team’s experience, read their entire report.

BACK
Subscribe to Hub
News from around the coalition and beyond

Hub is the monthly newsletter published for the membership of Strategic Radiology practices. It includes coalition and practice news as well as news and commentary of interest to radiology professionals.

If you want to know more about Strategic Radiology, you are invited to subscribe to our monthly newsletter. Your email will not be shared.