Article

The road to making peer review an added value—not an added burden

By Hans Lugnegård, Product Manager Radiology IT, Sectra

Most radiology departments currently use some form of formalized peer review and those that do not will soon as a result of upcoming legislation. However, many radiologists claim that previous attempts at peer review failed to accomplish the goals of identifying and meeting learning objectives, enhancing performance and improving patient outcome. Instead, peer review is perceived to increase people’s workload, decrease morale and take time from other value-adding activities[1].

In this article, I will share my thoughts on the characteristics required for a peer review solution to gain support and compliance among radiologists and to actually achieve the goals of peer review. The article will cover the following areas:

  1. Reviewer efficiency and statistical output
  2. The value of instant feedback
  3. Confidentiality, credibility and bias
  4. A plan for the future – Integration with pathology workflow and utilization of machine learning

Integration with the PACS image reading workflow

Feedback is already a natural part of most radiologists’ daily routine. Radiologists frequently ask colleagues to look at images and regularly participate in multi-disciplinary team meetings, to name only two examples. The question is, how can a peer review solution increase and support this kind of feedback as well as supporting quality goals and learning objectives in a more structured way. I strongly believe that the key is to not look at peer review as a separate process and tool, but rather as a natural part of the PACS image review workflow, from both a tool and process perspective.

Peer review and feedback must be effortless. If not, they will never be used or add value. The workflow has to be integrated into the reading process on an ad hoc basis – for example, when reading priors – or on a formal basis via randomized worklists. Switching between applications not only decreases the chances of adoption, it also adds more mouse mileage and is time-consuming. In addition, there is a high risk that the feedback will remain one directional since the author is unlikely to log on to a separate system only to look for comments on his/her feedback.

The statistical output for deep analysis should be stored in a data warehouse solution where statistical applications can be designed for quality assurance monitoring.

Key advice:

Implement a solution that is intuitive and easy to use and can be seamlessly integrated into your existing workflows.

The value of instant feedback

I already mentioned that feedback is happening even without any tools. To me, it is crucial that the value generated by this kind of instant feedback is taken into account when creating a peer review solution. Remember, the goal of peer review is to improve quality of care, not to simply meet demands for statistical reporting and look for errors. The added value created for the individual radiologist by receiving instant feedback from peers should not be underestimated. True value is created when a radiologist learns from the feedback. A peer review solution must facilitate instant feedback to the author, but also inform the reviewer that the feedback was received to motivate further feedback.  The solution must notify the user of any feedback pending action and support bidirectional communication when needed.”

Key advice:

Make sure your peer review solution supports and enhances instant feedback with bidirectional communication options.

Confidentiality—staying away from the blame game

There is an obvious risk that structured peer review may shift focus from the actual content of the report being evaluated to the radiologist behind the report or the review. Naturally, this should not be allowed to impact the review process. To avoid bias, a peer review solution should support anonymous feedback as well as a possibility to bring any disagreements before a board for further arbitration. This is essential for the credibility of the system.

Key advice:

Your peer review solution must be multifaceted and enable both “open” feedback, especially in an ad hoc scenario, and a possibility for anonymous feedback.

A plan for the future—integration with pathology workflow and utilization of machine learning

In my opinion, the primary need within peer review is to actually design tools and processes that work and make feedback effortless. However, once this is in place, some very exciting opportunities lie ahead. At Sectra, we look at deep learning as a way to improve decision support and workflow efficiency. Peer review is one area where this technology could have a significant impact. Deep learning could provide learning tools and enable automatic assignment to reviewers and authors based on reading history and performance.

Another interesting area is integrated diagnostics. With pathology now going digital, new opportunities for better collaboration between radiology and pathology are arising. I believe we will see the results of integrating pathology with radiology workflows in the future in the form of increased efficiency and quality of cancer care.

Key advice:

The future holds plenty of exciting opportunities. Ensure your vendor has the ability to look beyond isolated peer review functionality to really take advantage of these opportunities.

Summary

I think we can be certain that peer review, or if you like peer feedback, will be a natural part of all radiologists’ work. Whether this will be an added burden or added value depends on system design and user adoption. I for one believe that when correctly designed and integrated into existing workflows, peer review can enable individuals to become better radiologists by learning from their peers and mistakes and by focusing training efforts on the areas where they are most needed. It can also enable organizations to identify quality issues and even be used as a competitive advantage in the private healthcare markets.

References

[1] An example of a survey presenting such views is found in N.H. Strickland / Clinical Radiology 70 (2015) 1158e1164. The survey relates to the RADPEER program.

Author: Hans Lugnegård, Product Manager Radiology IT, Sectra

Related reading

Related products