Is recertification essential or a waste of time? Either way, the process isn’t trusted, and needs greater depth and validation for it to truly measure quality.
If you are like me, you may not be too excited come board recertification time. Opinions run the gamut - some touting recertification as essential and others claiming it’s a waste of time. Recently there has been open criticism in the press of the recertification process being self-serving, rather than what it is intended to be: a quality measure. At the same time, some have criticized the test takers for getting around the test’s intent.
Those inside and outside the profession likely won’t trust the system or endorse it unless we can agree on a system that works.
So what makes the system not trusted?
Amongst those who don’t trust the system, I’ve heard several criticisms. For one thing, some say, a large group of physicians aren’t subject to the process; they were grand-fathered out of it. And they are the ones who developed the process. If you have to take a test to graduate and the physician next to you does not, it is hard to think the system is fair. It stands to reason you either know and can practice radiology or you can’t, regardless of when you trained.
According to these critics, there’s no good argument for why a generation of physicians were subject to this process and another was not - other than favoritism. That’s one reason for cynicism about the process.
Meanwhile, on the other side, some of the specialty boards have harshly criticized the use of so-called board “recalls” (sharing of old test questions), despite the fact that this is a long-used practice. The boards maintain this is about fairness and imply that using recalls is cheating. The recall users note that some of those calling them cheaters didn’t take recertification exams at all. And others, they observe, likely used recalls themselves.
Moreover, these are sometimes the only means of guessing what part of the vast amount of information might be tested. So we have a whole generation of specialists - some radiologists - who have no recertification requirement at all, practicing next to an increasing number of physicians who regularly have to be retested, but claim they may not know what they’ll be tested on.
To critics outside of the process, the untested group gets a “free pass.” and the group who uses recalls to pass it are cheaters. For those being tested, that is a frustrating situation. And one that has alienated many physicians. So that explains part of the frustration.
An additional kicker, critics say, is that there is little data showing validity of the recertification system. Does it really demonstrate competence? A lack of data further undermines confidence. Finally, to those outside the medical world, there is little information available to the public regarding physician quality, so it is hard for anyone to trust our process.
None of that changes that fact that we must have some process to validate competence and that the health reimbursement system and public in general increasingly demand a system to show quality. The question is what system?
Taking a step back, the goal is to ensure quality in the profession and to be able to show that. Recertification exams may very well be valid, but need to be uniformly and fairly applied. They may not assess the breadth of medical knowledge needed to practice and the tools need to do that. Therefore they should be part of the solution - a larger report card is needed. Society is already pushing toward this, and that too can be problematic. Do we want our report card to be accumulated opinions offered on the internet? Probably not.
What could be valid is a combination of patient satisfaction data, professional credentialing documentation, standardized exam data, and quality metrics including peer review. And specialty boards need to be repositories for, supervisory of and in control of that information. In short, the system needs greater depth and validation to be reliable and trusted. We may need to take this hard step of revising the process by which we attest to competence of our colleagues or risk having no trusted system at all.
Could AI-Powered Abbreviated MRI Reinvent Detection for Structural Abnormalities of the Knee?
April 24th 2025Employing deep learning image reconstruction, parallel imaging and multi-slice acceleration in a sub-five-minute 3T knee MRI, researchers noted 100 percent sensitivity and 99 percent specificity for anterior cruciate ligament (ACL) tears.
The Reading Room: Artificial Intelligence: What RSNA 2020 Offered, and What 2021 Could Bring
December 5th 2020Nina Kottler, M.D., chief medical officer of AI at Radiology Partners, discusses, during RSNA 2020, what new developments the annual meeting provided about these technologies, sessions to access, and what to expect in the coming year.
New Collaboration Offers Promise of Automating Prior Authorizations in Radiology with AI
March 26th 2025In addition to a variety of tools to promote radiology workflow efficiencies, the integration of the Gravity AI tools into the PowerServer RIS platform may reduce time-consuming prior authorizations to minutes for completion.
Strategies to Reduce Disparities in Interventional Radiology Care
March 19th 2025In order to help address the geographic, racial, and socioeconomic barriers that limit patient access to interventional radiology (IR) care, these authors recommend a variety of measures ranging from increased patient and physician awareness of IR to mobile IR clinics and improved understanding of social determinants of health.