• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Rethinking Radiology Board Recertification

Article

Is recertification essential or a waste of time? Either way, the process isn’t trusted, and needs greater depth and validation for it to truly measure quality.

If you are like me, you may not be too excited come board recertification time. Opinions run the gamut - some touting recertification as essential and others claiming it’s a waste of time. Recently there has been open criticism in the press of the recertification process being self-serving, rather than what it is intended to be: a quality measure. At the same time, some have criticized the test takers for getting around the test’s intent.

Those inside and outside the profession likely won’t trust the system or endorse it unless we can agree on a system that works.

So what makes the system not trusted?

Amongst those who don’t trust the system, I’ve heard several criticisms. For one thing, some say, a large group of physicians aren’t subject to the process; they were grand-fathered out of it. And they are the ones who developed the process. If you have to take a test to graduate and the physician next to you does not, it is hard to think the system is fair. It stands to reason you either know and can practice radiology or you can’t, regardless of when you trained.

According to these critics, there’s no good argument for why a generation of physicians were subject to this process and another was not - other than favoritism. That’s one reason for cynicism about the process.

Meanwhile, on the other side, some of the specialty boards have harshly criticized the use of so-called board “recalls” (sharing of old test questions), despite the fact that this is a long-used practice. The boards maintain this is about fairness and imply that using recalls is cheating. The recall users note that some of those calling them cheaters didn’t take recertification exams at all. And others, they observe, likely used recalls themselves.

Moreover, these are sometimes the only means of guessing what part of the vast amount of information might be tested. So we have a whole generation of specialists - some radiologists - who have no recertification requirement at all, practicing next to an increasing number of physicians who regularly have to be retested, but claim they may not know what they’ll be tested on.

To critics outside of the process, the untested group gets a “free pass.” and the group who uses recalls to pass it are cheaters. For those being tested, that is a frustrating situation. And one that has alienated many physicians. So that explains part of the frustration.

An additional kicker, critics say, is that there is little data showing validity of the recertification system.  Does it really demonstrate competence? A lack of data further undermines confidence. Finally, to those outside the medical world, there is little information available to the public regarding physician quality, so it is hard for anyone to trust our process.

None of that changes that fact that we must have some process to validate competence and that the health reimbursement system and public in general increasingly demand a system to show quality. The question is what system?

Taking a step back, the goal is to ensure quality in the profession and to be able to show that. Recertification exams may very well be valid, but need to be uniformly and fairly applied. They may not assess the breadth of medical knowledge needed to practice and the tools need to do that. Therefore they should be part of the solution - a larger report card is needed. Society is already pushing toward this, and that too can be problematic. Do we want our report card to be accumulated opinions offered on the internet? Probably not.

What could be valid is a combination of patient satisfaction data, professional credentialing documentation, standardized exam data, and quality metrics including peer review. And specialty boards need to be repositories for, supervisory of and in control of that information. In short, the system needs greater depth and validation to be reliable and trusted. We may need to take this hard step of revising the process by which we attest to competence of our colleagues or risk having no trusted system at all.

Related Videos
Nina Kottler, MD, MS
The Executive Order on AI: Promising Development for Radiology or ‘HIPAA for AI’?
Related Content
© 2024 MJH Life Sciences

All rights reserved.