Should it really take 14 years to become a doctor? Brian Palmer reminds us how this all got started:
An American physician spends an average of 14 years training for the job: four years of college, four years of medical school, and residencies and fellowships that last between three and eight years. This medical education system wasn’t handed down to us by God or Galen—it was the result of a reform movement that began in the late 19th century and was largely finished more than 100 years ago. That was the last time we seriously considered the structure of medical education in the United States.
The circumstances were vastly different at that time. Until the Civil War, private, for-profit medical schools with virtually no admissions requirements subjected farm boys to two four-month sessions of lectures and sent them off to treat the sick. (The second session was an exact duplicate of the first.) The system produced too many doctors with not enough training. Abraham Flexner, the education reformer who wrote an influential report on medical education in 1910, put a fine point on the problem: “There has been an enormous over-production of uneducated and ill trained medical practitioners,” he wrote. (Emphasis added.) “Taking the United States as a whole, physicians are four or five times as numerous in proportion to population as in older countries like Germany.”
In other words, our current medical education system was originally designed to reduce the total number of people entering the profession. The academic medical schools that sprang up around the country—such as the Johns Hopkins Hospital in 1889—made college education a prerequisite. Medical school expanded from eight months to three years and solidified at four years in the 1890s. Postgraduate training programs were implemented, beginning with a one-year internship. These were brilliant reforms at the time.
Over the past century, there have been additions to, but few subtractions from, the training process. Residency and fellowship programs became longer and longer … and longer. The path to some specialties is now almost comically arduous. Many hand surgeons, for example, complete five years in general surgery, followed by three years in plastic surgery, followed by another year of specialized hand surgery training. To be a competitive candidate for a hand surgery fellowship, it’s also strongly recommended to spend two additional years on research at some point during the process.
One crazy idea comes from the outcomes movement:
American medical schools and residency programs have traditionally relied on the “tea steeping” method: They expose students to information for a prescribed amount of time, and assume they’re ready at the end of it. Years can be added if a student demonstrates gross incompetence in exams, but there’s no opportunity for exceptional students to accelerate the process. Offering that chance makes educators uncomfortable—both because it relies heavily on imperfect examinations and because it partially undermines the traditional process—but it’s time to experiment.
“Experiment” is the key word. The fundamental problem here is that the argument between traditionalists and reformers is essentially theoretical—we are in an evidence vacuum. It’s ironic, because in virtually every other aspect of medicine, tradition and intuition were discarded decades ago. Researchers rigorously test what is the best moment to start someone infected with HIV on antiretrovirals or a patient with high cholesterol on statins. But doctors have very rarely examined their own training. When Emanuel and Fuchs published their proposal two years ago, they could find just a single study comparing the competence of physicians from the traditional four-plus-four medical education system with that of doctors from shortened programs.
There is no reason not to do this important research.