A few osteopathic medical schools fare well in the latest U.S. News & World Report rankings of “Best Medical Schools.” In primary care, which is considered one of the two major ranking categories, the Michigan State University College of Osteopathic Medicine in East Lansing stands at No. 14 and the University of North Texas Health Science Center Texas College of Osteopathic Medicine (UNTHSC/TCOM) in Fort Worth at No. 20. In addition, the West Virginia School of Osteopathic Medicine (WVSOM) in Lewisburg ranks 9th in rural medicine and 12th in family medicine.
What’s more, three osteopathic medical schools make the Top 20 in geriatrics. UNTHSC/TCOM ranks 16th, while the University of Medicine and Dentistry-School of Osteopathic Medicine (UMDNJ-SOM) in Stratford ties for 19th place with the University of New England College of Osteopathic Medicine in Biddeford, Maine.
Many other osteopathic medical schools, however, are poorly ranked or unranked, and medical school rankings can vary mysteriously from year to year.
A number of osteopathic medical school representatives agree that the U.S. News ranking methodology is flawed, but they differ as to what should be done as a consequence. School administrators have varying views on whether the ranking system is a challenge to surmount or a travesty to shun.
Compounding the problem, U.S. News rankings are sometimes taken out of context by organizations that adapt the statistics for their own purposes. The popular physician rating site Vitals.com, for example, draws from the rankings to assign a quality score to medical schools. Vitals gives the vast majority of osteopathic medical schools just one out of four stars.
Call to boycott?
Joining five other osteopathic medical schools, the Ohio University Heritage College of Osteopathic Medicine (OU-HCOM) in Athens opted out last year from participating in U.S. News’ annual data and opinion surveys, which form the basis of the med school rankings. OU-HCOM’s director of institutional assessment and planning, Alex Westerfelt, PhD, hopes to spur other colleges in the profession to discontinue participation in the U.S. News ranking system, which he asserts is subjective, unscientific and strongly biased against osteopathic medical education, despite the high rankings of a handful of DO schools.
“The rankings are a very successful marketing effort but are not a valid research effort,” Dr. Westerfelt contends. “We have to stop letting a business venture define quality in medical schools.”
Dr. Westerfelt takes particular issue with the primary care rankings, for which, he insists, U.S. News “uses inferior methodology that no peer-reviewed journal would ever consider.” Given their strength in producing primary care physicians, osteopathic medical schools have been woefully underrepresented among the top-ranked institutions, he says.
The U.S. News ranking system gives each medical school a composite score of up to 100 points based on four broad components:
- Quality assessment, weighted at 40%.
- Proportion of graduates entering primary care, 30%.
- Ratio of full-time faculty members to students, 15%.
- Selectivity of admissions, 15%.
To come up with its quality assessment, U.S. News sends a survey to medical school deans, associate deans for academic affairs, and heads of internal medicine departments, asking them to rate each of approximately 160 allopathic and osteopathic medical schools on a scale of 1 (marginal) to 5 (outstanding), with “don’t know” also available as an option. In addition, the publisher sends a similar survey to a sample of MD residency directors in internal medicine, family medicine and pediatrics. Because osteopathic medical schools are much fewer in number and less well-known than allopathic medical schools, especially in regions of the country that do not have an osteopathic medical college or many practicing DOs, osteopathic medical education faces an unfair disadvantage in these opinion surveys, Dr. Westerfelt says.
“Both pools of evaluators on average rate the quality of osteopathic medical schools lower than that of allopathic medical schools,” Dr. Westerfelt says. “Those opinions count for the bulk of a school’s score and subsequent ranking.”
To calculate the percentage of a medical school’s graduates going into primary care, U.S. News uses the number of graduates entering residencies in family medicine, internal medicine and pediatrics. Although U.S News reports in a separate list that the Top 5 producers of primary care residents are osteopathic medical schools, even this component of the ranking system doesn’t give osteopathic medical education its expected edge because the 30% weight is not sufficient to compensate for the more heavily weighted quality assessment, Dr. Westerfelt says.
Moreover, the primary care proportion fails to present an accurate picture of those graduates who will actually practice as primary care physicians after serving their residencies, according to Dr. Westerfelt. The ranking system does not take into account that many residents in internal medicine go on to subspecialize. The fact that far more MD graduates than DO graduates pursue internal medicine, as opposed to family medicine, residencies suggests that the U.S. News primary care percentages for leading allopathic medical schools are inflated, he points out.
The student selectivity component of the U.S. News primary care formula weights average scores on the Medical College Admission Test at 9.75%, mean grade point average at 4.5%, and average acceptance rate at 0.75%. Osteopathic medical schools have traditionally admitted students with somewhat lower MCAT scores and GPAs than do allopathic medical schools, so it is not surprising that DO schools lose further ground with this portion of the ranking formula, according to Dr. Westerfelt.
“Research has shown that the higher the MCAT scores, the more likely students will go into subspecialties,” notes Bruce D. Dubin, DO, JD, the dean of the Rocky Vista University College of Osteopathic Medicine in Parker, Colo. Osteopathic medical schools place more emphasis on candidates’ expressed commitment to primary care and to osteopathic principles and practice, Dr. Dubin says.
Osteopathic medical colleges also come up short on faculty-student ratio because DO schools use a higher proportion of part-time faculty members, who are not counted in this ratio.
By participating in this ranking system, “osteopathic medical schools have abdicated the right to define quality,” Dr. Westerfelt argues. “The U.S. News ratings are determined by uninformed opinions and by inputs, such as MCAT scores and GPAs, rather than by outputs—the quality and the number of our graduates who become primary care physicians.”
The president of the American Association of Colleges of Osteopathic Medicine (AACOM), Stephen C. Shannon, DO, MPH, agrees that the U.S. News rankings “give biased and erroneous information.” He notes that AACOM’s member colleges have discussed refusing to cooperate with the publisher’s annual surveys as a group, but at this time the majority of osteopathic medical schools consider participating in the country’s most visible medical school ranking system to be in their best interests. Nevertheless, the debate continues as to whether they should continue to participate.
Should osteopathic medical colleges ever decide collectively to eschew the U.S. News rankings, they would not be the first professional group to do so. In 1994, all U.S. dental schools stopped participating in the dental program rankings. And four years ago, dozens of independent liberal arts colleges ceased participation in the reputation survey portion of the U.S. News “Best Colleges” rankings.
Like other well-rated osteopathic medical schools, WVSOM touts its high U.S. News rankings on its website. “Our potential applicants look at these rankings,” says the school’s president, Michael Adelman, DO, JD. Although he is proud that WVSOM is recognized for excellence in rural medicine and family medicine, he does question the methodology used for primary care.
“It is ironic that we are in the Top 10 in rural medicine and the Top 15 in family medicine, but we’ve fallen to 94th in primary care,” Dr. Adelman observes. “That doesn’t make sense to me.”
Dr. Adelman would prefer to see a ranking system for primary care that assesses how well a medical school fulfills its mission. If a school’s mission is to provide primary care physicians to underserved areas, then the rating should be based on the number and quality of graduates practicing in primary care specialties in such areas, he says.
Dr. Shannon agrees that a better gauge of a medical college’s caliber would be the extent to which the school carries out its mission. An article in the June 2010 issue of Annals of Internal Medicine proposes an alternative ranking system based on a “social mission score” that reflects the percentages of graduates who are primary care physicians, who practice in health professional shortage areas, and who are underrepresented minorities. Such a system merits further consideration, Dr. Shannon believes.
Osteopathic medical schools, in fact, do not rank highly overall in the system described in Annals because of the weight accorded underrepresented minority enrollment. “Our schools overall have had difficulty recruiting minority students,” Dr. Shannon says.
The most objective educational ranking system Dr. Westerfelt has seen is a collaboration of the National Academies and the National Research Council to evaluate and rank PhD programs. The resulting document is more than 200 pages long—more than most prospective students will wade through, he concedes. However, an independent organization, PhDs.org, has adapted this dense report into a more accessible and flexible online format that allows users to rank programs according to their own priorities.
Public relations value
The director of communications for the Lake Erie College of Osteopathic Medicine (LECOM) in Erie, Pa., Pierre Bellicini agrees that the U.S. News ranking system for medical schools is flawed. But he notes that after much effort on the part of LECOM and some other osteopathic medical schools, the publisher has been willing to adjust its surveys over the years. For example, when U.S. News first published its medical school rankings, institutions were ranked almost entirely according to their strength in research. Although U.S. News still lists “Best Research” medical schools, among which osteopathic medical colleges as a group rank poorly, the addition of primary care rankings and specialty rankings marks a change that has enhanced the visibility and recognition of osteopathic medical education, Bellicini says.
“Being among U.S. News’ Top 50 in primary care is one of the few ways for osteopathic medical schools to get noticed nationally,” Bellicini notes. The media attention garnered by highly ranked DO schools benefits the entire profession, he says.
Bellicini points out that medical school applicants today look at several other sources of information besides the U.S. News rankings, including school websites and Facebook pages, so osteopathic medical schools needn’t be overly concerned about low rankings.
No perfect methodology
“All in all, I think that the rating system U.S. News uses is good, but there is definitely room for improvement,” says Thomas A. Cavalieri, DO, the dean of UMDNJ-SOM. “It is difficult to come up with any evaluation system that would meet everyone’s needs. There is no perfect way to compare schools.”
Osteopathic medical schools are at a disadvantage because respondents to the U.S. News primary care survey “simply aren’t aware of our colleges,” Dr. Cavalieri notes. This can be ameliorated by including family medicine, as well as internal medicine, chairmen in the survey, he suggests. “If U.S. News would send the surveys to chairs of family medicine, osteopathic medical schools would do much better,” Dr. Cavalieri says. “The chairs of family medicine at allopathic medical schools are very aware of the strengths of osteopathic medical schools in family medicine.”
“We stand behind the validity of our research,” says Robert J. Morse, the director of data research for U.S. News & World Report, while noting that he is open to suggestions such as Dr. Cavalieri’s. The technique of surveying a large number of people on their opinions and quantifying the results is common research practice, he says.
Dr. Cavalieri asserts that the osteopathic medical profession needs to do a better job of promoting itself, both to the broader medical community and the public. Greater visibility would improve DO schools’ standings in the rankings, he says.
Because of the osteopathic medical profession’s long struggle for parity with allopathic medicine, Dr. Cavalieri believes it would be detrimental for the profession to separate itself from the country’s best-known ranking system for medical schools.
One alternative source to the U.S. News rankings, The Princeton Review’s book Best 168 Medical Schools, lists osteopathic medical schools in a separate section titled “Osteopathic Profiles” and encourages nontraditional students to consider osteopathic medicine. In Dr. Cavalieri’s view, this is far worse for colleges of osteopathic medicine than the U.S. News rankings, especially given that the “Osteopathic Profiles” follow the “Naturopathic Profiles.”
Dr. Westerfelt remains convinced that the U.S. News rankings only harm the osteopathic medical profession. He plans to write a letter in the fall—when U.S. News begins collecting data for the next ranking list—to encourage osteopathic medical school administrators to stop participating in the annual surveys. “This is a race we shouldn’t even waste time trying to win,” he says.
But many allopathic medical schools also feel this way and have so for years. “The annual U.S. News & World Report rankings of U.S. medical schools are ill-conceived; are unscientific; are conducted poorly; ignore the value of school accreditation; judge medical school quality from a narrow, elitist perspective; do not consider social and professional outcomes in program quality calculations; and fail to meet basic standards of journalist ethics,” stated an article in the October 2001 issue of Academic Medicine, the peer-reviewed journal of the Association of American Medical Colleges. “The U.S. medical education community, higher education scholars, the journalism profession, and the public should ignore this annual marketing shell game.”