Saturday, October 26, 2013
We need more primary care physicians. I have written about this often, and cited extensive references that support this contention, most recently in The role of Primary Care in improving health: In the US and around the world, October 13, 2013. Yet, although most studies from the US and around the world suggest that the optimum percent of primary care doctors should be 40-60%, the ratio in the US is under 30% and falling. A clear reason for this is that relative lack of interest of US medical students in entering primary care at the rates needed to maintain, not to mention increase, our current primary care ratio. In addition, the ratio of primary care to other specialty residency positions is too low. Here we confront the fact that the large majority of medical students completing Internal Medicine residencies enter subspecialty fellowships rather than practicing General Internal Medicine. At the Graduate Medical Education level, a simple way of estimating the future production of primary care doctors would be to add the number of residency positions in Internal Medicine (IM), Pediatrics (PD), Family Medicine (FM), and combined Internal Medicine-Pediatrics (IMPD) and subtract the number of fellowship positions they might enter. This still overestimates the number of general internists, however, since it does not account for doctors who practice as “hospitalists” after completing their residency because such a role does not currently require a fellowship (as does, say cardiology). Estimates are now that 50% or more of IM graduates who do not pursue fellowship training become hospitalists.
Thus, we welcome the research report from the Association of American Medical Colleges (AAMC) “The role of in medical school culture in primary care career choice”, by Erikson et al. that appears in the December 2013 issue of AAMC’s journal Academic Medicine. The authors surveyed all 4th-year medical students from a random sample of 20 medical schools to assess both student and school level characteristics that were associated with greater likelihood of entering primary care. The first, and arguably most important finding, was that only 13% of these final-year medical students were planning on primary care careers. This is despite the fact that 40% were planning to enter the “primary care” residencies of IM, PD, FM, and IMPD, with most of the fall-off in internal medicine and least in family medicine. This finding strongly supports my assertions above, and makes clear that the historically AAMC-encouraged practice of medical schools reporting “primary care” rates by entry into residencies in those fields is not valid. It also, even more important, shows the extent of our problem – a 13% production rate will not get us from 30% to 40% or 50% primary care no matter how long we wait; obviously it will take us in the other direction.
The primary outcome variable of the study was entry into primary care, and it specifically looked at two school level (but perceived by students, as reported in the survey) characteristics: badmouthing primary care (faculty, residents or other students saying it is a fall back or something that is a “waste of a mind”) and having greater than the average number of positive primary care experiences. It turns out that both were associated with primary care choice (in the case of badmouthing, students from schools with higher than average reported rates were less likely to be planning primary care careers, while students who were planning such careers reported higher rates of badmouthing), but, after controlling for individual student and school characteristics, accounted for only 8% of the difference in primary care choice. Characteristics of the student (demographics such as sex, minority status or rural origin, academic performance defined as the score on Step 1 of USMLE, as well as expectation of income and feeling of a personal “fit” with primary care) and of the school (research emphasis, private vs. public, selectivity) accounted for the rest. Interestingly, debt was not a significant factor in this study.
I would argue that many of these individual and school characteristics are highly correlated. A school that prides itself on being selective (taking students with high scores) and producing subspecialists and research scientists does not have to badmouth primary care; the institutional culture intrinsically marginalizes it. On the other side, the students selected at those schools are more likely to have those characteristics (particularly high socioeconomic status and urban or suburban origin) not associated with primary care choice. It is worth noting that the measure of academic performance in this study was USMLE Step 1, usually taken after the first 2 years and focusing more on the basic science material covered in those years, rather than USMLE Step 2, which covers more clinical material (perhaps because not all 4th-year students studied have taken Step 2 yet). This biases the assessment of academic qualification; many studies have demonstrated high levels of association of pre-medical grades and scores on the Medical College Admissions Test (MCAT) with pre-clinical medical school course grades and USMLE Step 1 scores, but not with performance in any clinical activity, not to mention primary care. Perhaps most students improve their scores from Step 1 to Step 2, but it is particularly true for FM and primary care. A quick look at our KU students applying to our family medicine program shows an average increase of nearly 30 points in these scores.
So the problem is in the overall culture of medical schools, in their self-perception of their role (creating research scientists vs. clinicians, creating subspecialists vs. primary care doctors) and in their belief that taking students with the highest grades is equivalent to taking the best students. This culture, simply put, is bad, defined as “it has undesirable outcomes for the production of the doctors America needs”, and must change. Erikson and colleagues acknowledge that schools could do a better job of taking rural students, offer more opportunities to engage in public health and community outreach activities, and have more experiences in primary care, all of which were somewhat associated with primary care career choice. These are tepid, but coming from the AAMC, a reasonably significant set of recommendations. I say we need an immediate change in every single medical school to recruit at least half of every class with students whose demographic and personal characteristics are strongly associated with primary care choice, present a curriculum that has much less emphasis on “basic science” and more on clinical, especially public health, community health, and primary care. One of the primary bases for assessing the quality of a medical school should be its rate of primary care production, and this is going to require a major qualitative shift in their practices and the beliefs of many of their faculty and leaders.
I am NOT saying is that we don’t need subspecialists or research scientists. We do. I AM saying that the emphasis on production of these doctors compared to primary care doctors is out of whack, not just a little but tremendously so, and can only be addressed by a major sea change in attitudes and practices in all of our medical schools. I do not expect that all schools should produce the same percent of primary care physicians. Some might be at 70%, while others are “only” at 30%, but ALL need a huge increase, by whatever means it takes. Even if we produce 50% primary care physicians on average from all schools it will be a generation before we get to their being 50% of the workforce. At less than that it will take longer, and at less than 30% we will not even maintain where we are.
13% is not just “insufficient”, it is a scandalous abrogation of the responsibility of medical schools to provide for the health care of the American people. They should be ashamed, should be shamed, and must change.
Sunday, October 20, 2013
A great deal of discussion followed the publication of the NY Times’ June 13 article “The $2.7 trillion medical bill” (including my own blog piece The high cost of US health care: it's not the colonoscopies, it's the profit, July 28, 2013). The article began with the cost of colonoscopies and went on to address many of the sources of the high cost of US health care. A more recent piece in the New England Journal of Medicine, “The thousand-dollar Pap smear” by Cheryl Bettigole seems like it could be a follow up. In some ways it is, but it also raises a number of other points that should be addressed.
Dr. Bettigole begins by describing a call from one of her patients, who complained she had been charged over $600 for her Pap smear, shocking both of them. She goes on to describe the tremendous role that Pap smears have had in (at least in developed countries) in almost eliminating the scourge of advanced cancer of the cervix; indeed, this test remains the best example we have of an effective screening test for any cancer. It is – or should – also be very cost-effective, as especially shown by studies that assume a $20-$30 cost for the test. So how did it get to be so expensive? Is it really? Is it necessary? A portion of the increased cost comes from the use of a more effective (and more expensive) method of preserving and analyzing the specimen (“liquid based”), but most of it comes from including a bunch of other tests. These include tests for human papilloma virus (HPV), recommended only for some women and at intervals less than routine Paps, and tests for sexually-transmitted infections (STIs) which may or may not be indicated based on the patient’s history and symptoms but are quite different from cervical cancer screening. Why? Because the laboratory, which makes money on this, often “bundles” these into an easy-to-order “panel” of tests (rarely accompanied by the price!), and busy clinicians check them off.
The insidiousness of this kind of effort to order more tests than planned (and, often, to find an unanticipated – and frequently unimportant – abnormality that requires more tests to follow up) is common, not solely for Pap smears but for many other lab tests, and contributes to the increased cost of health care for society, insurers, and individual people. Dr. Bettigole’s other point, however, is the role of the provider in contributing to this unnecessary cost by not ordering tests more carefully, and with attention to cost. She writes “`When I was in training, our attendings would ask a standard quiz question: “What is the biggest driver of health care costs in the hospital?’ Answer: the physician's pen. A mouse or a keyboard, rather than a pen, now drives the spending, but we physicians and our staff are responsible for ordering these unnecessary tests and hence responsible for the huge bills our patients are receiving.”
It is a good point, and we should all be careful to order tests (and treatments) cost-effectively and teach our students the same. But this is not the sole answer; we need systems that encourage this sort of test ordering, and make it more difficult to do things that are not cost effective. It is parallel to encouraging our patients (and ourselves) to adopt healthful behaviors – a good idea, but not the answer for improving the health of a society so heavily geared to encouraging poor behaviors (drinking and smoking and guns and overeating and eating empty calories, etc. etc.). The idea that the "problem" is individuals' bad behaviors appears in lots of places in society, and frequently in medicine. This is not only victim blaming but an impractical approach to problem solving. In industry, a strategy called “six-sigma” has been widely adopted; its goal is to make bad outcomes resulting from individual error occur with a frequency approaching zero. The model is airplane flight, and trying to eliminate crashes, and it works because systems are put into place that make things work rather than saying to each pilot “Be careful! Remember to push the joystick in the right direction!”
At a recent Family Medicine conference, the excellent film "Escape Fire" was shown. It addresses many issues of problems with the health care system, including delivery systems, an emphasis on high-tech “rescue” care rather than prevention, and profit seeking by insurers, providers, and drug and device makers. A part of it also features Safeway's program for employee wellness. For some reason, the leaders of the ensuing discussion chose that as the first question: "does your employer encourage wellness?"
After a while I observed that this was not the main point of the film, and that mostly it talked about the need for system change. A student indicated he agreed with most of what I said but that there should be some "individual accountability". I should have asked specifically what he meant, but did observe that they have the ultimate accountability -- they get sick and die sooner. Of course, we should encourage our patients to eat right and exercise and not smoke and drive carefully, and we should ourselves. However, like trying to get all airplane pilots to push the stick the right way, getting each individual to always do the right thing is not the way to go. Few of us never drive too fast! Yet over the last 30 years there has been a tremendous decrease in traffic-related deaths, all of it from safer roads and more safely designed cars and 0% of it from people driving more carefully.
In occupational medicine behavior change is considered a weak third option after architecture and engineering. If there is a big window next to the factory floor where it is sometimes slippery, that is an architectural flaw; it shouldn't be there. But, if it is, you can put a heavy mesh screen over it so if people do slip, they don't go through -- engineering. Telling everyone to always be careful is good advice, but not a very effective solution. And yet, in our practices, with patients, with doctors, with social problems, we (as a culture) do it all the time.
Of course, an additional consideration in solely emphasizing individual behavior change is that we are wont to do it mostly with people whose “bad” behaviors are different from our own, and people who seem to be different from ourselves. We may overeat and need to go on a diet, but they are massively obese and at fault. We may drink sometimes, maybe too much, but they are alcoholics, or drug addicts. We could do a little more exercise, but they don’t care at all for exercise and its health benefits. We sometimes indulge in a piece of cake or a donut or two, but they only eat crap. We are sometimes in a hurry and not as careful as we should be, but they are maniacs on the road.
And, of course, they often look different from us, of a different race or culture. And really often they are poorer than we are (especially when we are physicians), and confronting, on a daily basis, a lot of challenges we don’t. Do they live in a “food desert” where the nearest grocery is too far to walk and they haven’t access to a car? Or it is unsafe to walk, for food or for exercise? Have they got a job or any chance of getting a job? Or are they “lucky” enough to have 3 jobs, and no time to “work out”? Judging others is a popular pastime, but it is not only often done without adequate understanding, it is rarely useful. We can and should encourage healthful behaviors and try to identify obstacles and help people overcome them, but we must focus primarily on the systems changes that make health possible in a more efficient and effective way than expecting everyone to change their behavior.
We can. The airlines have done it. The car industry (dragged kicking and screaming) has done it. The health care system can as well.
I am indebted to many wise comments made by many family medicine chairs on the ADFM listserve. The opinions and conclusions, however, are entirely my own.
Sunday, October 13, 2013
At the Family Medicine Midwest conference held recently in Milwaukee, the first day’s plenary speaker was Richard Roberts, MD, from the University of Wisconsin. Dr. Roberts has a distinguished history as a health services researcher and leader in Family Medicine, having been president of both the American Academy of Family Physicians (AAFP) and the World Organization of Family Doctors (WONCA). He has extensive experience in international health, and is knowledgeable about the health systems – and their results – in countries around the world. And he continues to practice family medicine.
setting in which health care takes place), done first by Kerr White in 1961 and replicated by the Graham Center of the AAFP in 2003 with remarkably similar results. In a community of 1000 adults, in any month about 800 have a health problem or injury, 217seek attention from a doctor, 8 are hospitalized, 5 see subspecialists, and 1 or less is admitted to an academic medical center teaching hospital, which, of course, is where we train most medical students and residents, and where they get a skewed view of the prevalence of disease. They begin to see unusual or even rare things as common, and develop habits of ordering tests that are perhaps appropriate in that setting, but dramatic overuse in ambulatory practice.
In 2005, there were 34 million hospital admissions in the US, but almost 1 billion office visits. Of those, about 53% were to primary care physicians. While much is made of the increase in emergency department usage, from 1995 to 2005 ER visits were up 8% while primary care visits increased 22%. As Roberts notes (medical students should cover their ears!) primary care doctors comprise about ¼ of the physician workforce but see more than ½ of all patient visits and earn about ½ the income of subspecialists (and this is average; a much smaller fraction of the income of the most highly paid subspecialists).
Internationally, the same trends are noted. Countries with a higher “primary care score” (which largely measures the percent of the medical workforce in primary care) had lower rates of premature deaths than those with low PC scores in 1970, and over the last 4 decades, while the rate has gone down in both groups, the gap between them has widened. In an unintended “natural experiment”, the Asian economic boom of the early 1990s allowed Indonesia to greatly increased health spending, mostly in primary care; that nation saw a 70% improvement in health status in all provinces. With the collapse of that “bubble” in the late 1990s, spending on primary care went down, but not on hospital care in the big cities. This was a result not of Indonesian government decisions, but rather of the international community through organizations such as the World Bank saying “your economy is worse, but you need health care – here’s money … to build hospitals”. But health status dropped in most provinces. Not the best use of resources!
In the 1990s, Shi studied socioeconomic, environmental, and health system characteristics of US states and their relationship to health status (mortality, lifespan, deaths due to heart disease and cancer, neonatal mortality, and low birthweight). Access to primary care was the strongest predictor of greater lifespan and was second (to living in an urban area) for lower mortality rates, even ahead of education. Number of specialists and number of hospital beds were far down the list – indeed they were negative predictors! None of the changes in the health system since that time are likely to change this; indeed, the increase in specialists, technology, and hospitalizations have probably increased it.
What is it about primary care? Why does it make so much difference. Starfield’s work identified the fact that nations and regions with high levels of primary care have greater self-reported health status and fewer health disparities, and that the presence of primary care tends to mitigate the negative impact of income inequality. This group also demonstrated that an increase of primary care physicians of 1 per 10,000 (20%) physicians decreases mortality by 40 per 100,000 (5% fewer deaths), and 1 per 10,000 (33%) more family physicians decreases mortality by 70 per 100,000 (9% fewer deaths), while an increase in specialists of 1 per 10,000 (8%) increases mortality by 16 per 100,000 (2% more deaths).  Dr. Roberts notes 4 features of systems with higher primary care to specialist ratios that might affect this: 1) when there are too many specialist and not enough primary care doctors, specialists may try to manage conditions outside their specialty in which they are not knowledgeable; 2) prevention and early detection save more lives and extend life more than intervention late in the disease process; 3) there is excessive utilization of procedures when there are too many specialists (supply drives demand rather than vice versa) and these often have risks; 4) the more “handoffs” there are between doctors caring for a patient, the more that care begins to resemble an elementary school game of “telephone”, where the final message heard is very different from that which began the communication.
The fact that family physicians specifically seem to improve population health status more than primary care physicians taken as a whole is apparent in the data, but the reason has not yet been identified by studies. Dr. Roberts postulates that it has to do with caring for multiple family members, and using that information to improve their care, such as when a mother’s issues are addressed at a visit ostensibly limited to caring for her child. Primary care (and possibly especially family physicians) acts to achieve all aspects of what has been identified as the “Triple Aim” of health care: greater access, lower cost and higher quality.
Primary care doctors, and especially family physicians, are doctors of “first and last resort”. They care for pregnant women and deliver their babies and care for their children as well as the other adults in the family. They tend to the “grandparents”, older adults, and manage the often complex interplay of multiple chronic diseases. They provide acute care and preventive care and are aware of the individual’s beliefs and preferences and those of the family, and the dynamics that exist between them. They care for people at the end of life, right through the end, not just until “there is no more to do”, and they remain there for the survivors.
The US could do a lot better. We need a health system that is more grounded in primary care, and we need a health system that provides access to everyone. What we don’t need is folks in Congress are committed so committed to preventing that access they will shut the government down! Another conference speaker, Dr. Cynthia Haq of the University of Wisconsin, quoted the Ethiopian Minister of Health, with whom she had recently met. “Only in the United States,” the Minister said, “could there be discussion about whether access to health care was a human right or not.”
Oh, my. He’s right. I sure wish he were not.
 White KL, Williams TF, Greenberg BG. The ecology of medical care. N Engl J Med 1961;265:885–892.
 Green, LA et al., “The ecology of medical care revisited”, N Engl J Med 2001; 344:2021-2025June 28, 2001DOI: 10.1056/NEJM200106283442611
 Shi L, "Primary care, specialty care, and life change", Intl J of Health Service,1994; 24(3):431-58
 Starfield BA, Shi L, Macinko J, “Contribution of Primary Care to Health Systems and Health”, Milbank Quarterly Sept2005; 83(3):457-502. DOI: 10.1111/j.1468-0009.2005.00409.x
 Shi L, et al., “The Relationship Between Primary Care, Income Inequality, and Mortality in US States, 1980–1995”, J Am Bd Fam Med, 1Sep2003;16(5)412-422. doi: 10.3122/jabfm.16.5.412.
Sunday, October 6, 2013
If you live in a sparsely populated area, you may find it difficult to obtain medical care because doctors and hospitals are far away. The issue of geographic isolation is independent of insurance status; it is a problem that plagues Canada, where everyone has health insurance through its single payer system (coincidentally called “Medicare”), but most of the people are concentrated within a short distance of the US border, and there are vast stretches of empty (or, more to the point in this case, almost empty) land. The situation is exacerbated further by the fact that many people living in rural areas work in jobs that have a higher risk of injury which might need care (e.g., farming, ranching, logging), and by the fact that a greater percentage of people living in rural areas are older, and thus more likely to have chronic disease. However, hospitals serving rural areas are small, and may not bring in enough revenue to support their fixed costs, so hundreds of rural hospitals closed in the 1980s and 1990s.
In response, Congress created the Critical Access Hospitals (CAH) designation in 1997, allowing hospitals that meet certain criteria (initially being greater than 35 miles apart) to receive increased reimbursement from Medicare at 101% of their costs. This was very successful, not only permitting the survival of many existing rural hospitals, but the creation of new ones, particularly when states were allowed to add other criteria to the designation, creating “Necessary Provider” Critical Access Hospitals, NP-CAH. The existence of these hospitals has been seen as a top priority for many rural communities, and for the states that they are located in. However, a recent report (OEI-05-12-00080) by the Department of Health and Human Services’ Office of the Inspector General (HHS-OIG) suggests that a stricter application of the distance criterion (even 15 miles, not 35) would mean that many of these hospitals would no longer receive 101% of their costs, and that this would result in the saving of $449 million to Medicare. They provide us with a sample map of Missouri, showing which hospitals would be affected.
As reported by Mike Shields of the Kansas Health Institute (KHI) in “Inspector general’s report has rural hospitals worried”, this has the National Rural Health Association raising the alarm. It is of special interest in the middle of the country. Kansas, where former governor and current HHS Secretary Kathleen Sebelius certified 31 additional hospitals under the NP-CAH criteria, leads the nation with 83 CAHs; Iowa is second with 82. According to the OIG, “There are more than 1,300 CAHs in the United States. CAHs are located in every State except Connecticut, Delaware, Maryland, New Jersey, and Rhode Island. CAHs provided care for approximately 2.3 million beneficiaries in 2011. Medicare and beneficiaries paid approximately $8.5 billion for this care.” So it is not surprising that these hospitals, their trade association, and the states in which they are located, are very concerned; many of them would likely close if they didn’t receive the excess payments from Medicare. The question is: would it be a good idea?
Essentially, the key part of that question is not whether it would save money for Medicare; clearly it would. The question is “would it harm the access of rural people to necessary medical care”? I don’t know the answer to that; or, rather, I know the answer is that it would but I don’t know how much. Could people drive 15 miles farther to the next hospital? Probably. Many of them are already driving a number of miles. Would this be inconvenient? Probably. After all, a large percentage of the users of these hospitals (most of which are also the locations of the doctors’ or other health providers’ practices) are older, thus Medicare’s interest in them. Would people be less likely to get necessary preventive and treatment care for none emergencies? Possibly. Distance is a big issue, especially if you have to be picked up and driven by someone else. Would there be disparities in which rural residents see decreased access? Almost certainly. Many rural people with high incomes often go to larger facilities in bigger cities, or to “destination” centers, like the Mayo Clinic, for their regular care. Obviously, the poor will have less access.
But would it save sufficient money to justify this? What is the cost/benefit to saving $449 million to Medicare against the – what? Lives? Convenience? of a bunch of rural Americans? Very hard to measure, although I again (see “Why poor people choose ERs: we need a system designed to meet everyone’s needs”, August 4, 2013) call attention to the fact that “convenience” is a loaded word that does not convey the full impact of time, transportation, and competing demands that affect the lives of the most needy. It is probably a matter of priorities, and of course, who you are. Are you the majority of people, including Medicare recipients, who live in major metropolitan areas, and for whom the sheer distance to a hospital is not among the many problems that you have accessing care (although transportation might well be) or the 20% or so who live in these rural areas?
One additional point that can be brought up on either side of the argument is that CAHs are often critical in other ways, such as their economic impact on their communities. Many are among the largest employers in their towns. They are a sense of civic pride. One I know about is Kiowa County Hospital in Greensburg, Kansas, a town of 1500 in the southwestern part of the state. On May 4, 2007, most of Greensburg was leveled by a tornado. Because I drive through it a few times a year, I have watched its rebuilding and taken pictures of it. For several years, the hospital was located in Quonset huts on the north side of US Highway 54. In rebuilding the town, Greensburg, with the support of many organizations, sought to make it in many ways a model of what a small town could be, including in ways that encouraged health, such as having schools and public buildings in downtown, walkable, rather than on cheaper land on the outskirts requiring a car. And they rebuilt the hospital, which is now “the first LEED Platinum Certified Critical Access Hospital in the United States.”
So what? I mean, it’s nice that Greensburg rebuilt in an environmentally positive and health-oriented way, and that Kiowa County Hospital is LEED platinum. Yes, it’s nice that rural communities take pride in their local hospitals, and that they provide jobs for the people who live there. It’s nice that the folks who live in these parts of the country don’t have to drive quite so far to get medical care. But is that a reason for Medicare to spend all that money to subsidize them, to keep them open?
I think so. I think that, from a health point of view, minimizing the already-long distances many rural Americans have to travel to access care is a good thing. I think that having institutions that provide jobs and stabilize communities and possibly even keep towns alive is a good thing. You can say “only 20% of Americans live in rural areas”, but that is 20% of Americans. My concern is not nostalgia for a pastoral way of life I have never known, but rather a concern for these communities and the people who live there as needing support as much as poor and middle-class people in cities and suburbs. I note the irony that Kansas’ two Republican senators are very strong advocates for rural hospitals while supporting their party’s policies on cutting services for the needy, and that its Republican governor (and former senator) is a leading advocate for “let’s do whatever we can to help the Koch brothers by cutting taxes on fossil fuel producers”. But we have spent, and continue to spend, billions upon billions of dollars on subsidizing bankers, financiers, and the wealthiest American individuals, companies and businesses.
Spending a little bit on keeping rural hospitals alive seems like a whole lot better thing to do.