Sunday, June 13, 2021

Culture and Medical Culture: Understanding to increase benefit and reduce harm

Culture is often understood, at least by that culture that is in a majority in a given place, as a characteristic of others. That is, we are “regular”, they have a culture. The greater the disproportion between the dominant group and others in terms of numbers, the less diverse a community is, the more this – incorrect – assumption prevails. In the 19th century, before the work of Bronislaw Malinowski and Margaret Mead, who actually spent time in the places and cultures they were studying,  cultural analysis of the world by anthropologists was often done “offline” by what have become known as “armchair anthropologists”. All European, they ranked cultures from least to most civilized, and guess what: European, and especially Western European, cultures were always at the top!

It should be needless to say that this was wrong. In addition to all the examples that can be given of other non-European cultures were far more advanced (think the Arab world for mathematics and science, China for all kinds of things), all cultures are different. They do not just have “strengths” and “weaknesses”, or areas in which one is “better”, but differences which have developed to serve the needs that existed where they lived. Weather, for a start, makes a difference in the types of crops grown or how housing is designed. In addition, of course, different cultures share many similarities. This allows for, for example, religious ecumenism, in which folks of different religions can come together based upon the values that they share. In the US today, we have seen great advances in understanding not only that differences between cultures do not mean one is better than another, but also that similarities between people usually exceed differences. Recently, we are seeing great strides against racism, sexism, jingoism, and all the other “isms” that promote hatred instead of understanding. Unfortunately, however, we also see a backlash from people who feel threatened by the idea that other people, whom they have disparaged and discounted, are indeed their equals. This has gone beyond attitudes; it has led not only to violence, but to legislation enshrining prejudice, hatred, and discrimination. I hope this will get better, but it might get worse first.

One way that we have on tried to address this issue in medical education has been discussions between small groups of students about how they see common phenomena in the world, in their communities, in families, and relationships. The more diverse a class is, the richer these discussions become and the more the students learn that what they think of as “regular” is in fact just as much a cultural belief as that of other people. Of course, this also can reveal assumptions that they may make about what is “normal” that are not normal for others, particularly regarding financial and socioeconomic issues. Or, for instance, whether the police are seen as your protectors or your persecutors.

This becomes an important entry point for examining medical culture, which certainly exists and carries its own beliefs and prejudices, as do most professions. These beliefs are no more, or less, “true” than sociocultural beliefs. Because medicine involves not only extensive interaction with other people who are not immersed in the culture but, even more, extensive power over the lives and health of those people, coming to grips with what you (and your teachers) believe because, well, we all believe it, rather than what is based in evidence, is important. This is more difficult because a big part of the socialization to a profession such as medicine is for a novice who is from outside that culture to learn the jargon, way of thinking, and indeed prejudices that characterize it, and this can have negative as well as positive results.

For example, our medical students usually enter perfectly capable of speaking English (and perhaps other languages) and conversing with others and communicating ideas and information. As part of becoming doctors, they learn new language, new terms, new acronyms, new meanings, and eagerly repeat them as evidence of their acculturation. Unfortunately, this can become an obstacle to communication with their patients, who do not speak this language. One example: a couple of sentences ago, I used “positive” and “negative” in their usual English senses of “good” and “bad”. However, when doing medical tests (lab, imaging, biopsies) a positive result is usually bad, and a negative result is good. But when a doctor, or student, informs a patient that their results are negative, it is common for the patient to react with fear, since this sounds like a bad thing. We urge them to say “normal”. Whew, that’s a relief!

Some other issues of medical culture are address in an Op-Ed by Robert Pearl in the Los Angeles Times of May 16, 2021, “How doctor culture sinks US health care”. A big part of Dr. Pearl’s critique in the distinct bias, not only in physician attitudes but in medical journal articles, towards intervention and procedures rather than prevention. This, he notes correctly, is very much tied to money, since physicians and hospitals and health systems (which are increasingly the physicians’ employers) stand to make much more money from them. Medical journals are more likely to print articles with positive (there is that word again!) results, demonstrating that a procedure had benefit, than negative results, demonstrating that, actually, compared to something – or nothing – else, something (or nothing) that was easier, cheaper, less interventive, and less dangerous, it had no better outcomes. Of course, anyone can see that knowing this information, that doing something is not worthwhile, is at least as important as knowing that something works well.

However, the inclination (or perhaps prejudice) among most physicians is to do something, to intervene; aside from making money, it makes them feel that they have skills, are justified, are important. Unfortunately, this is also an attitude quite prevalent among their patients, who want something done to help their problem – to cure their disease, or increase their lifespan, or improve the quality of that life, and in particular to ease their pain. But doing something does not always improve things, and can definitely increase the risk of harm. We need to know what works (and what doesn’t), and in what circumstances, and what the dangers are, and what the alternatives are, and their potential benefits and risks, and then have discussions together about what, in the specific circumstance a specific person is in, what would be the best choice for them.

This effort is likely to overlap with more traditional sociocultural and religious beliefs, which can have an influence on what a person thinks would be best for them. Communication around this requires care, and a real effort on the part of the medical professional to understand and to make their own thoughts clear and clearly expressed. This is even more complicated when, as is the case, physicians are from a pretty narrow slice of the American population, racially, culturally, and economically (and, again, a good argument for increasing its diversity). As in all situations where there is a power differential (and in medical care, the greater power lies with the physicians and health systems) it is incumbent on those with greater power to make the effort to understand those with less. And, at least as important, to not make decisions for and about people based on only your understanding – or worse, assumptions – about what they want, or are because of race, religion, gender, national origin, etc. Doctors, even when they are well-meaning (and all of them are not always) too often allow themselves to fall victim to the ecological fallacy, and confuse “condition X is more common in population Y” with “the patient is a member of group Y so probably has condition X”.

It is, of course, also very important to recognize that all interventions and procedures are not a bad idea; indeed, they are often the best treatment. And, also, that not everything sold as “preventive” is really so; plenty of tests and treatments called preventive are not proven to prevent anything. It is not easy to overcome prejudices and beliefs.

But understanding that we all have culture, and trying to not be bound by it and doing our best to understand that of others, is a good start.

Monday, May 17, 2021

COVID, Vaccine, Racism, and Masks: Changing for better or for worse?

We are not yet out of the COVID pandemic. Not in the world, where hot spots linked to both bad luck and irresponsible arrogant governments continue to rage in such places as Brazil and India, as well as in some countries in Europe and in other parts of the world. Not in the US, despite the CDC indicating that it is now ok for those who are vaccinated to stop wearing masks. People are still dying in large numbers, people are still being infected, and this map of the US does not show all light-colored, low-risk counties (the gray counties, and state of Alabama, have to do with nonreporting).

But things are getting better. We no longer have an ignorant, evildoer in the White House who is intent on ensuring that as many Americans die as possible (while he himself got every a

vailable treatment when he was sick, and indeed got vaccinated), but there are still plenty of Americans who are loyal and faithful to him (or to the myth of him), and to the wacko conspiracy theorists on the dark part of the web. Many of these folks are vaccine-hesitant or vaccine-resistant.

A lot of the coverage of these attitudes has been in minority, and particularly Black, communities. There is definitely resistance there, and to a large degree it is based on the experiences of those communities with the health care system. Some of this may be, in fact, the oft-cited historical examples of exploitation and experimentation on Black people in this racist nation. These include (probably most famous) the Tuskegee study of the “natural history” of syphilis in Black men that continued even after a cure (penicillin) was discovered; they were not treated. They also include the work of the “father of gynecology”, Dr. J. Marian Sims, whose discoveries were based on non-consented (of course) surgery on Black slave women, for a foretaste of Nazi doctors. They include, more recently, the fact that the HeLa cell line, derived from a cancer in a Black woman, Henrietta Lacks, has been basis of many discoveries to treat cancer, but she was not cured and neither she or her family received recognition (or money, despite much going to researchers) for it.

Many Black Americans know these stories, and it is probably true that, to some degree, they influence the reluctance of some people to put their fates in the hand of the healthcare system and to trust the government, including the CDC. But I would posit that a larger reason is not these near-mythical (although very real) abuses, but the more mundane, everyday abuses that people, especially poor and minority people, have experienced at the hands of that system. It is no fun to be sick, no pleasure to need treatment, and the arrogance of much of our health-system culture to its patients is very often confusing, frustrating, unpleasant, and demeaning to even well-insured and well-to-do White people. How much greater are all of these indignities – and too often abuses – if you are not insured or poorly-insured? If the massive health care institutions that have been built in pursuit of profit do not really want to care for you, or offer you their special magic bullets available to those with more money? How much greater than even that is it when you are Black, and endure the overwhelming manifestations of American structural racism as delivered by the healthcare system? When the presumption is you are more likely to be the cause of your own problems, because of your behaviors, or your self-abuse, when the presumption is that you are guilty of not-always-clearly-stated malfeasances, that you are stupid and ignorant and not to be trusted?

Sure, Tuskegee and Sims and HeLa came from the same racism, but you don’t need to even know about them at all to know that the health system is stacked against your, and your family, and your friends, and your community, and not always to be trusted. So maybe you’re not sure about getting that vaccine. And many Black Americans, and other minorities are not sure. But they aren’t the majority of the “vaccine skeptical”, despite getting more attention. The “traditional” anti-vax community is largely White, educated, upper-income and centered on the coasts, especially the West Coast. It is represented by a legion of celebrities and other famous people, including Robert Kennedy, Jr., Jennie McCarthy, Mayim Bialik, even (until recently) Oprah Winfrey. In the COVID pandemic, they have been joined by another much larger cohort, overall less educated and wealthy but just as White – the Trumpers. Following the incorrect and dangerous information that continues to spew from their leader, they now believe that the vaccine is ineffective, probably bad for you, and likely a result of a conspiracy of foreigners. You can look into the Dark Web, or QAnon to find this information, or just turn your dial to FoxNews. And while there are organized efforts by Black leaders, including celebrities, to urge folks to receive the vaccine, there is no comparable campaign among by the heroes of the White right.

Many articles have appeared documenting the degree to which our emergence from the COVID shutdown is going to depend upon adequate numbers of people receiving the immunizations. Health authorities, nationally and in state after state, have dropped the age criteria for being immunized, so that at this point in many jurisdictions, only infants are ineligible. And folks are still being immunized; while the rate of immunization has decreased from the peak, it is still being actively pursued by many people. But what is scary is that there are so many who will not receive it. And they won’t wear signs.

Were you worried about a third wave of COVID? Or is it now fourth? I forget. Anyway, good idea to worry. The CDC has indicated that those people who are fully vaccinated can now not wear masks in public. There is actually good scientific evidence for this, and it would certainly help people to feel like a corner has been turned. Except – they won’t wear signs. As reported by the Associated Press on May 15, 2021, while many small businesses are embracing the new opening and taking down their signs requiring masks, not wanting to “have to play police” any more, many large chains are continuing their mask policies.

‘As many business owners pointed out, there is no easy way to determine who has been vaccinated and who hasn't.’ 

And the new guidelines, issued Thursday by the Centers for Disease Control and Prevention, essentially work on the honor system, leaving it up to people to do the right thing.’

Right. The honor system. The folks who stormed the US Capitol, who stood with guns outside the Michigan State Capitol to oppose masking (in the state which now has the highest COVID rates in the US), who believe that COVID is a hoax, that Trump won the election, that Nancy Pelosi is the devil, and that Jews have space lasers, will be honorable enough to wear masks when they haven’t been vaccinated.

Good luck with that.

Monday, April 5, 2021

We need more primary care to serve our people: Why do the medical schools lie?

Every year the nation’s medical schools graduate thousands of people with MD and DO degrees. But this is just the start of becoming a practicing physician; they now need to complete residency programs in a specialty area, ranging from 3 to as many as 8 years, to become family physicians, surgeons, radiologists, dermatologists, orthopedists, etc. Indeed, for many physicians this “postgraduate” training (meaning post-medical school, since medical school itself is post-graduate, requiring a bachelor’s degree for entrance) can have two components as well. First there is the primary residency program, say an internal medicine residency of 3 years, and then there is subspecialty training, usually called “fellowship”, where that internist becomes a cardiologist, or endocrinologist, or pulmonary medicine physician. While the internist who completes 3-year residency may practice general internal medicine and thus become a primary care physician for adults, those subspecialists do not. A similar process exists for pediatrics. Family physicians completing their 3 year residencies can also do fellowships in a limited number of areas, and some limit their practices to sports medicine or geriatrics or adolescent medicine, but most add these skills to their primary care practice. And, of course, geriatrics and adolescent medicine are, like general internal medicine or general pediatrics, primary care for a particular population.

This is important. Primary care doctors provide care for their patients that is comprehensive and unrestricted, other than by age for pediatrics, internal medicine, and geriatrics. They meet the World Health Organization (WHO) criteria for primary care, providing continuous, comprehensive, community-and-family-centered care. Distilled down, this means that primary care physicians see their patients for everything, whatever concerns them, referring when needed. They are the doctors for their people, not for a particular disease or set of diseases. The lack of sufficient numbers of primary care doctors has significant negative impact on the health of our people. Of course, it falls hardest on those who are always most disadvantaged – the poor, members of minority groups, and rural residents. But it also has negative impact upon the health of privileged people who see lots of subspecialists, in two ways. One is that the specialist may be expert in their field, but miss appropriate treatments, and especially preventive measures, outside it. The other is that many specialties and subspecialties rely on and extensively use care that is very high-tech and expensive, which can lead to people getting tests and treatments that are not only costly but may not be of any benefit, and indeed may lead to harm.

 

So, when a medical school claims that it is good at producing primary care physicians, this is serious, and should be accurate. But it usually is not, because schools want to look as good as possible so establish criteria that make them look good, counting a wide variety of specialties that their graduates might enter as “primary care”. The biggest “offender” in this regard is counting all graduates entering internal medicine residency programs as entering primary care. As described above, some of these end up doing fellowships to become subspecialists and do not practice primary care; indeed, “some” is an understatement as it is about 80%. In addition, about half the rest end up practicing as “hospitalists”, taking care of hospitalized patients only, rather than practicing primary care. So an approximation would be to assume about 10% of those entering internal medicine residencies will practice primary care. In pediatrics, continuing as a general pediatrician is much more common; the appropriate multiplier is probably 60%, and for family medicine as much as 95%. There are also residency programs in a combination of medicine and pediatrics (Med/Peds) which can produce primary care doctors, and whose graduates are less likely to pursue subspecialty training; however, they are very likely to choose only one of those areas (adult medicine or pediatrics) and also to become hospitalists.

In addition, some (or many) schools include in the primary care numbers specialties that are simply not primary care at all. Most commonly, they include emergency medicine and obstetrics/gynecology. Emergency medicine does indeed provide first-contact care, but it does not provide continuity. Obstetrics/gynecology can provide some aspects of primary care (and indeed OBGyns may be the only doctors some young women see) but it is limited in that it is not comprehensive; women are more than their reproductive tracts, and they can have a variety of conditions OBGYN does not care for (diabetes, hypertension, heart disease, depression, arthritis, asthma and other lung problems, substance abuse, etc., to name a few). Perhaps the most egregious abuse is counting all students who enter internal medicine “transitional” or “preliminary” years. Such one-year programs, which have replaced the old “rotating internships”, are required for many specialties such as neurology, anesthesiology, radiology, ophthalmology, dermatology, and others, whose practitioners do not do primary care at all.

If we want to know how well a school is doing in graduating students who actually practice primary care at the end of their residency and fellowship training, these inflated numbers do not inform us. Fortunately, one of the most popular sources of information on medical (and other) schools, US News, has worked with the Robert Graham Center, the policy center of the American Academy of Family Physicians (AAFP) to develop and publish a metric that does show which schools actually produce primary care physicians, available at https://www.usnews.com/best-graduate-schools/top-medical-schools/graduates-practicing-primary-care-rankings. The top of this list is dominated by schools of osteopathic medicine, which consistently graduate higher numbers of primary care physicians, and, among the allopathic schools, the mainly public schools who have been doing well in this area for a long time. The private, largely northeastern, schools that usually top rank lists are nowhere to be found.

It is important to look at this list, not the list of “Top Primary Care Schools”, to get accurate data on production of primary care physicians. The metric on percent of students going into primary care has also been fixed in the “Top Primary Care” rankings, so it is better, but it still only accounts for 40% of that ranking. “Peer Assessment” (subjective rankings) account for 30%, half from medical school deans and other leaders, and half from residency directors. The other 30% is half “faculty resources” (largely faculty ratio) which may be skewed to the advantage of research-intensive schools, because it includes faculty who are mostly in laboratories and not teaching, and half “student selectivity” (based on student grades and MCAT scores), which is actually negatively associated with entry into primary care. This doesn’t mean the students that enter primary care are not as smart; it means that the cachet of attending a research-intensive school makes the competition greater. Unsurprisingly, adding these other criteria does affect the rankings; Harvard, for example, is now #8 in “best primary care schools”, although it ranks #141 of 159 schools in percent of graduates practicing primary care. (In contrast, the University of Kansas, which ranks #9 in primary care, below Harvard, ranks #17 in graduates practicing primary care, at 37.8%). Reputation affects peer assessments in at least 3 ways. One is spillover effect -- well, it’s Harvard, and good in everything so it must be good in primary care. A second is the ignorance of non-primary care deans and residency directors about what kinds of doctors the school produces. Finally, the fact that “good in primary care” can mean things other than what specialties the graduates enter can have an effect; there are schools in which the family medicine and other primary care faculty are well-known for their research and leadership in national organizations, but which do not graduate very many students into primary care disciplines.

The fact remains, though, that the US very short of the primary care doctors it needs to provide quality health care to the American people. The way to begin to change that is to stop deceiving ourselves. Then we can start the process of producing a higher percentage, in every school.

Sunday, March 21, 2021

"Values" based care: Public Health, Primary Care, and Medicare for All

Recently, a class of undergraduate freshmen I teach debated the issue “Health Care is a Human Right”. Although we later determined that most of them personally supported that statement, the “no” team did a good job marshalling the arguments of opponents, often citing those of libertarian think tanks such as the CATO Institute (originally founded as the Charles Koch Institute, in case that helps), which identifies itself as promoting "free markets and individual liberty”. This includes identifying health care as a “commodity” and opposition to health care as a right (and thus to universal health coverage) as an infringement upon individual liberty. Essentially these two concepts boil down to the idea that the individual is free to decide what kind of health care they want, or don’t want, and what kind of insurance coverage they want, or don’t want, and can use their money (or not) to purchase this commodity (health care) as opposed to another (I don’t know, say a bass boat).

The hole in this argument is wide enough, though, to drive a bass boat through. It is that not everyone has such a large amount of disposable income that they have the financial options to make such decisions. An old point about commodities having to do with cars, when most Americans bought American brand cars, is that some folks can buy Cadillacs and others Chevrolets. But of course, even if we update this to Beemers and Kias, there are a huge number of people who are buying used cars – often old “junkers” – to try to get to work and shopping. And there are those who can’t afford to buy, insure, and run any car at all and are reliant on public transportation. If there is any public transportation where they live. People without a lot of money (often despite working multiple jobs, even those making quite a bit more than the federal minimum wage of $7.25/hour -- last raised in 2009 when $7.25 was equivalent to about $9 today) make regular trade-offs on what they will spend their money on. Rent? Food? Clothes for the kids? Heat? Electric bill? Gas for the car to get to work, if they have a car? Health and medical care are rarely right up there at the top unless they are actively ill. Indeed, often even chronic diseases don’t get adequately managed, with medications for common conditions such as diabetes and hypertension stretched out. This family – and to a greater or lesser extent, this is probably true of the majority of families – is trying to figure out how to juggle absolute necessities, not luxury goods. The students arguing the “anti” position gamely tried to respond to such concerns, but learned that, outside the walls of conservative think tanks, Congress, state legislatures, and country clubs, there is a limit to the effectiveness of continually repeating “individual liberty” and “commodities”.

Paying for the cost of health care is a real juggling act for the government, although for a different reason from the one the families above are doing. It is balance between wanting to spend less money and continuing to support the profits of health care corporations such as insurance companies, hospital systems, and drug makers. The rational solution to this problem is to decide that it is not the government’s business to guarantee the often obscene profits of such private corporations, but rather to spend the money on whatever maximally increases the level of health of the American people.

This should include at least two major changes: first, a national health insurance plan (such as “Medicare for All”, recently reintroduced with major improvements by Reps. Pramila Jayapal and Debbie DIngell) that ensures that everyone is covered – everyone, all in one plan, no exceptions by age, disease, etc.), and second, a massive and continuing re-investment in public health, the need for which should have been made clear by the COVID pandemic. Historically in the US, in Democratic and Republican administrations, funding for public health is about 1% of the health budget, with the rest going to individual medical care. When we have a crisis, we bemoan the lack of public health infrastructure for a while, but then it recedes. Yet this is the most important component of keeping us healthy. Fighting an active enemy (like COVID) can garner support, while maintaining programs of prevention absent an obvious crisis gets less. How often do we wake up and say “I’m glad I don’t have cholera today because we have clean water and sewage”? And, yet, recently folks in Mississippi and Texas can count themselves lucky that their lack of water did not come with cholera or another infectious disease.

Instead of such wholesale reimagining we have had programs like “value-based care” for Medicare, adopted with the ACA (“Obamacare) in 2010. When this was first rolled out, I was enthusiastic because I misunderstood it – I thought it was about providing care based upon values, presumably decent human values. Sadly, I was wrong. It was about spending less money. Did it work? To do what? If the goal was spend less, yes, to some degree (see Austin Frakt in the NY Times Upshot Oct 9, 2019, “more singles than home runs”). One of the big goals was to substitute “value” for “volume”. Paying for volume, the number of patients seen, was the accepted way to pay doctors. But what does paying for value mean? This whole issue is reviewed by Dr. Don McCanne in his “Quote of the Day” for March 17, 2021 “Policy community hung up on ‘volume to value’”. Dr. McCanne reviews the recent article “The Future of Value-Based Payment: A Roadmap to 2030” from the University of Pennsylvania on the topic, but in his comments he notes that

“All health care has “volume” – time, effort and resources devoted to health care. Volume varies tremendously depending on the clinical situation. Think of management of a common cold as opposed to management of severe multiple injuries in an accident. Can payment schemes ignore volume? Of course not. Volume is built into the problem.”

Here is a volume/value solution that I have discussed before but will now say clearly: Revise the way that physicians (and other providers) are paid so that family physicians and other primary care doctors make at least as much as those providing subspecialty care. This is the third step to add to universal health coverage and investment in public health. When I go to a shoulder orthopedist for the pain in my shoulder, that is the ONLY PROBLEM they deal with. Not BP, not abdominal pain, not my cold -- not even the arthritis in my knees.  My PCP would deal with every problem on my – and all their patients’ -- problem list (to greater or less extent, depending upon severity and acuity), and thus rarely has enough time for any on person. If you go to the cardiologist, and mention that you have knee pain, they say "I don't do knees; here is a referral to the orthopedist". And you go to the orthopedist, they make a recommendation, you come back to the cardiologist who says "I don't do knees; whatever they said". So, for the subspecialist, referral is a time saver.

But if you come to a PC doc and say your knee hurts, they make some diagnostic and treatment suggestions. After examining your knee, maybe ordering imaging and lab, and thinking about it, if they think it might need surgery, they might refer you to the orthopedist. Then you go and the ortho says "maybe surgery", so you come back and ask your PC doc’s opinion, so they read the whole consult and review the films and think about it and discuss it with you. Result: referral for a PC doc makes MORE work.

And they get paid less.

PC docs need more time with everyone, and thus fewer patients each day/week/year. How much money should they make? I don't care, pick a number, but it should be able to be earned by seeing no more than half the number of visits that they currently do. People's complaint is ALWAYS about not having enough time with the doctor.  

So, increase funding for public health, develop a universal single-payer health insurance system, and pay PC docs at least as much per hour or patient as the highest-paid subspecialist in the outpatient setting. 

Now we begin to have “value”!

Saturday, March 6, 2021

DTC Advertising on TV illustrates the corruption and inequity of the US medical care system

I don’t watch a lot of live daytime TV. (In fact, I don’t watch a lot of live TV at any time.) I do see it several times a week for the 45 minutes I spend on the elliptical at the gym, which requires that the two TVs be tuned to sports. In practice, that means ESPN and ESPN2 and I try to position myself between them so I can look at whichever has the least boring talk. Daytime is not a time for actual sports; it is all sports talk. And mostly I listen to music in my headphones.

But I did seem to notice a really lot of the commercials were medically-related and aimed at my demographic (ie., “old”). Since I was recently in the hospital for several days and had the TV on different stations, I can attest that this pattern is not limited to sports shows; it is absolutely as ubiquitous on CNN, MSNBC, etc. (can’t personally attest to FoxNews, but I bet it is also true there). Sure, there are a few non-medically related commercials aimed at my demographic (e.g., reverse mortgages) and once in a while even something of more general generational interest – I particularly liked a half-hour infomercial for the NuWave Bravo air fryer/convection oven. But the medically related dominate.

Such commercials include those related to insurance (Medicare supplements and in particular Medicare Advantage – which seems a bit odd, since open enrollment doesn’t start until October), those advertising medical treatments for disease of the age-challenged, and lots of commercials for incredibly expensive recombinant DNA (anything ending in “-ab”) for relatively uncommon diseases. This last group is in the traditional (if your idea of “tradition” is 40 years) mode of direct-to-consumer (DTC) advertising first legalized under the Reagan administration: “ask your doctor if this is right for you”. The implication is that it probably is, although the litany of unpleasant side effects that always starts with bloating, and moves to serious infections, and ends in death, should give us pause. It is certainly right for the drug manufacturer, who – shock! – makes HUGE HUGE HUGE amounts of money on these things. Some of these drugs cost $100,000 a year. There are neurologic drugs that cost $30,000 a MONTH. Or more. Thus it is worth advertising to lots of people in the hope that even a small percentage will “bite”.

NOTE: Most doctors hate this DTC advertising. Yes, it is in part because of how irritating and time consuming it is for patients to come in and keep asking about whether they should be on these drugs. But more important, it is because they are trying to take care of you, to best manage you condition using drugs (when drugs are appropriate) that are the most effective, have the fewest and least dangerous side-effects, and are affordable for you. Why on earth anyone would ever think that a multi-drillion-dollar multi-national drug company would have your health interests at heart more than your doctor? If you think that, 1) you’re wrong, 2) you’ve been watching too much TV.

In addition to the insurance and drug commercials, there are multitudes of miscellaneous others, especially for devices. You can get an app for your phone that will check your EKG. You can get endless numbers of devices that will monitor your blood sugar, many of which apparently come with the added benefit of turning you into a totally fit mountain biker! Indeed, if you have always wanted to play the flute, direct a play, or kayak whitewater, these devices are for you! Also, the you-know-what. And let’s not forget the “mobility devices” like scooters, available at “no cost to you” (although such ads seem a little less common since some of the big operators have been imprisoned). These commercials provide a regular source of income for actors in their 50s playing people in their 70s. Even the ones in scooters look not very old, not very sick, and not very obese. Of course, sometimes they feature a real person in their 70s, if they’re famous. Joe Namath, is, I think, 77.

There seem to be endless ways for the companies to make money off of your Medicare benefit. Medicare Advantage is one issue that deserves a little more discussion, since they often seem to (and may) offer you actual advantage. Medicare Advantage (Medicare Part “C”) takes the money that traditional Medicare would pay, usually has you put in more, and basically puts you in an HMO. You get the benefits of an HMO – stuff not covered by traditional Medicare like maybe eye, ear, dental, or other stuff. Also, the disadvantages, like limited choice of doctors, hospitals, etc., especially if you’re not in your usual geographic area. People with traditional Medicare do not get bills from doctors who are out of network (unless, and this is rare, they have opted out of Medicare altogether), but Medicare Advantage patients do. So, if you never leave your home area, and are happy with the hospitals and doctors in the network, it may be a good choice for you, just as an HMO may be for anyone (disclaimer: probably, however, if you are to choose one, it shouldn’t be the one advertising on TV!)

That’s about you, though. Societally, it is much worse. Medicare Advantage plans, compared to traditional Medicare, have way higher overhead costs (about 12-18% vs about 2%). They also get paid extra by the federal government. Why? Well, you’d have to ask Reagan and his GOP successors; essentially it is part of an effort to privatize as much as possible. And, like most efforts to privatize, actually costs more. And we all, as taxpayers, pay for it.

This is not a benign process, for Medicare, for drugs, for devices. They are selling us very expensive stuff and making a huge profit, while millions of Americans have no health insurance and millions more are grossly underinsured. Discussions about national health insurance proposals often focus on cost, but the stuff being sold in these commercials is part of the structure that causes amazingly inflated costs, making our health system the most expensive (2-3x as much per capita as other industrialized countries) while maintaining among the worst health outcomes of that group of nations. And the burden is not spread equally; those most in need, those who are the poorest, disproportionately minority group members, are the hardest hit. It is inequitable, discriminatory, and immoral.

The bottom line is that all of these commercials propagate a system that is not only vastly inequitable, but is medically inappropriate. A system in which the goal is not to maximize the health of the population, in any fashion but certainly not an equitable one, but rather to maximize the profit for the companies that are advertising, to take money from the rest of the economy and accrue it to themselves. This, of course, is the nature of modern capitalism, but that doesn’t make it good. You have to decide if your health, and the health of your friends, family, community and nation are a core public benefit or a product to be sold, caveat emptor.

The bottom line for the individual is the same as it is for all commercials. They are NOT about YOU. They are about MAKING MONEY for the company sponsoring the commercial. That is all. You are nothing but the vehicle, or perhaps more technically, “sucker”, who will channel that money their way. If you read the fine print it is scary, and that is only the stuff that they are legally required to tell you. Anything that they are not legally required to tell you will not be there. Do not trust them at all. Do not watch them. They are dangerous.

Although I am going to look at see how Consumer Reports rates the NuWave Bravo…

Total Pageviews