Friday, September 17, 2021

Should hospitals and doctors make value judgements about who deserves treatment?

I heard on NPR’s “Here and Now” (Sept 9, 2021) that Jimmy Kimmel, the late night TV host, had expressed anger and frustration with people continuing to refuse vaccination for COVID-19. He noted that many hospitals no longer have available Intensive Care (ICU) beds available, and were going to have to triage who was admitted to them. According to the host, Robin Young, Kimmel said the decision was easy: you have a heart attack, you’re in; you have COVID and didn’t get vaccinated, you’re out. (His monologue is summarized by The Hill, among other sources.) Kimmel is not the only one to express outrage at the unvaccinated -- “shock jock” Howard Stern has responded to those who would cite their freedom to not be vaccinated with “F—k their freedom; I want my freedom to live!”— and is also not the only one called for such “ICU triage”.

Daniel Wikler, a professor of medical ethics from the Harvard School of Public Health was Ms. Young’s guest, and he said that, while he understood the anger that Kimmel and others were expressing, and empathized with it, he did not believe that it was the business of doctors or hospitals to make such decisions. It was the tradition and history of medicine, he said, to treat the illness of the patient if it was treatable, not to decide that someone had done something to themselves to make them undeserving of treatment. As an example, he noted a skier who might ignore all warnings, ski down the back of the hill, and get injured. There are lots of other potential examples, and they are valid.

I agree with Dr. Wikler on both points. First, I understand and empathize with Mr. Kimmel and others who are furious that those who have refused vaccination not only threaten the health of the rest of us but also end up utilizing a huge amount of health resources and services that not only can limit access to these services for others in need, and in any case cost huge amounts in time and effort by health professionals as well as in money. But I also agree that doctors and hospitals have no business refusing to care for these people, and that a core ethical value in medical care has been to provide care, if you are able, to help the illness of the patient, not to judge whether they are worthy of care because of their previous actions. One of the most dramatic and important examples are medical facilities in war zones, which are obligated by the Geneva Convention to treat all injured on the basis of need, not which side they fought on. To treat one’s own soldiers and not injured enemy soldiers who are prisoners is a war crime.

Many of those people who have the heart attacks that Mr. Kimmel thinks should get them into the ICU smoked cigarettes, or ate a very poor diet, or did not exercise, or all of these. While I’m sure that there are some people who are judgmental and smug enough to believe that they should suffer the results of their own life decisions and not receive care, this is not the approach that doctors and hospitals take.

There are certainly many people whose illnesses are at least partly a result of other poor decisions, including use of alcohol – both heavy lifetime use and even one episode which led to the car accident that has them in the emergency room – or other drugs. In addition, while less common than from alcohol, illness and death related to illegal drugs such as opiates and opioids and stimulants is still very common; we have all heard of the “opioid epidemic”. And there are infinite possibilities for blame when you go beyond “sins of commission” – things you did that were bad for you – and enter the realm of “sins of omission” – thing that you didn’t do that are, at least in the view of the one making the judgement, would have been good for you (e.g., diet and exercise).

Back to domestic hospital use, I would like to discuss two examples from my own experience. Suicide attempts are definitely self-inflicted, but the motivation to act is often transient, and many people who attempt suicide and survive do not attempt it again. Guns are very lethal, however, with well over 90% of suicide attempts by gun being “successful”; drugs are less so. My son killed himself with a gun, but if his attempt had been with a less lethal method, I  certainly would have wanted him treated.

On our inpatient services, residents and I have cared for many people who are repeatedly admitted with the effects of their use of alcohol or other drugs. One person I remember well. Regularly admitted for the toxic effects of alcohol overdose, on treatment and release he always pledged to get treatment for his disease, most strongly motivated by caring for his daughter, but never followed through. After many admissions, some residents thought it wasteful to continue to treat him and argued against it. My position was not only was recovery a difficult process, often with many failed attempts, but that our role was to treat his medical condition and refer him for treatment for his alcoholism. We could make the judgement that he was at fault, and each of us might have our own opinion about whether he “deserved” treatment, but that was irrelevant to our obligation to take care of it. It would be a slippery slope indeed. And I would be remiss to not point out the most common reason people are “triaged” to not receive care, at least in the US, is financial: they do not have money or good insurance. That is totally immoral and unacceptable.

There are some differences with those who refused to be vaccinated against COVID or wear masks or distance, but these are variations on a theme. Yes, they put others as well as themselves and their families at risk, but so do those who drink and drive or use other drugs, or who do many other things. It is our job to take care of them to the best of our ability. To do otherwise is to risk great hypocrisy, thinking that those who do the dangerous things we ourselves do are less culpable than those who do dangerous things we do not do and decry. I call it the “Jesse Helms fallacy” after the former powerful North Carolina senator who both opposed treatment for people with HIV/AIDS, who he said were suffering God’s punishment for their homosexuality, and also smoked like a chimney and fought for the tobacco industry. When he had developed heart disease, he sought and received treatment, despite being largely personally responsible for it.

That so many are refusing vaccination and care that there are no beds in ICUs in many states (as a person from Alabama did from heart disease after being unable to get a bed in 43 hospitals in 3 states, and as is occurring across the poorly-vaccinated South) is shameful, discouraging, and incredibly dangerous. These people are misguided, stupid, and many are even evil. But we also hear of those who (because they are dying, to be sure) regret their decisions. We can feel some sense of self-righteousness when we hear about anti-vax personalities who have died. If we are in institutions where there are not enough beds and patients have to be triaged, that triage must be on the basis of their condition and our ability to help them. The social/political fight cannot be waged at the bedside of an individual patient.

As much as we might be tempted to do so.

Tuesday, September 7, 2021

Twenty years after 9/11: a health worker perspective

This is a guest post on the 20th anniversary of September 11, 2001, by Seiji Yamada, MD, a family physician at from the University of Hawai'i John A. Burns School of Medicine

All those of us who are old enough recall what we were doing when we heard of the attacks of September 11, 2001. Since I live in Hawaiʻi, I was awakened by a friend living on the East Coast. He called to tell me to turn on my TV. When I did so, I saw the two towers of the World Trade Center on fire. I then watched the towers collapse.

On the following day, the University of Hawaiʻi Department of Family Practice (before the name was changed to Family Medicine) held a debriefing session with all staff, residents, and faculty in attendance. We came to some conclusions that we wrote about in the medical school newsletter:

We are humans before we are healthcare workers; our humanity is still a core component of our effectiveness as healers. Thus, our presence and genuineness, in the form of compassion and, when appropriate, openness about our own feelings, are therapeutic. When we can share some of our feelings about a recent disaster, it encourages a healing partnership by making the relationship less hierarchical. . . .

 

We must seek productive ways that translate our responses to distant suffering into a medicine more responsive to the suffering before us.  In this way, we can strive to incorporate social justice, equality, and compassion into both the practice of medicine and into the political response to acts of jarring violence.  We suggest that we should feel, think, and act not as members of a particular ethnic group, religion, or nation - but, rather, as humans.[1]

One participant, a Muslim and Arab woman, was silent through most of the session, but at the end, she related that she first wanted to hear what others had to say. She told us that she had grown up with, and constantly lived with anti-Muslim, anti-Arab sentiments being expressed around her – such that she often found it most prudent to hide her ethnicity.

We wondered what the future would hold.  Would this tragedy make Americans ponder why their country is hated by many around the world?  Or would the U.S. hunker down like Israel and embody the national security state, arms pointed in every direction?  The fearful consensus was, as has been borne out, that this trial would only serve to strengthen the impetus to meet force with force.

Indeed, 9/11 was followed by much flag-waving and George W. Bush’s declaration of a “War on Terror.” As the mastermind of the September 11 attacks, Osama Bin Laden (a Saudi), and the training camps of Al-Qaeda were in Afghanistan – the U.S. military began to plan for an assault on Afghanistan.

Richard Horton, the editor of The Lancet, wrote in a commentary published on October 6, 2001, suggesting that “The war against terrorism, announced by President Bush and endorsed by western political leaders in the immediate aftermath of the Sept 11 assault on America, will fail.” He suggested instead that “health, development, and human rights” be the objectives of a public health approach to Afghanistan.[2]

The U.S. started bombing Afghanistan on October 7, 2001.

I attended the American Public Health Association in Atlanta in late October 2001. Against the backdrop of daily bombing runs projected on the megascreen of the CNN Center, I thought that I might find fellow health workers opposed to the war. After all, UN agencies such as the World Food Program and UNICEF had been drawing attention to the humanitarian crisis in Afghanistan that pre-dated 9/11. Severe drought and twenty years of war in Afghanistan had led to conditions bordering on widespread famine. Shouldn’t public health workers, who are concerned about the health and well-being of people, oppose the U.S. war on Afghanistan?

I buttonholed Victor Sidel, grand old man of social medicine, and invited him to chat over a coffee. His take on bombing Afghanistan was, “The U.S. has to do something.  It can’t stand by and do nothing.” He criticized what he saw as my pacifist stance.[3]

It has taken nearly 20 years for the U.S. to leave Afghanistan. September 11 also served as one of the pretexts for the Iraq War of 2003-2011. All told, the first ten years of the “War on Terror” took on the order of 1.3 million lives.[4]

Since September 2001, we have endured twenty years of U.S. invasions of Afghanistan, Iraq, and wherever else the U.S. deploys its Special Forces, whether it is Africa or the Philippines. Twenty years of drone attacks, reaching its height under “Hope and Change” Obama, who devoted his Tuesday mornings to choosing the week’s targets for extrajudicial assassination (“Sorry about the wedding party collateral damage”).  Twenty years of torture chambers at Guantanamo and Abu Ghraib and Bagram Air Base and those hidden black sites around the world (“Yeah, Gina Haspel, you sure did a bang-up job running that black site in Thailand - we’re going to give you the top job of CIA Director”). Oh, Julian Assange, Chelsea Manning, Edward Snowden, do you think you’re going to let the people know what’s really going on? Well, for your troubles, you’re going to be psychologically tortured and placed in solitary confinement or exiled.

One economic sector saw its stock prices jump upward after 9/11, those of the arms manufacturers. As soon as the generals who oversaw the destruction of Afghanistan and Iraq and Libya retired from the U.S. military, they moved straight onto the boards of the weapons manufacturers. Lloyd Austin went from being commander of CENTCOM to the board of Raytheon. Meanwhile, the other pillar of the U.S. economy was the gambling house of debt financialization. When the casinos (i.e., the investment banks and their insurers) couldn’t cover their own debts and crashed the world economy, the U.S. taxpayers (via Congress) bailed out the banks, and workers were foreclosed on their houses. Subsequently, the Affordable Care Act (ACA, or ‘Obamacare’), touted as expanding the social good of health care to more people, essentially turned it over to the insurance and pharmaceutical industries.

However much the fabric of U.S. society has deteriorated in the twenty years since 9/11, it does not compare with the deliberate kinetic destruction wrought on the health services, access to water and food, infrastructure, and economies of Afghanistan and Iraq. Prior to the Gulf War (1991-1992, waged by George H.W. Bush), Iraq had been a thriving society, a leader in science in medicine in the Arab world. [5] Now, subsequent to the U.S. invasion (2003-2011, started by George W. Bush and Dick Cheney), and the war against ISIS (2013-2017), Iraq is a shambles. And thanks to Donald Trump’s utter incompetence, George W. Bush is now looked upon as a statesman. We are reminded that the U.S. destruction of the Middle East has been going on for much longer than the past twenty years. As Noam Chomsky often says, massive reparations are in order.

As noted by Chris Hedges, as the U.S. leaves, Afghanistan is, like when the U.S. invaded, in the midst of another humanitarian crisis:

Things are already dire. There are some 14 million Afghans, one in three, who lack sufficient food. There are two million Afghan children who are malnourished. There are 3.5 million people in Afghanistan who have been displaced from their homes. The war has wrecked infrastructure. A drought destroyed 40 percent of the nation’s crops last year. The assault on the Afghan economy is already seeing food prices skyrocket. The sanctions and severance of aid will force civil servants to go without salaries and the health service, already chronically short of medicine and equipment, will collapse.[6]

As Hedges points out, the response of the civilized world is to freeze the assets of the Afghan central bank and deny the new government access to loans or grants.

In retrospect, it is obvious how the desire for revenge in the immediate aftermath of 9/11 has led us to where we are now. What if, instead, the pain engendered by 9/11 had encouraged us to recognize the pain of others - those who suffer from hunger, poverty, ill health, and exploitation? What if narrative and images death and destruction had prompted us health workers to demand an end to war?[7]

What if we had sought instead to alleviate social ills and sought to ensure clean water, good nutrition, education, and health? Might we not all be better for it now?



[1] Yamada S, Maskarinec G, Bohnert P, Chen TH.  In the aftermath:  reactions to September 11, 2001.  News from the John A. Burns School of Medicine 2001 Winter;2:1-2. https://www.researchgate.net/publication/354116332_In_the_aftermath_-_reactions_to_September_11_2001

[2] Horton R. Public health: a neglected counterterrorist measure. Lancet 2001 358:1112-1113.

[3] Yamada S. On The Responsibility of Health Workers to Oppose the War. ZNet. Nov. 2, 2001. https://www.researchgate.net/publication/354116411_On_The_Responsibility_of_Health_Workers_to_Oppose_the_Afghanistan_War

[4] International Physicians for the Prevention of Nuclear War. Body count: casualty figures after 10 years of the “War on Terror” Iraq Afghanistan Pakistan. 2015 March: International Physicians for the Prevention of Nuclear War. https://www.psr.org/wp-content/uploads/2018/05/body-count.pdf

[5] Yamada S. Health workers and the Afghanistan-Pakistan War. ZNet.  December 14, 2009. Reprinted at Medicine and Social Justice. January 11, 2010. https://medicinesocialjustice.blogspot.com/2010/01/health-workers-and-afghanistan-pakistan.html

[6] Hedges C. The Empire does not forgive. ScheerPost. August 30, 2021. https://scheerpost.com/2021/08/30/hedges-the-empire-does-not-forgive/

[7] Yamada S, Smith Fawzi MC, Maskarinec GG, Farmer PE.  Casualties:  narrative and images of the war on Iraq.  Int J Health Services, 2006;36(2):401-15. http://web.mit.edu/humancostiraq/further-reading/casualties.pdf

Tuesday, August 3, 2021

COVID is still with us. Vaccine resistance is real, and it is dangerous

That there are a lot of people who are vaccine-resistant will be news to no one at this point. I wrote about this on May 17, 2021 (COVID, Vaccine, Racism, and Masks: Changing for better or for worse?), and while many people who were initially resistant have now changed their minds and have gotten (or in many cases, are trying to get) vaccinated, there is a hard-core residual group. There are a variety of reasons that people have, and sometimes articulate, for not getting vaccinated.  These include a lack of knowledge (hard, I know, given the amount of discussion) and a mindset that disbelieves those who are in power (also in some ways understandable; they do lie a lot). This is complicated for some parts of the population, particularly members of minority groups like African-Americans and Native Americans, by the fact that the US history is loaded with stories of exploitation and oppression and infection, from the passing out of measles and smallpox-infested blankets to Indians to the Tuskegee experiments that denied treatment for syphilis to Black men and many other crimes.

But at this point, there are no responsible people at all who are urging folks to not get vaccinated. Virtually all doctors, epidemiologists, and scientists have been (with some effectively evil exceptions, such as Joseph Mercola, DO (The Most Influential Spreader of Coronavirus Misinformation Online). Inversely, I mean anyone who is (which unfortunately includes many politicians, FoxNews personalities, and other “influencers”) is not responsible. COVID is real, it is infectious, the Delta variant is more infectious, it makes people really sick, and it kills people. A lot of people. Irrespective of whether they believe it is “real” or “dangerous” or not. Indeed, most of the people dying now (over 95%) in the US are those who have not been vaccinated. There is definite evidence that people who are vaccinated CAN get infected, and maybe even a few of them will die, but this is also definitely being overplayed by the media. Most of the media does not lie, exactly, but most people don’t go beyond the headlines and a very large number have no concept of relative risk or odds (I would assume, therefore, that poker players have been vaccinated!) You can cross the street very carefully, on a green light, at a corner, looking both ways, and still be hit and killed by a car driven by a lunatic speeding around the corner. But it is pretty unlikely compared to, say, running across a busy 6-lane highway without looking. Most folks can get that kind of relative risk, and it is not really unlike the risk of being vaccinated or not. This graphic illustrates that risk:                                                

Another way to express it, with a more traditional line graph, by county:

A recent article in the NY Times, Workplace vaccine mandates reveal a divide among workers” describes another way that workers are being divided, not just by whether they are willing to be vaccinated, but by whether their employers are willing to mandate (and pay for) vaccination for them. Apparently, they are for white collar workers who they want back in the office (not clear why) but not necessarily for the blue collar workers, those who actually do work that cannot be ‘phoned (or Zoomed) in’ but requires their physical presence, those people who have often been called essential workers. Walmart, for example, ‘announced mandatory inoculation for employees at its headquarters and for managers who travel domestically. For a sense of scale, about 17,000 of Walmart’s 1.6 million employees are expected to work in new headquarters in Bentonville, Ark.’ The argument for this seems to be that companies like Walmart need lots of workers and don’t want to alienate those who don’t want to be vaccinated by mandating it. But somehow it is ok for office workers. I am not sure that I understand this, but it must have something to do with pay: if you get paid a reasonable living wage you are more likely to be willing to get a mandatory vaccine, and you should anyway. It seems to me that if Walmart and other companies wish to increase the demand for their jobs, the better way to do it is: pay more!

We read that ‘Trump's COVID-19 testing czar warns the unvaccinated: 'You’re going to get the delta variant' (The Daily Kos article’s author notes that they didn’t remember that #TFG had  a testing czar. Neither did I.) Of course, a lot of people regret having not been vaccinated once they end up sick, hospitalized, and on a ventilator, like conservative radio talk show host Phil Valentine and others covered in a recent article in Rolling Stone. The NY Times also had an article titled “They Spurned the Vaccine. Now They Want You to Know They Regret It”. I heard from a friend about a hospital-based health professional (not a physician) they knew who had been vaccinated but ended up on a ventilator, which was very concerning. A few weeks later, we called to follow up. Guess what? Turns out they weren’t vaccinated; they and their whole family were lying. Embarrassed. Reassuring, in one sense, but also very worrying that folks are lying.

I hope that others who have reservations will take their advice and get vaccinated. It is, of course, a bit disingenuous; if they knew they were going to get sick and maybe die they would have gotten vaccinated. If we had known  that the roulette wheel was going to come up black, we would never have bet on red! You can’t be sure what will happen to you, but the important point is that you have to do it because you might, and at least as important, because you can infect others.

But there are many people who steadfastly believe that being vaccinated is not for them, and some who believe that it is not for anyone. They are getting sick, and will continue to get sick, and infect others, and cause others to die. They will not wear masks, and they will not wear signs. When they say that vaccine mandates, or mask mandates, are oppression, they are being, frankly, ridiculous.

The pandemic is not over. Delta is very serious. Get vaccinated. Wear a mask indoors.

I cannot resist including two other posts that make that point: 



  



Saturday, July 10, 2021

Drug and device makers: Obscene profits and kickback -- a big part of why our "healthcare" costs so much

 


Drug companies are greedy parasites. This is well-known to everyone. Even the shock troops of the Republican right know about and decry it.

The drug companies start with the capitalist model – sell something at a profit – and take to an extreme: make as much profit  as possible no matter who it hurts (as long as it doesn’t hurt them).

They have a great business plan: people need their medicine, they (or mostly their insurance companies, including government insurers Medicare and Medicaid) are willing to pay for it, even if truculently, and they’ll pay whatever is asked.

While we are all shocked periodically by stories of individual dramatic greed, such as Mylan and its CEO Heather Bresch jacking up the price of EpiPen® to over $600 (Epi-Pen® and Predatory Pricing: You thought our health system was designed for people’s health?, Sept 3, 2016) or Martin Shkreli and his company Turing brazenly raising the price of Daraprim® from $13.50 to $750 (Drug prices and corporate greed: there may be limits to our gullibility, Sept 27, 2015), the fact is that this is the standard business model of pharmaceutical companies. Think of colchicine, an anti-gout drug first used by ancient Egyptians at least since 1500BC, having its price jacked up from $4/mo to $5/pill (VISA and colchicine: maybe the banks and Pharma really ARE in it for the money!). 

Or, to get to one of the biggest scandals of all, the price of insulin!  Should there be anyone who does not know, insulin was one of the most important drugs ever discovered. Unlike the newest drugs that I have recently discussed, which sell for thousands (or tens of thousands) of dollars a month, insulin is not for a niche market. It is for diabetes, one of the most common diseases. For people with Type 1 diabetes, who produce no insulin of their own, it is simply an absolute requirement for life. For the larger group of people with Type 2 diabetes, it is often a critical part of controlling their disease. It was discovered by Canadians Banting and Best in the 1920s, who sold the patent to Eli Lilly and Co. for -- $1! (Not sure if that was a US or Canadian dollar.) This drug – the impact of which is HUGE – now is over $100/month for the cheapest generic forms if you have a coupon, and, depending upon the formulation, can be hundreds!

So, we have predatory pricing on products people absolutely need for their lives. Check.

We have outrage among millions of people of all political stripes. Check.

We have the ability to control these prices, starting with the largest payer in the US, Medicare. Check.

We have done what we can, as a government and a society, to address this issue. Um, uncheck. Nope.

In fact when Congress passed the Medicare Drug Plan, (MMA, Part “D”) in 2003, it specifically forbade Medicare from negotiating drug prices. Good deal for the drug companies! Guaranteed payment and no ability for the purchaser to negotiate the price! Where can you or I get that deal? Nowhere, of course. We do not have huge piles of $$ for paying Congressmen. Every time the drug manufacturers suggest that they need their profit to pay for Research and Development (R&D), we need to note how much MORE they spend on Marketing (including lobbying and contributions). It is also worth noting that contrary to their propaganda, MOST drugs are not developed in the US. It is the large plurality, but Japan and Western Europe also contribute a lot. (see, e.g., Light and Lexchin Pharmaceuticals as a market for “lemons”: Theory and practice and Pharmaceutical research and development: what do we get for all that money?, also for another take US Pharmaceutical Innovation in an International Context) More important, perhaps, is that most of the basic (high-cost, high-risk) research is funded by you, the taxpayer, through National Institute of Health (NIH) grants; drug companies most often buy the patents only after the original research shows promise.https://truecostofhealthcare.org/wp-content/uploads/2019/03/PharmaPG2018.png

Of course, in recent years, there has been a change. In labeling. Some expenditures that used to be included as marketing are now included as R&D. Not that the amount that they spend on lobbying, political contributions, marketing to physicians, and direct marketing to consumers has changed. And of course we have the FDA approving incredibly expensive and profitable drugs against the recommendations of their scientific panels (FDA approves Alzheimer's drug against the recommendation of its scientific panel. Be very concerned, June 21, 2021). This is corruption, this is your government selling you down the river to increase the profits of an industry that everyone, justifiably, hates.

And then, of course, we need to consider the device makers. You know, the companies that make stuff that is stuck into your bodies by surgeons, from artificial hips to cardiac defibrillators. Just to make sure that they do not continue to fly under the radar, getting cover from the excesses of the drug manufacturers. This is important; unlike drugs, if it turns out that your  implant  is not the best choice, or is not working well, or is really inferior, or is harming you, it is not as easy to just stop and put a new one in. That artificial hip? They opened you up, cut out the top part of your thigh bone (femur), and part of where it inserts into the pelvis (ilium) and put in this new hardware. Replacing it is a big deal. So getting the best one available is important. Usually, this decision is left up to the surgeon. So how do you get surgeons to use your device if you are a device manufacturer? Marketing! Telling them how great it is! Showing them testimonials from other surgeons!  Oh, yes, and, of course good old kickbacks, paying them for using your product! This is reported on in a recent piece in Medscape, Device Makers Have Funneled Billions to Orthopedic Surgeons Who Use Their Products”, June 17, 2021. That is how you wanted your surgeon to choose the new part that is going to replace part of your hip, or spine, or knee, right? Bribery?

Of course, it is not listed as bribery on the companies’ balance sheets, and I’m sure that the individual surgeons do not report it on their income tax under “kickbacks”. Back in 2012, surgeon and writer Atul Gawande, in an article in the New Yorker called “Big Med” (discussed by me in Quality and price for everyone: Bigger may be better in some ways, but not all, Aug 24, 2012) reported on an effort in a Harvard-affiliated hospital to standardize the hardware used by orthopedic surgeons. Rather than having each surgeon pick their own favorite and having the hospital have to stock several – or sometimes more than 10 – versions of, say, an artificial hip, a committee researched the quality and the hospital stocked 1 or 2 of the best, and each surgeon had to use them. This improved outcomes. There was no suggestion that the surgeons were, at that time, getting kickbacks, but it is certain that the manufacturers whose products were not chosen were not happy about it.

It is not hard to see why the American people distrust the healthcare system, and the hugely profitable drug and device industries that supply it, and the healthcare “providers” – hospitals – that deliver it, and, sadly with news like these kickbacks, even the doctors and other clinicians caring for them. And I haven’t even gotten into  (today) the insurance companies! As in so much else, it often appears that both our major political parties are the parties of Corporate America, although one is at least making some efforts to limit those corporations. The other is, like the drug companies, totally shameless.

It is not hard, but it is sad. And worse, destructive to our health, as individuals and as a society. 

Monday, June 21, 2021

FDA approves Alzheimer's drug against the recommendation of its scientific panel. Be very concerned.

Early in June, an article in the NY Times discussed the possible approval of aducanumab, a recombinant DNA (the “-ab” is always clue!) drug intended to treat Alzheimer’s disease. The FDA approved the drug a few days later, going against the recommendations of its advisory committee of scientific experts, and generating this “Quotation of the Day” in the Times from one of its members, G. Caleb Alexander: “There’s no way to recover the opportunity to understand whether or not the product really works in the post-approval setting.” Almost immediately, three members of the advisory committee, Joel Perlmutter of Mayo, David Knopman of Washington University in St. Louis, and Aaron Kesselheim of Harvard, resigned in protest of the decision. Dr. Kesselman, along with his colleague, Dr. Jerry Avorn, presents a strong indictment of the FDA in an Op-Ed guest essay in the Times, and they are not alone. Most neurologists, including those that I know who are experts on and leading researchers in Alzheimer’s, echo these concerns.

This is pretty unusual. Not just the resignations, but the reason for them – the decision by the FDA to approve a new drug based on evidence of effectiveness so weak that the scientific advisory panel recommended against it. It raises a number of questions, the foremost one of which is “why?” Also: Is this a precedent, and will it happen again, or more regularly? What was the reason that the advisory committee recommended against approval? Who were the people at the FDA who overruled them, and what were their reasons?

First let’s start with cui bono? – who benefits. This is certainly Biogen, the company that developed aducanumab and will market it, under the tradename Aduhelm. It is estimated that it will cost $56,000 a year. This is not a record; there are other recombinant DNA drugs – including several for neurologic conditions – that cost even more. In fact, as indicated in recent study by the American Academy of Neurology, “Medicare paid 50% more for neurology drugs over 5 years while claims rose only 8%”. Still, it is a tidy chunk of change, and since Alzheimer’s is a far more common disease than most of the rare one that are ostensibly treated by more expensive drugs, Biogen expects to make a bundle. And, because only the very very rich could afford this much, most of it will be paid by you. That is, by insurance companies that collect your premiums, and especially by Medicare, the insurer for the majority of Alzheimer’s patients, which is funded by your tax dollars. This is described in another article, with the subhead: ‘Despite scant evidence that it works, the drug, Aduhelm, is predicted to generate billions of dollars in revenue, much of it from Medicare.’  If people are not insured, or rich, they can forget it. Which, in this case, might be just as well.

Making a lot of money, as much as they possibly can wring out of patients and insurers, is the core business of pharmaceutical companies (and most companies, although pharmaceutical companies have been particularly good at making outrageous profits, always ranking as the #1 industry for profit). It is not, despite their ads (and they spend much more on marketing than on research and development) about improving your health.  You are just the coincident vehicle for generating their profits. Their drugs do not have to actually help you get better; as long as they don’t harm you too much – and, of course, as long as the FDA approves them – they are golden. This is why they spend so much on marketing, and lobbying, and specifically lobbying the FDA. Indeed, the “golden parachute” of many FDA staffers is to retire from the agency and get a job lobbying for a drug company. Sigh. So that one is obvious. Corrupt and despicable, yes, worthy of complete anger and condemnation, yes. But obvious. Not, heretofore, however, predictable.

There is another stakeholder group involved, Alzheimer’s advocacy groups. The FDA still has an acting chief, Janet Woodcock, and another article notes these groups supported her becoming permanent. It says “Woodcock’s nomination back in February when the application for the drug, aducanumab by Biogen, was pending, its approval was a sign that they backed the right candidate.” Wow. Shouldn’t we be paying them attention? After all, they are not the drug manufacturers who will be making a mint. And Alzheimer’s is a terrible disease, and we need effective treatments, right?

Not so fast. Yes, Alzheimer’s is a terrible disease. Those who have it suffer greatly, at least until it is so advanced they no longer recognize what is going on. And their loved ones continue to suffer, more and more. A drug that would cure it, or mitigate it, or make it progress more slowly would be wonderful (although it shouldn’t cost $56,000 a year!). But is aducanumab that drug? Not according to the scientific panel, who know. But the advocacy groups are pushing for it anyway. Why? Well, they make not be making most of the money, but they have to justify their existence. And they almost certainly are getting donations from those drug makers. And maybe, even, they care so deeply about the disease that their hope and optimism overcomes appropriate caution. It wouldn’t be the first time that this has happened (e.g., the continued promotion by breast cancer advocacy groups for decreasing the age and increasing the frequency of screening even when science showed the opposite).

It also wouldn’t be the first time that those advocating for victims of terrible disease pushed strongly for approval before studies were completed. One meaningful and important example is the efforts of groups such as ACT-UP to get early approval for anti-retroviral drugs, as people were dying in droves from AIDS.  But there are differences. One is the disease; Alzheimer’s is not killing people quickly as did AIDS, and no one is claiming that aducanumab or any other drug will change its eventual downhill course. Another is health equity. In the political and social landscape of the 1980s, AIDS was a disease primarily affecting gay men and IV drug users, definitely not the mainstream. Leaders such as Ronald Reagan refused to offer support. And, perhaps most importantly, the anti-retrovirals were showing a definite positive effect in studies, and the calls were to speed up the approval process. In the current case, the trials are complete and the evidence showing a positive effect is not sufficient.

This is in no small part due to the fact that the “positive effect” studies show involves changes in biomarkers, not changes in people’s lives. That is, they look at lab tests rather than whether people die less soon or suffer less. Yes, there is evidence, as there is evidence in many diseases, that these intermediate markers are related to long-term outcomes, but the problem is that the further out they get the more it becomes like a game of “telephone” (well, our drug affects A, and A is related to B, and B may be related to long-term outcomes). We need studies that look at patient-oriented not disease-oriented or laboratory-test-oriented effects.

Sometimes an intermediate marker improves but the patient does not, or gets worse. It could be from a side effect of the drug (drug safety) but it can also be from the desired positve effect of the drug!  For a time diabetes groups pushed to lower the target hemoglobin A1c (HbA1c) -- a measure of long term glucose level, to be 5 rather than 6, because people with diabetes with lower HbA1c levels had lower levels of diabetes complications. Makes sense. But when the average blood sugar over several months is lower, it increases the risk of significant hypoglycemia (low blood sugar), which can be more dangerous than higher sugar. Indeed, if you pass out from low blood sugar, fall and break your hip, and die, the lower rate of complications from your diabetes in the long term is irrelevant. There is an old medical joke about Harvard doctors being very insistent that their residents keep patients’ lab values in the normal range, so that even when the patient died, they died in “perfect Harvard balance”.

This is not what we want. We want diseases to be cured, or ameliorated; for lives to be lengthened and improved in quality. We certainly do not want drug companies to make billions off of people’s suffering. When the FDA approves a drug over the recommendations of its scientific panel, it should be of great concern to all of us. 

And don't forget cui bono?

Sunday, June 13, 2021

Culture and Medical Culture: Understanding to increase benefit and reduce harm

Culture is often understood, at least by that culture that is in a majority in a given place, as a characteristic of others. That is, we are “regular”, they have a culture. The greater the disproportion between the dominant group and others in terms of numbers, the less diverse a community is, the more this – incorrect – assumption prevails. In the 19th century, before the work of Bronislaw Malinowski and Margaret Mead, who actually spent time in the places and cultures they were studying,  cultural analysis of the world by anthropologists was often done “offline” by what have become known as “armchair anthropologists”. All European, they ranked cultures from least to most civilized, and guess what: European, and especially Western European, cultures were always at the top!

It should be needless to say that this was wrong. In addition to all the examples that can be given of other non-European cultures were far more advanced (think the Arab world for mathematics and science, China for all kinds of things), all cultures are different. They do not just have “strengths” and “weaknesses”, or areas in which one is “better”, but differences which have developed to serve the needs that existed where they lived. Weather, for a start, makes a difference in the types of crops grown or how housing is designed. In addition, of course, different cultures share many similarities. This allows for, for example, religious ecumenism, in which folks of different religions can come together based upon the values that they share. In the US today, we have seen great advances in understanding not only that differences between cultures do not mean one is better than another, but also that similarities between people usually exceed differences. Recently, we are seeing great strides against racism, sexism, jingoism, and all the other “isms” that promote hatred instead of understanding. Unfortunately, however, we also see a backlash from people who feel threatened by the idea that other people, whom they have disparaged and discounted, are indeed their equals. This has gone beyond attitudes; it has led not only to violence, but to legislation enshrining prejudice, hatred, and discrimination. I hope this will get better, but it might get worse first.

One way that we have on tried to address this issue in medical education has been discussions between small groups of students about how they see common phenomena in the world, in their communities, in families, and relationships. The more diverse a class is, the richer these discussions become and the more the students learn that what they think of as “regular” is in fact just as much a cultural belief as that of other people. Of course, this also can reveal assumptions that they may make about what is “normal” that are not normal for others, particularly regarding financial and socioeconomic issues. Or, for instance, whether the police are seen as your protectors or your persecutors.

This becomes an important entry point for examining medical culture, which certainly exists and carries its own beliefs and prejudices, as do most professions. These beliefs are no more, or less, “true” than sociocultural beliefs. Because medicine involves not only extensive interaction with other people who are not immersed in the culture but, even more, extensive power over the lives and health of those people, coming to grips with what you (and your teachers) believe because, well, we all believe it, rather than what is based in evidence, is important. This is more difficult because a big part of the socialization to a profession such as medicine is for a novice who is from outside that culture to learn the jargon, way of thinking, and indeed prejudices that characterize it, and this can have negative as well as positive results.

For example, our medical students usually enter perfectly capable of speaking English (and perhaps other languages) and conversing with others and communicating ideas and information. As part of becoming doctors, they learn new language, new terms, new acronyms, new meanings, and eagerly repeat them as evidence of their acculturation. Unfortunately, this can become an obstacle to communication with their patients, who do not speak this language. One example: a couple of sentences ago, I used “positive” and “negative” in their usual English senses of “good” and “bad”. However, when doing medical tests (lab, imaging, biopsies) a positive result is usually bad, and a negative result is good. But when a doctor, or student, informs a patient that their results are negative, it is common for the patient to react with fear, since this sounds like a bad thing. We urge them to say “normal”. Whew, that’s a relief!

Some other issues of medical culture are address in an Op-Ed by Robert Pearl in the Los Angeles Times of May 16, 2021, “How doctor culture sinks US health care”. A big part of Dr. Pearl’s critique in the distinct bias, not only in physician attitudes but in medical journal articles, towards intervention and procedures rather than prevention. This, he notes correctly, is very much tied to money, since physicians and hospitals and health systems (which are increasingly the physicians’ employers) stand to make much more money from them. Medical journals are more likely to print articles with positive (there is that word again!) results, demonstrating that a procedure had benefit, than negative results, demonstrating that, actually, compared to something – or nothing – else, something (or nothing) that was easier, cheaper, less interventive, and less dangerous, it had no better outcomes. Of course, anyone can see that knowing this information, that doing something is not worthwhile, is at least as important as knowing that something works well.

However, the inclination (or perhaps prejudice) among most physicians is to do something, to intervene; aside from making money, it makes them feel that they have skills, are justified, are important. Unfortunately, this is also an attitude quite prevalent among their patients, who want something done to help their problem – to cure their disease, or increase their lifespan, or improve the quality of that life, and in particular to ease their pain. But doing something does not always improve things, and can definitely increase the risk of harm. We need to know what works (and what doesn’t), and in what circumstances, and what the dangers are, and what the alternatives are, and their potential benefits and risks, and then have discussions together about what, in the specific circumstance a specific person is in, what would be the best choice for them.

This effort is likely to overlap with more traditional sociocultural and religious beliefs, which can have an influence on what a person thinks would be best for them. Communication around this requires care, and a real effort on the part of the medical professional to understand and to make their own thoughts clear and clearly expressed. This is even more complicated when, as is the case, physicians are from a pretty narrow slice of the American population, racially, culturally, and economically (and, again, a good argument for increasing its diversity). As in all situations where there is a power differential (and in medical care, the greater power lies with the physicians and health systems) it is incumbent on those with greater power to make the effort to understand those with less. And, at least as important, to not make decisions for and about people based on only your understanding – or worse, assumptions – about what they want, or are because of race, religion, gender, national origin, etc. Doctors, even when they are well-meaning (and all of them are not always) too often allow themselves to fall victim to the ecological fallacy, and confuse “condition X is more common in population Y” with “the patient is a member of group Y so probably has condition X”.

It is, of course, also very important to recognize that all interventions and procedures are not a bad idea; indeed, they are often the best treatment. And, also, that not everything sold as “preventive” is really so; plenty of tests and treatments called preventive are not proven to prevent anything. It is not easy to overcome prejudices and beliefs.

But understanding that we all have culture, and trying to not be bound by it and doing our best to understand that of others, is a good start.

Total Pageviews