Sunday, January 25, 2015

Our health is going downhill: poor public health and poor attention to the social determinants

“Our health is going downhill” shouts a headline in the Kansas City Star, January 4, 2009. The local take of this article, by Alan Bavley, was the poor performance of Kansas and Missouri, the two states served by the Star, on the 2014 report on America’s Health Rankings, published by the United Health Foundation, the longest-running ranking of public health status in the nation, since 1990. Bavley emphasizes that both states have dropped significantly in those rankings; Kansas was 12th in 1990 and is now 27th; Missouri was 24th in 1990 and is now 36th.

This leads to a lengthy discussion of why both states have dropped, mainly attributed to a lack of investment in public health, and how there is a geographic disparity, with states on the coasts doing overall better than those in the Midwest: “What explains this dramatic difference between the coasts and the Midwest is broad investments on the coasts in things that make communities healthy,” Bavley quotes Patrick Remington of the University of Wisconsin. What this misses, however, is the even worse news that is hidden by “rankings” data. While in rankings of states there will always be a #1 (in this case, Hawaii) and a #50 (you guessed it, Mississippi) this hides the fact that, overall, states have gotten worse over this 25-year period. The graphs in the print edition of the Star (not included in the on-line edition) show the decrease in rankings noted above for the two states over time. However, on the “America’s Health Rankings” website one can not only look at the map showing relative state rankings but also click on each state and see how its absolute health ratings have changed over time.

Hawaii, ranked #1 in 2014 (Vermont is ranked #1 for the whole 25-year period), has nonetheless had its health status drop quite dramatically since 1990, while Mississippi, #50, has actually slightly improved. Locally, Kansas’ health status has dropped significantly consistent with its slippage in the rankings, but Missouri’s, after a big dip in the intervening years, is about the same as it was in the mid-1990s, despite its lower ranking. How can this happen? How can Missouri drop 12 places in the rankings despite having about the same health status if the top-ranked states are getting worse? The only explanation is that the gap was even greater in the past, and that some states in the middle, such as Illinois (#30) and Pennsylvania (#28) have gotten better while Missouri has stayed the same. Hawaii has dropped from a rating of +0.7 to +0.3, while Mississippi has gone from -0.4 to -0.3. Dr. Remington’s comments may be accurate, but they were more accurate in 1990, and since then states have seen a race to the middle, if not the bottom, in terms of public health.

The rankings above are the “all outcomes” rankings from the United Health Foundation studies. They are composed of several subcategories. One component lowering these overall outcomes is the obesity rates, which have risen nationally from 11.6% in 1990 to 29.4% in 2014 (!) as well as in every individual state. Diabetes has risen nationally from 4.4% to 9.6%. Physical inactivity has stayed relatively constant, but distressingly high, at nearly 75%. On the other hand, the last measure, smoking, has gone down nationally from 29.5% to 17.6%, but has tended to stay the same over many years more in lower-ranked states, such as Mississippi, Missouri, and even Kansas.  The study ranks senior health separately, but this tracks pretty well with overall health; Hawaii is the best, Kansas is 25, Missouri is 42, and Kentucky replaces Mississippi (#47) as the worst. The study also examines rankings for a variety of other characteristics, some of which are different for the overall population and for seniors. They include chronic drinking (seniors), binge drink (all adults), depression (seniors), etc., as well as societal measures which might impact or “confound” health status including education level, percent of “able bodied” (no disability) adults and percent of children in poverty.

The study also provides us with information on health disparities, obesity levels by different sub-populations, based on education, race/ethnicity, age, gender, urbanicity, and income. Two non-surprises: the South and South Central regions do the worst, and the problem is greater for those with lower education, non-white race/ethnicity, and lower income; urban status and age have less impact. In terms of educational impact on health disparity (the difference between the highest and lowest educated in terms of health status), things change: Hawaii is still #1 but Mississippi is #2, while California is #50! Unfortunately, for many of the states with both low overall health status and low disparity, it means that even the better-educated have poor health status.

So what do we learn? Yes, as Dr. Remington points out, some parts of the country generally do better than others (although identifying these as the Northeast , West, and North Central  regions is more accurate than saying “the coasts”), and the South and South Central regions tend to be worse. Yes, as Mr. Bavley highlights, both Kansas and Missouri have significantly slipped in the relative rankings. But we also see the whole country getting worse, particularly with regard to conditions such as obesity and diabetes. And we see the most dramatic drops in certain states, not only Kansas but Wisconsin (down from +.38 to barely positive at all, +.07). The people interviewed for the Bavley article in Kansas and Missouri, as noted above, cite inadequate, and decreasing, spending on public health as the reason.

It is certainly one of the big reasons, along with a consumer society that encourages consumption of high-calorie, low nutrition foods. And a car-based society that makes exercise a specialty activity, more available to some than others, rather than part of life. And a terrible economy where a shocking number of people don’t have jobs and others have to hold down two or more to make ends meet so have little time for exercise. The other huge reason are those “social determinants of health”; the impact of poverty, racism, poor education, inadequate housing and food. The social structure and social support for the most needy in the US has never been adequate, and is eroding, more in some states than in others, sometimes on purpose (because of political beliefs) and sometimes by a (possibly) more benign neglect.

Some of it is the chronic problem of public health, that its successes are the absence of disease and thus less obvious. It is easier to feel grateful for treatment of a disease we have contracted than, say (as I have often said before) to be grateful each morning that we don’t have cholera because we have clean water. It is, perhaps for some, easier to think we don’t need to vaccinate our children when diseases that the vaccines prevent are no longer in evidence. But it is a fatally flawed analysis. When a good has resulted from doing effective preventive efforts, the solution is to keep up our efforts, whether vaccination or public health.

And cutting back on our social safety net is a good prescription for worse health.

Sunday, January 18, 2015

Free speech, religious belief, and facts: how does it affect health?

The massacre at the French magazine Charlie Hebdo was shocking and horrible, as are the massacres and atrocities that occur regularly with less immediacy to those in the West, such as those committed by Boko Haram in Nigeria. The most positive result was the massive outpouring of support for free speech, for being able to say and print what you want even if it offends people. And, I would add, particularly if it offends the powerful, which Charlie Hebdo also did. More than a million in the streets of Paris saying “Je suis Charlie” (“I am Charlie”), with more than 40 heads of state in attendance, even if they didn’t actually lead the march, but were photographed together on a protected side street. And even if many of them sponsor severe repression of free speech in their home countries. 

The inclusion of Israeli Prime Minister Benjamin Netanyahu was particularly problematic given the violently repressive policies of his government, but given that the companion attack was on a kosher supermarket where four Jews were killed, the symbolism was important even if a lightning rod for (largely just) criticism of Israeli government policy. Less appreciated was the message from Netanyahu that French Jews should all come to Israel, and more appreciated were the sentiments of French Prime Minister Manuel Valls that ‘France Without Jews Is Not France’, and the demonstrators, most of whom were not, who carried signs that said “Je suis juif” (“I am Jewish”).

But the necessary condemnation of terror, and moves to avert it, along with the necessary condemnation of anti-Semitism and the conflation of Jews with the actions of the government of Israel (or the conflation of Islam with the actions of Islamic terrorists) does not solve the problem of communication, that people see “truth” so differently. I don’t know that I can offer much more insight into the conflict of seeing truth through the lens of religious doctrine (and of course some people and groups’ interpretation of religious doctrine) and a “liberal” concept of the value of free speech. I was interested in the perspective of Maajid Nawaz, a British Muslim who became a radical Islamist at 16, served 4 years in an Egyptian jail where his readings changed his perspective and later founded Quilliam, an anti-jihadist think tank in London, expressed on NPR’s Fresh Air. Asked by host Terry Gross how he saw himself as the same person, given his loss of relationships including family and friends since his “conversion”, Nawaz spoke about commitment to justice. He said it was the blatantly unjust treatment of Muslims that motivated him to fight as an Islamist, and the same commitment to justice that makes him oppose terrorism. Ideologically, I think that this is a good start.

 Most countries, including France and the US, have a mixed relationship with free speech. In the US (which I know much better), many people not only support free speech for positions that they agree with but also positions that they can tolerate listening to. Of course, however, true support for free speech means support for speech you abhor, hate, despise, think dangerous. Not, of course, the same as action (“your free speech stops just short of my nose”), but certainly includes free assembly and demonstrations to express views. If one’s religious views include opposing anyone’s right to criticize your religion (or, even more, as illustrated by the Inquisition or ISIL’s massacres of Yazidis, not adopt your religion), you are clearly endorsing a society antithetical to free speech. And, of course, with the grossly immoral series of US Supreme Court decisions that money is speech and that corporations are people who can exercise that “speech”, the entire concept of free speech in our country is perverted.

Closer to home, and closer to the usual themes of this blog, health and social justice, we see again how beliefs not only threaten free speech but threaten our ability to act as an honorable and just society because groups of people see things so differently. The reasons given are many: our social isolation from groups of people unlike us (residential segregation by race and class and age and educational level), our ability to receive “customized” news, where what we watch on TV or find on the Internet is that which agrees with what we already believe. When people hold views based on their faith, it may be difficult or even unreasonable to expect to change it; this is what “faith” is. However, when people hold views that are not religious and are demonstrably wrong in the face of the facts, and those beliefs are held as firmly as those that are religious, and those beliefs threaten the core well-being of other parts of our society, we would hope that they could change.

I have often written about the Social Determinants of Health. These are the conditions of people’s lives that make them more vulnerable to illness, less likely to be able to prevent it through both health screening and living in places and circumstances in which prevention is possible. For example, not near areas of high pollution, not in poor quality cold housing, not in no housing. To have shelter, and decent food, and the opportunity for education for themselves and their children. All the things that characterize their lives and come before their access, or lack of access, to the health system comes into play. If we are to improve the health of the American people, we must not only provide equitable access to health care geographically, financially, and socially (with language access and caring and actual interest in people’s health) but also address those social determinants that disadvantage so many in the pursuit of their health.
 
And then I read the results of a survey by the Pew Research Center that says a majority of well-to-do Americans think that poor people “have it easy”. Widely reported, including by the Washington Post which leads with “There is little empathy at the top”, and CNN, which reports “54% of those with the greatest financial security believe thatpoor people today have it easy because they can get government benefits without doing anything in return’…Only 36% of the wealthiest say ‘poor people have hard lives because government benefits don't go far enough to help them live decently.’" I want to say this is unbelievable, but I have to believe it is true that they think this. I am, nonetheless, aghast that they could think this. What world do they live in? Is it really true that their only contact with poor people is on TV news, Fox News at that? Have it easy?

Would they want to test that? Live like poor people for a while? Even knowing that – unlike real poor people – they could return to their comfort in a month or a week, would they be able to tolerate it? Not being able to pay their bills, not have heat, not have decent or sufficient food, not be able to afford the doctor, not be able to take off work without losing pay to go to one even if they had health insurance? I think – I know – that if they did they would feel differently about it being easy to be poor. But while there is great value to “walking a mile in someone else’s shoes”, there is a way to know what is going on without even doing that. It is called opening your eyes, looking at the facts.

Even when they are uncomfortable, even when they challenge your beliefs, or more importantly your sense of self-entitled comfort. To not do so is part of no one’s religion. This is the responsibility of free people. 

Sunday, January 11, 2015

Belief vs. "truth": how people often make medical decisions

In a fascinating article in the “Medicine and Society” section of the New England Journal of Medicine, “Beyond belief—how people feel about taking medication for heart disease”[1], Lisa Rosenbaum discusses some of the reasons that people do not take medicines prescribed for them by doctors, really for any condition, not just heart disease. These reasons go beyond the obvious ones of personally experiencing side effects and not being able to afford them; indeed, she starts out discussing the fact that folks don’t use aspirin, a very cheap drug, even after having been diagnosed with coronary heart disease, for which the evidence of benefit is very strong.

Rosenbaum addresses a number of reasons, beginning with simple belief. A friend tells her that “My parents [whom Rosenbaum describes as “brilliant and worldly”] are totally against taking any medication”. Another person she meets, prescribed a “statin” (an anti-cholesterol drug), has no intention of taking it and indeed expresses disdain that is “raw and bitter” (the disdain, not the pill). For him, it is tied to the suffering he saw his sister endure when taking toxic anti-cancer drugs. Her hairdresser suggests another reason: taking medication means acknowledging that you are sick, and people don’t want to acknowledge that. He says that he gives his grandmother her nightly medication by telling her they are vitamins—after all, vitamins are to make you healthier, not treat your sickness.

Rosenbaum tells more stories, relating more reasons, but most come down to a belief, almost to an unchangeable worldview. Some of the issues seem to be semantic. People do not want to take “chemicals”, but will take vitamins. Connotation, and the “frame” that people put around words and concepts (sickness, drugs, natural, chemical, etc.) are very important. Of course, they’re all chemicals, and of course anything (“natural” or produced in a laboratory) that can have a biologic effect (good or bad) can have other effects (good or bad).  People sometimes cite the side effects of drugs even when they haven’t experienced them but have read or heard about them, and credit them with more importance than the beneficial effects. While some people have always made decisions based on creating a parallel to what happened to someone they know, the Internet has probably magnified the universe of people they “know” and stories that they “hear”.

Perhaps the scariest reason Rosenbaum points out is that the success of medical treatment has led people to minimize, in some cases,  the seriousness of the disease. As a cardiologist, she points to acute myocardial infarction (heart attack), which used to require 4-6 weeks of hospitalization, and now often has people out of the hospital in 24 hours. She talks to a person who contrasts it to the flu, which “can knock you down for days or a week or two, [while]the heart attack, once they do the thing, you’re in good shape.” And yet, “once they do the thing”, whatever it is, stents or clot lysing (presumably not yet bypass, which does require a longer hospitalization) and you feel better, you still have the disease; only the use of certain drugs along with diet and lifestyle changes can modify the trajectory of the disease. But the latter are hard, and maybe we don’t want to take drugs. Because, you know, we are feeling better.

I admit to initially feeling anger, hostility, as I read the “reasons” that these people would not take medicine, feeling that they were stupid. I don’t mean that I was angry that they don’t take medicine; this is their decision. In addition, there are lots of important reasons to be wary of taking medicines that go beyond personal experience with side effects. Not the least of these is the fact that they are heavily marketed by drug manufacturers, who are in business solely to make a profit, and regularly invent new “diseases” that “need” treatment in order to market their drugs and make money. In addition, “indication creep” (which I have discussed before, The cost of health care: Prevention and Indication “creep”, drugs, and the Sanders plan, June 25, 2011, particularly citing a piece by  Djulbegovic and Paul, “From efficacy to effectiveness in the face of uncertainty: indication creep and prevention creep”).[2] This means that a drug, which is found to be effective and relatively safe for a certain condition, at a certain severity level, in certain people, starts to be used by physicians (often encouraged by the manufacturers) for other people with less severe levels of conditions, and sometimes for other indications for which efficacy has not been proven. For example, starting drugs for cholesterol at levels below which treatment has been shown to reduce mortality, or putting younger (or older) people on treatments only shown to benefit older (or younger) people, or men or women.

Indeed, this appeals to another system of beliefs common in people (including doctors), that if a little is good, more is better; if reducing cholesterol in people whose level is above “X” is good, why not in people whose cholesterol is a little below “X”; if getting your average blood sugar below “Y” is good, why not a little lower still; if aspirin is good prevention and reduces death in men who have coronary heart disease, why not use it in men who don’t but otherwise look a lot like men who do? This sort of belief may lead to behavior opposite of that described by Rosenbaum (that is, taking medication when it is not of value rather than not taking medication that is likely to be of value) but it stems from same root—making decisions based on beliefs rather than evidence. And it is not uncommon to see both behaviors manifested in the same people: someone who would “never” take “artificial chemicals” (regulated drugs) into their body who ingests large amounts of unregulated chemicals (labeled as “natural”). The apparent contradiction is non-rational to me but makes sense to them.

I often—maybe usually—agree with those who say “less is better”, such as Ezekiel Emanuel in his New York Times op-ed “Skip your annual physical”.[3] But I hope that I do this when, as in the case of the annual physical, the evidence does not demonstrate benefit, and the cost is high, as it is for many heavily-marketed drugs. And, of course, my anger subsides as I realize that I often feel the same things, and maybe even sometimes act on them. I don’t want to be a sick person, certainly not one with a chronic disease (it’s bad enough to have the flu!) and taking a medicine for a condition labels me as such. I don’t want to take medicines just because they “might” help (prescription or over-the-counter, made by traditional pharmaceutical manufacturers or “natural” companies) if there is not good evidence, and I don’t want to experience unpleasant side effects. But I do take the medicines that have been shown to benefit people like me, with the same or similar risk factors, and even put up with some side effects (e.g., mild myopathy from the statin).

I am not going to change anyone’s worldview, no more than Dr. Rosenbaum is likely to change that of the “brilliant and worldly” friends of her parents. And I am certainly not going to become an advocate for treating for the sake of treatment, or being a flak for drug companies. But if there is strong evidence that taking a drug (in the lowest effective dose) for a condition that I in fact have (denial or not) is likely to have a “patient-important” (meaning lower risk of premature death or better quality of life) outcome, and I personally do not experience serious side effects, I will take the drug.

The key issue here is not making decisions to do, or not do something (have a physical or take a drug) because of a general belief that such things are good or bad for you, but rather to evaluate the evidence of how it might benefit or harm you, and to make a decision that balances these filtered through your own value system, how much you value the potential benefit or harm that might come.

To me, this is a rational approach.





[1] Rosenbaum L, “Beyond belief—how people feel about taking medications for heart disease”, NEJM 8 Jan 2015;372(2):183-87
[2] Djulbegovic B, Paul A., From efficacy to effectiveness in the face of uncertainty: indication creep and prevention creep”, JAMA. 2011 May 18;305(19):2005-6..
[3] Emanuel E, “Skip your annual physical”, New York Times, January 9, 2015.

Monday, January 5, 2015

Thursday, January 1, 2015

Direct Primary Care, Scope of Practice, and the Health of the People

One of the relatively new and growing movements in family medicine is “direct primary care”, or DPC. The term seems to have a lot of different meanings, depending upon who is talking about it (or, often, it is talked about in very vague terms, as are many things we want to have only thought about in positive ways; if we get too specific people can criticize!). In general, however, it is about primary care doctors taking direct payment from patients for their services rather than getting reimbursed by insurers (including Medicare and Medicaid). This is touted to be a panacea for doctors tired of “bureaucracy” (often referring to the “government”, but certainly at least as painfully insurance companies); of too many forms to fill out and rules to follow and loss of autonomy. The primary care doctor provides the service that s/he is capable of and the patient pays, just like in the old days (maybe barter is included, but don’t know about paying in chickens – on visit to the vet the other day I saw an old sign on the wall advertising a vet’s services, indicating both cash and barter—but no poultry.)

There is a certain attraction to the simplicity of this arrangement. The doctor provides the services that s/he can provide (presumably not including most laboratory tests or medicines or immunizations) for a fee that is collected in cash. The patient can even apply to their insurance company for reimbursement. Voilà! Everyone is happy! The patient gets the service, the doctor does what s/he likes to do, and is freed from bureaucratic regulations and thus can operate his/her business more efficiently and with lower overhead, presumably (this is not always explicit) passing the savings on to the patient. But there are a few concerns.

The first, obviously, involves people who are too poor to pay. This may not concern some of the DPC doctors, but does others, and should concern our society as a whole. We know these people; we see them regularly in our student-run free clinic (except there they do not pay anything). I have pointed out that this need not be a problem; one of the advantages of not taking insurance is that the doctor is free to charge different people different amounts. The Center for Medicare and Medicaid Services (CMS) requires physicians accepting it to not charge anyone less than the amount they charge Medicare (not the amount Medicare actually pays). Not accepting Medicare means a doctor could charge a well-heeled person $100, and another poorer one $25 for the same service. Or $5. Or a chicken. Or nothing. And those people with Medicare (or another insurer) could still submit a request for reimbursement for what they actually paid. Don’t know if they would be reimbursed or not. And it might be tough for the senior who can barely accomplish their basic functions to submit directly to Medicare. It all depends, as I pointed out to a colleague considering such a practice, on how much you want to make. If you are willing to make less, you can charge people less. I have no idea how many of those physicians currently practicing or planning to practice DPC are charging such a sliding scale, or taking all comers, or are willing to earn less. But it is at least theoretically possible to do this.

A second concern is “what is the scope of care provided by the DPC provider?” Sometimes discussions of DPC seem to focus on treating colds, high blood pressure, sprains, etc., all the things that are currently taken care of by the increasingly common Urgent Care Centers in drug stores and big box stores. Many of these things are problems that do not need to see a provider (your mother can tell you to drink plenty of fluids, rest, and eat chicken soup – perhaps a better use for that chicken than paying the doctor!). Otherwise, it is not clear what advantages DPC offers over Urgent Care Centers, except that the latter are often staffed by Nurse Practitioners, not physicians. If you care. If the services being offered are within the scope of practice of the provider, what difference does it make? And the Urgent Care Center will take your insurance, not a small matter when it comes to the cost of immunizations, for example.

Clearly, this DPC model cannot work for problems that need to be cared for in the hospital, or require facilities. The doctor cannot choose to be DPC only for their outpatient practice but be on insurance for inpatient care, so won’t do it. Or probably deliver babies. Or provide any beyond the simplest of office-based procedures. Including the critical ones of providing long-acting reversible contraception (LARC), IUDs and implants, which have very high up-front costs, except for quite well-to-do patients. Again, it is getting hard to see the benefit of DPC over Urgent Care, except, possibly, the provision of continuity of care with the same provider. Unless, of course, you need something that cannot be done in the office. Metaphors abound; one DPC provider is quoted as saying “you don’t use auto insurance to buy your gas; why should you use health insurance to buy primary care?” I leave this question up to you, including whether the metaphor is apt. However, it clearly minimizes the scope of what primary care doctors can do.

This is a potential challenge for family medicine and other primary care providers, especially as family medicine moves into its “Health is Primary: Family Medicine for America’s Health”[1] campaign. For a long time, other specialists have derided PC for only taking care of simple problems. Many, including me, have argued the contrary, that primary care is difficult and complex (see, for example, my 2009 blog post “Uncomplicated Primary Care”, and my recent Graham Center One-Pager “Accounting for Complexity: Aligning Current Payment Models with the Breadth of Care by Different Specialties[2]), but quotes like the one above seem to indicate a retrenchment, away from “full-scope” practice. Obviously, like DPC, “full-scope” can be defined in various ways, but usually means things like caring for people in the hospital (another thing I have argued is a strength of US family medicine), delivering babies, caring for children, doing a variety of procedures, and even caring for people in intensive care. At the recent North American Primary Care Group (NAPCRG) meeting, several papers from the American Board of Family Medicine (ABFM) and Graham Center indicated that in most cases greater scope of practice of family physicians led to lower cost. The ABFM developed a 0-30 scale for scope of practice, and found significantly lower costs for patients cared for by FPs with 15-16 scores than those of 12-13 (a relatively small difference in scores). Presumably this is because those with lower scope of practice are referring more to higher-cost specialists. The interesting exception was integrated practices (like Kaiser) where the scores for FPs were low (~11.5) but costs were low, as a result of the other surrounding services available to patients from those integrated systems. These would not be characteristic of small DPC practices.

Finally, there is the concern about “who is health care for?” Much of the interest in DPC among residents, it seems, is to make their own lives less stressed, less busy, less frustrating. Not bad things. But the ultimate and only real measure of whether our society should embrace such a trend is whether it enhances the health of our people. All our people. Rich and poor. Rural and urban. White, Black, Asian, Hispanic. Over 150 years ago, Rudolf Virchow (the Father of Social Medicine) wrote “Medical education does not exist to provide students with a way of making a living, but to ensure the health of the community.… If medicine is really to accomplish its great task, it must intervene in political and social life.”

I hope that we still believe this to be true.

Happy New Year!




[1] Phillips RL, et al., “Health is Primary: Family Medicine for America’s Health”, Ann Fam Med October 2014 vol. 12 no. Suppl 1 S1-S12.
[2] Freeman J, Petterson S, Bazemore A, “Accounting for Complexity: Aligning current payment models with the breadth of care by different specialties”, Am Fam Physician. 2014 Dec 1;90(11):790.

Thursday, November 27, 2014

Giving Thanks in a scary world

Let us give thanks.

Let us give thanks that we are not the parents of Michael Brown. One of the more thoughtful and moving pieces on this subject among the thousands to appear is by Charles Blow, Fury after Ferguson.

Let us give thanks, if we do not live in Missouri, that we won’t see the St. Louis County District Attorney running for Governor. Or, if we are, that we can vote against him.

Let us give thanks that we are not in prison, victims of the four-decade old policy of mass incarceration in the US, addressed as a major public health epidemic by the New York Times, Mass Imprisonment and Public Health”, which details the reasons why

…people in prison are among the unhealthiest members of society. Most come from impoverished communities where chronic and infectious diseases, drug abuse and other physical and mental stressors are present at much higher rates than in the general population. Health care in those communities also tends to be poor or nonexistent.

The experience of being locked up — which often involves dangerous overcrowding and inconsistent or inadequate health care — exacerbates these problems, or creates new ones. Worse, the criminal justice system has to absorb more of the mentally ill and the addicted. The collapse of institutional psychiatric care and the surge of punitive drug laws have sent millions of people to prison, where they rarely if ever get the care they need. Severe mental illness is two to four times as common in prison as on the outside, while more than two-thirds of inmates have a substance abuse problem, compared with about 9 percent of the general public.
Common prison-management tactics can also turn even relatively healthy inmates against themselves. Studies have found that people held in solitary confinement are up to seven times more likely than other inmates to harm themselves or attempt suicide.

The report also highlights the “contagious” health effects of incarceration on the already unstable communities most of the 700,000 inmates released each year will return to. When swaths of young, mostly minority men are put behind bars, families are ripped apart, children grow up fatherless, and poverty and homelessness increase. Today 2.7 million children have a parent in prison, which increases their own risk of incarceration down the road.

Oh, yes. Or their children.

Most of us are not. Some of us are. It is simply not ok. And it is not ok to be selfish, arrogant, so-greedy-it-is-not-to-be-believed multi-billionaires. Be successful, yes. Be rich, yes. Do not be obscenely so wealthy that it requires the destruction of the lives of millions of others.

Blow notes that
Even long-suffering people will not suffer forever. Patience expires. The heart can be broken only so many times before peace is broken. And the absence of peace doesn’t predicate the presence of violence. It does, however, demand the troubling of the comfortable

Nick Hanauer, a multi-billionaire, is less sanguine. He warns his fellow 0.01%ers in a post on Politico.com that “The Pitchforks are coming for us…Plutocrats”. It’s a nice thought, that they would get what is coming to them, but I am less than confident that he is correct. It is a nice thought for Thanksgiving, though.

If we have jobs, let us be thankful. If, even better, they are good jobs, let us be more thankful.

If we have family, let us be thankful. If we have lost family, let us be thankful for the time that we had them. If we can still imagine a world with peace and justice, let us be thankful, although it may be just in our imagination.

And then, let us take a deep breath and realize that it is not just going to come, that we are going to have to work for it. Hard, and tirelessly.

Happy Thanksgiving.

Monday, November 17, 2014

Racism, classism, and who we take into medical school: Who will care for the people?

I work in a medical school. I see and teach medical students. They are a smart group. When measured by grades and scores on standardized exams, they are even smarter. Some of them – but not nearly enough – are members of socioeconomic and ethnic groups or geographic areas under-represented in medicine. Sometimes, these students struggle with grades in medical school. Occasionally, this elicits comments, sometimes smug, sometimes rueful, that this is the result of affirmative action, as if this were a negative thing. Given the alternative, the default of taking all people who look alike, who come from the same background, who want to do the same things – in brief, to stereotype, white 22 year old men who come from economically privileged and professional families (many of them medical) who want to be subspecialists in the suburbs – this is pretty scary.

It is affectively, intellectually, and morally scary, yes, to think that we could accept this kind of regression to an archaic, not to say racist and classist past where becoming a doctor was a privilege limited to only a few. It is also scary in very practical terms, because the people who need health care the most are those least likely to be served by the “default” group. Indeed, in fulfilling their personal goals, the result will be to “serve” already overserved communities, largely in specialties in oversupply. There is good data that shows that students from rural areas are more likely to serve rural communities, that students from underrepresented minority groups are more likely to serve members of those groups, that students from less-privileged backgrounds are more likely to serve needier communities. And that all these groups are more likely to enter primary care specialties, those in most short supply. This is what we want. But they represent a small percentage of our medical students. Why? Because we still, despite all the data showing what predicts service to people most in need, stay wedded to incorrect and outdated ideas of “qualified” for medical school that overwhelmingly bring us the same old same old.

Many (although clearly, given the above, not most) of medical students, from all backgrounds, have some difficulty with the first two years of medical school despite being not only smart but are well-educated from top small liberal arts colleges. There is a relationship here; these colleges emphasize thinking and creativity and problem solving, exactly the skills needed to be an effective physician. They teach largely in small and interactive classes, fostering self-confidence and independence and thoughtfulness and sometimes non-conformity, exactly the temperament needed for an effective physician. They grade largely on the basis of essay tests, requiring integration of information, literacy, and demonstrating an ability to think, not on multiple-choice tests, just what we want from physicians. Unfortunately, this is not the best preparation for the first two years of medical school, overwhelmingly consisting of large lectures characterized by a presentation of a huge number of facts, and designed to reward memorization of those facts using massive multiple-choice tests. Good preparation for this: being a science major at a large university whose courses overwhelmingly consisted of large lectures characterized by a litany of factoids and rewarding successful regurgitation of those factoids on massive multiple choice tests. QED.

Not, of course, the best preparation for being a curious, open-minded, thinking, problem-solving doctor. But this is what we get. Yes, it is certainly true that some of our students from large universities, or from professional or high socioeconomic status, or majority ethnic groups, or suburbs, or all, are incredibly committed to making a difference. Many want to enter primary care, many more want to serve humanity’s neediest, in our country and abroad. They are humble, and caring, and smart. We are lucky to have them in our schools and entering medicine. But they, along with those who are from less-well-off families, and ethnic minority groups, and rural communities, remain a minority among all the sameness. And remain more or less in the same proportions over time. We continue to do the same thing, and have the audacity to wonder why we do not get different results. This is Einstein’s definition of insanity.

On November 16, 2014, Nicholas Kristof published his column “When Whites Just Don’t Get It, Part IV” in the New York Times. He discusses the continuing racism in this country, the legacy of slavery, the fact that “For example, counties in America that had a higher proportion of slaves in 1860 are still more unequal today, according to a scholarly paper published in 2010.” And, of course, he discusses the responses he received (from white people) to Parts I-III, saying it is all in the past, stop beating that drum, it is not my fault, I work hard and don’t get the special privileges that “they” do, and why don’t they take personal responsibility, and our President is Black, isn’t that proof that the problem is gone? I won’t begin to get into the question of how much of the vicious attacks on our President are in fact the result of the fact that he is Black; rather while I observe that the fact that he was elected says “Yes, we have made incredible progress,” I note that this does not eliminate “Yes, we still have lots of racism and it has major negative effects on people as individuals and society as a whole.”

Kristof talks about the fact that he and his Times colleague, Charles Blow, are both promoting books. He notes that while he (Kristof) is white and from a middle-class background, Blow is black and was raised largely in poverty by a single mother. But he also makes clear that this doesn’t prove that the playing field is even, but rather that Blow was very talented, very hard working, and also lucky. That some members of minority groups, or people with very disadvantaged backgrounds (or both) succeed is a testimony to them, to their drive and intelligence and talent and luck, and the support that they have had from others such as family or friends which, while obviously not financial, was significant. It absolutely doesn’t prove that those who are from such backgrounds who have not succeeded are at fault. Indeed, the converse is true; how many of those who are from well-to-do, educated, privileged and white backgrounds, who have had all the financial and educational supports all their lives, who are now in medical school or doctors or professors or leaders of industry would have gotten there if they had started as far down the ladder as, say, Charles Blow, or some of our medical students? Some, for sure, but not most. They are folks born on second, or even third, base, who make it home and look at those who started from home and made it around all four bases, and say “why can’t they all do that”? Most of you, starting in the same place, would, like those who actually did start there, never have had a prayer.

It is common for classes of medical students to develop a “personality”, more self-centered or more volunteering, more intellectually curious or more grinding, more open or more closed. I suspect that this probably has to do with a few highly visible people, because most of the students don’t vary that much. I have heard faculty complain about the inappropriate behavior, the lack of professionalism (especially when they get to the parts of school that involve caring for patients), the sense of “entitlement” that many students have. But this is not true (overwhelmingly) of those who are the first in their families to go to college, who are grateful for the opportunity and hard-working, and committed to making a difference in the world. If we think that entitled, unprofessional students are not desirable, why are we accepting those who fit that mold?


We can do better. We can scale up programs to accept caring, humble, committed, smart people instead of self-centered, arrogant, and entitled ones. Indeed, if we hope to improve the health of our people, we must.

Sunday, November 9, 2014

Uber, pricey doughnuts, and health care: serving the needs of the people or the interests of the rich and powerful?

Two articles in the Sunday Review of the New York Times on November 10, 2014 that are not explicitly about health care seem to me to be very much related to the health care system in the US. “Republicans and the puzzle of Uber”, by Josh Barro, discusses the conflicting interests that affect policy making, particularly at the state level, and create an ideological challenge for that party. On the one side, the libertarian wing of the party lauds “the smartphone based car service” Uber as a wonderful example of deregulation, of opening the market to new ideas that nimbly serve the consumer and meet a real need. On the other side are the existing large and small businesses whose owners not only vote Republican but contribute money to Republican coffers, who want to have their interests protected. In the case of Uber, it is licensed taxi owners, but as Mr. Barro makes clear, this extends to many other businesses where profit margins are protected by legal regulations.

Examples that Mr. Barro cites include everything from licensing of interior designers, auctioneers and ballroom dance studio owners in Florida (run by Republicans) to limiting the sale of coffins to funeral homes (in Oklahoma, also very “red”). He notes that this also occurs in the case of very large businesses at the federal level, citing the controversy about the Export-Import bank, which can protect big companies in the US, but is seen as anti-competitive by some in Congress. Other examples which he does not mention include opposition to the presence of food trucks by local restaurants and “blue laws” in some states requiring car dealerships to be closed on Sunday (hey, if it were legal someone would open and then I’d have to also to say competitive, and I don’t want to work Sunday!)

What does this have to do with the health system? A lot, in a lot of areas, but one that is of great interest to me is the recent initiative begun by a collaboration of all of the major family medicine organizations and newly including osteopathic groups called “Family Medicine for America’s Health”. This effort, with the tag line “Health is Primary”, is good and important, calling attention to the fact (and it is fact) that the creation of a cost-effective health system that delivers high-quality care depends upon a strong primary care base (discussed and with evidence presented many times in this blog). It also emphasizes that family doctors are the central specialty in primary care, given the near abandonment of general medicine by internal medicine graduates. The argument is articulately made in a recent article (ironically called, internally, the “über article” as it will be succeeded by other articles addressing components of the problem) in the Annals of Family Medicine, Health Is Primary: Family Medicine for America’s Health”.

However, there has been less-than-sweeping coverage in the media, and a less than enthusiastic reception by other groups in the medical establishment. A generally positive article in the Kaiser Health News by Lisa Gillespie on October 24, 2014, “Family doctors push for a bigger piece of the health care pie”, quotes Atul Grover MD, chief public policy officer of the Association of American Medical Colleges (AAMC), who says “while primary care is important, taking funding away from specialty training isn't necessarily a solution because an aging population will need more specialty care.” This may or may not be true – we need as much training in different specialties as we need, not more or less. It is almost certainly true that we need more in primary care and less in some others – but it reflects Grover’s (and AAMC’s) role in representing the interests of our academic health centers and all of its components even when this may not be in the best interests of the health of the American people. Just like the Republican party, AAMC has constituents that may reflect different interests.

Thus, there is some irony to another quotation from Grover, that “It’s always a question of what motivates groups to do these kind of campaigns — is it looking out for patients or your own interests, and generally it’s a combination of both,” because this is exactly the position the AAMC is in. However, it is a real caution for the family medicine organizations who are working on “Family Medicine for America’s Health”: to the extent that this campaign keeps to the high ground of America’s health (as it generally is, notably in the Annals article) it deserves strong support. To the extent that the self-interest of family doctors is, or is seen to be, the major driver of the campaign, we risk being lumped with other “special interests”: we could become the funeral homes in Oklahoma selling coffins, or at least the AAMC.

The other NY Times article on November 9, 2014, is from Margaret Sullivan, the Times’ “Public Editor”. “Pricey doughnuts, pricier homes, priced-out readers” addresses common complaints from readers that the Times, not only in its advertising but its articles, seems to be addressing an incredibly wealthy crowd. Anyone who reads the paper is impressed by the lack of accessibility of the homes featured often costing not just millions but tens of millions of dollars, the ubiquity of ads for $10,000+ watches, and articles as well as ads for the highest-end consumer items ($160 flashlights and doughnuts costing $20 for a half-dozen). Sullivan notes that these may seem “aimed at hedge fund managers, if not Russian oligarchs”. She quotes Times executive editor Dean Baquet who, adding insult to injury, says of Times readers “I think we have as many college professors as Wall St. bankers”. This is a double insult; first of all there are way more college professors than Wall St. bankers, and the idea that college professors are the economic “low end” is amazing.

Ms. Sullivan’s article cites mixed reviews of the extent to which the Times covers of poverty (the Pew Research Center says 1% of page 1 articles), but it is clear that appealing to the middle class is missing from the Times. Baquet talks about “balance” as if it were reasonable to balance coverage of issues relevant to the 0.01% with those of the 1% or even only the 10% wealthiest Americans, and only an occasional piece addressing the world of the rest of the nation lives in. This, of course, is what parallels the health care system.

Our hospitals seek to attract well-off and well-insured clients, “balancing” them with poor people. But there are way more poor people, and they tend to be sicker and need more care, so justice, equity, demands that there be much, much more care and attention allocated to them than to the wealthy. If the Times makes money from advertisers who want to reach the wealthiest customers, our hospitals are interested in pleasing their wealthiest customers (oh, I mean patients) in hopes of getting big donations. And those donations are almost never used to provide necessary health care for the sickest and poorest, but rather to open new units (adorned with the donors’ names) to recruit yet more well-off patients. Both our health care institutions and the NY Times are about augmenting their income rather than meeting people’s needs.

Ms. Sullivan ends with “In the end, the upscale doughnut and the penthouse apartment — lofty as they may be — have nothing to do with The Times’s highest purpose.”  Good for her. Maybe Mr. Baquet will get the message, but I doubt it. At bottom, however, if the “balance” of whose interests are addressed by New York Times articles seems off, or offends you, or doesn’t meet your needs, you can read your local paper.

If the balance of who our health care system cares for is way off, we have to work to change it.

Sunday, October 19, 2014

Ebola, fate, and appropriately assessing risk.


There has been a lot written about Ebola lately, and lots of talk about it, and fear about it, in the halls of the hospital and clinics where I work, and I would guess lots of other places also. I don’t have any expertise in Ebola, and don’t claim to know what should have been done, or what “we” should be doing going forward, but it is clear that there have been mistakes, or at least major miscalculations, made by the WHO and the CDC and other government agencies. Some of this may be the result of cuts in funding over the last many years, some the result of emphasis on bioterrorism rather than the impact of infectious agents that get transmitted the regular way (“Failures of Competence”, Joe Nocera, October 18, 2014) but has been both sobering as well as a vehicle for administration opponents to attack it. Of course, the attacks may be justified, but there is no reason to think a previous administration would have done better.

First, it is critical to point out that the real problem, suffering, and crisis is in West Africa, in Liberia and Sierra Leone, and Guinea. In countries with little infrastructure and few resources and in the cases of the first two, relatively recent histories of devastating civil wars. The concern about Ebola in the US (so far two home-grown cases, both in nurses who cared for the Liberian man in Texas) needs to be seen in that context. Those countries that are severely affected need major resources, both human and financial, and need them fast. An excellent video analysis of the issues was recently done by Laurie Garrett on a webinar called “The Ebola crisis: the best and worst case scenarios from here”. Thankfully, there is some recognition of the need in these countries among those who are seeking to make political hay of this crisis; my own Senator Jerry Moran has joined others, particularly other Republicans, in calling for closing off flights from the affected countries, but proposes to exempt health workers. Of course, this misses the fact that it is easier to screen folks coming from those countries than people who first travel to other countries and then fly to the US.

What does interest me about the whole discussion in this country around Ebola is the degree to which it illustrates two common flaws in the way people think about problems. One, obviously at play in the case of Ebola, is the fear of the new, unfamiliar, and scary, especially when hyped up by the media. Thus, for example, the reluctance of both patients and staff (encouraged by their families) to come to work in our clinic during the period that a patient at our hospital was being ruled out for Ebola (he didn’t end up having it), even though it is in a separate building and the patient was three layers of isolation deep. This fear is stoked by events such as the revelation that the second nurse to come down with Ebola had been allowed to fly from Texas to Ohio and back on a commercial airliner (although it could just as well be cited as evidence that anyone, working in a hospital or not, might be at risk).

The second is that people often find it easier to worry about, to get worked up about, problems that they are at low risk for but that they cannot do anything about, even when they are not doing what they could do to prevent problems for which they are at much greater risk. I have written in the past about a prototypical patient obsessed by breast cancer, a condition for which she was in fact at no increased risk, who was not doing anything about problems she could act on such as uncontrolled hypertension, cigarette smoking, and unprotected sex with multiple partners. Indeed, my point was that if she was worried about those last three, people would expect her to do something about them since she could – she could take blood pressure medication, stop or cut down on her smoking, and use protection when having sex. But those might be hard. Worrying about breast cancer, something that there was nothing she herself could do to prevent (doctors could order mammograms, at too young an age and far too frequently, but she wouldn’t have to do anything) was, in this sense, easier. Both of these logical flaws were highlighted by comments from the chief medical officer of my hospital, early in the isolation of the possible Ebola patient: “If 20,000 people were dying of Ebola there would be riots in the streets. But every year an average of 22,000 Americans die of influenza, and people still don’t get their flu shots”.

This selective concern is a form of determinism, the topic addressed by Konika Banerjee and Paul Bloom in “Does everything happen for a reason?” in the NY Times, October 19, 2014. They discuss the idea of fate, that things happen that were destined to happen, that experiences of adversity which coincidentally lead to positive outcomes (the man hospitalized for injuries as a result of the 2013 Boston Marathon bombings who falls in love with and marries his nurse) are “meant to be”. They note that while this is most common in people who are religious and believe that God determines everything, it is still a very common belief among atheists. It is an attractive idea, but it is a mis-reading of chance. That is, people pay attention when things seem to fortuitously happen, or a coincidence facilitates something you wanted to happen. (E.g., yesterday morning I heard a part of a 1981 song on the oldies station but couldn’t remember its name; amazingly, when I turned the station back on in the evening, it was playing again! Fated? No, just Rick Springfield’s “Jesse’s Girl”.) We forget how often things do not happen, but remember when they do. Banerjee and Bloom write:
Not everyone would go as far as the atheist Richard Dawkins, who has written that the universe exhibits “precisely the properties we should expect if there is, at bottom, no design, no purpose, no evil, and no good, nothing but blind, pitiless indifference.”
Deists, like Thomas Jefferson I heard yesterday (also on the radio, but NPR this time), believed that the world was so ordered that there must be a creator, although they rejected the detailed instructions that many of their contemporaries took from the Bible or other religious texts. However, the random nature of events, as suggested by Dawkins or by Stephen Jay Gould (“Full House: The Spread of Excellence from Plato to Darwin”) accounts for these just as well.

The same issue of the Times contains a more medically related piece, “Why doctors need stories”, by Peter D. Kramer. I like stories, and I use them a lot (see above for a couple); they make things come alive, tie abstract events to actual lives, create examples in the experience of individual people of phenomena that are harder to understand when we look only at populations. But they can be misused; my story about the woman who was more worried about breast cancer than her smoking or high blood pressure is meant to be an example of how people can choose which facts they believe and which they ignore. It doesn’t prove anything, certainly not that most people act this way, or don’t. It does (I hope) get your attention. We have to be careful how stories are used; “I knew someone who had an abnormal Pap smear and she didn’t do anything and it went away”, while consistent with our most current knowledge about the early course of abnormal Pap smears in young women, is not a valid argument for you to not do anything. Stories tell what they tell; the lessons learned and conclusions reached are up to us.

Banerjee and Bloom end their piece:
If there is such a thing as divine justice or karmic retribution, the world we live in is not the place to find it. Instead, the events of human life unfold in a fair and just manner only when individuals and society work hard to make this happen. We should resist our natural urge to think otherwise.

And, no matter what we think about Ebola, we should get our flu shots.


Thursday, September 25, 2014

Medical futility and the responsibility of physicians...and patients

Ethics is a difficult area. Medical decision making is a difficult area. Both are fraught with ambiguities, conflicting priorities, differing values. Priorities shift with time, as our moral compass tacks back and forth, seeking to compensate for current problems and deficiencies, and sometimes overreaching, require new future shifts in direction. We will probably never get it exactly right, but need to keep working in the right direction.

Barron Lerner makes an excellent start in his NY Times  Op-Ed from September 18, 2014,”When Medicine is Futile”.  He talks about his father, who was a physician leader in “The medical futility movement, which argued that doctors should be able to withhold interventions that they believed would merely prolong the dying process...”. The father was an infectious disease specialist, whose early career was marked by the miracle of penicillin and drugs to treat tuberculosis. However, later his practice, as that of most infectious disease physicians in the US, was being called to consult on infections occurring in hospitalized patients who were otherwise severely ill, often in intensive care units, with terminal diseases and frequently advanced dementia, “…connected to machines and tubes he knew would not help them.”

The younger Dr. Lerner, author of this piece and a professor of medicine at a major medical school, notes that his father “..placed some of the blame for the situation at the feet of bioethics and patients’ rights, two movements that I, as a young physician, had fiercely advocated,” which put them in conflict. And yet, from a longer perspective, he can see a great deal of his father’s concerns. As he point out, the patient’s rights movement and to a lesser extent bioethics
…did not account for one thing: Patients often demanded interventions that had little or no chance of succeeding. And physicians, with ethicists and lawyers looking over their shoulders, and, at times, with substantial money to be made, provided them.

The stimulus for this article is a recent report by the Institute of Medicine (IOM) of the National Academy of Sciences, “Dying in America,” “…that argues that we subject dying patients to too many treatments, denying them a peaceful death.” This report begins the process of reconciling the physician’s responsibility to at least provide accurate information on treatments and the likelihood of their success with the autonomy of patients to make their own decisions about the treatment that they want. This is a welcome effort at reconciliation of these two apparently or potentially conflicted ethical principles, especially given, as Dr. Lerner points out, that “Physicians declaring things to be ‘futile’ sounded too much like the old system of medical paternalism, in which doctors had made life-and-death decisions for patients by themselves.” But any meaningful discussion of this  requires consideration of four powerful issues that always impact upon it.

First, there is trust. Can patients and their families trust the doctors to be advising them in their best interests? This is particularly true for many disenfranchised people who are not like the majority of doctors in background, ethnicity, and certainly income. For the poor and members of minority groups that have in fact in the past been victims of outrageous abuses, there may remain a suspicion of any effort to suggest that further interventions would be futile. People may think “you are only saying this because I (or my family member) is …” and would recommend intervention for a member of your own family, or someone more like you.
Second is the issue touched on above about the “likelihood of success”. Success at what? For patients and/or their families to make intelligent decisions, the parameters of success, or similar words like “help”, “improve”, “make better”, need to be clearly defined in words and concepts that lay people can understand and that physicians are willing to use. I have written in the past about a woman whose daughter worked with me who was dying in the intensive care unit (ICU) of the hospital I worked at. I came to visit and met her 5 children trying to decide whether to approve the use of an intervention in their unconscious, uncomprehending, and terminally ill mother that they were told “would help”. As I was not the treating physician, I went to the ICU doctor and inquired, on their behalf, what the intervention was and how it was expected that it might help. I then cautiously returned to the family to feel them out regarding their understanding of “help”. They thought it meant it would make her “better”. What, I asked, did “better” mean to them? When they appeared confused, I went further, asking if it meant that she would wake up, be able to talk to them, be able to go home, perhaps be able to say goodbye? Yes, they said, that is what it meant to them. I tried to gently say that this would not happen, that this intervention would, perhaps, correct a laboratory value, but would not have any of the outcomes they hoped for. They opted to not have it.

These are not easy discussions, and lead to the third issue. Most people do not have the training and background to understand the ramifications of the decisions that they are asked to make under the rubric of “patient autonomy”. I feel that I am knowledgeable about many areas, particularly medical ones, but it would be ridiculous to ask me how a bridge should be built. You wouldn’t want to drive over it, and I wouldn’t either. Simply citing “patient autonomy” and presenting incomprehensible data and decisions about whether to do something that even the words for make no sense to a regular person is not ethical; it is equivalent to abandonment. Sometimes a doctor or nurse or other health professional with patience can spend the time and effort necessary to help a patient or their family really understand what is going on, what likely outcomes of any intervention or –equally validly – non-interventions might be, so that they can make an informed decision based upon that information and on their own values and priorities. Most of the time the healthcare providers are too busy, and do not have the time and may not have the inclination, and are not paid well, or at all, for that time.

Which leads to the final issue, money. In a quote from Dr. Lerner above regarding procedures that will not be effective, I include “the substantial money to be made from them”. The money to be made by providing them is a profoundly important issue, potentially corrupting any discussion of ethic versus futility. Dr. Lerner notes that the IOM report “…advocates that Medicare and other insurers pay physicians to talk to their patients about end-of-life care”. This is a great idea, but it doesn’t currently happen often, and even the proposal that physicians do it (not to mention be paid for it) was grossly misrepresented as “death panels” when included in the Affordable Care Act (ACA). Even if this time is paid for, it would never approach the amount of money that would come from doing the procedure, or certainly not the money a busy surgeon, for example, might make from operating on someone else.
All people know, intellectually, that everyone dies. What may be harder for many to accept is that they must die, particularly when their time comes. It may be even harder for family members, who are not the actual patient, to accept, and to demand that “everything” be done. For Dr. Barron the elder, as described by his son, “Infections were the way that such frail individuals were supposed to die, the ‘final straw’ in the deterioration of so many of the body’s vital organs and functions.” Yet somehow they had become things that needed to be treated.” 

Everyone dies; what we can only hope for is a death unaccompanied by pain and unpleasantness. Infections like pneumonia which should be treated in an otherwise healthy person in whom a return to health is likely probably should not be in a person who is terminally ill, bed bound, demented. They are nature’s exit door. The same could be said for starvation, a relatively benign way to go, and almost always better than the alternatives of feeding tubes or intravenous nutrition, which carry high risks of complications of aspiration and infection and discomfort.

There must be a real understanding that patient autonomy does not include the right to demand any treatment. We would not assume a person could request a Corvette or a lifetime pension, and yet the latter would probably do more to improve health than any medical intervention, and the former would cost less than many.

Dealing with and overcoming the barriers presented by the first three issues will be difficult but can be done. To really do so means eliminating the fourth issue, the perverse economic conflict of interest that can cloud judgment, decrease trust, and pollute the entire process.

Wednesday, September 17, 2014

Suicide in doctors and others: remembering and preventing it if we can

Recently, SASS-MoKan, our local suicide survivors’ support group, held its annual Remembrance Walk. I have written about these in the past, and have included the fact that my personal interest in the issue of suicide is the fact that my older son, Matt, committed suicide in 2002, just after he turned 24. At the time, everything seemed to be going fine in his life, and it came as a real surprise to his family, his close friends, everyone.

The Remembrance Walk is a lovely ceremony. Following a lap around a good-sized park, there is a ceremony. All the survivors stand on the grass in a circle, and the names of all those they have lost are read; at the end, a flight of doves is released. It is caring, and it is supportive. Of course, the ceremony also brings out the pain and sadness of our losses.

Matt died nearly 12 years ago, but I know others who have lost children very recently. I can tell them that the acute agonizing pain that feels as if it will never ease does, but they cannot and should not believe. It becomes less acute, less sharp, less all-consuming every minute, but it never goes away.

Recently, an Op-Ed in the New York Times by Pranay Sinha discussed “Why do doctors commit suicide?”. Because I am a doctor, and one who many of my colleagues know had a suicide in my family, several people shared this with me, although I’d already seen it. The article provides the perhaps shocking information that the suicide rate among physicians is twice the national average. Beyond this, however, it focuses on residents, doctors in training, and the enormous stresses that they are under, not further discussing the reasons why doctors in general have such high suicide rates. This is not surprising, as Dr. Sinha himself is a first-year resident (at Yale) and is obviously acutely aware of the stresses and strains of residents.

I know that this is true; I was a resident (a long time ago, in the last millennium) and I work every day with residents. It is a hard job; although these people will become, in a few years, full-fledged physicians who will range from very well-paid to extremely well-paid, they are, as residents, working for about the US average wage, but up to 80 hours a week, often with life and death in their hands. That this is fewer hours than residents used to work is good, but is of limited comfort to them. The point is that the residency is a big stress, and it occurs at an age when many people, especially males, are already at risk. The important issues that are addressed in this piece, and in several of the letters written in response to it, are the idea that doctors are supposed to be all-knowing, infallible, unwilling to admit mistakes or weakness. The op-ed and the letters make clear how obviously unreasonable and burdensome that this is, for all doctors and possibly even more for these young doctors who are even more aware, inside, of what they don’t know and are terrified of showing it.

This is National Suicide Prevention Month, which is why the remembrance walk is in September, and why there are such articles appearing. NPR recently ran a program on the increase in the number of suicides in middle-aged men has increased by 50% since 1999; older men have always had the highest suicide rate, followed by adolescents and young men, but this increase in men aged 45-65 is new. Some suggestion is that it is the economic downturn, which hit poor people well before the “official” recession; it is lower-income men who had the highest rates in this age group. Robin Williams’ recent suicide was widely covered; I doubt that National Suicide Prevention Month entered into his decision. Interestingly to me, many of the commentators focused on the “how surprising, he was so funny, he made us laugh, who would have guessed” angle, while others have discussed how he, like many comedians, needed the public attention but was lost when alone. I suspect Williams was bipolar; certainly many of his activities have suggested an ongoing depression while his outer persona was so often manic.

What many of the articles, including the Op-Ed, do not discuss, however, is the fact that most suicides in the US are caused by depression, a disease. It may be unipolar (“just” depression) or bipolar (“manic depression”), but it is potentially fatal.  The stresses of being a doctor, or being a resident, are tremendous and cause people who are depressed to go “over the edge” and commit suicide, but it is important to remember that most people undergoing the same stresses do not. The underlying condition usually needs to be present for the precipitating cause to be fatal. I have previously cited what I consider to be one of the best discussions of this issue, “The trap of meaning: a public health tragedy” by CG Lyketsos and MS Chisolm in JAMA.[1]

National Suicide Prevention Month, and the activities associated with it, are critically important for raising awareness about a condition that kills 40,000 Americans a year, but is often kept secret, through shame. When I became a member of the club no one wants to be in, what are called suicide survivors (although the meaning is family and friends of someone who has completed suicide, rather than those who have personally survived a suicide attempt), I found out lots of people I knew were also members. They had parents, children, siblings, close friends who had committed suicide, but I didn’t know. We often don’t talk about our pain, but the sadness of losing a child seems to many to be more ok to talk about if it was a medical disease, or car accident, or homicide, than if it was suicide. But the loss is the loss; the pain is the pain.

I don’t have the optimism that some do about being able to prevent suicide in specific cases. I do believe that early diagnosis and treatment helps; I believe that being aware of the warning signs is important, and I will never know how many times having those who loved him around might have prevented my son from killing himself before he finally did. I can only hope that more awareness and discussion about this condition will make a difference for some, and perhaps many.





[1] Lyketsos CG, Chisolm MS .The trap of meaning: a public health tragedy. JAMA. 2009 Jul 22;302(4):432-3. doi: 10.1001/jama.2009.1059.

Total Pageviews