Messing up medicine

[Warning: None of what follows should be taken as medically authoritative.  It is merely a layman’s interpretation of widely available information.]

Baby Boomers should be worried about health care.  Not only are we already consuming a hefty share of medical services, but proposals to create “Medicare for All” would certainly mean fewer options for us.  Having grown up in an era that offers every consumer unlimited choice of just about everything, we might find that hard to take.  Don’t expect AARP to come out for Bernie Sanders.  His plan would affect the trailing Boomers the most.  Employer-provided insurance will be out–no more negotiating for the ‘best package’–while taxes to finance single-payer care will hit those most likely to be saving for retirement.

But there is a cause for concern about medical care that does not lead inevitably to a debate about “socialized medicine.”  Or it needn’t, at least.  Some of the greatest historical advances in human health have little to do with access to the best doctors or private hospital rooms.  Just since 1850, life expectancy in the urban West has nearly doubled, thanks largely to vaccination against infectious disease and the advent of antimicrobial drugs.*  Arguably, the most significant of all these breakthroughs have occurred during the lifetime of the Boomers themselves.  If we’d like to keep this salutary trend going, it’s time to appreciate these improvements and resist the temptation to undermine them.  When it comes to disease, other people have always been part of the problem; now it’s clear that they must be part of the solution as well.  

Survival-Curves-UK
*This improvement is not just the result of reduced child mortality, although that undoubtedly played a major role at first

At the turn of the 20th century–yes, I know, not quite the rock ‘n’ roll era–enormous strides against infectious disease had been made through experimentation, inside the lab and outside.  The principal causes of death, especially for poor people, were widespread conditions like tuberculosis, whooping cough, scarlet fever, diphtheria, along with pneumonia.  As early as 1717, a British aristocrat named Mary Wortley Montagu had sworn by the practice of inoculation, which she encountered on a visit to Turkey, as a means of limiting the impact of smallpox, the most feared killer around the world.  Many of her readers reacted with horror at the idea of deliberately introducing the disease agent in order to stimulate a protective response.  When a smallpox outbreak struck Boston, however, a clergyman, Cotton Mather, deployed his scientific knowledge to advocate inoculation.  At the very end of the century, Edward Jenner showed that vaccination–using a related virus found in cows (hence, vaca)–would work equally well with less risk.  But George Washington was already on board; in fact, he had made inoculation with smallpox itself mandatory among soldiers in the Continental Army.  Today, soldiers receive multiple vaccinations at basic training.  

In short, Enlightenment science, spurred by the needs of the state, developed existing knowledge into a new technology.  The decline of smallpox “towards controlled extinction,” especially among children,  was one of the 19th century’s great medical successes.  By the late 1800s, after Louis Pasteur and Robert Koch identified specific microbes that caused disease (especially rabies and tuberculosis), the hunt was on for ways to extend protection to mass populations.  The concept of “herd immunity”, unflattering as it sounds, dates from these discoveries–and it has proven value.  With preventive vaccination and the gradual improvement in living conditions, the threat from contagion, which once made living in modern cities more precarious than staying on the farm, declined dramatically. (The exception is tuberculosis, and the reason is alarming.  We’ll get to that.)

All-Cause-Mortality-and-Top-10_USA-e1402597040445

The next big advance came in the realm of treating infection once people were sick.  Until the 1930s, so-called sulfa drugs were the best anti-microbial agents available; they inhibited the growth of bacteria but did not kill them.  Then, by an amazing but inspired ‘accident,’ Alexander Fleming, a Scottish researcher in London, realized that a product distilled from the penicillium mold could destroy bacteria.  The collaboration that followed established the outlines of the government-backed drug development and testing system that still exists.  Once again, military exigencies provided a key impetus.  Between 1940 and 1945, production of penicillin moved from Oxford laboratories to Pfizer manufacturing facilities; output increased from tiny samples to 100 billion units per month.  The same approach, mobilizing the resources of private drug companies with the help of government, still underlies the creation of new bactericides and vaccines.  Some antiviral immunizations given today may even prevent cancer from developing in the future.  It looks as if achieving ever-better health for all, even in peacetime, depends upon continuing this practice.

Unfortunately there are counter-currents.  Leaving aside the eternal debate about who pays and how much, medicine in the late Boomer era faces a distressing malaise: a combination of complacency and distrust of expertise.  Modern anti-vaccine movements and the emergence of drug-resistant ‘superbugs’ are symptoms of this condition, and they ought to worry any US citizen who grew up in the age of open-heart surgery and the near-eradication of polio.  It’s not just the Boomers’ fault, of course.  It turns out that Anglo-American culture harbors a suspicion of any measure that appears to impinge on individual freedom.  Moreover, the internet and social media can intensify not only the general desire for carefree health but also the omnipresent anxiety of parents concerning their children’s well-being.  It’s pointless to complain about these attitudes–I share them both–but Boomers owe it to themselves and the next generation to promote the habits they were raised with.  One of these is using your head.

Opposition to vaccination has existed almost from the moment it became an option, but it increased when government adopted it as a public health measure.  Individuals as well-respected for their advocacy as Frederick Douglass regarded mandatory vaccination as a threat to liberty, and the idea continues to be espoused by somewhat less famous defenders of freedom.  In 1885,  one anti-vaccination protest in England attracted up to 100,000 marchers.  Ironically, the nation that produced Edward Jenner also sent William Tebb, a popular anti-vaccine activist, across the Atlantic, and in America, the closing decades of the 19th century saw at least three organizations dedicated to ending vaccination programs.  Interestingly, the British Parliament relented in 1898, removing penalties for refusal to vaccinate, whereas the US Supreme Court upheld states’ power to enact compulsory vaccination laws in 1905.  

Anti-Vax-Poster
Image from an anti-vaccination publication from 1892.
The Historical Medical Library of The College of Physicians of Philadelphia

So, the recent campaigns against routine vaccination are only the latest in a history of populist appeals to a fearful and suspicious public.  The response comes from across the political and religious spectrum.  Orthodox Jews and Silicon Valley tech-types seem equally receptive to propaganda about government coercion and harmful side-effects, while remaining oblivious to the long-standing evidence that vaccines have prevented epidemic diseases from killing and disabling millions around the world.  Predictably, such resistance from “anti-vaxxers” has prompted some breathless journalism on the other side.  The internet creates a carnival “house of mirrors” effect in which discussion of data is quickly submerged in clashing accounts of non-events or forthcoming revelations.  (In this arena, there are some crusaders in search of a cause.  Note RFK Jr’s latest charge, facilitated by Twitter.)  Yet the most damaging claims about the ill-effects of vaccines–precisely because they purported to be based on experimental results–appeared in the published study linking MMR shots with autism conducted by Andrew Wakefield, another Brit, by the way, who subsequently crossed the pond to start a new and exciting life.   Although Wakefield’s conclusions were dismissed as “fraudulent” in 2010, it has taken serious researchers another decade to dispel the myth he created. Meanwhile, these concerns no doubt contributed to the renewed outbreaks of measles in California and elsewhere, demonstrating the real consequences of lowering “herd immunity.”  

Admittedly, sifting through the arguments and evidence is tedious and, for the layperson, possibly misleading.  But the call to action about “vaccine hesitancy” is not alarmist.  Violent opposition–the murder of immunization workers in Pakistan or Nigeria–is only the most sensational form of resistance.  Influenza, for instance, still poses a threat to the global population.  The last major pandemic of the disease, in the wake of World War I, killed some 40 million around the world, and the virus that causes it is constantly mutating into new forms.  Annual flu shots are highly recommended to protect the elderly, but in 1918-1919, the virus struck young adults.  Today, fewer than half of American adults overall get the vaccine, and the rate of immunization for those aged 18-49 is less than half that for the Medicare-eligible (about 26% vs. 60% in 2017-18).  

But, if it’s “only the flu,” drugs can always fix it, right?  Not so fast.  The Western world’s record of producing “miracle cures” is not a closed story.  With the globalization of everyday life comes not only the ready transmission of illnesses between continents but also a delicate equilibrium of political and business interests.  It’s not just about the cost of your drug plan, it’s about whether those drugs will work if you get them.  The phenomenon of drug-resistance has been recognized for a long time, but in the past decade the threat has become more visible.  According to a recent study by experts, averting a pandemic requires taking action on several fronts: developing new drugs, reducing use of antibiotics, improved diagnosis of infections, and even better personal hygiene.  Above all, they note, “we need to avoid short-termism,” the idea that the problem is susceptible to quick, permanent solution (71).  By both temperament and experience, we Boomers are prone to exactly that mind-set.

The fundamental issue arises from the ability of bacteria to adapt to change in a frighteningly short time, a new generation every 20 minutes.  Once a particular microbe has been exposed to an antibiotic, it will evolve.  If the new version is not wiped out by drugs, it will continue to reproduce, passing on its defense capability–one study showed that in less than two weeks an unfamiliar drug will produce a strain that can survive concentrations of the antibiotic that are 3,000 times what is normally lethal. (36)  Methicillin-resistant staphylococcus aureus (MRSA) is now a well-known denizen of the nation’s hospitals and has proved difficult to control.  Other bugs have striking resistance to even the strongest drugs.  Against one class of microbes, enterobacteria, most antibiotics are effective less than 10% of the time.  The ones that work can come with toxic side-effects and will probably become obsolete in their turn.  Seemingly minor scratches can prove just as lethal as in pre-penicillin days.

Between 700,000 and 1.5 million people worldwide die each year from drug-resistant infections.  But the toll is properly measured not just in lives but in other costs as well–for instance, higher health expenditures and lost economic productivity–that run into tens of billions per year in the US alone.  The example of tuberculosis, which infected up to 70% of the urban population a century ago, is instructive.  This disease was already linked to a bacillus, but the only treatment then was confinement in a sanatorium; if this helped at all, it might be because it withdrew the patient to a relatively clean environment.  In the 1920s, a vaccine appeared, but it is seldom used in the US.  Instead, treatment is based on testing–every Boomer, one hopes, recalls the four-dot skin puncture–and months of antibiotics.  But the most common drugs are now losing their effectiveness.  Here and abroad, the existing drugs are sometimes improperly prescribed and monitored.  As a result, TB is still a “public health crisis” despite being labeled “curable and preventable.”

The same sad situation applies to other scourges–Ebola, malaria, Dengue fever: there is no proven vaccine, and drug-resistance is growing.  (Anthrax is a better bet, if you’re curious.  It has a three-dose vaccine, and can be cured with as little as 60 days of Cipro.)  In our new interconnected world, geographical location matters a lot less than it used to.  Always a tricky proposition, complete quarantine is now virtually impossible.  If you’re not planning a trip to Central Africa, where many of these diseases abound, there might still be an infected person boarding a plane headed in the opposite direction.  If mere sympathy requires further bolstering from self-interest, many of the  procedures common to “modern” medicine–transplants, joint replacements, cancer therapy–leave the patient susceptible to infection.  New and better antibiotics will be needed just to make these treatments as safe as possible.

So, I have a confession to make.  When I suggested that this topic has nothing to do with current events, I fibbed a little.  “Big Pharma” has become a target of politicians and activists across the spectrum.  No one wants to pay an astronomical price for a drug, especially one that people really need.  But only the pharmaceutical industry possesses the equipment and expertise to create miracle cures, and the process is neither cheap nor guaranteed to produce a money-spinner.  After spending 10 to 15 years in the pipeline, a new invention may have a limited market.  With antibiotics, there is an economic paradox at play.  If a new drug works against “superbugs,” physicians will be reluctant to prescribe it because they will want to save it for the worst cases.  Sales volume will be low, profits likewise.  The most potent treatments will likely be very expensive.  Everyday we debate the morality of denying medical care on account of cost.  If the decision to withhold an experimental treatment or last-chance operation seems cruel–and it is, of course–imagine the outrage if simple lack of money prevented the cure of a bug that used to be knocked off by a week’s worth of pills.

Two significant questions arise from this summary, and they are both controversial.  Who decides whether or not a drug is effective?  Who controls the use of antibiotics?  When you push your doctor to give you an antibiotic for a cold caused by a virus, you are doing your own little bit to spread drug-resistance.  Still, it seems absurd to continue to suffer if a drug could alleviate some of those miserable symptoms.  When drug companies hand out free samples or agricultural feedlots discharge tons of antibiotic-laden waste into the water, they make their own contribution.  But Americans have little appetite for government regulation that might actually end or restrict these practices.  In the country that invented the medical-industrial complex, citizens are still encouraged to speak, join groups, and vote.  We Boomers do all these things, and it’s time to pay attention.

Making the hard decisions is called politics, and it can take more than a Tweet.  Meanwhile, as your mother might have said, “An ounce of prevention….”  Not getting sick in the first place is the best advice, for all of us.  Since infections spread through coughing–or even exhaling hard–cover up.  Set a good example.  Get your shots.  Wash your hands; teach your (grand)children.  Contrary to urban myth, pee is not really sterile, and in fact most surfaces are not.  We all know that “employees must wash hands,” but that won’t make much difference if the rest of us don’t.  Above all, give yourself time in the loo.  Let’s not jeopardize one of humankind’s greatest achievements just because the second half is about to start.

**

News Flash, 6 April 2019:  Drug-Resistance is not just for microbes any more.  “Pretty much unbeatable” is not a phrase we want to hear in this connection.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close