[Note: If you have a layperson’s or a colloquial understanding of insurance industry jargon, you might want to read this prior to reading the post below.]
The government shutdown has once again brought the PPACA – commonly known as Obamacare – into the glaring national spotlight. As I noted earlier this week, this seems as good a time as any to take a look at our healthcare system and how it may or may not be transformed by the President’s signature achievement. As I said on Monday, I plan on making the following arguments:
1Despite what you may have heard from the left, the health-insurance crisis that led to the passage of Obamacare was not the nation’s poor being uninsured. Despite what you may have heard from the right, the nation’s healthcare system was not sustainable, because the way that system is primarily funded had been in the process of radically changing to something different than what it had been for decades.
2Despite its good intentions, Obamacare will not solve the actual crisis it was originally meant to solve. This is because it was largely diverted to tackle the problems that were easy to sell politically, rather than the larger problems that were more difficult to discuss.
3Our healthcare system is going to change radically over the next 10-20 years — and it will change regardless of whether Obamacare stands or is overturned. Despite what politicians of all stripes may say, there are only a limited number of directions we can go — and though each direction has benefits, each will also require sacrifices of some sort that we, collectively, would prefer not to make.
4We would be better off acknowledging the change that’s coming now, and begin to have a discussion about which sacrifices we as a nation wish to make. Otherwise, those changes will happen without our having input, and we may very much dislike what we end up with.
I’ll be discussing those various choices in my next post, as well as the pros and cons each carries. Before we make guesses at the future, however, it is necessary to examine the past. It’s important to be aware of how our current healthcare system evolved into what it is today, and how that evolution is inexorably tied to our predominant health-insurance scheme. It’s also necessary to take a look at other health insurance schemes that were attempted in this country and note why they failed. We should acknowledge the tremendous successes our current system had borne, some of which will suffer as that same system changes. Lastly, we need to recognize that our health insurance and healthcare system has been not only been in crisis for quite some time, it’s actually been changing into something both wholly new and unsustainable.
The seeds of our system’s breakdown are sewn into the very fabric of what has made it so very successful for so very long. Because of this, we need to look fully into our rear view mirror before we hit the accelerator.
Health insurance’s roots in the United States actually go back to the mid-1800s. The two earliest types of policies, accident insurance and sickness insurance, were actually closer to what we now think of as workers compensation. Each primarily existed to partially reimburse wages lost due to injuries and illness. There were many entrepreneurial attempts to form a stable insurer around both of these policies; however, each of them eventually crumbled under the weight of the Law of Adverse Selection. Most men who were healthy and worked in relatively safe occupations chose not to spend their wages on what they saw as an unnecessary and frivolous expense. Those few who would initially purchase annual policies usually let them slip into cancelation at renewal. The eventual pool of risk, therefore, largely consisted of working-class, high-risk, and unhealthy men. Eventually, losses drove up premiums to the point that they were unaffordable to the insurers’ target demographic and the pools collapsed. Much of the demand for these products disappeared after 1917, when the Supreme Court’s ruling on New York City Railway v. White cleared the way for forced participation in government workers compensation schemes. It would be more than a decade after the New York City Railway v. White ruling before what we might call modern health insurance (i.e.: insurance designed primarily to pay hospital bills) was first successfully attempted by Baylor University’s Vice President of Medical School, Justin Ford Kimball.
Kimball was a career school administrator, and had no special training in actuarial mathematics. Because of this it is likely that he “accidently discovered” his Law of Adverse Selection workaround without being aware that the law even existed. Kimball created his insurance plan as a way to reduce the medical school’s bad debt racked up by Baylor teachers who often partook of the adjacent medical school’s facilities without being able to pay subsequent bills. At first blush, Kimball’s scheme was similar to all of those that came before him: Kimball figured that if he could get healthy teachers to pay a small amount each month (50¢, in fact) they could collectively fund up to 21 days of hospital care per teacher over time. Kimball, however, had one advantage his entrepreneurial predecessors did not: He was his potential clients’ employer and could therefore mandate their participation. Kimball’s plan was so successful that within a few years other schools in the Dallas area were joining his fund, which became known as the Blue Cross fund. In 1939, lumber and mining employers in the Pacific Northwest copied Kimball’s model when they formed the Blue Shield fund. Soon, existing life insurers were creating similar schemes, marketed to employers.
Unlike every private health insurance scheme that had come before, these plans thrived over time. And they thrived for the same reason the government-mandated workers compensation schemes did: forced participation of healthy insureds kept the Law of Adverse Selection at bay. Interestingly, doctors and small-government advocates at the time were largely opposed to employer-based schemes for reasons that seem positively oracular in retrospect: They feared that private compulsory health insurance would eventually lead to public compulsory schemes, and that eventually penitents’ choices themselves might be regulated by the government.
Employer-based health insurance schemes grew in popularity throughout the 1940s and early 1950s, but they truly exploded after 1954. That was the year the IRS tax code first made employee and employer contributions to health insurance tax exempt. Employers throughout the United States began using health insurance schemes as a low cost way to compensate employees tax-free. Within two years of the tax code change, almost half of all Americans were covered by private health insurance schemes; by the early 1960s, over three quarters were covered. As America’s middle class burgeoned, it did so fully insured.
And as we reach this moment of bourgeois triumph in our history review, we need to take a moment to stop and survey the early employer-based health insurance landscape back in the day. If you are relatively young, you would be excused for reading the paragraph above and asking, “Wait – health insurance was an inexpensive way to compensate employees?” It was. Premiums back then were around 12% of what they are today, and that’s after taking both inflation and average cost of living indexes into account. There are a number of reasons for this cost discrepancy, but by far the largest is this: The definition of “healthcare” is substantially different today then it was back then.
If you were to go back in time sixty or seventy years to discuss healthcare, you’d find that people would most likely look at you funny. (And not just because your jeans look like they’re about to fall off and you’re wearing your baseball cap all wrong. Also: Get off my lawn!) A huge number of things that you include in your definition of “healthcare” were not included in theirs, such as pre-natal care, treatment to extend the life of the terminally ill for short periods, preventative care, physical therapy, or anything at all having to do with mental health.
More than that, however, the expectations of what healthcare could address were significantly lower. Hospitals today are a place where sick people go to receive medical treatment with the expectation of being made healthy again, but this is a very recent development. Throughout most of US history, hospitals were more akin to today’s hospice centers. They existed largely as places where families with enough funds could put sick and dying members of the household — temporarily or permanently — out of the way, for the comfort of the sick as well as the family. Simply put, at the outset of the health insurance industry premiums were incredibly cheap because medical intervention was rare – even for the sick. This is a larger point than it first appears, because the advent of compulsory healthcare coverage and the American medical treatment and technology boom that began in the mid-20th century are inescapably bound.
As it turned out, insuring the vast majority of Americans provided an astounding benefit to the healthcare industry: an enormous, well-funded customer base that cared more about treatment than cost. Towns that previously could support only a handful doctors could suddenly support numerous clinics, scores of private practices, and perhaps even a hospital to boot. Health insurance is also responsible for the proliferation of specialized medicine, because it created a large market of patients with both the means and willingness to fund it.
At the same time, the vast number of insureds created the ability of private companies to engage in profitable and robust medical research.
Over $100 billion is spent in the United States each year on medical research. The vast preponderance of this research will hit dead ends, and primarily provide data for other researches on what doesn’t work and why. Despite this, research continues (and continues quite profitably, thank you very much) because when that rare breakthrough occurs, it’s a virtual goldmine. But here’s the thing: it’s only a virtual goldmine because it’s assumed that the vast majority of people who could use whatever drug, treatment or device will be able to purchase it. And the overwhelming majority of them can only afford that treatment because of health insurance.
Take any ailment being researched today: cancers, AIDS, heart disease, you name it. Let’s say that creating a breakthrough, working treatment for any of these treatments costs the research industry an average of $75 billion over time to perfect. If 75% of the country is able to pay for these treatments as they are introduced into the market, you can see the appeal to venture capitalists to roll those investment dice. (Especially since it’s hard to imagine a more motivated customer base than someone that needs your product to live.) But what if you knew that less than 1% of the population would ever be able to afford your new cancer/AIDS/artificial-heart treatment were you able to make it work? You’d never come close to getting your $75 billion investment back, let alone make a profit. Why would you invest that kind of money?
Its predominant insurance scheme also gave the US a decided market advantage with medical research over other industrialized countries. While the US was going with employer-based compulsory schemes, the populations of the United Kingdom, Japan, the Scandinavian countries, and others were moving to government-based compulsory schemes. One of the way those country’s were able to make their schemes work without bankrupting their respective governments was by defining certain expenses as not being part of the healthcare system. The US had no such centralized restrictions, and because of this the amount of research here dwarfed research everywhere else. And this isn’t just to the US’s benefit; all countries have benefitted from that research.
I cannot stress this point enough, because it will be very important when we get to our next discussion about what the future holds for the US healthcare system. It is a common argument among some these days that covering those too poor to afford the escalating costs of healthcare insurance can bring upward premium pressure for the rest of the country. And this is true enough, at least to a point. But if this is the argument on which you’re banking your future healthcare system, you need to also be aware of what you’re giving up: a robust research incentive that has driven the most significant medical advances in the past half-century. The smaller you make that pool of potential patients, the less likely people are to invest those billions into the system. And make no mistake; that pool is already shrinking at an exponential rate.
Which brings us back to our history lesson.
As the ability for people to get care grew, so did the demand for that care. In fact, it grew too quickly. The Nixon administration was actually the first to note out loud publically what actuaries had begun to observe a decade earlier. Fueled by the access provided by employer-based insurance, the cost of healthcare wasn’t simply growing faster than inflation rates, it was growing exponentially. The system, declared the administration, was not sustainable and something needed to be done. (Each subsequent presidential administration as well as every major-party presidential candidate over the past 30 years, it should be noted, has repeated this declaration.) Nixon and insurance actuaries knew the lines on graph had to cross; the only question was when they would. As it turns out, the lines began to cross thirty years later, in the 1990s.
If you’re of a certain age, you will immediately recognize the pattern. For years, your employer paid all of your health insurance premiums. And then one year, they couldn’t afford the price increase and took monies out of your paycheck to cover 10% of the latest bump. Before too long, they were splitting each premium increase; then they were forwarding each increase in its entirety on your shoulders. Eventually, health insurance was something that you could either pay for mostly or entirely by yourself through the company health plan, or you could just choose not to be insured.
And just like that, the Law of Adverse Selection caught up with the American healthcare system.
As participation became voluntary and the cost of premiums was pushed on to employees, the predictable happened. Those that were young, healthy, and least likely to need medical intervention chose to pocket the premium. Employees who were older, less healthy, and highly likely to need medical intervention chose to remain, regardless of expense. Of course, the effects began to slowly snowball. Reduced participation and adverse selection would negatively affect renewal rates. (In the last part of the 2000s it was not unusual to deliver the bad news of 40% rate increases to my clients.) As those increases hit, more and more employees made the financial decision to save their money and roll the dice. Sometimes, of course, they would opt back in later. Perhaps they were about to go through a pregnancy, or perhaps they began to have chronic back pains, or perhaps they became freaked out because they began to notice blood in their urine. More and more, employer-based health insurance schemes are becoming a place where only the upper-middle class and those likely to need expensive care choose to participate. When President Obama took his oath of office in 2008, participation in private health insurance schemes had already fallen to levels this country hadn’t seen since the mid-1950s when the IRS tax exemption was initially introduced. Predictably, medical research dollars in the United States are now also in decline.
And herein lies the truth of the healthcare crisis you heard so much about four years ago. The crisis was never that the nation’s most impoverished were uninsured and couldn’t get access to quality healthcare; the nation’s most impoverished have always been uninsured and unable to access quality healthcare. This is not to say that the poor being uninsured and lacking access to quality healthcare isn’t an important issue, or that this state of affairs isn’t morally questionable – it absolutely is. It’s just that it’s absurd to call the conditions we have always happily lived under since the country was founded a “crisis.” The real crisis that had insurers and healthcare policy wonks freaking out was this:
Thanks to the Law of Adverse Selection, we have been rapidly approaching a scenario where the middle class cannot afford health insurance or quality healthcare. Healthcare reform wasn’t implemented to make sure the guy living under the bridge in a cardboard box could get health insurance ten years from now. It was needed to make sure that you could.
If that seems heartless, think of the nation’s healthcare system as a luxury cruise ship. For years we’ve been tanning on the Lido deck, blissfully unaware that there are quite a few fellow passengers who have fallen overboard and are drowning. Now, however, the engine room has caught fire and the ship is starting to sink, and for the first time we’ve glanced over the railing and said, “Huh. People are drowning. That’s not good.” We can argue about the moral imperative of pulling them out of the sea vs. the cost-savings of letting them drown — (or how they’ll never learn to swim if you throw them a line, or whatever your conservative metaphor is) — but before we can save anyone, we really, really need to make sure the ship doesn’t go the way of the Titanic.
At any rate, that’s where we’ve been, and that’s where we are now.
Next up in our discussion: The various directions we might choose to go now (with or without Obamacare), the advantages and disadvantages of each, and why anyone on the right or left that tells you we can have everything we’ve ever wanted in a healthcare system without making difficult and possibly heartbreaking sacrifices is either yanking your chain or doesn’t know what they’re talking about.