Looking back: How US healthcare history shapes what lies ahead


Looking back: How US healthcare history shapes what lies ahead

A look at how historical trends affect product adoption in healthcare.

John Payne November 6, 2017

This is the first in a three-part series looking at forces that will affect healthcare user experience (HcUX) and the work of healthcare designers in 2018.

Any innovator looking to bring new products and services to a market needs to understand how current attitudes and beliefs will affect willingness to adopt. In politics, this idea is called the Overton Window, or “the range of ideas or opinions that the public is willing to accept.” It is generally expressed on a spectrum from more to less conservative. Ideas inside this window of acceptability are mainstream, and those outside can be seen as fringe. One of the functions of a political campaign—or a designer trying to craft an innovative product or service—is to align to this window, or attempt to alter the window’s boundaries to better suit their goals.

In the United States, our government-subsidized employer-based health insurance system is a great example of a mainstream idea, firmly established in the window of acceptability. This system has been in place for generations; it has become the mental model of how healthcare works. The average US healthcare consumer isn’t accustomed to shopping for coverage or evaluating different products and services to address their ailments. They are unaware of the costs incurred—for themselves or their insurer—until after a doctor visit. They accept the reality that personal choices are limited to options available in the plan offered by an employer. The 75-year history of our insurance system has led Americans to think of it as traditional, and proposed changes—even those that benefit them directly—often fall outside the Overton Window.

Looked at through the lens of history; however, the health insurance system we now take for granted in the US came about through an unlikely confluence of tax regulation, Depression-era business ideas, and wartime economic policy.

A look back

At the beginning of the last century, healthcare wasn’t the big industry it is today. The average U.S. citizen spent just $5 a year on healthcare services. It wasn’t a significant part of their lives. But thankfully, medical technology advanced, bringing with it better health, longevity and even improved peace of mind.

As medical technology advanced, the industry grew. Healthcare business leaders, eager to bring their innovations to the public, sought to capitalize on their success. To ensure that American people would benefit from each new discovery, the industry needed to scale. The best way to grow was to find new revenue streams.

“Before the birth of modern medicine, hospitals were poorhouses where the indigent went to die. Then came the advent of effective medicines… Healthcare became much more effective, and much more expensive. Clean hospitals, educated doctors, and real pharmacological research cost money.”

— “Accidents Of History Created U.S. Health System,” NPR

The search for sustainable growth meant that the industry needed a steady stream of customers or at least a steady income they could rely on to develop in ever better treatments. Three key milestones heralded the healthcare system we have today: the first, a new business model that smoothed out revenue streams; the second, a mechanism for widespread adoption; and third, government endorsement that incentivized this new model. The following three milestones have led to the government-subsidized employer-based health insurance system we have today.

1. Prepaid healthcare

in 1929, Blue Cross was invented by administrators at Baylor University Hospital in Dallas, and the first “health insurance” was offered as an experiment. The idea was to entice a larger pool of people to access healthcare services more frequently. This new model began by offering public school teachers the opportunity to pay a small amount every month in return for routine wellness care and coverage for the expensive bill that might result from an unexpected health problem.

2. Employee benefits

During World War II, there was a severe labor shortage as many workers joined the military. Fearing skyrocketing wages and uncontrolled inflation, President Roosevelt signed an executive order that froze pay in place. Many companies then turned to innovative new benefits like health insurance as a way to entice workers to join their ranks.

3. Tax exemptions

Shortly after FDR’s wage freeze, the Internal Revenue Service made employer-based health insurance exempt from taxation to the employee, endorsing this new model with government subsidies and providing a convenient tax shelter for the increasing rolls of US taxpayers. As more and more people took advantage of this model, the percentage of Americans with health insurance rose from 9% to 50% between 1940 and 1950.

The ACA has brought healthcare coverage to an all time high, ensuring that over 90% of Americans have health coverage. But if you were starting from scratch, would you design a system that puts a burden on employers to administer health benefits and endows them with quasi-parental rights, allowing them to choose what healthcare options are available to their employees? In an era where changing jobs every few years is the norm, is it still to anyone’s benefit to require a change of healthcare coverage each time employment changes? Unfortunately, when alternative payment models spur innovation in healthcare delivery, new approaches like digital therapeutics and telemedicine challenge the public’s mental model of how healthcare ought to work. When healthcare reform like the Affordable Care Act seeks to give our healthcare system a 21st-century update by expanding coverage, requiring insurers to overlook pre-existing conditions, and providing an alternative to employer-based care, the Overton Window is the benchmark by which these reforms are measured.

The average American still sees health insurance as a financial benefit of employment and a way to keep more of their hard-earned wages. Through their choices of health plan, our employers are the decision-makers about which healthcare innovations their employees are exposed to. Despite true efficacy, it can be an uphill battle to for a new product or service to overcome the public’s existing mental model and achieve adoption in this convoluted system.

How does this history lesson help us design today’s products and services? To define what lies ahead, it is critical to understand how and why healthcare has evolved into the complex system we have today. To bring this back to the concerns of the healthcare user experience designer, if we want to create successful products and services that truly fit the needs of the healthcare consumer, it is imperative to understand how generations of shared history shapes what the patient, provider, and employer will deem desirable, or even acceptable now and in the future.

This is the first essay in a three-part series looking at forces that will affect the work of healthcare designers in 2018. My next post will outline the current landscape and what it might mean for patients, providers, and the designers who focus on HcUX.

Want to get the latest from Moment?
Sign up for our mailing list.✉️


John Payne

Managing Director