The role of government in the U.S. health care system has been contentious long before the recent emergence of Medicare-For-All proposals among Democratic presidential candidates. Advocates of so-called free-market health care have long described government intervention as “un-American” and “socialist.” Their arguments can perhaps be best summarize in the phrase let’s “get government out and let markets work in health care.” Yet a closer look at the development of the U.S. health care system paints a starkly different picture. Indeed, publicly owned hospitals – that is, hospitals run by local, state, and federal governments – have played an important and substantive role throughout the country’s history. Government has always been extensively involved in the provision of health care in one form or another. And to the surprise of many today, events could have taken a very different turn. The U.S. could potentially even have ended up with a British-style, government-run health care system. Yet, the country went a different route. Instead of expanding, public hospitals have been closing since the 1960s in large numbers. How come? In my recent academic paper on the subject, I analyzed the creation and closure of public hospitals in California, the state with one of the most extensive public hospitals system in the nation. My findings indicate that when state and federal governments extended health coverage through programs like Medicaid and Medicare, all but the most well-resourced local governments in turn began closing their hospitals. My findings bear implications for policy debates today. Advocates for any large-scale health reform effort such as Medicare-For-All should be mindful of the eventual unintended side-effects they may trigger.
This Work is not currently in any collections.