An Oregon Study Casts Doubt On Whether Health Insurance Improves Health By Michael Barone
Does having health insurance make people healthier? It's widely assumed that it does.
Obamacare advocates repeatedly said that its expansion of Medicaid would save thousands of lives a year. Obamacare critics seldom challenged the idea that increased insurance coverage would improve at least some people's health.