Megan McArdle asks how many people die from lack of health insurance, and finds that it's hard to say, but is probably a lower number than many of those that get thrown around as potential figures:
Friday, February 12, 2010 :::
To give you an example of what I mean, one of the two studies that went into the most commonly cited number--the roughly 20,000 a year figure from the Institute of Medicine and the Urban Institute--found that the highest mortality was not associated with being uninsured, but being on a government health care program. (the other excluded those patients). This was true even after they'd run all their controls. Given that the bulk of the coverage expansion in both the Senate and the House plans comes from Medicaid expansion, this is a little disturbing.You could raise this with friends who support the Democrats' bills, but if you want to be intellectually honest, you should continue.
But how likely is it that Medicaid is killing people? Possible, I suppose, but not really all that likely. Medicaid and Medicare patients, too, are not like the broader population. The authors in fact recognized this fact in their paper, pointing out that these patients have higher rates of disability--but then failed to address the obvious question this raised about their data on the uninsured.Her main point, in fact, is that almost all* the confounding factors, including those that no researcher thinks to correct for, will bias the number upward because uninsured people are disproportionately higher risk in a lot of other ways.
To my mind probably the single most solid piece of evidence is this: turning 65--i.e., going on Medicare--doesn't reduce your risk of dying. If lack of insurance leads to death, then that should show up as a discontinuity in the mortality rate around the age of 65. It doesn't. There are some caveats--if the effects are sufficiently long term, then it's hard to measure, because of course as elderly people age, their mortality rate starts rising dramatically. But still, there should be some kink in the curve, and in the best data we have, it just isn't there.Life expectancy in the US, incidentally, does improve in comparison to other countries as age increases, though it does so smoothly; life expectancy for octogenarians is higher in the U.S. than in any other country on the planet. I don't know whether I've presented the following thought experiment here before, but it probably won't hurt too badly if I'm repeating myself.
Consider two countries next to each other on an island; one is wealthy, and spends much of its money on the best health care in the world; it also spends some of that money on decent education and safety, and has at least some cultural norms toward paying some attention to the health quality of food and to exercise. The other country is considerably poorer, comparable to the wealthier developing countries, and has a health care system to match, along with poorer education and somewhat poorer eating habits. A researcher decides to look at "regional data" — aggregating for the entire island. What would be observed?
Overall spending will be high, pulled up by that first country. So is overall wealth; note that 2/3 of the GDP of a rich country, by international standards, is still a rich country, and this first country is among the very richest. Overall life expectancy will be sort of middling; as you look at older and older cohorts, though, they consist more and more of residents of the first country, so the aggregate result looks better and better; it might (hypothetically) be around 35th in the world in overall life expectancy, but 12th at age 65 and 1st at age 80. If you look at life expectancy at a young age, it will look much worse than would be extrapolated from the overall level of spending.
There are a lot of situations in which aggregate statistics obscure important heterogeneity, especially in economics; I think it's rarely as significant in interpreting data, though, as in U.S. health care statistics.
* I don't have data on this, but I suspect age would be an exception; I have the impression, especially in the handful of states like New York with "community rating", that young people are more likely to be uninsured than old people, especially adjusting for other factors. In fact, with "community rating", adverse selection is going to be stronger, and might bias the numbers downward (though not necessarily by more than other confounding factors bias it upward). I wonder whether a study of community rating states versus others would show something useful.
I should add that, in a lot of insurance markets, it turns out that adverse selection is overwhelmed by the tendency for cautious people to place a higher value on insurance, just as they do on other means of reducing their risk exposure, so that it's quite possible that the selection effect in general causes the insured to fare better than the uninsured. I would still think this should be less true in community rated states than others, though.
Labels: Health Care
::: posted by dWj at 3:29 PM