Should America just get rid of health insurers?
Americans don't seem to like insurance companies much, even if they're happy with their health coverage itself. Several Democratic health-care proposals would end private health insurance entirely, an idea that some Americans seem to applaud, until you tell them that it would entail higher taxes and a government-run system. And then they sometimes, but not always, become less sanguine.
Read Full Article »