By Nathan Rothwell
As the remaining provisions of the Affordable Care Act are phased in over the next several years, the American health care system may be forced to rethink how health insurance is written and delivered in this country.
The United States is unique among industrialized Western nations in that most of its citizens obtain coverage from the private market, primarily from their employers. According to recent statistics, 58% of working Americans under the age of 65 get their health insurance through their employers. While this percentage has steadily decreased over the past several years, a majority of working-age Americans still rely on their employers for insurance, a trend that has persisted for decades.
A quick lesson in history can explain why this is the case. While health insurance policies first made their appearance in the United States during the early 20th century, World War II saw a dramatic rise in employer-sponsored health care plans. This is because wage and price freezes were put in place to tightly regulate the American wartime economy, making it difficult for businesses to attract new workers to replace those who had gone off to war. However, fringe benefits (such as sick leave and employer-sponsored health insurance) were not subject to wage freezing, allowing employers to offer these additional benefits in lieu of additional pay.
This system allowed for 75% of Americans to have some form of health insurance by 1958. For better or worse, the trend of obtaining health insurance as an employment benefit has persisted to present day. While this system is often lauded as a motivator for Americans to find and keep employment, it does nothing to address the needs of those who are unable to work – namely, the sick and elderly. Even poor Americans who are able to work either cannot obtain insurance through an employer, or are required to contribute toward group insurance premiums (which can be quite high for those with chronic health conditions).