If you want to understand how to fix today’s health insurance system, you’d be smart to look first at how it was born. How did Americans end up with a system in which employers pay for our health insurance? After all, they don’t pay for our groceries or our gas.
It turns out there never was any central logic at work. The evolution of the American health care system began in the 1920s, when choices boiled down to which crazy cure you preferred.
Dr. John Brinkley, for instance, was a huge hit in American radio with his health advice shows. For whatever problem folks had, Brinkley had one fabulous solution: transplant a goat gland into your body. He pitched it as being perfect for everything from dementia to impotence to flatulence. But if, somehow, a goat gland didn’t cure your ills, you could always use Bonnore’s Electro Magnetic Bathing Fluid or Clark Stanley’s Snake Oil Liniment.
In that era, most medical care in the U.S. was basically medieval — a bunch of potions that did nothing. Luckily, though, they were cheap potions. Health care was a trivial part of the average person’s annual budget. In 1900, the average American spent $5 a year on health care ($100 in today’s money). No one had health insurance, because you don’t need insurance for something that costs $5 a year.
The First Health Insurance
Before the birth of modern medicine, hospitals were poorhouses where the indigent went to die. Then came the advent of effective medicines, especially antibiotics, along with a revolution in medical schools.
Suddenly, says economic historian Melissa Thomasson, “hospitals are marketing themselves as places to have babies.” The professor at the Miami University in Ohio says that in the early part of the 20th century, hospitals were able to focus on happy outcomes.
Health care became much more effective, and much more expensive. Clean hospitals, educated doctors and real pharmacological research cost money. People proved willing to pay for care when they were really sick, but it wasn’t yet common to go for checkups or survivable illnesses.
By the late 1920s, hospitals noticed most of their beds were going empty every night. They wanted to get people who weren’t deathly ill to start coming in.
An official at Baylor University Hospital in Dallas noticed that Americans, on average, were spending more on cosmetics than on medical care. “We spend a dollar or so at a time for cosmetics and do not notice the high cost,” he said. “The ribbon-counter clerk can pay 50 cents, 75 cents or $1 a month, yet it would take about 20 years to set aside [money for] a large hospital bill.”
The Baylor hospital started looking for a way to get regular folks in Dallas to pay for health care the same way they paid for lipstick — a tiny bit each month. Hospital officials started small, offering a deal to a group of public school teachers in Dallas. They offered a plan for the teachers to pay 50 cents each month in exchange for Baylor picking up the tab on hospital visits.
When the Great Depression hit, almost every hospital in the country saw its patient load disappear. The Baylor idea became hugely popular. It eventually got a name: Blue Cross.
“When I actually started studying this stuff, I got interested because I wondered why we have an employer-based system,” Thomasson says. “It comes right out of Blue Cross.” The genius of that approach, she says, was marketing it to groups of workers.
The Modern System Is Born
Soon, Blue Cross coverage was available in almost every state, though not many people bought in. The modern system of getting benefits through a job required another catalyst: World War II. Thomasson says that if the Great Depression inadvertently inspired the spread of employer-based health insurance, World War II accidentally spread the idea everywhere.
“The war economy is an entirely different ballgame,” Thomasson says. The government rationed goods even as factories ramped up production and needed to attract workers. Factory owners needed a way to lure employees. She explains that the owners turned to fringe benefits, offering more and more generous health plans.
The next big step in the evolution of health care was also an accident. In 1943, the Internal Revenue Service ruled that employer-based health care should be tax free. A second law, in 1954, made the tax advantages even more attractive.
Thomasson cites the huge impact of those measures on plan participation. “You start from 9 percent of the population in 1940 to 63 percent in 1953,” she says. “Everybody starts getting in on it. It just grows by gangbusters. By the 1960s, 70 percent [of the population] is covered by some kind of private, voluntary health insurance plan.”
Thus employer-based insurance, which started with Blue Cross selling coverage to Texas teachers and spread because of government price controls and tax breaks, became our system. By the mid-1960s, Thomasson says, Americans started to see that system — in which people with good jobs get health care through work and almost everyone else looks to government — as if it were the natural order of things.
But to Thomasson and other economic historians, there’s nothing natural or inevitable about it. Instead, they see it as the profound result of historical accidents.
The federal government had no permissible federal interest in denying benefits to same-sex couples under the Defense of Marriage Act (DOMA), the 1st U.S. Circuit Court of Appeals in Boston ruled Thursday.
Congress’ effort to “put a thumb on the scales and influence a state’s decision as to how to shape its own marriage laws” should subject the justifications of the law to greater scrutiny, the three-judge panel ruled. Tradition alone, the court said, wasn’t enough of a reason to deny same-sex couples federal benefits.
“For 150 years, this desire to maintain tradition would alone have been justification enough for almost any statute,” the ruling stated. ”But Supreme Court decisions in the last fifty years call for closer scrutiny of government action touching upon minority group interests and of federal action in areas of traditional state concern.”
WASHINGTON — The world’s air has reached what scientists call a troubling new milestone for carbon dioxide, the main global warming pollutant.
Monitoring stations across the Arctic this spring are measuring more than 400 parts per million of the heat-trapping gas in the atmosphere. The number isn’t quite a surprise, because it’s been rising at an accelerating pace. Years ago, it passed the 350 ppm mark that many scientists say is the highest safe level for carbon dioxide. It now stands globally at 395.
So far, only the Arctic has reached that 400 level, but the rest of the world will follow soon.
“The fact that it’s 400 is significant,” said Jim Butler, global monitoring director at the National Oceanic and Atmospheric Administration’s Earth System Research Lab in Boulder, Colo. “It’s just a reminder to everybody that we haven’t fixed this and we’re still in trouble.”
Carbon dioxide is the chief greenhouse gas and stays in the atmosphere for 100 years. Some carbon dioxide is natural, mainly from decomposing dead plants and animals. Before the Industrial Age, levels were around 275 parts per million.
This forgotten crisis is called biodiversity loss.
Biodiversity is a contraction of two words—biological, referring to life; and diversity, meaning variety.
Biodiversity is the variety of life on Earth. It encompasses all life forms, from the smallest microorganism to the biggest whale. Biodiversity is the web of life that includes the full-range of ecosystems, the species that live in them, and the genetic variety of those species produced by nature or shaped by humans.
[Conservative policymakers] agreed on a new tax plan that will sharply cut income taxes for wealthy state residents while at the same time raising taxes on the poor. The result, predictably, will be a shortfall in state revenue that will undoubtedly force additional cuts to state services.
The Center on Budget and Policy Priorities provides the analysis, but you don’t have to trust the left-leaning think tank for the spin. A newly formed group of retired Kansas Republican legislators are also declaring that enough is enough. The bottom line is this: If you’re wealthy enough and smart enough to structure your business affairs correctly, you can avoid both corporate taxes and income taxes. But if you’re poor, you will have to choose between whether you qualify for the Earned Income Tax Credit, or a state-funded rebate on sales taxes charged on groceries. One or the other! Not both! Because if there is a tax loophole that favors working-class Americans, we’d better close it!
“The study concludes that media sources have a significant impact on the number of questions that people were able to answer correctly,” wrote Cassino and his colleagues. “The largest effect is that of Fox News: all else being equal, someone who watched only Fox News would be expected to answer just 1.04 domestic questions correctly—a figure which is significantly worse than if they had reported watching no media at all. On the other hand, if they listened only to NPR, they would be expected to answer 1.51 questions correctly.”
This should come as no surprise if you follow Fox. Consider some recent history. Fox and Friends host Steve Doocy invented a quotation from President Obama completely out of thin air. He falsely claimed that Obama had said he and Michelle were not born with silver spoons in their mouths “unlike some people,” in reference to Mitt Romney’s privileged upbringing. In fact, Obama did not say “unlike some people” and he has been using the silver spoon line for years. Several other news outlets repeated Doocy’s assertion as fact and Doocy initially avoided correcting the record after it was revealed he was wrong. Eventually he admittedthat he “seemed to misquote” Obama, instead of stating that he did, in fact, misquote him. And he did not apologize for the error.