As the nation grapples with a jobs crisis and unemployment hovers near 9 percent, it is easy for policy makers to forget the plight of those who work but earn very little. There are about 4.4 million workers earning the minimum wage or less, according to government statistics. This amounts to about 6 percent of workers paid by the hour. They need a raise.
Today, a worker laboring 40 hours a week nonstop throughout the year for the federal minimum wage could barely keep a family of two above the federal poverty line. Though it rose to $7.25 an hour in 2009, up $2.10 since 2006, the minimum wage is still lower than it was 30 years ago, after accounting for inflation. It amounts to about $1.50 an hour less, in today’s money, than it did in 1968, when Martin Luther King Jr. and Robert Kennedy were killed, Richard Nixon was elected president and the economy was less than a third of its present size.
The minimum wage has many opponents among big business and Congressional Republicans. In Nevada, the Las Vegas Chamber of Commerce is pushing to repeal the state’s minimum wage, a whopping $8.25 an hour. Representative Darrell Issa, the California Republican, has proposed a bill in the House that would effectively cut the minimum wage in states where it was higher than the federal threshold by allowing employers to count health benefits toward wages.
Opponents argue that raising the minimum wage would inevitably lead to higher unemployment, prompting companies to cut jobs and decamp to cheaper labor markets. It is particularly bad, the argument goes, to raise it in a weak labor market. Yet with unemployment likely to remain painfully high for years to come, this argument amounts to a promise that the working poor will remain poor for a long time.
What’s more, we know now that the argument is grossly overstated. Over the past 15 years, states and cities around the country have rushed ahead of the federal government to impose higher minimum wages. Economists analyzing the impact of the increases on jobs have concluded that moderate increases have no discernible impact on joblessness. Employers did not rush off to cheaper labor markets in the suburbs or across state lines for a simple reason: that costs money too.
The most recent research, by John Schmitt and David Rosnick at the Center for Economic and Policy Research, found that San Francisco’s minimum wage jump to $8.50 in 2004 — well above the state minimum of $6.75 — improved low-wage workers’ incomes and did not kill jobs. An even bigger jump in Santa Fe, N.M., the same year — from $5.15 to $8.50 — had a similar effect.
Despite evidence to the contrary, businesses and Republicans may keep pushing against the minimum wage — using the jobs crisis now to clinch their argument. They should be disregarded, because their argument is wrong and the United States is too rich to tolerate such an underclass.
By: Editorial, The New York Times, March 25, 2011
Florida Gov. Rick Scott is one of the most entertainingly shameless figures in American political life. In the 1990s, Scott headed Columbia/HCA Healthcare, the largest for-profit hospital in America. While Scott was running Columbia/HCA Healthcare, it got involved in a bit — okay, a lot — of fraud. As Forbes reported, the company “increased Medicare billings by exaggerating the seriousness of the illnesses they were treating. It also granted doctors partnerships in company hospitals as a kickback for the doctors referring patients to HCA. In addition, it gave doctors ‘loans’ that were never expected to be paid back, free rent, free office furniture, and free drugs from hospital pharmacies.”
The scale of the fraud was so immense that Columbia/HCA Healthcare ended up paying more than $2 billion (PDF) back to the federal government in the single largest fraud case in history. (The previous record holder? Drexel Burnham.) Scott resigned shortly before the judgment came down.
Today, Scott is enjoying a second act as governor of Florida. And, as Suzy Khimm reports, he doesn’t seem all that chastened. Before running for office, he turned his $62 million stake in Solantic, the urgent-care clinic chain he founded after resigning from Columbia/HCA Healthcare, over to a trust in his wife’s name. Solantic doesn’t take traditional Medicaid, but it does work with the private HMOs that, under a 2005 pilot program, were allowed to contract with Medicaid. And Scott is now pushing a bill that would expand that program across the state making those HMOs — the ones Solantic works with — the norm for Medicaid.
Asked about the apparent conflict of interest, Scott said, “If you look at everything that I want to accomplish in health care in Florida is basically what I’ve believed all my life. I believe in the principle that if you have more competition it will drive down the prices.” And I believe him. But he could have sold his stake in Solantic when he got into government. Since he didn’t, the fact remains that Scott is pushing a policy his family stands to profit from immensely . Which is, for Scott, real progress. In the 1990s, he made his money off single-payer health-care programs by cheating them. Today, he’s making his money off single-payer health-care programs by running them. No matter how you look at it, it’s a step up.
By: Ezra Klein, The Washington Post, March 25, 2011
The latest technique used by conservatives to silence liberal academics is to demand copies of e-mails and other documents. Attorney General Kenneth Cuccinelli of Virginia tried it last year with a climate-change scientist, and now the Wisconsin Republican Party is doing it to a distinguished historian who dared to criticize the state’s new union-busting law. These demands not only abuse academic freedom, but make the instigators look like petty and medieval inquisitors.
The historian, William Cronon, is the Frederick Jackson Turner and Vilas research professor of history, geography and environmental studies at the University of Wisconsin, and was recently elected president of the American Historical Association. Earlier this month, he was asked to write an Op-Ed article for The Times on the historical context of Gov. Scott Walker’s effort to strip public-employee unions of bargaining rights. While researching the subject, he posted on his blog several critical observations about the powerful network of conservatives working to undermine union rights and disenfranchise Democratic voters in many states.
In particular, he pointed to the American Legislative Exchange Council, a conservative group backed by business interests that circulates draft legislation in every state capital, much of it similar to the Wisconsin law, and all of it unmatched by the left. Two days later, the state Republican Party filed a freedom-of-information request with the university, demanding all of his e-mails containing the words “Republican,” “Scott Walker,” “union,” “rally,” and other such incendiary terms. (The Op-Ed article appeared five days after that.)
The party refuses to say why it wants the messages; Mr. Cronon believes it is hoping to find that he is supporting the recall of Republican state senators, which would be against university policy and which he denies. This is a clear attempt to punish a critic and make other academics think twice before using the freedom of the American university to conduct legitimate research.
Professors are not just ordinary state employees. As J. Harvie Wilkinson III, a conservative federal judge on the Fourth Circuit Court of Appeals, noted in a similar case, state university faculty members are “employed professionally to test ideas and propose solutions, to deepen knowledge and refresh perspectives.” A political fishing expedition through a professor’s files would make it substantially harder to conduct research and communicate openly with colleagues. And it makes the Republican Party appear both vengeful and ridiculous.
By: The New York Times, Editorial, March 25, 2011
One hundred years ago, during the last great American conniption over immigration, the United States government went to unheard-of effort and expense to peer deep into the bubbling melting pot to find out, as this paper put it, “just what is being melted.”
A commission led by Senator William Dillingham, a Republican of Vermont, spent four years and $1 million on the project. Hundreds of researchers crisscrossed the country bearing notebooks and the latest scientific doctrines about race, psychology and anatomy.
They studied immigrants in mining and manufacturing, in prisons and on farms, in charity wards, hospitals and brothels. They drew maps and compared skulls. By 1911, they published the findings in 41 volumes, including a “Dictionary of Races or Peoples,” cataloging the world not by country but by racial pedigree, Abyssinians to Zyrians.
Forty-one volumes, all of it garbage.
The Dillingham Commission is remembered today, if it is remembered at all, as a relic of the age of eugenics, the idea that humanity can be improved through careful breeding, that inferior races muddy the gene pool. In this case, it was the swelling multitudes from southern and eastern Europe — Italians, Russians, Jews, others — who kept America’s Anglo-Saxons up at night.
I pored over the brittle pages of the report recently at the New York Public Library (they are available online). It was a cold plunge back to a time before white people existed — as a generic category, that is. Europeans were a motley lot then. Caucasians could be Aryan, Semitic or Euskaric; Aryans could be Teutonic, Celtic, Slavonic, Iranic or something else. And that was before you got down to Ruthenians and Russians, Dalmatians and Greeks, French and Italians. Subdivisions had subdivisions. And race and physiognomy controlled intelligence and character.
“Ruthenians are still more broadheaded than the Great Russians,” we learn. “This is taken to indicate a greater Tartar (Mongolian) admixture than is found among the latter, probably as does also the smaller nose, more scanty beard, and somewhat darker complexion.” Bohemians “are the most nearly like Western Europeans of all the Slavs.” “Their weight of brain is said to be greater than that of any other people in Europe.”
See if you can identify these types:
A) “cool, deliberate, patient, practical,” “capable of great progress in the political and social organization of modern civilization.”
B) “excitable, impulsive, highly imaginative,” but “having little adaptability to highly organized society.”
C) possessing a “sound, reliable temperament, rugged build and a dense, weather-resistant wiry coat.”
A) is a northern Italian. B) is a southern Italian. C) is a giant schnauzer, according to the American Kennel Club. I threw that in, just for comparison.
The commission had many recommendations: bar the Japanese; set country quotas; enact literacy tests; impose stiff fees to keep out the poor.
These poison seeds bore fruit by the early 1920s, with literacy tests, new restrictions on Asians and permanent quotas by country, all to preserve the Anglo-Saxon national identity that was thought to have existed before 1910.
It’s hard not to feel some gratitude when reading the Dillingham reports. Whatever else our government does wrong, at least it no longer says of Africans: “They are alike in inhabiting hot countries and in belonging to the lowest division of mankind from an evolutionary standpoint.”
But other passages prompt the chill of recognition. Dillingham’s spirit lives on today in Congress and the states, in lawmakers who rail against immigrants as a class of criminals, an invading army spreading disease and social ruin.
Who brandish unlawful status as proof of immigrants’ moral deficiency rather than the bankruptcy of our laws. Who condemn “illegals” but refuse to let anyone become legal. And who forget what generations of assimilation and intermarriage have shown: that today’s scary aliens invariably have American grandchildren who know little and care less about the old country.
It’s no longer acceptable to mention race, but fretting about newcomers’ education, poverty and assimilability is an effective substitute. After 100 years, we’re a better country, but still frightened by old shadows.
By: Lawrence Downs, Editorial Observer, The New York Times, March 25, 2011