mykeystrokes.com

"Do or Do not. There is no try."

“Forgetting What Religion Is About”: When Did ‘Dependence’ Become A Dirty Word?

Too many Americans—including Christians—are afraid that helping the poor will create ‘dependency.’ They’re forgetting that’s what religion is all about.

Not long ago, I preached a Lenten sermon in which I made a lone reference to food stamps as being one of the ways we “love our neighbors as ourselves.” Judging from the reactions of a few congregants, you might have thought it was all I preached about. They went out of their way to tell me how such programs “breed” complacency, laziness, and—wait for it—dependency.

It reminded me of Rep. Paul Ryan, who’s always carrying on about America’s “culture of dependency,” and just realized a major budget proposal that would slash food stamps and other government measures that relieve the misery of the poorest Americans.

When did “dependence” become such a dirty word? We list our children on our income tax forms as “dependents” without stigmatizing them by such a designation. So why does “dependent” become an accusation when applied to other people’s children when they are in need of food stamp (SNAP) assistance, a free-school-lunch program, or housing assistance to rescue them from being homeless? Why is it wrong for someone blind, disabled, or elderly and frail to be “dependent” upon the society in which he or she lives for the basic necessities, when it is impossible for that person to provide for themselves?

And besides, it’s far from clear that a “culture of dependency” is what America has—in fact, we have something like the opposite. Independence may well be the modern day Golden Calf to which far too many of us bow down and worship. Independence is bound up in our national identity, both personal and corporate. After all, next to our Constitution, it is the Declaration of Independence to which we most often appeal. The rugged individualism which in many ways helped make our nation what it is may also be what is causing us to lose our sense of the common good.

The establishment of a social safety net is the most profoundly religious action a government can take. An underlying principle of the Judeo-Christian faith—indeed of most faith communities—is that God will judge humankind by the way we care for the most vulnerable in our midst. Think of all the people in the world we generally revere: Dr. Martin Luther King, Jr., Gandhi, Clara Barton, Nelson Mandela, Dorothy Day, Albert Schweitzer, Dag Hammarskjold, Mother Teresa. All of them, in one way or another, reached out to the poor, the disenfranchised, and the marginalized, seeking to ease their pain and help bear their burdens.

When a government sets out to seek the common good, it realizes that there will be some among us who are less able to meet all their needs, chief among them housing, food and safety. And it’s not just a few of us who find ourselves in need at some point: as Mark Rank wrote on the New York Times’ Opinionator, “nearly 40 percent of Americans between the ages of 25 and 60 will experience at least one year below the official poverty line during that period ($23,492 for a family of four), and 54 percent will spend a year in poverty or near poverty (below 150 percent of the poverty line).”

Are there undeserving, even fraudulent people receiving welfare/food/housing assistance? Undoubtedly. But as a citizen of this great nation, I am willing to fund the undeserving few who slip by unnoticed and game the system, in order to provide for the many who are truly in need. Many of our national and state legislators seem to want to use the excuse of the undeserving few to gut the social safety net altogether, and by so doing, punish the many who are in real need.

In fact, most of the people who avail themselves of the government’s (in other words, our) social safety net are indeed dependent. Some of them will remain so: children (45 percent), the disabled, and the elderly (20 percent). Many more will remain so until we get serious about offering them the kind of assistance which might lift them out of poverty, like raising the minimum wage.

In 2012, 47 percent of people who received food stamp assistance were in families where at least one person was working. These so-called “working poor” are not lying around in Paul Ryan’s imagined hammock of ease, living off others’ hard work and generally having a grand time of it. They are working one or more jobs, and because of part-time work or low wages and extreme needs, are still not able to provide adequate food and shelter for themselves and their families. Politicians who claim to be “helping” poor people by depriving them of aid are either ignorant or cruel.

For Christians are called to care for our neighbors. Telling the Good Samaritan story, Jesus teaches that all people are our neighbors. And as for a few “getting away with murder,” Jesus reminds his followers that it rains on the just and the unjust alike, and that God will sort it all out in the end. Jews, Muslims, Christians, and followers of nearly every religion believe in helping those in need. So do most humanists and atheists. We are called to respect the dignity of every human being. And yet, we witness professed Christians like Paul Ryan putting forward budgets that would eviscerate our common safety net.

It’s time religious people stood up and laid claim to their desire and responsibility to care for the poor. It’s time to withdraw the stigma and condemnation from those who by necessity must be “dependent” on the rest of us. It should be our joy to serve them.

 

By: V. Gene Robinson, Senior Fellow at the Center for American Progress, Washington, DC, and the Retired IX Episcopal Bishop of New Hampshire; Published in The Daily Beast, April 4, 2014

April 7, 2014 Posted by | Poor and Low Income, Poverty, Religion | , , , , , , | 1 Comment

“Blowing Away The Smoke”: A Democrat-Sponsored Tax Cut Calls The GOP’s Anti-Poverty Bluff

For months now, as congressional Republicans have blocked repeated attempts to extend benefits to the long-term unemployed, as they’ve fought to deny low-income Americans access to health insurance, as they’ve advocated to cut tens of billions from the food stamp program, as they’ve resisted proposals to raise the minimum wage, they have simultaneously professed their commitment to American workers and the poor.

Senator Patty Murray put forth a new test of that commitment on Wednesday, by introducing legislation to expand the Earned Income Tax Credit. The EITC is already one of the largest and most effective anti-poverty programs, rewarding low-wage earners for their work and lightening their tax burden. It’s also one of the very few specific anti-poverty policies Republicans have praised in recent months.

Murray’s bill, the “21st Century Worker Tax Cut Act,” would increase the maximum credit for childless adults and create a new tax deduction for families with two working parents. It’s intended to complement the Democrats’ campaign for a higher minimum wage, and to force Republicans to take a real stand on help for American workers. Given their recent nods towards the EITC, one might reasonably expect Republicans to consider Murray’s proposal seriously. (President Obama also proposed an EITC expansion in his budget for 2015.) Even the tax loopholes Murray proposes closing in order to pay for the expansion have already been singled out for elimination by the Republican chairman of the House Ways and Means Committee, Robert Camp. But these are not reasonable times.

The Republican’s recent expressions of support for expanding the EITC have always seemed more opportunistic than sincere. Rather than actively working to extend the credit to more Americans, the GOP instead uses the EITC as “a protective shield against populist attacks,” as Jonathan Chait put it; specifically, as a counterpoint to calls from the left to raise the minimum wage.

“The minimum wage makes it more expensive for employers to hire low-skilled workers, but the EITC, on the other hand, gives workers a boost—without hurting their prospects,” Representative Paul Ryan said of the EITC in a January speech at the Brookings Institution. “It gives families flexibility—it helps them take ownership of their lives.”

Conservative pundits and academics have taken a similar line. Two economists at the American Enterprise Institute argued last year that “expanding the earned income tax credit is a much more efficient way to fight poverty than increasing the minimum wage.” Steve Moore of the Heritage Foundation argued in favor of a higher EITC in January, as did former Bush advisor Glenn Hubbard. Another former Bush advisor, Harvard economist Gregory Mankiw, wrote recently that the EITC was “distinctly better” than raising the minimum wage because the costs are born by taxpayers rather than employers.

In his own much-hyped poverty speech in January, Senator Marco Rubio advocated for replacing the EITC with a “federal wage enhancement” subsidy. The vague contours of the alternative he proposed suggested that what he had in mind was nearly identical to the EITC, but with more support for people without kids.

Rubio was right to point out that one of the major shortcomings of the current EITC is that it offers minimal assistance to childless workers. As the program operates now, people without children who are under 25 are ineligible, and the maximum credit for those between 25 and 64 is $487. Families with children receive more substantial benefits. In 2011, their average credit was $2,905.

Murray’s bill addresses Rubio’s professed concern for childless workers by lowering the eligibility age to 21 and raising the maximum credit for childless workers to about $1,400. Those changes would benefit thirteen million people, according to a Treasury Department estimate. The legislation also increases support for families with two working parents by allowing a secondary earner to deduct twenty percent of their income from their federal taxes. This could offset childcare, transportation, and other costs associated with entering the workforce, thus encouraging more stay-at-home parents to find jobs. More than seven million families would benefit from this new deduction, according to the Joint Committee on Taxation.

The bill also doubles the penalties for tax payers who fail to comply with the Internal Revenue Service’s “due diligence” requirement, a reform that addresses Republican concerns about the costs of improper claims.

If Republicans really wanted to use the EITC as a vehicle for boosting low wages, this legislation provides an excellent starting point for negotiation. But they’re unlikely to engage with it seriously, because their lauding of the EITC was never serious to begin with. For example, Rubio’s proposal to expand the credit for childless workers would have been accomplished by taking money away from workers with kids, instead of by increasing the size of the program overall.

Republicans will face a tricky situation if Harry Reid brings Murray’s bill up for a vote in the Senate. “If Republicans aren’t interested in supporting this bill, they’ll need to explain why they are rejecting the alternative that they have often pointed to in order to justify opposing raising the minimum wage,” a senior Democratic aide told The Nation.

If recent votes on unemployment insurance are any indication, Republicans are far more likely to risk hypocrisy and find reasons to kill the bill than do any real governing, even on policies they profess to support. If a vote doesn’t accomplish much for low-wage workers, it may at least blow away some of the smoke from the GOP’s show.

 

By: Zoe Carpenter, The Nation, March 26, 2014

March 28, 2014 Posted by | Earned Income Tax Credit, GOP, Poor and Low Income | , , , , , , , | Leave a comment

“Paul Ryan Is Victim-Blaming Men Now”: No, Men Don’t Lack A “Culture Of Work”, They Lack Decent Jobs

Last week Paul Ryan provoked an outcry when he claimed that poverty in America was in large part a product of a “tailspin of culture, in our inner cities in particular, of men not working, just generations of men not even thinking of working, or learning the value and the culture of work.” Ever since the heyday of Ronald Reagan, the phrase “inner city” has been criticized as a GOP dog whistle for “black people,” so Ryan has rightly faced a backlash for his comments. (While claiming they were “inarticulate,” he insists his comments had “nothing to do with race whatsoever.”)

But another aspect of this much-remarked-on incident has drawn no notice: his focus on inner city men. Ryan’s comments seem to be based on an unstated assumption that what he calls the “culture of work” is especially relevant to men.

That assumption in turn is a product of an increasingly anachronistic and indeed reactionary world view, in which working for money is the epitome of what it means to be a man. More precisely, to be a man, on this view, is to work a “real job” — that is, a job that at least pays enough to allow him to be the provider, the breadwinner, for his family.

Ryan’s inner city men, who have never “learned the value and the culture of work,” are therefore not merely failing, but failing specifically as men, by failing to provide for their families.

The problem with this neat little morality tale is captured by what ought to be some startling statistics. Note that another unstated assumption behind comments such as Ryan’s is that the American economy actually produces enough decent-paying jobs to allow a reasonable number of Americans to have such jobs, as long as they embrace “the culture of work.”

To say this isn’t the case is an understatement. What is a “good” job, financially speaking? One which pays $50,000 per year? $40,000? $30,000? The latter figure, which represents take-home pay of less than $2000 per month, and which is only twice the minimum wage (which itself has declined sharply in real terms since the 1960s), is an extremely generous definition of what constitutes a decent-paying job.

But let’s use it anyway, to determine how many Americans of working age have such jobs. If we make a couple more unrealistically optimistic assumptions — that nobody under 18 or over 69 is working, and that no one has more than one job — the answer is: three out of 10.

Nearly 70 percent of American working-age adults do not have jobs that pay at least $30,000 per year, because there are only three such jobs for every 10 American adults between the ages of 18 and 69. In other words, the vast majority of working age Americans cannot possibly acquire decent-paying jobs, even if one defines a decent-paying job extremely broadly, because there aren’t nearly enough such jobs, not because people fail to embrace “the culture of work.”

Here’s another statistic that those who embrace the culture of math will find relevant to Ryan’s claims that inner city men in particular are poor because they have a bad attitude toward gainful employment: the labor force participation rate. This is the percentage of non-institutionalized adults who are either employed or actively seeking work.

The year Paul Ryan’s father reached working age (1948), 86 percent of American men, but only 32 percent of American women, were participating in the labor force. (A large portion of women who worked outside the home were poor, usually non-white, domestic workers. It was fairly unusual for a white middle class woman over 30 to work for income).

Since then, the labor force participation rate among men has declined by 18 percent, while the rate among women has nearly doubled. Another consequence of this social shift is that most men make less money than they did 40 years ago, even though the country as a whole is vastly wealthier: for 60 percent of men, real wages are actually lower now than they were in 1973.

Republicans love to talk about the wisdom of the free market in general and the irresistible laws of supply and demand in particular, but Ryan (who is currently touted as his party’s economic whiz kid) seems to be failing Econ 101. Poverty in America has nothing to do with the shiftless “inner city” men haunting Paul Ryan’s all-too vivid imagination, and everything to do with the fact that seven out of 10 American adults of working age can’t get a decent-paying job, because those jobs don’t exist.

In a culture in which it’s now assumed that every non-elderly adult who isn’t a full-time student or the primary caretaker of small children should be working for wages, this fact has especially devastating consequences for precisely those men whose plight Ryan addressed in such an “inarticulate” way.

 

By: Paul Campos, The Week, March 19, 2014

March 21, 2014 Posted by | Jobs, Paul Ryan, Poverty | , , , , , , , | Leave a comment

“Don’t Buy It”: The “Paid-What-You’re-Worth” Myth

It’s often assumed that people are paid what they’re worth. According to this logic, minimum wage workers aren’t worth more than the $7.25 an hour they now receive. If they were worth more, they’d earn more. Any attempt to force employers to pay them more will only kill jobs.

According to this same logic, CEOs of big companies are worth their giant compensation packages, now averaging 300 times pay of the typical American worker. They must be worth it or they wouldn’t be paid this much. Any attempt to limit their pay is fruitless because their pay will only take some other form.

“Paid-what-you’re-worth” is a dangerous myth.

Fifty years ago, when General Motors was the largest employer in America, the typical GM worker got paid $35 an hour in today’s dollars. Today, America’s largest employer is Walmart, and the typical Walmart workers earns $8.80 an hour.

Does this mean the typical GM employee a half-century ago was worth four times what today’s typical Walmart employee is worth? Not at all. Yes, that GM worker helped produce cars rather than retail sales. But he wasn’t much better educated or even that much more productive. He often hadn’t graduated from high school. And he worked on a slow-moving assembly line. Today’s Walmart worker is surrounded by digital gadgets — mobile inventory controls, instant checkout devices, retail search engines — making him or her quite productive.

The real difference is the GM worker a half-century ago had a strong union behind him that summoned the collective bargaining power of all autoworkers to get a substantial share of company revenues for its members. And because more than a third of workers across America belonged to a labor union, the bargains those unions struck with employers raised the wages and benefits of non-unionized workers as well. Non-union firms knew they’d be unionized if they didn’t come close to matching the union contracts.

Today’s Walmart workers don’t have a union to negotiate a better deal. They’re on their own. And because fewer than 7 percent of today’s private-sector workers are unionized, non-union employers across America don’t have to match union contracts. This puts unionized firms at a competitive disadvantage. The result has been a race to the bottom.

By the same token, today’s CEOs don’t rake in 300 times the pay of average workers because they’re “worth” it. They get these humongous pay packages because they appoint the compensation committees on their boards that decide executive pay. Or their boards don’t want to be seen by investors as having hired a “second-string” CEO who’s paid less than the CEOs of their major competitors. Either way, the result has been a race to the top.

If you still believe people are paid what they’re worth, take a look at Wall Street bonuses. Last year’s average bonus was up 15 percent over the year before, to more than $164,000. It was the largest average Wall Street bonus since the 2008 financial crisis and the third highest on record, according to New York’s state comptroller. Remember, we’re talking bonuses, above and beyond salaries.

All told, the Street paid out a whopping $26.7 billion in bonuses last year.

Are Wall Street bankers really worth it? Not if you figure in the hidden subsidy flowing to the big Wall Street banks that ever since the bailout of 2008 have been considered too big to fail.

People who park their savings in these banks accept a lower interest rate on deposits or loans than they require from America’s smaller banks. That’s because smaller banks are riskier places to park money. Unlike the big banks, the smaller ones won’t be bailed out if they get into trouble.

This hidden subsidy gives Wall Street banks a competitive advantage over the smaller banks, which means Wall Street makes more money. And as their profits grow, the big banks keep getting bigger.

How large is this hidden subsidy? Two researchers, Kenichi Ueda of the International Monetary Fund and Beatrice Weder di Mauro of the University of Mainz, have calculated it’s about eight tenths of a percentage point.

This may not sound like much but multiply it by the total amount of money parked in the ten biggest Wall Street banks and you get a huge amount — roughly $83 billion a year.

Recall that the Street paid out $26.7 billion in bonuses last year. You don’t have to be a rocket scientist or even a Wall Street banker to see that the hidden subsidy the Wall Street banks enjoy because they’re  too big to fail is about three times what Wall Street paid out in bonuses.

Without the subsidy, no bonus pool.

By the way, the lion’s share of that subsidy ($64 billion a year) goes to the top five banks — JPMorgan, Bank of America, Citigroup, Wells Fargo. and Goldman Sachs. This amount just about equals these banks’ typical annual profits. In other words, take away the subsidy and not only does the bonus pool disappear, but so do all the profits.

The reason Wall Street bankers got fat paychecks plus a total of $26.7 billion in bonuses last year wasn’t because they worked so much harder or were so much more clever or insightful than most other Americans. They cleaned up because they happen to work in institutions — big Wall Street banks — that hold a privileged place in the American political economy.

And why, exactly, do these institutions continue to have such privileges? Why hasn’t Congress used the antitrust laws to cut them down to size so they’re not too big to fail, or at least taxed away their hidden subsidy (which, after all, results from their taxpayer-financed bailout)?

Perhaps it’s because Wall Street also accounts for a large proportion of campaign donations to major candidates for Congress and the presidency of both parties.

America’s low-wage workers don’t have privileged positions. They work very hard — many holding down two or more jobs. But they can’t afford to make major campaign contributions and they have no political clout.

According to the Institute for Policy Studies, the $26.7 billion of bonuses Wall Street banks paid out last year would be enough to more than double the pay of every one of America’s 1,085,000 full-time minimum wage workers.

The remainder of the $83 billion of hidden subsidy going to those same banks would almost be enough to double what the government now provides low-wage workers in the form of wage subsidies under the Earned Income Tax Credit.

But I don’t expect Congress to make these sorts of adjustments any time soon.

The “paid-what-your-worth” argument is fundamentally misleading because it ignores power, overlooks institutions, and disregards politics. As such, it lures the unsuspecting into thinking nothing whatever should be done to change what people are paid, because nothing can be done.

Don’t buy it.

 

By: Robert Reich, The Robert Reich Blog, March 13, 2014

March 15, 2014 Posted by | Big Banks, Unions, Wall Street | , , , , , , | 2 Comments

“Killing Germs, Not Jobs”: A New Report Confirms That Business Fears About Paid Sick Day Laws Are Unfounded

Every time the idea of implementing a paid sick days law – which requires that workers earn paid time off to use when they fall ill – gets  floated somewhere, the same thing occurs: Businesses and conservative lawmakers cry bloody murder about the effect the law will supposedly have on small businesses and job creators. Every mom and pop store will have to close, they say! Job creators will flee elsewhere to escape the job-killing mandate! Oh, the humanity! (Check out the Cry Wolf Project for some choice quotes.)

Reality, though, stubbornly refuses to conform to the script. For instance, when San Francisco adopted a paid sick days law in 2007, its job growth actually outperformed surrounding counties that did not have a similar law. (This isn’t to imply that having paid sick leave caused any job growth, just that it didn’t hurt either.) And a new report from the Center on Economic and Policy Research shows that Connecticut experienced much the same thing after becoming the first state to adopt a paid sick days law 18 months ago.

Gathered via both surveys and site visits, the Center’s data show businesses faced extremely modest costs – if any – due to the sick days law. As the Center’s Eileen Appelbaum, Ruth Milkman, Luke Elliott and Teresa Kroeger wrote:

Most employers reported a modest effect or no effect of the law on their costs or business operations; and they typically found that the administrative burden was minimal. … Despite strong business opposition to the law prior to its passage, a year and a half after its implementation, more than three-quarters of surveyed  employers expressed support for the  earned paid sick leave law.

Not only that, but the data show that “in the period since [Connecticut’s law] took effect, employment  levels rose in key sectors covered by the  law, such as hospitality and health services, while employment fell in manufacturing, which is exempt from the law.” Some job killer! Business warnings about employees abusing their sick leave also failed to come true.

On an economic level, this actually makes perfect sense. Sick employees coming to work and infecting others reduces productivity, as does the constant turnover if workers have to quit to recover from an illness or are fired for missing time while sick. In addition, most workers already have paid sick leave, so the disruptive power of applying it to the usually low-income, service sector workers who don’t is low. San Francisco, New York, Seattle, Jersey City and Washington, D.C. all have some form of paid sick leave requirement, and all of them continue to have functioning economies. Plus, paid sick day laws have the added benefit of cutting down on the transmission of diseases, including those of the decidedly deadly variety.

This report is actually the second knock this week to the notion that business regulation automatically increases costs and kills jobs. A Bloomberg News report yesterday noted that in the 15 years since Washington state voted to gradually increase its minimum wage, its job growth has outpaced the national average, with jobs even growing in the sectors thought particularly susceptible to a minimum wage hike, such as food services. Even the recent Congressional Budget Office report showing that a national minimum wage increase would cause some workers to drop out of the labor force or reduce their hours showed benefits that vastly outweigh any cost.

The moral of the story is this: The Econ 101 notion of more regulations or higher mandatory wages automatically translating into fewer jobs and higher business costs doesn’t actually hold true out in the real world. Paid sick days laws actually kill germs, not jobs.

By: Pat Garofalo, Washington Whispers, U. S. News and World Report, March 6, 2014

March 9, 2014 Posted by | Businesses, Jobs | , , , , , , , | 1 Comment