“Jobs And Skills And Zombies”: Skills Gap, An Idea That Should Have Been Killed By Evidence But Refuses To Die
A few months ago, Jamie Dimon, the chief executive of JPMorgan Chase, and Marlene Seltzer, the chief executive of Jobs for the Future, published an article in Politico titled “Closing the Skills Gap.” They began portentously: “Today, nearly 11 million Americans are unemployed. Yet, at the same time, 4 million jobs sit unfilled” — supposedly demonstrating “the gulf between the skills job seekers currently have and the skills employers need.”
Actually, in an ever-changing economy there are always some positions unfilled even while some workers are unemployed, and the current ratio of vacancies to unemployed workers is far below normal. Meanwhile, multiple careful studies have found no support for claims that inadequate worker skills explain high unemployment.
But the belief that America suffers from a severe “skills gap” is one of those things that everyone important knows must be true, because everyone they know says it’s true. It’s a prime example of a zombie idea — an idea that should have been killed by evidence, but refuses to die.
And it does a lot of harm. Before we get there, however, what do we actually know about skills and jobs?
Think about what we would expect to find if there really were a skills shortage. Above all, we should see workers with the right skills doing well, while only those without those skills are doing badly. We don’t.
Yes, workers with a lot of formal education have lower unemployment than those with less, but that’s always true, in good times and bad. The crucial point is that unemployment remains much higher among workers at all education levels than it was before the financial crisis. The same is true across occupations: workers in every major category are doing worse than they were in 2007.
Some employers do complain that they’re finding it hard to find workers with the skills they need. But show us the money: If employers are really crying out for certain skills, they should be willing to offer higher wages to attract workers with those skills. In reality, however, it’s very hard to find groups of workers getting big wage increases, and the cases you can find don’t fit the conventional wisdom at all. It’s good, for example, that workers who know how to operate a sewing machine are seeing significant raises in wages, but I very much doubt that these are the skills people who make a lot of noise about the alleged gap have in mind.
And it’s not just the evidence on unemployment and wages that refutes the skills-gap story. Careful surveys of employers — like those recently conducted by researchers at both M.I.T. and the Boston Consulting Group — similarly find, as the consulting group declared, that “worries of a skills gap crisis are overblown.”
The one piece of evidence you might cite in favor of the skills-gap story is the sharp rise in long-term unemployment, which could be evidence that many workers don’t have what employers want. But it isn’t. At this point, we know a lot about the long-term unemployed, and they’re pretty much indistinguishable in skills from laid-off workers who quickly find new jobs. So what’s their problem? It’s the very fact of being out of work, which makes employers unwilling even to look at their qualifications.
So how does the myth of a skills shortage not only persist, but remain part of what “everyone knows”? Well, there was a nice illustration of the process last fall, when some news media reported that 92 percent of top executives said that there was, indeed, a skills gap. The basis for this claim? A telephone survey in which executives were asked, “Which of the following do you feel best describes the ‘gap’ in the U.S. workforce skills gap?” followed by a list of alternatives. Given the loaded question, it’s actually amazing that 8 percent of the respondents were willing to declare that there was no gap.
The point is that influential people move in circles in which repeating the skills-gap story — or, better yet, writing about skill gaps in media outlets like Politico — is a badge of seriousness, an assertion of tribal identity. And the zombie shambles on.
Unfortunately, the skills myth — like the myth of a looming debt crisis — is having dire effects on real-world policy. Instead of focusing on the way disastrously wrongheaded fiscal policy and inadequate action by the Federal Reserve have crippled the economy and demanding action, important people piously wring their hands about the failings of American workers.
Moreover, by blaming workers for their own plight, the skills myth shifts attention away from the spectacle of soaring profits and bonuses even as employment and wages stagnate. Of course, that may be another reason corporate executives like the myth so much.
So we need to kill this zombie, if we can, and stop making excuses for an economy that punishes workers.
By: Paul Krugman, Op-Ed Columnist, The New York Times, March 30, 2014
“Aligned Agenda’s”: The Tea Party and Wall Street Might Not Be Best Friends Forever, But They Are For Now
“Our problem today was not caused by a lack of business and banking regulations,” argued Ron Paul in his 2009 manifesto End the Fed, which outlined a theory of the financial crisis that only implicated government policy and the Federal Reserve, while mocking the idea that Wall Street’s financial engineering and derivatives played any role. “The only regulations lacking were the ones that should have been placed on the government officials who ran roughshod over the people and the Constitution.”
There seems to be some confusion about the relationship between the Tea Party and Wall Street. New York magazine’s Jonathan Chait says the two “are friends after all,” while the Washington Examiner‘s Tim Carney insists that the Tea Party has loosened the business lobby’s “grip on the GOP.” So let’s make this clear: The Tea Party agenda is currently aligned with the Wall Street agenda.
The Tea Party’s theory of the financial crisis has absolved Wall Street completely. Instead, the crisis is interpreted according to two pillars of reactionary thought: that the government is a fundamentally corrupt enterprise trying to give undeserving people free stuff, and that hard money should rule the day. This will have major consequences for the future of reform, should the GOP take the Senate this fall.
On the Hill, it’s hard to find where the Tea Party and Wall Street disagree. Tea Party senators like Mike Lee, Rand Paul, and Ted Cruz, plus conservative senators like David Vitter, have rallied around a one-line bill repealing the entirety of Dodd-Frank and replacing it with nothing. In the House, Republicans are attacking new derivatives regulations, all the activities of the Consumer Financial Protection Bureau, the existence of the Volcker Rule, and the ability of the FDIC to wind down a major financial institution, while relentlessly attacking strong regulators and cutting regulatory funding. This is Wall Street’s wet dream of a policy agenda.
Note the lack of any Republican counter-proposal or framework. The few that have been suggested, such as David Camp’s bank tax or Vitter’s higher capital requirements have gotten no additional support from the right. House Republicans attacked Camp’s plan publicly, and Vitter’s bill lost one of its only two other Republican supporters immediately after it was announced. So why is there a lack of an agenda? Because the Tea Party thinks that Wall Street has done nothing wrong.
The story of the crisis, according to the right, goes like this: The Community Reinvestment Act and other government regulations forced banks into making subprime loans, and the “affordability goals” of government-sponsored enterprises made the rest of the subprime that crashed the economy. The Federal Reserve pumped a credit bubble, as it always does when it tries to push against recessions. In other words, the financial crisis in 2008 was entirely a government creation, and could have been solved by just putting all the financial firms into bankruptcy. There’s no such problem as “shadow banking,” and to whatever extent Wall Street misbehaved, it was only the result of the moral hazard created by the assumption that there would be bailouts. Or as Senator Marco Rubio said in his 2013 State of the Union response, we suffered “a housing crisis created by reckless government policies.”
This narrative is an easy one to believe for people who distrust government, but it’s far from the facts. The CRA didn’t even cover the fly-by-night institutions making the vast majority of subprime loans. The GSEs lost market share during the housing bubble and subprime loans account for less than 5 percent of their losses. Low interest rates likely account for only a quarter of housing price shifts, and even then, low interest rates likely offset capital coming into the country from abroad.
The mainstream account of the crisis, as Dean Starkman pointed out in The New Republic, is that we’re all to blame—or, as Georgetown law professor Adam Levitin wrote in his recent survey of the crisis, that it was a “perfect storm.” Starkman argues that the Everyone-Is-To-Blame narrative is partially responsible for the lack of serious homeowner help in the Home Affordable Modification Program. As he demonstrates in his piece, “there’s a big and growing body of documentation about what happened as the financial system became incentivized to sell as many loans as possible on the most burdensome possible terms.”
The lack of any Republican policy on financial reform is the result of several factors. Mitt Romney thought it would be a liability to put forward his own agenda in 2012. By voting nearly unanimously against Dodd-Frank, Republicans were able to make this moderate, lukewarm response to the crisis look like a partisan takeover of finance (financial reform is hard and may not work, so all the better to have Democrats own the issue so they can be clubbed with it later). Rather than wage total war against Dodd-Frank through partisan outfits, the smartest minds on the right are weakening the law through law firms and K Street. And the conservative infrastructure has been solely focused on privatizing the GSEs completely.
This lack of policy has allowed the far right and Austrian School acolytes to occupy the intellectual space in the party. It’s the minority party for now, but all it takes is a few Senate seats changing hands before the Tea Party narrative becomes the prevailing one on the Hill—and nothing would delight Wall Street more.
By: Mike Konczal, The New Republic, March 21, 2014
“The Wall Street That Cried Wolf”: Banks Complain About Onerous New Regulations While Reaping Record Profits
The headlines have been nothing short of dazzling: “Bank of America profits soar“; “Citigroup’s profits surge“; “Bank boom continues: Goldman Sachs profit doubles.” In fact, the six biggest Wall Street banks – Bank of America, Citigroup, Goldman Sachs, JP Morgan Chase, Wells Fargo and Morgan Stanley – all beat their profit expectations in the most recent quarter, according to results announced over the last week. JP Morgan Chase is even on pace to make $25 billion (yes, billion with a b) this year.
If you’re thinking that these numbers don’t at all square with the ominous warnings of bank executives and lobbyists, who have been saying non-stop that new regulations meant to safeguard the financial system and prevent a repeat of the 2008 financial crisis are going to irreparably harm their ability to do business, you’re right. But that hasn’t stopped the banks’ griping.
The latest iteration of this argument played out after regulators recently announced new rules regarding bank capital – the financial cushion banks must keep on hand to guard against a downturn. Failed presidential candidate turned bank lobbyist Tim Pawlenty, for instance, said that the new rules “will make it harder for banks to lend and keep the economic recovery going.” JP Morgan Chase CEO Jamie Dimon, who has been scaremongering for years about various regulations, warned that the new rules would put U.S. banks at a competitive disadvantage with foreign lenders.
But this same dynamic has been playing out since the Dodd-Frank financial reform law was signed by President Obama in 2010. Banks and their allies complain about onerous new regulations, while at the same time reaping record profits.
And as the New Yorker’s John Cassidy explained, those profits are due to many of the same practices that helped cause the 2008 debacle in the first place: “an emphasis on trading rather than lending, a high degree of leverage, and implicit subsidies from the taxpayer.” That would seem to make the case that new regulations, rather than going too far, have not gone far enough.
Perhaps that’s why banks haven’t been crowing about their new avalanche of profits, and Dimon is even warning about an upcoming profit squeeze. As the Financial Times’ U.S. banking editor Tom Braithwaite explains:
In the next 12 months the Fed will hit the banks with a new flurry of measures. … Those are coming, they are serious and the banks fear them. There is an outside chance that lawmakers will go even further, such as by restoring the split between investment banking and commercial banking known as Glass-Steagall. There is still plenty to play for in deciding how painful the next round of regulations will be.
But, with every earnings season, warnings of calamity look more and more hollow.
One of the major knocks against Dodd-Frank – beyond the obvious one that it left the biggest banks even bigger than they were before the financial crisis – is that it left too much discretion to regulators to write new rules. Corporations and trade organizations familiar with how the agency rule-writing process works are almost inevitably going to have the upper hand in such a system. And there are still so many rules left to be written – some 60 percent, according to the law firm Davis Polk – that Wall Street will have ample opportunity to water the law down to meaninglessness.
But it’s hard to keep saying with a straight face that new regulations will spell doom for the industry when the new rules that are in place so far, which were accompanied by similarly dire warnings, have done nothing to even dent Wall Street’s bottom line. In fact, the huge pile of profits may be the best thing that could have happened for those trying to bring a modicum of sanity back to Wall Street regulation.
By: Pat Garofalo, U. S. News and World Report, July 18, 2013
“The Story Of Our Time”: The Most Crucial Thing To Understand Is The Economy Is Not Like An Individual Family.
Those of us who have spent years arguing against premature fiscal austerity have just had a good two weeks. Academic studies that supposedly justified austerity have lost credibility; hard-liners in the European Commission and elsewhere have softened their rhetoric. The tone of the conversation has definitely changed.
My sense, however, is that many people still don’t understand what this is all about. So this seems like a good time to offer a sort of refresher on the nature of our economic woes, and why this remains a very bad time for spending cuts.
Let’s start with what may be the most crucial thing to understand: the economy is not like an individual family.
Families earn what they can, and spend as much as they think prudent; spending and earning opportunities are two different things. In the economy as a whole, however, income and spending are interdependent: my spending is your income, and your spending is my income. If both of us slash spending at the same time, both of our incomes will fall too.
And that’s what happened after the financial crisis of 2008. Many people suddenly cut spending, either because they chose to or because their creditors forced them to; meanwhile, not many people were able or willing to spend more. The result was a plunge in incomes that also caused a plunge in employment, creating the depression that persists to this day.
Why did spending plunge? Mainly because of a burst housing bubble and an overhang of private-sector debt — but if you ask me, people talk too much about what went wrong during the boom years and not enough about what we should be doing now. For no matter how lurid the excesses of the past, there’s no good reason that we should pay for them with year after year of mass unemployment.
So what could we do to reduce unemployment? The answer is, this is a time for above-normal government spending, to sustain the economy until the private sector is willing to spend again. The crucial point is that under current conditions, the government is not, repeat not, in competition with the private sector. Government spending doesn’t divert resources away from private uses; it puts unemployed resources to work. Government borrowing doesn’t crowd out private investment; it mobilizes funds that would otherwise go unused.
Now, just to be clear, this is not a case for more government spending and larger budget deficits under all circumstances — and the claim that people like me always want bigger deficits is just false. For the economy isn’t always like this — in fact, situations like the one we’re in are fairly rare. By all means let’s try to reduce deficits and bring down government indebtedness once normal conditions return and the economy is no longer depressed. But right now we’re still dealing with the aftermath of a once-in-three-generations financial crisis. This is no time for austerity.
O.K., I’ve just given you a story, but why should you believe it? There are, after all, people who insist that the real problem is on the economy’s supply side: that workers lack the skills they need, or that unemployment insurance has destroyed the incentive to work, or that the looming menace of universal health care is preventing hiring, or whatever. How do we know that they’re wrong?
Well, I could go on at length on this topic, but just look at the predictions the two sides in this debate have made. People like me predicted right from the start that large budget deficits would have little effect on interest rates, that large-scale “money printing” by the Fed (not a good description of actual Fed policy, but never mind) wouldn’t be inflationary, that austerity policies would lead to terrible economic downturns. The other side jeered, insisting that interest rates would skyrocket and that austerity would actually lead to economic expansion. Ask bond traders, or the suffering populations of Spain, Portugal and so on, how it actually turned out.
Is the story really that simple, and would it really be that easy to end the scourge of unemployment? Yes — but powerful people don’t want to believe it. Some of them have a visceral sense that suffering is good, that we must pay a price for past sins (even if the sinners then and the sufferers now are very different groups of people). Some of them see the crisis as an opportunity to dismantle the social safety net. And just about everyone in the policy elite takes cues from a wealthy minority that isn’t actually feeling much pain.
What has happened now, however, is that the drive for austerity has lost its intellectual fig leaf, and stands exposed as the expression of prejudice, opportunism and class interest it always was. And maybe, just maybe, that sudden exposure will give us a chance to start doing something about the depression we’re in.
By: Paul Krugman, Op-Ed Columnist, The New York Times, April 28, 2013
In retrospect, George W. Bush’s legacy doesn’t look as bad as it did when he left office. It looks worse.
I join the nation in congratulating Bush on the opening of his presidential library in Dallas. Like many people, I find it much easier to honor, respect and even like the man — now that he’s no longer in the White House.
But anyone tempted to get sentimental should remember the actual record of the man who called himself The Decider. Begin with the indelible stain that one of his worst decisions left on our country’s honor: torture.
Hiding behind the euphemism “enhanced interrogation techniques,” Bush made torture official U.S. policy. Just about every objective observer has agreed with this stark conclusion. The most recent assessment came this month in a 576-page report from a task force of the bipartisan Constitution Project, which stated that “it is indisputable that the United States engaged in the practice of torture.”
We knew about the torture before Bush left office — at least, we knew about the waterboarding of three “high-value” detainees involved in planning the 9/11 attacks. But the Constitution Project task force — which included such figures as Asa Hutchinson, who served in high-ranking posts in the Bush administration, and William Sessions, who was FBI director under three presidents — concluded that other forms of torture were used “in many instances” in a manner that was “directly counter to values of the Constitution and our nation.”
Bush administration apologists argue that even waterboarding does not necessarily constitute torture and that other coercive — and excruciatingly painful — interrogation methods, such as putting subjects in “stress positions” or exposing them to extreme temperatures, certainly do not. The task force strongly disagreed, citing U.S. laws and court rulings, international treaties and common decency.
The Senate intelligence committee has produced, but refuses to make public, a 6,000-page report on the CIA’s use of torture and the network of clandestine “black site” prisons the agency established under Bush. One of President Obama’s worst decisions upon taking office in 2009, in my view, was to decline to convene some kind of blue-ribbon “truth commission” to bring all the abuses to light.
It may be years before all the facts are known. But the decision to commit torture looks ever more shameful with the passage of time.
Bush’s decision to invade and conquer Iraq also looks, in hindsight, like an even bigger strategic error. Saddam Hussein’s purported weapons of mass destruction still have yet to be found; nearly 5,000 Americans and untold Iraqis sacrificed their lives to eliminate a threat that did not exist.
We knew this, of course, when Obama became president. It’s one of the main reasons he was elected. We knew, too, that Bush’s decision to turn to Iraq diverted focus and resources from Afghanistan. But I don’t think anyone fully grasped that giving the Taliban a long, healing respite would eventually make Afghanistan this country’s longest or second-longest war, depending on what date you choose as the beginning of hostilities in Vietnam.
And it’s clear that the Bush administration did not foresee how the Iraq experience would constrain future presidents in their use of military force. Syria is a good example. Like Saddam, Bashar al-Assad is a ruthless dictator who does not hesitate to massacre his own people. But unlike Saddam, Assad does have weapons of mass destruction. And unlike Saddam, Assad has alliances with the terrorist group Hezbollah and the nuclear-mad mullahs in Iran.
I do not advocate U.S. intervention in Syria, because I fear we might make things worse rather than better. But I wonder how I might feel — and what options Obama might have — if we had not squandered so much blood and treasure in Iraq.
Bush didn’t pay for his wars. The bills he racked up for military adventures, prescription-drug benefits, the bank bailout and other impulse purchases helped create the fiscal and financial crises he bequeathed to Obama. His profligacy also robbed the Republican Party establishment of small-government credibility, thus helping give birth to the tea party movement. Thanks a lot for that.
As I’ve written before, Bush did an enormous amount of good by making it possible for AIDS sufferers in Africa to receive antiretroviral drug therapy. This literally saved millions of lives and should weigh heavily on one side of the scale when we assess The Decider’s presidency. But the pile on the other side just keeps getting bigger.
By: Eugene Robinson, Opinion Writer, The Washington Post, April 25, 2013