“The Tragic History Of Race Wars”: 70 Years After A Flash Of Soundless Light Blasted Away 60,000 Lives
He wanted to start a race war.
That, you will recall, was what authorities say white supremacist Dylann Roof had in mind when he shot up a storied African-American church in June. It might have surprised him to learn that we’ve already had a race war.
No, that’s not how one typically thinks of World War II, but it takes only a cursory consideration of that war’s causes and effects to make the case. Germany killed 6 million Jews and rampaged through Poland and the Soviet Union because it considered Jews and Slavs subhuman. The Japanese stormed through China and other Asian outposts in the conviction that they were a superior people and that Americans, as a decadent and mongrel people, could do nothing about it.
Meantime, this country was busy imprisoning 120,000 of its citizens of Japanese ancestry in concentration camps and plunging into a war against racial hatred with a Jim Crow military. The American war effort was undermined repeatedly by race riots — whites attacking blacks at a shipyard in Mobile, white servicemen beating up Mexican-Americans in Los Angeles, to name two examples.
So no, it is not a stretch to call that war a race war.
It ended on August 15, 1945. V-J — Victory over Japan — Day was when the surrender was announced, the day of blissfully drunken revelry from Times Square in New York to Market Street in San Francisco. But for all practical purposes, the war had actually ended nine days before — 70 years ago Thursday — in a noiseless flash of light over the Japanese city of Hiroshima. One person who survived — as at least 60,000 people would not — described it as a “sheet of sun.”
The destruction of Hiroshima by an atomic bomb — Nagasaki followed three days later — did not just end the war. It also ushered in a new era: the nuclear age. To those of us who were children then, nuclear power was what turned Peter Parker into a human spider and that lizard into Godzilla.
It was also what air-raid sirens were screaming about when the teacher told you to get down under your desk, hands clasped behind your neck. We called them “drop drills.” No one ever explained to us how putting an inch of laminated particle board between you and a nuclear explosion might save you. None of us ever thought to ask. We simply accepted it, went to school alongside this most terrifying legacy of the great race war, and thought nothing of it.
The world has seen plenty of race wars — meaning tribalistic violence — before and since 1945. Ask the Armenians, the Tutsis, the Darfurians. Ask the Congolese, the Cambodians, the Herero. Ask the Cherokee. The childish urge of the human species to divide itself and destroy itself has splashed oceans of blood across the history of the world.
The difference 70 years ago was the scope of the thing — and that spectacular ending. For the first time, our species now had the ability to destroy itself. We were still driven by the same childish urge. Only now, we were children playing with matches.
This is the fearsome reality that has shadowed my generation down seven decades, from schoolchildren doing drop drills to grandparents watching grandchildren play in the park. And the idea that we might someday forge peace among the warring factions of the planet, find a way to help our kind overcome tribal hatred before it’s too late, has perhaps come to seem idealistic, visionary, naïve, a tired ’60s holdover, a song John Lennon once sang that’s nice to listen to but not at all realistic.
Maybe it’s all those things.
Though 70 years after a flash of soundless light blasted away 60,000 lives, you have to wonder what better options we’ve got. But then, I’m biased.
You see, I have grandchildren playing in the park.
By: Leonard Pitts, Jr., Columnist, The Miami Herald, August 3, 2015
“Rock Bottom Economics”: The Inflation And Rising Interest Rates That Never Showed Up
Six years ago the Federal Reserve hit rock bottom. It had been cutting the federal funds rate, the interest rate it uses to steer the economy, more or less frantically in an unsuccessful attempt to get ahead of the recession and financial crisis. But it eventually reached the point where it could cut no more, because interest rates can’t go below zero. On Dec. 16, 2008, the Fed set its interest target between 0 and 0.25 percent, where it remains to this day.
The fact that we’ve spent six years at the so-called zero lower bound is amazing and depressing. What’s even more amazing and depressing, if you ask me, is how slow our economic discourse has been to catch up with the new reality. Everything changes when the economy is at rock bottom — or, to use the term of art, in a liquidity trap (don’t ask). But for the longest time, nobody with the power to shape policy would believe it.
What do I mean by saying that everything changes? As I wrote way back when, in a rock-bottom economy “the usual rules of economic policy no longer apply: virtue becomes vice, caution is risky and prudence is folly.” Government spending doesn’t compete with private investment — it actually promotes business spending. Central bankers, who normally cultivate an image as stern inflation-fighters, need to do the exact opposite, convincing markets and investors that they will push inflation up. “Structural reform,” which usually means making it easier to cut wages, is more likely to destroy jobs than create them.
This may all sound wild and radical, but it isn’t. In fact, it’s what mainstream economic analysis says will happen once interest rates hit zero. And it’s also what history tells us. If you paid attention to the lessons of post-bubble Japan, or for that matter the U.S. economy in the 1930s, you were more or less ready for the looking-glass world of economic policy we’ve lived in since 2008.
But as I said, nobody would believe it. By and large, policy makers and Very Serious People in general went with gut feelings rather than careful economic analysis. Yes, they sometimes found credentialed economists to back their positions, but they used these economists the way a drunkard uses a lamppost: for support, not for illumination. And what the guts of these serious people have told them, year after year, is to fear — and do — exactly the wrong things.
Thus we were told again and again that budget deficits were our most pressing economic problem, that interest rates would soar any day now unless we imposed harsh fiscal austerity. I could have told you that this was foolish, and in fact I did, and sure enough, the predicted interest rate spike never happened — but demands that we cut government spending now, now, now have cost millions of jobs and deeply damaged our infrastructure.
We were also told repeatedly that printing money — not what the Fed was actually doing, but never mind — would lead to “currency debasement and inflation.” The Fed, to its credit, stood up to this pressure, but other central banks didn’t. The European Central Bank, in particular, raised rates in 2011 to head off a nonexistent inflationary threat. It eventually reversed course but has never gotten things back on track. At this point European inflation is far below the official target of 2 percent, and the Continent is flirting with outright deflation.
But are these bad calls just water under the bridge? Isn’t the era of rock-bottom economics just about over? Don’t count on it.
It’s true that with the U.S. unemployment rate dropping, most analysts expect the Fed to raise interest rates sometime next year. But inflation is low, wages are weak, and the Fed seems to realize that raising rates too soon would be disastrous. Meanwhile, Europe looks further than ever from economic liftoff, while Japan is still struggling to escape from deflation. Oh, and China, which is starting to remind some of us of Japan in the late 1980s, could join the rock-bottom club sooner than you think.
So the counterintuitive realities of economic policy at the zero lower bound are likely to remain relevant for a long time to come, which makes it crucial that influential people understand those realities. Unfortunately, too many still don’t; one of the most striking aspects of economic debate in recent years has been the extent to which those whose economic doctrines have failed the reality test refuse to admit error, let alone learn from it. The intellectual leaders of the new majority in Congress still insist that we’re living in an Ayn Rand novel; German officials still insist that the problem is that debtors haven’t suffered enough.
This bodes ill for the future. What people in power don’t know, or worse what they think they know but isn’t so, can very definitely hurt us.
By: Paul Krugman, Op-Ed Columnist, The New York Times, November 23, 2014
“The Millionaire’s Club Expands”: The Wealthiest 10 Percent Of Americans Own 75 Percent Of The Personal Wealth
The millionaire’s club isn’t what it used to be.
Time was that “being a millionaire” was a mark of unimaginable success. You’d joined the financial elite. People didn’t much discuss whether you arrived by wealth or income, because it didn’t matter much. The millionaire’s club was so small that the path to membership wasn’t worth discussing.
No more.
Millionaires aren’t as common as water, but there are plenty of them. A new study puts the worldwide total at 35 million in 2014, with about 40 percent (14 million) of them American. That’s about 5 percent of the U.S. adult population (241 million in 2014), or one in 20. Rarefied, yes; exclusive, no. After the United States, Japan has the largest concentration of millionaires with 8 percent of the world total, followed by France (7 percent), Germany (6 percent) and the United Kingdom (6 percent). At 3 percent, China ranks eighth.
The figures come from a study by Credit Suisse Research, which has been estimating worldwide personal wealth since 2010. The numbers reflect net worth, not annual income. The wealth totals add the value of people’s homes, businesses and financial assets (stocks, bonds) and subtract their loans. Doubtlessly, the number of millionaires would be much smaller if the calculations were based on income. In the study, an American with a $300,000 mortgage-free home and $700,000 in retirement accounts and financial investments qualifies as a millionaire.
On this basis, the study put global personal wealth in mid-2014 at $263 trillion, up from $117 trillion in 2000. Wealth in the United States reached $84 trillion, almost a third of the total. All of Europe, with a larger population, was virtually the same. Median wealth in the United States — meaning half of Americans were above the cutoff and half below — was $53,000, dominated by homes for many middle-class families. Japan’s total wealth was $23 trillion, but with a more equal distribution and a smaller population, its median was more than twice the American at $113,000. China’s wealth was $21 trillion and its median $7,000.
Credit Suisse did a special analysis of wealth inequality and, not surprisingly, found plenty of it. For starters, the analysis reminded readers that wealth inequality (basically, the ownership of stocks and bonds) is typically much greater than income inequality (basically, wages, salaries, dividends and interest).
In the United States, the wealthiest 10 percent of Americans own about 75 percent of the personal wealth, a share that’s unchanged since 2000; the income share of the top 10 percent is slightly less than 50 percent. But the study also found that wealth inequality is high in virtually all societies. Although the United States is at the upper end of the range, the low end is still stratospheric.
Consider.
In 2014, the wealthiest 10 percent owned 62 percent of the personal wealth in Germany; 69 percent in Sweden; 49 percent in Japan; 64 percent in China; 51 percent in Australia; 54 percent in the United Kingdom; 53 percent in France; 72 percent in Switzerland; and 68 percent in Denmark. These steep levels, the report noted, defied large cross-country differences in tax and inheritance policies.
There is, however, one country where wealth inequality is “so far above the others that it deserves to be placed in a separate category.” This is Russia. In 2014, the wealthiest 10 percent owned 85 percent of personal wealth. They aren’t oligarchs for nothing.
By: Robert Samuelson, The Washington Post, October 22, 2014
“The Forever Slump”: The Debate Between The ‘Too-Muchers’ And The ‘Not-Enoughers’
It’s hard to believe, but almost six years have passed since the fall of Lehman Brothers ushered in the worst economic crisis since the 1930s. Many people, myself included, would like to move on to other subjects. But we can’t, because the crisis is by no means over. Recovery is far from complete, and the wrong policies could still turn economic weakness into a more or less permanent depression.
In fact, that’s what seems to be happening in Europe as we speak. And the rest of us should learn from Europe’s experience.
Before I get to the latest bad news, let’s talk about the great policy argument that has raged for more than five years. It’s easy to get bogged down in the details, but basically it has been a debate between the too-muchers and the not-enoughers.
The too-muchers have warned incessantly that the things governments and central banks are doing to limit the depth of the slump are setting the stage for something even worse. Deficit spending, they suggested, could provoke a Greek-style crisis any day now — within two years, declared Alan Simpson and Erskine Bowles some three and a half years ago. Asset purchases by the Federal Reserve would “risk currency debasement and inflation,” declared a who’s who of Republican economists, investors, and pundits in a 2010 open letter to Ben Bernanke.
The not-enoughers — a group that includes yours truly — have argued all along that the clear and present danger is Japanification rather than Hellenization. That is, they have warned that inadequate fiscal stimulus and a premature turn to austerity could lead to a lost decade or more of economic depression, that the Fed should be doing even more to boost the economy, that deflation, not inflation, was the great risk facing the Western world.
To say the obvious, none of the predictions and warnings of the too-muchers have come to pass. America never experienced a Greek-type crisis of soaring borrowing costs. In fact, even within Europe the debt crisis largely faded away once the European Central Bank began doing its job as lender of last resort. Meanwhile, inflation has stayed low.
However, while the not-enoughers were right to dismiss warnings about interest rates and inflation, our concerns about actual deflation haven’t yet come to pass. This has provoked a fair bit of rethinking about the inflation process (if there has been any rethinking on the other side of this argument, I haven’t seen it), but not-enoughers continue to worry about the risks of a Japan-type quasi-permanent slump.
Which brings me to Europe’s woes.
On the whole, the too-muchers have had much more influence in Europe than in the United States, while the not-enoughers have had no influence at all. European officials eagerly embraced now-discredited doctrines that allegedly justified fiscal austerity even in depressed economies (although America has de facto done a lot of austerity, too, thanks to the sequester and cuts at the state and local level). And the European Central Bank, or E.C.B., not only failed to match the Fed’s asset purchases, it actually raised interest rates back in 2011 to head off the imaginary risk of inflation.
The E.C.B. reversed course when Europe slid back into recession, and, as I’ve already mentioned, under Mario Draghi’s leadership, it did a lot to alleviate the European debt crisis. But this wasn’t enough. The European economy did start growing again last year, but not enough to make more than a small dent in the unemployment rate.
And now growth has stalled, while inflation has fallen far below the E.C.B.’s target of 2 percent, and prices are actually falling in debtor nations. It’s really a dismal picture. Mr. Draghi & Co. need to do whatever they can to try to turn things around, but given the political and institutional constraints they face, Europe will arguably be lucky if all it experiences is one lost decade.
The good news is that things don’t look that dire in America, where job creation seems finally to have picked up and the threat of deflation has receded, at least for now. But all it would take is a few bad shocks and/or policy missteps to send us down the same path.
The good news is that Janet Yellen, the Fed chairwoman, understands the danger; she has made it clear that she would rather take the chance of a temporary rise in the inflation rate than risk hitting the brakes too soon, the way the E.C.B. did in 2011. The bad news is that she and her colleagues are under a lot of pressure to do the wrong thing from the too-muchers, who seem to have learned nothing from being wrong year after year, and are still agitating for higher rates.
There’s an old joke about the man who decides to cheer up, because things could be worse — and sure enough, things get worse. That’s more or less what happened to Europe, and we shouldn’t let it happen here.
By: Paul Krugman, Op-Ed Columnist, The New York Times, August 14, 2014
“The Timidity Trap”: The Best Lack All Conviction, While The Worst Are Full Of Passionate Intensity
There don’t seem to be any major economic crises underway right this moment, and policy makers in many places are patting themselves on the back. In Europe, for example, they’re crowing about Spain’s recovery: the country seems set to grow at least twice as fast this year as previously forecast.
Unfortunately, that means growth of 1 percent, versus 0.5 percent, in a deeply depressed economy with 55 percent youth unemployment. The fact that this can be considered good news just goes to show how accustomed we’ve grown to terrible economic conditions. We’re doing worse than anyone could have imagined a few years ago, yet people seem increasingly to be accepting this miserable situation as the new normal.
How did this happen? There were multiple reasons, of course. But I’ve been thinking about this question a lot lately, in part because I’ve been asked to discuss a new assessment of Japan’s efforts to break out of its deflation trap. And I’d argue that an important source of failure was what I’ve taken to calling the timidity trap — the consistent tendency of policy makers who have the right ideas in principle to go for half-measures in practice, and the way this timidity ends up backfiring, politically and even economically.
In other words, Yeats had it right: the best lack all conviction, while the worst are full of passionate intensity.
About the worst: If you’ve been following economic debates these past few years, you know that both America and Europe have powerful pain caucuses — influential groups fiercely opposed to any policy that might put the unemployed back to work. There are some important differences between the U.S. and European pain caucuses, but both now have truly impressive track records of being always wrong, never in doubt.
Thus, in America, we have a faction both on Wall Street and in Congress that has spent five years and more issuing lurid warnings about runaway inflation and soaring interest rates. You might think that the failure of any of these dire predictions to come true would inspire some second thoughts, but, after all these years, the same people are still being invited to testify, and are still saying the same things.
Meanwhile, in Europe, four years have passed since the Continent turned to harsh austerity programs. The architects of these programs told us not to worry about adverse impacts on jobs and growth — the economic effects would be positive, because austerity would inspire confidence. Needless to say, the confidence fairy never appeared, and the economic and social price has been immense. But no matter: all the serious people say that the beatings must continue until morale improves.
So what has been the response of the good guys?
For there are good guys out there, people who haven’t bought into the notion that nothing can or should be done about mass unemployment. The Obama administration’s heart — or, at any rate, its economic model — is in the right place. The Federal Reserve has pushed back against the springtime-for-Weimar, inflation-is-coming crowd. The International Monetary Fund has put out research debunking claims that austerity is painless. But these good guys never seem willing to go all-in on their beliefs.
The classic example is the Obama stimulus, which was obviously underpowered given the economy’s dire straits. That’s not 20/20 hindsight. Some of us warned right from the beginning that the plan would be inadequate — and that because it was being oversold, the persistence of high unemployment would end up discrediting the whole idea of stimulus in the public mind. And so it proved.
What’s not as well known is that the Fed has, in its own way, done the same thing. From the start, monetary officials ruled out the kinds of monetary policies most likely to work — in particular, anything that might signal a willingness to tolerate somewhat higher inflation, at least temporarily. As a result, the policies they have followed have fallen short of hopes, and ended up leaving the impression that nothing much can be done.
And the same may be true even in Japan — the case that motivated this article. Japan has made a radical break with past policies, finally adopting the kind of aggressive monetary stimulus Western economists have been urging for 15 years and more. Yet there’s still a diffidence about the whole business, a tendency to set things like inflation targets lower than the situation really demands. And this increases the risk that Japan will fail to achieve “liftoff” — that the boost it gets from the new policies won’t be enough to really break free from deflation.
You might ask why the good guys have been so timid, the bad guys so self-confident. I suspect that the answer has a lot to do with class interests. But that will have to be a subject for another column.
By: Paul Krugman, Op-Ed Columnist, The New York Times, March 20, 2014