“Nutballs And Buffoons”: The GOP’s Next Internal Debate
This morning, Jeb Bush said some somewhat surprising things in a meeting with reporters, at least for a Republican. He noted that neither Ronald Reagan nor his father could be elected in today’s GOP, and said in essence that Mitt Romney had moved too far to the right on immigration. He also said some of the things you’d expect a Republican to say, like that the blame for the current partisan atmosphere lies with President Obama, because he didn’t seek common ground with Republicans enough. Anyone who has been watching politics for the last three and a half years knows how utterly insane this is, but in case you missed this tidbit, a bunch of influential congressional Republicans got together on the night of Obama’s inauguration to lay out a plan for how they would obstruct everything they could and sabotage his presidency.
The question of what Jeb is up to sheds some light on where his party is going to find itself this coming fall, should it lose the presidential election. The simplest explanation for his willingness to tenderly criticize other Republicans is that he is realistic about the country’s yearning for more Bushes in the White House, so he feels free to state the blindingly obvious about his party’s gallop to the right. The alternative answer, which Jonathan Chait suggests, is that Jeb “is clearly engaged in an effort to position himself as the next leader of the Republican Party.” Chait explains:
To understand what Bush is saying, you need to anticipate how the party might diagnose the causes of a loss in 2012, and then you can see how he is setting himself as the cure. Bush has been publicly urging Republicans to moderate their tone toward Latinos and to embrace immigration reform. Here is the one issue where Republicans, should they lose, will almost surely conclude that they need to moderate their party stance. The Latino vote is both growing in size and seems to be tilting ever more strongly toward the Democrats, a combination that will rapidly make the electoral map virtually unwinnable. Indeed, the body language of the Romney campaign suggests it already regrets the hard-line stances on immigration it adopted during the primary…
If you try to imagine the Republican consensus after a potential losing election, it will look like this [a moderation in tone, without a moderation in substance]. It will recognize that its harsh partisan rhetoric turned off voters, and will urgently want to woo Latinos, while holding on to as much as possible of the party’s domestic policy agenda. And oh, by the way, the party will be casting about for somebody to lead it.
Chait may indeed be right about what Jeb is thinking. But it’s important to remember that if Romney loses, there will be a vigorous debate within the GOP about why he lost, and the outcome of that debate is not completely certain. Many Republican leaders will certainly argue that the rhetoric got out of hand, and they’ll be right. But lots of other Republicans, including the remnants of the Tea Party and the people who represent them, will argue that there was only one reason Romney lost: he was too liberal. They will push for more hardline positions, more uncompromising obstruction, and more conservative candidates, at all levels but especially when it comes to the 2016 presidential race.
You might say, well, that happened in 2012, didn’t it? And the establishment’s candidate eventually prevailed. That’s true enough, but Mitt Romney had the good fortune to run against a remarkable collection of nutballs and buffoons. It isn’t as though defeating Michele Bachmann, Rick Perry, Herman Cain, Newt Gingrich, and Rick Santorum makes you some kind of giant-killer. After a few months of those primaries, he came out looking like the closest thing the party had to a candidate who was in possession of all his faculties.
In every presidential election in the last half-century with the exception of 2000, Republicans have nominated the person who was “next in line,” almost always someone who had run for president before and come in second. But the closest thing to a next in line for 2016 will be Santorum, and the party couldn’t possibly be dumb enough to nominate him. There will likely be some candidates more acceptable to the establishment, and some who appeal more to the base. But the former group will still feel enormous pressure to move as far right as possible to placate those base voters. In other words, it’s possible Jeb Bush will wind up as the leader of the GOP. But if he does, it won’t be because he’s a moderate. It’ll be because, like Romney, he can give the base the wingnuttery it demands, while winking to the establishment that he’s not as crazy as he sounds.
By: Paul Waldman, Contributing Editor, The American Prospect, June 11, 2012
“Government Is The Solution”: Healing The Economy For The Common Good
Why don’t Democrats just say it? They really believe in active government and think it does good and valuable things. One of those valuable things is that government creates jobs — yes, really — and also the conditions under which more jobs can be created.
You probably read that and thought: But don’t Democrats and liberals say this all the time? Actually, the answer is no. It’s Republicans and conservatives who usually say that Democrats and liberals believe in government. Progressive politicians often respond by apologizing for their view of government, or qualifying it, or shifting as fast as the speed of light from mumbled support for government to robust affirmations of their faith in the private sector.
This is beginning to change, but not fast enough. And the events of recent weeks suggest that if progressives do not speak out plainly on behalf of government, they will be disadvantaged throughout the election-year debate. Gov. Scott Walker’s victory in the Wisconsin recall election owed to many factors, including his overwhelming financial edge. But he was also helped by the continuing power of the conservative anti-government idea in our discourse. An energetic argument on one side will be defeated only by an energetic argument on the other.
The case for government’s role in our country’s growth and financial success goes back to the very beginning. One of the reasons I wrote my book “Our Divided Political Heart” was to show that, from Alexander Hamilton and Henry Clay forward, farsighted American leaders understood that action by the federal government was essential to ensuring the country’s prosperity, developing our economy, promoting the arts and sciences and building large projects: the roads and canals, and later, under Abraham Lincoln, the institutions of higher learning, that bound a growing nation together.
Both Clay and Lincoln battled those who used states’ rights slogans to crimp federal authority and who tried to use the Constitution to handcuff anyone who would use the federal government creatively. Both read the Constitution’s commerce clause as Franklin Roosevelt and progressives who followed him did, as permitting federal action to serve the common good. A belief in government’s constructive capacities is not some recent ultra-liberal invention.
Decades of anti-government rhetoric have made liberals wary of claiming their legacy as supporters of the state’s positive role. That’s why they have had so much trouble making the case for President Obama’s stimulus program passed by Congress in 2009. It ought to be perfectly obvious: When the private sector is no longer investing, the economy will spin downward unless the government takes on the task of investing. And such investments — in transportation and clean energy, refurbished schools and the education of the next generation — can prime future growth.
Yet the drumbeat of propaganda against government has made it impossible for the plain truth about the stimulus to break through. It was thus salutary that Douglas Elmendorf, the widely respected director of the Congressional Budget Office, told a congressional hearing last week that 80 percent of economic experts surveyed by the University of Chicago’s Booth School of Business agreed that the stimulus got the unemployment rate lower at the end of 2010 than it would have been otherwise. Only 4 percent disagreed. The stimulus, CBO concluded, added as many as 3.3 million jobs during the second quarter of 2010, and it may have kept us from lapsing back into recession.
So when conservatives say, as they regularly do, that “government doesn’t create jobs,” the riposte should be quick and emphatic: “Yes it has, and yes, it does!”
Indeed, our unemployment rate is higher today than it should be because conservatives blocked additional federal spending to prevent layoffs by state and local governments — and because progressives, including Obama, took too long to propose more federal help. Obama’s jobs program would be a step in the right direction, and he’s right to tout it now. But he should have pushed for a bigger stimulus from the beginning. The anti-government disposition has so much power that Democrats and moderate Republicans allowed themselves to be intimidated into keeping it too small.
Let’s turn Ronald Reagan’s declaration on its head: Opposition to government isn’t the solution. Opposition to government was and remains the problem. It is past time that we affirm government’s ability to heal the economy, and its responsibility for doing so.
By: E’ J’ Dionne, Jr., Opinion Writer, The Washington Post, June 10, 2012
“A Cautionary Tale”: Remember When Breaking The Law Used To Mean Something?
The big piece today is in the Washington Post, where Carl Bernstein and Bob Woodward share a byline for the first time in 36 years. It’s about President Nixon and Watergate 40 years after the fact, and how the whole situation was much worse than was thought back then:
Ervin’s answer to his own question hints at the magnitude of Watergate: “To destroy, insofar as the presidential election of 1972 was concerned, the integrity of the process by which the President of the United States is nominated and elected.” Yet Watergate was far more than that. At its most virulent, Watergate was a brazen and daring assault, led by Nixon himself, against the heart of American democracy: the Constitution, our system of free elections, the rule of law.
Today, much more than when we first covered this story as young Washington Post reporters, an abundant record provides unambiguous answers and evidence about Watergate and its meaning. This record has expanded continuously over the decades with the transcription of hundreds of hours of Nixon’s secret tapes, adding detail and context to the hearings in the Senate and House of Representatives; the trials and guilty pleas of some 40 Nixon aides and associates who went to jail; and the memoirs of Nixon and his deputies. Such documentation makes it possible to trace the president’s personal dominance over a massive campaign of political espionage, sabotage and other illegal activities against his real or perceived opponents.
The article is full of great quotes from the Nixon tapes as he became increasingly paranoid and irrational, going on profanity-laced tirades against journalists, the antiwar movement, and “the Jews,” among others. But what is perhaps most notable about the article is the implicit frame it presents. The sense I get from it is that Woodward and Bernstein are presenting a cautionary tale, a kind of story to tell young politicians before you tuck them into bed. “Be careful, kids, or this is where you’ll end up.”
The trouble with this is that recent cases of elite lawbreaking, up to and including top officials, are still almost too common to count. Just for the most obvious example, consider the fact that George Bush has admitted to ordering the waterboarding of Khalid Sheik Mohammed. There’s a ginned up controversy about whether or not that was against the law, but don’t take my word for it, listen to the chief law enforcement officer of the United States:
In his confirmation hearing before the Senate Judiciary Committee, Holder declared that the interrogation practice known as waterboarding amounts to torture, departing from the interpretation of his Bush administration predecessors.
And finally, from the UN Convention Against Torture, Article II, signed by President Reagan:
1. Each State Party shall take effective legislative, administrative, judicial or other measures to prevent acts of torture in any territory under its jurisdiction.
2. No exceptional circumstances whatsoever, whether a state of war or a threat or war, internal political instability or any other public emergency, may be invoked as a justification of torture.
Nixon was not the last of the presidential lawbreakers. Far from it.
By: Ryan Cooper, Washington Monthly Political Animal, June 6, 2012
“Say It Ain’t So”: Face It Republicans, Reagan Was A Keynesian
There’s no question that America’s recovery from the financial crisis has been disappointing. In fact, I’ve been arguing that the era since 2007 is best viewed as a “depression,” an extended period of economic weakness and high unemployment that, like the Great Depression of the 1930s, persists despite episodes during which the economy grows. And Republicans are, of course, trying — with considerable success — to turn this dismal state of affairs to their political advantage.
They love, in particular, to contrast President Obama’s record with that of Ronald Reagan, who, by this point in his presidency, was indeed presiding over a strong economic recovery. You might think that the more relevant comparison is with George W. Bush, who, at this stage of his administration, was — unlike Mr. Obama — still presiding over a large loss in private-sector jobs. And, as I’ll explain shortly, the economic slump Reagan faced was very different from our current depression, and much easier to deal with. Still, the Reagan-Obama comparison is revealing in some ways. So let’s look at that comparison, shall we?
For the truth is that on at least one dimension, government spending, there was a large difference between the two presidencies, with total government spending adjusted for inflation and population growth rising much faster under one than under the other. I find it especially instructive to look at spending levels three years into each man’s administration — that is, in the first quarter of 1984 in Reagan’s case, and in the first quarter of 2012 in Mr. Obama’s — compared with four years earlier, which in each case more or less corresponds to the start of an economic crisis. Under one president, real per capita government spending at that point was 14.4 percent higher than four years previously; under the other, less than half as much, just 6.4 percent.
O.K., by now many readers have probably figured out the trick here: Reagan, not Obama, was the big spender. While there was a brief burst of government spending early in the Obama administration — mainly for emergency aid programs like unemployment insurance and food stamps — that burst is long past. Indeed, at this point, government spending is falling fast, with real per capita spending falling over the past year at a rate not seen since the demobilization that followed the Korean War.
Why was government spending much stronger under Reagan than in the current slump? “Weaponized Keynesianism” — Reagan’s big military buildup — played some role. But the big difference was real per capita spending at the state and local level, which continued to rise under Reagan but has fallen significantly this time around.
And this, in turn, reflects a changed political environment. For one thing, states and local governments used to benefit from revenue-sharing — automatic aid from the federal government, a program that Reagan eventually killed but only after the slump was past. More important, in the 1980s, anti-tax dogma hadn’t taken effect to the same extent it has today, so state and local governments were much more willing than they are now to cover temporary deficits with temporary tax increases, thereby avoiding sharp spending cuts.
In short, if you want to see government responding to economic hard times with the “tax and spend” policies conservatives always denounce, you should look to the Reagan era — not the Obama years.
So does the Reagan-era economic recovery demonstrate the superiority of Keynesian economics? Not exactly. For, as I said, the truth is that the slump of the 1980s — which was more or less deliberately caused by the Federal Reserve, as a way to bring down inflation — was very different from our current depression, which was brought on by private-sector excess: above all, the surge in household debt during the Bush years. The Reagan slump could be and was brought to a rapid end when the Fed decided to relent and cut interest rates, sparking a giant housing boom. That option isn’t available now because rates are already close to zero.
As many economists have pointed out, America is currently suffering from a classic case of debt deflation: all across the economy people are trying to pay down debt by slashing spending, but, in so doing, they are causing a depression that makes their debt problems even worse. This is exactly the situation in which government spending should temporarily rise to offset the slump in private spending and give the private sector time to repair its finances. Yet that’s not happening.
The point, then, is that we’d be in much better shape if we were following Reagan-style Keynesianism. Reagan may have preached small government, but in practice he presided over a lot of spending growth — and right now that’s exactly what America needs.
By: Paul Krugman, Op-Ed Columnist, The New York Times, June 7, 2012
“Untempered Individualism”: Conservatives Used To Care About Community
To secure his standing as the presumptive Republican presidential nominee, Mitt Romney has disowned every sliver of moderation in his record. He’s moved to the right on tax cuts and twisted himself into a pretzel over the health-care plan he championed in Massachusetts — because conservatives are no longer allowed to acknowledge that government can improve citizens’ lives.
Romney is simply following the lead of Republicans in Congress who have abandoned American conservatism’s most attractive features: prudence, caution and a sense that change should be gradual. But most important of all, conservatism used to care passionately about fostering community, and it no longer does. This commitment now lies buried beneath slogans that lift up the heroic and disconnected individual — or the “job creator” — with little concern for the rest.
Today’s conservatism is about low taxes, fewer regulations, less government — and little else. Anyone who dares to define it differently faces political extinction. Sen. Richard Lugar of Indiana was considered a solid conservative, until conservatives decided that anyone who seeks bipartisan consensus on anything is a sellout. Even Orrin Hatch of Utah, one of the longest-serving Republican senators, is facing a primary challenge. His flaw? He occasionally collaborated with the late Democratic senator Edward M. Kennedy on providing health insurance coverage for children and encouraging young Americans to join national service programs. In the eyes of Hatch’s onetime allies, these commitments make him an ultra-leftist.
I have long admired the conservative tradition and for years have written about it with great respect. But the new conservatism, for all its claims of representing the values that inspired our founders, breaks with the country’s deepest traditions. The United States rose to power and wealth on the basis of a balance between the public and the private spheres, between government and the marketplace, and between our love of individualism and our quest for community.
Conservatism today places individualism on a pedestal, but it originally arose in revolt against that idea. As the conservative thinker Robert A. Nisbet noted in 1968, conservatism represented a “reaction to the individualistic Enlightenment.” It “stressed the small social groups of society” and regarded such clusters of humanity — not individuals — as society’s “irreducible unit.”
True, conservatives continue to preach the importance of the family as a communal unit. But for Nisbet and many other conservatives of his era, the movement was about something larger. It “insisted upon the primacy of society to the individual — historically, logically and ethically.”
Because of the depth of our commitment to individual liberty, Americans never fully adopted this all-encompassing view of community. But we never fully rejected it, either. And therein lies the genius of the American tradition: We were born with a divided political heart. From the beginning, we have been torn by a deep but healthy tension between individualism and community. We are communitarian individualists or individualistic communitarians, but we have rarely been comfortable with being all one or all the other.
The great American conservative William F. Buckley Jr. certainly understood this. In his book “Gratitude: Reflections on What We Owe to Our Country,” he quotes approvingly John Stuart Mill’s insistence that “everyone who receives the protection of society owes a return for the benefit.” With liberty comes responsibility to the community.
Before the Civil War, conservatives such as Alexander Hamilton and Henry Clay believed in an active federal government that served the common good. This included a commitment to internal improvements (what we now, less elegantly, call infrastructure), public schooling, and the encouragement of manufacturing and science. Clay, an unapologetic supporter of national economic planning, called his program “the American System,” explicitly distinguishing his idea from the British laissez-faire system. (The Club for Growth would not have been pleased.)
Abraham Lincoln, for whom Clay was a hero, built upon this tradition, laying the foundation for our public universities by backing the establishment of land-grant colleges.
Civil War pensions — the first great social insurance program and a central Republican cause — were supporting about 28 percent of men 65 and over by 1910. In 1894, the program’s most expensive year, the pensions accounted for 37 percent of federal spending. Sounds like a massive entitlement program, doesn’t it?
And the first American version of socialized medicine was signed into law in 1798 by that great conservative president, John Adams. The Marine Hospital Servicefunded hospitals across the country to treat sailors who were sick or got injured on the job. There is no record of a mass campaign to repeal AdamsCare.
Mr. Conservative himself, Robert A. Taft, a Republican senator from Ohio and Senate majority leader, urged federal support for decent housing for all Americans in the 1940s. Dwight Eisenhower created the interstate highway system and established the federal student loan program in the 1950s.
More recently, Ronald Reagan never tried to dismantle the New Deal and acknowledged, sometimes with wry humor, the need for tax increases. He was acutely alive to the communal side of conservatism. Nearly all of the pictures in his 1984 “Morning in America” commercial — one of the most famous political ads in our history — invoked community: a father and son working together, tidy neighborhoods, a wedding, young campers earnestly saluting the flag. Reagan spoke regularly not only of the power of the market and the dangers of Soviet communism, but also of the centrality of families and neighborhoods.
George W. Bush, who promoted “compassionate conservatism,” built on old progressive programs with his No Child Left Behind law, using federal aid to education as a lever for reform. And he added a prescription-drug benefit to the Medicare program that Lyndon B. Johnson pushed into law.
In other words, until recently conservatives operated within America’s long consensus that accepted a market economy as well as a robust role for a government that served the common good. American politics is now roiled because this consensus is under the fiercest attack it has faced in more than 100 years.
For most of the 20th century, conservatives and progressives alternated in power, each trying to correct the mistakes of the other. Neither scared the wits out of the other (although campaign rhetoric sometimes suggested otherwise), and this equilibrium allowed both sides to compromise and move forward. It didn’t mean that politics was devoid of philosophical conflicts, of course. The clashes over McCarthyism, the civil rights revolution, the Vietnam War, Watergate and the Great Inflation of the late 1970s remind us that our consensus went only so far. Conservatives challenged aspects of the New Deal-era worldview from the late 1960s on, dethroning a liberal triumphalism that long refused to take conservatism seriously. Over time, even progressives came to appreciate some essential instincts that conservatives brought to the debate.
So why has this consensus unraveled?
Modern conservatism’s rejection of its communal roots is a relatively recent development. It can be traced to a simultaneous reaction against Bush’s failures and Barack Obama’s rise.
Bush’s unpopularity at the end of his term encouraged conservatives, including the fledgling tea party movement, to distance themselves from his legacy. They declared that Bush’s shortcomings stemmed from his embrace of “big government” and “big spending” — even if much of the spending was in Iraq and Afghanistan. They recoiled from his “compassionate conservatism,” deciding, as right-wing columnist Michelle Malkin put it, that “ ‘compassionate conservatism’ and fiscal conservatism were never compatible.”
That would be true, of course, only if “fiscal conservatism” were confined to reductions in government and not viewed instead as an effort to keep revenue and spending in line with each other, which was how older conservatives had defined the term.
Obama, in the meantime, pitched communal themes from the moment he took office, declaring in his inaugural address that America is “bigger than the sum of our individual ambitions.” The more he emphasized a better balance between the individual and the community, the less interested conservatives became in anything that smacked of such equilibrium.
That’s why today’s conservatives can’t do business with liberals or even moderates who are still working within the American tradition defined by balance. It’s why they can’t agree even to budget deals that tilt heavily, but not entirely, toward spending cuts; only sharp reductions in taxes and government will do. It’s why they cannot accept (as Romney and the Heritage Foundation once did) energetic efforts by the government to expand access to health insurance. It’s why, even after a catastrophic financial crisis, they continue to resist new rules aimed not at overturning capitalism but at making it more stable.
For much of our history, Americans — even in our most quarrelsome moments — have avoided the kind of polarized politics we have now. We did so because we understood that it is when we balance our individualism with a sense of communal obligation that we are most ourselves as Americans. The 20th century was built on this balance, and we will once again prove the prophets of U.S. decline wrong if we can refresh and build upon that tradition. But doing so will require conservatives to abandon untempered individualism, which betrays what conservatism has been and should be.
By: E. J. Dionne, Jr., Opinion Writer, The Washington Post, May 24, 2012