“The Roots And Lessons Of Memorial Day”: A Reminder That Politics Can Have Dire Consequences
Memorial Day is a peculiarly appropriate holiday for our times. Its origins lie in the Civil War, which resulted from the failure of a deeply polarized political system to settle the question of slavery.
Reading the history of the period leading up to the war is jarring because its political conflicts bear eerie similarities to our own — for the sharp regional differences over how the federal government’s powers should be regarded; for the way in which advocates of slavery relied on “constitutional” claims to justify its survival and spread; for the refusal of pro-slavery forces to accept the outcome of the 1860 election; and for the fierce disagreements over how the very words “morality,” “patriotism” and “freedom” should be defined.
Our nation argued over what the Founders really intended and over the Supreme Court’s authority to impose a particular political view — in the case of the Dred Scott decision, it was the pro-slavery view — and to override growing popular opposition to slavery’s expansion. Religious people sundered their ties with each other over the political implications of faith and biblical teachings. And, yes, we struggled over race and racism.
We are not on the verge of a new civil war, and no single issue in our moment matches slavery either in its morally evocative power or as a dividing line splitting the nation into two distinct social systems. But Memorial Day might encourage us to re-engage with the story of the pre-Civil War period (the late David M. Potter’s Pulitzer Prize-winning history of the era, “The Impending Crisis,” has helpfully been reissued) for clues from the past as to how we might understand the present.
The holiday itself and how it was transformed over the years also carry political lessons for us now.
Memorial Day, as veterans are always the first to remind us, is not the same as Veterans Day. Memorial Day honors the war dead; Veterans Day honors all vets. Memorial Day started as Decoration Day on May 5, 1868, initiated by the Grand Army of the Republic, the vast and politically influential organization of Union veterans. The idea was to decorate the graves of the Union dead with flowers. Students of the holiday believe that Gen. John A. Logan, the commander in chief of the GAR (and the Republican vice presidential nominee in 1884), eventually set May 30 as its date because that would be when flowers were in bloom across the country.
The South, of course, saluted the Confederate war dead. A group of women in Columbus, Miss., for example, decorated the graves of the Southern dead at the Battle of Shiloh on April 25, 1866. This and other comparable ceremonies led to a vigorous competition over where the holiday originated.
It was only after World War I that Memorial Day was established as a holiday commemorating the fallen in all American wars. And it was not until 1966 that President Lyndon Johnson declared Waterloo, N.Y., as the official birthplace of Memorial Day, although that has not stopped the disputes over where it began.
Seen one way, the Memorial Day story traces a heartening journey: a nation whose Civil War took the lives of an estimated 750,000 Americans (more than 2 percent of the U.S. population then) could and did gradually come back together. A holiday that was initially a remembrance of those who died because the nation was so riven is now a unifying anniversary whose origins are largely forgotten.
Marking Memorial Day, moreover, may now be more of a moral imperative than it ever was. As a nation, we rely entirely on a military made up of volunteers. We are calling on a very small percentage of our fellow citizens to risk and give their lives on behalf of us all. We should recognize how much we have asked of so few, particularly in the years since 2001.
But it would be a mistake to ignore the roots of Memorial Day in our Civil War. Memorial Day is a call to political responsibility, even more so in some ways than the Fourth of July. The graves that Logan asked his contemporaries to decorate were a reminder that politics can have dire consequences. Distorting political reality (the pro-secession forces, for example, wrongly insisting that the resolutely moderate Abraham Lincoln was a radical) makes resolving differences impossible. As we honor our war dead, let us pause to consider how we are discharging our obligations to their legacy.
By: E. J. Dionne, Jr., Opinion Writer, The Washington Post, May 25, 2014
“A Challenge To Conservative Principles”: Humankind Is Better Off Than It Has Ever Been, And It’s Thanks To Government
There has never been a better time to be a human being than in March 2014. People live longer, wealthier, happier lives than they ever have. Each of the Four Horsemen — disease, famine, war, and death — are being beaten back.
This isn’t just my opinion. The data is incontrovertible. Life expectancy is the highest it’s ever been, and getting higher. Global GDP has never reached our present heights. The number of humans in poverty has never been lower. Wars between nations are almost extinct, and wars in general are getting less deadly.
The notion of human progress isn’t a grand theory anymore; it’s a fact. So why do so many people insist on telling you it’s impossible?
Almost everywhere you turn, some pundit or “literary intellectual” is aching to tell you the “hard, eternal truths” about the way the world works. Progress is a false idol, they’ll say — and worse, an American one. The harsh reality is that nothing ever changes; the sad truth of the human condition is pain and misery.
These people position themselves as besieged truth tellers, braving the wrath of the masses to challenge our dominant, rose-tinted national narrative. In reality, they’re just saying what most people think. A reasonably large majority of Americans think the country’s “best years” are behind it. Post-Great Recession, doom-and-gloom is in.
But while pessimism may be the conventional wisdom nowadays, its intellectual avatars have never been more anemic. Take British philosopher John Gray. Gray has made debunking the notion of “progress” his life’s work, having written two whole books on the matter in addition to innumerable columns and magazine articles. His review of Steven Pinker’s The Better Angels of Our Nature, a book that carefully assembles immense amounts of statistical evidence showing that war and violence claim fewer lives than ever, does not dispute a single bit of Pinker’s data. Incredibly, Gray thinks pointing out that some Enlightenment thinkers disagreed with each other constitutes a devastating rebuttal to Pinker’s detailed empirical argument. The review’s shallowness is emblematic of the general tenor of Gray’s sad crusade.
It’s not just John Gray. Given the enormous amounts of data on the optimists’ side, pessimists have little more than handwaving left to them. The pessimists babble on about “permanent human nature” and “timeless verities.” The optimists cite U.N. life expectancy statistics and U.S. government crime data. Having no answer to books like Pinker’s, Charles Kenny’s Getting Better, or Angus Deaton’s The Great Escape, the pessimists resort to empty pieties.
The irony here is obvious. The pessimists accuse optimists of falling prey to a seductive ideological thinking; “the worship of Progress,” as Christian conservative Rod Dreher puts it. Yet the only people being seduced are the pessimists, clutching the pillars of their ideological house while its foundation shatters.
Today’s optimists notice clear evidence that humanity’s lot is getting better — a point that does not require assuming that it must get better as a consequence of some inevitable historical law. Opponents respond by asserting the world simply cannot be getting better, as their own pessimistic theory of history says it’s impossible. The critics of blind faith have put out their own eyes.
The reason that purportedly hard-boiled realists adhere to the absurd pessimistic ideology is plain. Their own political views depend crucially on the idea that nothing about the world can be improved. The clear evidence that human inventions — government, the market, medicine, international institutions, etc. — have improved the world point to devastating truths adherents to pessimistic ideologies are loath to admit.
The two ideologies I have in mind have been at odds of late: American conservatism and foreign policy “realism.” Yet popular versions of both rely on the notion of an unchanging, conflict-filled political landscape.
For many conservatives, the idea of “progress” constitutes liberalism’s fatal conceit. Russell Kirk put it most eloquently: “Man being imperfect, no perfect social order ever can be created.” Bill Kristol, living proof that movement conservatism has been immune from the happy trends improving the world, is more blunt. “Progressivism is a touchingly simple-minded faith,” he says. “The higher the number of the century, the better things should be. But progressivism happens not to be true.”
Kristol’s understanding of progressivism is wanting, to say the least. But the reason he needs to stamp his feet and deny the evidence of progress is that hard evidence of human improvement challenges his conservative first principles. Improvements in human welfare have come from government — most notably through public health programs, like the campaign against leaded gasoline, but also through institutions like the welfare state and mixed-market economies. It’s no surprise that the wealthiest, healthiest, and happiest countries are all welfare state democracies.
But more fundamentally, human progress runs against the conservative assumption that human nature does not permit fundamental victories over evils like war. Government will always fail, as Kirk suggests, because human nature will frustrate any attempt to eradicate suffering.
But as it turns out, human nature itself is shaped crucially by the institutions we find ourselves surrounded by — including government. The newest research on humanity’s basic psychology, lucidly explained in recent books by neuroscientist Jonathan Greene and primatologist Frans de Waal, find that human “nature” is malleable. We’re naturally inclined toward both conflict and cooperation, and thus have the potential for both great good and great evil. The crucial deciding factor is the circumstances we find ourselves in. The reality of human progress, then, suggests that the political and social arrangements we’ve created are bringing out our better angels. This is a truth the conservative view of human nature cannot abide.
Foreign policy realists are also concerned by human nature, but nowadays tend to rely more on arguments about “the international system.” For them, global harmony is impossible because nations can never trust each other. Without a world government, no one can really ensure that another country’s army won’t come calling on your doorstep. States are driven to conflict by the need to secure themselves from an always-there risk to their security.
The decline in violence constitutes an existential threat to this worldview. There is strong evidence that international institutions, trade interdependences, and the spread of democracy have all contributed to war’s decline. If that’s true, then it really does seem like the globe isn’t destined for conflict forever. Neither human nature nor the international system make war inevitable.
Now, there are real grounds to worry about the future of human progress. Most notably, climate change has the potential to wipe out much of what we’ve accomplished. The reality of human progress isn’t an argument against heading off ecological disaster.
But that crisis hasn’t happened yet. You can simultaneously celebrate the fact that humanity is better off than it has ever been and argue that we need to take drastic action if we want to make sure that progress doesn’t stop with our generation.
So there’s no reason not to sing progress’ praises. Today’s world is much more Lego Movie than True Detective: everything really is kind of awesome, and time is not a damn flat circle.
By: Zack Beauchamp, The Week, March, 13, 2014
“Atheists In Tornadoes And Foxholes”: If You Believe Only When There’s An Enemy Army Or A Tornado, You Don’t Believe
If you’ve watched the endless interviews with survivors of natural disasters, you may have noticed that the news media representatives, faced with someone who may be too shocked or nervous before the cameras to offer sufficiently compelling testimony, often do some gentle prompting. “When you saw your home destroyed, were you just devastated?” “You’ve never seen anything like this before, have you?” “Your whole life changed in that moment, didn’t it?” Not everyone who survived a disaster is YouTube clip-ready, so some need to be coached. There was one such interview after the tornado ran through Moore, Oklahoma that got some attention. Interviewing a woman as they stood before the tangled pile of debris that used to be her home and discussed her family’s narrow escape, CNN’s Wolf Blitzer said, “You guys did a great job. I guess you got to thank the Lord. Right?” When she hesitated, Blitzer pressed on. “Do you thank the Lord for that split-second decision?” She paused for a moment before responding, “I’m actually an atheist.” Awkward laughs ensued.
Blitzer’s assumption was understandable; most Americans profess a faith in God, and there is an awful lot of Lord-thanking after a natural disaster. Atheists find this puzzling, to say the least; if God deserves your thanks and praise for being so merciful as to allow you to live through the tornado, maybe He could have been kind enough not to destroy your home and kill 24 of your neighbors in the first place. But at times of crisis, everyone looks for comfort where they can find it.
It’s often said that there are no atheists in foxholes, and I suppose Wolf Blitzer thought the same would be true of tornadoes. But when you stop to think about that old expression, you realize how insulting it is, not just to those who don’t believe in an almighty but also to those who do. It says that the primary basis for religious faith is fear of death, and one’s beliefs are so superficial that they are a function only of the proximity of danger. If you believe only because there’s an enemy army or a tornado bearing down on you, you don’t believe.
Wolf Blitzer will no doubt be more careful next time. And perhaps he’ll learn that those who hold to no religion are a fast-growing group, as many as one in six Americans in most polls, so there’s at least a fair chance that the next disaster survivor he interviews will also be an atheist. Some of those secular folks are becoming more open about it as their numbers increase; for instance, when last week it came Arizona state representative Juan Mendez’s turn to open the legislative session with a prayer, he instead chose an eloquent invocation of “my secular humanist tradition,” including a quote from Carl Sagan. Afterward, Mendez said, “I hope today marks the beginning of a new era in which Arizona’s non-believers can feel as welcome and valued here as believers.”
It’s a nice thought, but it may take a while. There are signs of progress, though. Last week, Pope Francis made news around the world when in a homily, he delivered to his flock the shocking news that atheists are capable of doing good. They may not get to heaven, but on this planet they are not necessarily gripped by evil. This was certainly a step in the direction of mutual understanding that his predecessor was not inclined to make; Pope Benedict was aggressively hostile to those who don’t believe in God, essentially blaming the crimes of the Third Reich on atheism.
But I was surely not the only atheist who was a little underwhelmed by Francis’ generosity of spirit. Atheists are capable of goodness? How kind of him to say. If you heard a man say, “You may not believe it, but women can be intelligent,” you probably wouldn’t respond, “What an admirable statement of his commitment to equality—thanks, Mr. Feminist!” But the bar is pretty low for religious leaders; we expect them to hold that all who do not share their particular beliefs are doomed to an eternity of the cruelest punishments the divine mind can devise. We speak of religious “tolerance” as the most we can expect when it comes to the treatment of other people’s religions. But we “tolerate” not that which we love or respect but that which is unpleasant, painful, or worthy of mild contempt. We tolerate things which we’d just as soon see disappear. You tolerate a hangnail.
Nevertheless, we can give the Pope credit for making a start, even if in public life the most vapid expressions of faith will continue to be the norm. Singers will thank the Lord for delivering unto them a Grammy, smiting the hopes of the other nominees, who are plainly vile in His sight. Football players will gather to pray before a last-second field goal, in the hopes that God will alter his divine plan in their favor and push the ball through the goalposts. And presidents Democratic and Republican will end every speech with “And may God bless the United States of America.” As The Atlantic‘s James Fallows has noted many times, this utterly content-free bit of religiosity means nothing more than “This speech is now over.”
I don’t know if hearing that at the end of a speech makes anyone feel more reassured or hopeful about our country’s future. Perhaps it does. But that woman Wolf Blitzer interviewed? The group Atheists Unite put out a call to help her family rebuild their house, setting a goal of raising $50,000. They’re already approaching $100,000. She no doubt feels thankful, but she’ll be thanking her fellow human beings.
By: Paul Waldman, Contributing Editor, The American Prospect, May 27, 2013
“If War Is Hell, What Is Perpetual War?”: The Question That Lindsey Graham Should Be Asked Every Day
I’ve been staring at Sen. Lindsey’s Graham’s comments yesterday from Fox News Sunday, when he criticized the president’s big counter-terrorism speech, and wondering what it would take to satisfy him that it’s time to declare the Global War On Terrorism over?
At a time we need resolved the most, we are sounding retreat. Our enemies are emboldened all over the planet. Al Qaeda in Iraq is coming back with vengeance, in Libya together. Our friends are uncertain. Syria is falling apart. We are talking about helping the rebels but doing nothing about it. Iran is marching toward a nuclear weapon….
At the end of the day, this is the most tone deaf president I’ve ever — could imagine and making such a speech at a time when our homeland is trying to be — attacked literally every day.
So are the only alternatives for the United States a world free of threats or perpetual war? That would seem to be Graham’s essential argument. And what a forfeiture of national sovereignty he calls for, if we are prohibited from adjusting our national security strategy and returning to a normal constitutional regime so long as one “emboldened” enemy or “uncertain” friend might notice!
The habit, carried over from the Cold War, of waging undeclared wars fought under hazy international and domestic auspices is dangerous enough. The idea that anything other than a permanent war footing invites disaster is an extension of the Cold War “Peace Through Strength” doctrine that in fact rules out peace.
If, as Sherman rightly said, “War is hell!”–then what kind of existence do advocates of perpetual war propose for us? It’s a question that Lindsey Graham should be asked to ponder every time he objects to even the smallest steps away from fear and hysteria.
By: Ed Kilgore, Contributing Writer, Washington Monthly Political Animal, May 27, 2013
Marco Rubio’s Foreign Policy: Blind, Irrational, And Dangerous
In a speech at the University of Louisville this week, Sen. Marco Rubio (R-Fla.) warned against U.S. “retreat” from the world, which he claimed would result in a vacuum filled by “chaos” and “tyranny.”
These remarks have been interpreted as a rebuke to the foreign policy views of Rubio’s colleague and possible 2016 rival, Sen. Rand Paul (R-Ky.). But they are more important than an example of intra-party feuding. These statements reflect the seriously flawed assumptions of Rubio and other hawkish interventionists about what American engagement in the world requires, and they reveal just how alarmist and outdated Rubio’s worldview is. And it is because Rubio’s worldview continues to be the one that prevails among Republican leaders that it merits closer inspection.
“This is what will replace us on the global stage: chaos and tyranny,” Rubio warned. On one level, this is rather crude fear-mongering, but there is more to Rubio’s argument than that. When he warned that “chaos” and “tyrannical governments” will fill a void left by U.S. “retreat,” Rubio was showing his continued reliance on the arguments of Robert Kagan, whose book, The World America Made, Rubio referred to frequently in his foreign policy address at the Brookings Institution last year.
It has become a common hawkish refrain that the U.S. cannot withdraw from any conflict or reduce its commitments anywhere in the world without inviting either chaos or risking the increased influence of authoritarian major powers or both. Kagan has been one of the strongest proponents of this view, and Rubio appears to have adopted most of Kagan’s arguments. This view both overstates the importance of an extremely activist U.S. foreign policy for international stability and underestimates the ability of rising democratic powers to assume regional responsibilities.
The idea that U.S. preeminence in the world must necessarily be “replaced” by the global dominance of authoritarian governments hasn’t made any sense in over 20 years. Today, major authoritarian powers are significantly less powerful and less ambitious in their foreign policy goals than America’s 20th century rivals. Today, many of the world’s rising powers are democratic and have no interest in falling in line behind Chinese or Russian “leadership.” So the implication in Rubio’s speech that there is a danger of another state becoming the world’s predominant military power is sheer alarmism designed to justify an exorbitant military budget that is larger in real terms than it was at the height of the Reagan-era build-up. The fear of being surpassed militarily by another major power has rarely been more unfounded, and the danger to the U.S. from pursuing a less activist role abroad has rarely been smaller. Rubio’s vision of America’s role takes none of this into account.
Another flaw in Rubio’s thinking: His definition of what constitutes engagement with and “retreat” from the world is heavily skewed by his apparent conviction that the U.S. should regularly entangle itself in the internal conflicts of other countries. According to that definition, failing to intervene or to become more involved in the conflict in Syria, for example, is viewed as equivalent to “disengagement.” Rubio wanted a larger, faster intervention in Libya, and he wants greater U.S. involvement in Syria as well. While he said that that the U.S. shouldn’t be involved in “every civil war and every conflict,” Rubio’s record to date shows that he has yet to see a high-profile foreign conflict in which he didn’t want the U.S. heavily involved.
There is no danger that the U.S. will cease to engage with the rest of the world. But there are very real dangers that U.S. foreign policy will remain overly militarized and excessively confrontational toward other states. Rubio’s foreign policy would require more of both. The greatest damage to international peace and stability that the U.S. can do is if it keeps resorting to force to handle crises and disputes as often in this decade as it did in the last. Support for “retreat” is the last thing that Americans need to worry about from their policymakers and political leaders, many of whom remain only too eager to find reasons to sound the attack.
By: Daniel Larison, Contributing Editor at The American Conservative, The Week, March 29, 2013