“Mr. States Rights In A Political Pickle”: How The Constitution May Screw Rand Paul For 2016
Rand Paul has a little-discussed problem. Yes, he’s riding a wave. Yet another new poll brings happy tidings, putting him at the top of the GOP heap in both Iowa and New Hampshire (although still well behind “undecided”). He keeps doing these clever things that titillate the Beltway sages, like coupling with Democratic Sen. Cory Booker (ooh, he’s black!) on sentencing reform. All this, you know. He’s a shrewdie, we have to give him that.
But here’s what you maybe don’t know. Paul is up for reelection in 2016. One assumes that he would want to hold on to his Senate seat. If he ran for president, he would hardly be the first person hoping to appear on a national ticket while simultaneously seeking reelection, although the other examples from the last 30 years have all been vice-presidential candidates: Paul Ryan in 2012, Joe Biden in 2008, Joe Lieberman in 2000, and… trivia question, who’s the fourth?
For those, it hadn’t been a problem. But it is for Paul, because under Kentucky law, he cannot run for two offices at the same time. The law has been on the books in the Bluegrass State for a long time. Paul quietly asked that it be changed, and the GOP-controlled state senate acquiesced this past session. But the Democrats have the majority in the lower house, and they let the bill expire without voting on it. I would reckon, unless the Kentucky state house’s Democratic majority is possessed of a shockingly benevolent character unlike every other legislative majority I’ve ever encountered, it won’t be rushing to pass it.
Paul has said that he’d just ignore the law.
We should stop and pause to appreciate that: Rand Paul, of all people, arguing that states don’t have the authority to dictate the rules for federal elections. Yes, Mr. States’ Rights insists that this is the province of the federal government!
It gets even better. The tradition that states set the rules of their elections and always have was not handed to us by a bunch of pinko mid-century judges, but lo and behold, by the Framers themselves. I give you Article I, Section 4 of the Constitution: “The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations, except as to the Places of choosing Senators.” So not only is Mr. States’ Rights backing the federal jackboot as long as it’s kicking on his behalf, but Mr. Tea Party Strict Constitutionalist is challenging the Constitution!
Here’s what the Supreme Court has had to say on the matter. There are two cases that are most relevant, U.S. Term Limits Inc. v. Thornton and Cook v. Gralike. In those cases, the court held that Arkansas and Missouri’s respective term-limit laws added extra qualifications to seek office that weren’t found in Article I, Sections 2 and 3 of the Constitution (the sections that state the qualifications for candidates for the House of Representatives and the Senate). That is, the court protected candidates who had served X number of terms and were thus, under those states’ laws, prohibited from seeking office again. You can’t do that, said the court to states; you’re in essence adding an extra-constitutional “qualification” for office (that a candidate can’t have served more than three terms). Sen. Paul can argue that Kentucky’s law imposes an extra-constitutional qualification on him—that if he wants to run for president, the state has added the “qualification” that he not also run for Senate.
I’m no lawyer, but that sounds like a reach to me. A term-limits law is a clear imposition of an added qualification. But a law requiring that a person seek only one office at a time seems to me like a perfectly reasonable thing for a state to decide, under the word “manner” in the relevant constitutional passage, if it wants to. States have had these laws for a long time. Florida has one, too, and Marco Rubio—also up for reelection in 2016 and also considering a White House run—has defended it and said of running for the presidency: “I think, by and large, when you choose to do something as big as that, you’ve really got to be focused on that and not have an exit strategy.”
Paul said in June: “Can you really have equal application of federal law if someone like Paul Ryan or Joe Lieberman can run for two offices, but in Kentucky you would be disallowed? It seems like it might not be equal application of the law to do that. But that means involving a court, and I don’t think we’ve made a decision on that. I think the easier way is to clarify the law.” Touching. I doubt Paul worries too much about the “equal application of federal law” for pregnant women who live in states where they’ve found ways to shut down every federally legal abortion clinic. And of course, historically speaking, there are the black Kentuckians and Southerners generally who weren’t soaking up much equal application of federal law until the passage of the Civil Rights Act that Paul so famously told Rachel Maddow in 2010 he would have opposed.
Paul is going to be in a political pickle over this. Remember, a presidential candidate has never done this in modern history, just vice-presidential ones (trivia answer: Lloyd Bentsen in 1988). Vice president—who really cares. But president? Even if he prevailed in court, can a person really run for president of the United States while also seeking another office? Rubio sounds right here to me. This is the presidency. It just seems cheesy. Plain and simple, Paul should have to choose.
By: Michael Tomasky, The Daily Beast, July 18, 2014
“Gun Laws And What The Second Amendment Intended”: When The NRA Didn’t Support Everything That Goes ‘Bang’!
As school shootings erupt with sickening regularity, Americans once again are debating gun laws. Quickly talk turns to the Second Amendment.
But what does it mean? History offers some surprises: It turns out in each era, the meaning is set not by some pristine constitutional text, but by the push and pull, the rough and tumble of public debate and political activism. And gun rights have always coexisted with responsibility.
At 27 words long, the provision is the shortest sentence in the U.S. Constitution. It reads: “A well regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed.”
Modern readers squint at its stray commas and confusing wording. The framers believed in freedom to punctuate.
It turns out that to the framers, the amendment principally focused on those “well regulated militias.” These militias were not like anything we know now: Every adult man (eventually, every white man) served through their entire lifetime. They were actually required to own a gun, and bring it from home.
Think of the minutemen at Lexington and Concord, who did battle with the British army. These squads of citizen soldiers were seen as a bulwark against tyranny. When the Constitution was being debated, many Americans feared the new central government could crush the 13 state militias. Hence, the Second Amendment. It protected an individual right – to fulfill the public responsibility of militia service.
What about today’s gun-rights debates? Surprisingly, there is not a single word about an individual right to a gun for self-defense in the notes from the Constitutional Convention; nor with scattered exceptions in the transcripts of the ratification debates in the states; nor on the floor of the U.S. House of Representatives as it marked up the Second Amendment, where every single speaker talked about the militia. James Madison’s original proposal even included a conscientious objector clause: “No person religiously scrupulous of bearing arms shall be compelled to render military service in person.”
To be clear, there were plenty of guns in the founding era. Americans felt they had the right to protect themselves, especially in the home, a right passed down from England through common law. But there were plenty of gun laws, too. Boston made it illegal to keep a loaded gun in a home, due to safety concerns. Laws governed the location of guns and gunpowder storage. New York, Boston and all cities in Pennsylvania prohibited the firing of guns within city limits. States imposed curbs on gun ownership. People deemed dangerous were barred from owning weapons. Pennsylvania disarmed Tory sympathizers.
That balance continued throughout our history, even in the Wild West. A historic photo of Dodge City, Kansas, the legendary frontier town, shows a sign planted in the middle of its main street: “The Carrying of Fire Arms Strictly Prohibited.” Few thought the Constitution had much to say about it.
Through much of history, this balance evoked little controversy. Even the National Rifle Association embraced it. Today the NRA is known for harsh anti-government rhetoric, but it was started to train former Union soldiers in marksmanship. In the 1930s, the group testified for the first federal gun law. In 1968, its American Rifleman magazine told its readers the NRA “does not necessarily approve of everything that goes ‘Bang!’”
Of course, over the past three decades, the NRA shifted sharply. At the group’s 1977 annual meeting, still remembered as the “Revolt at Cincinnati,” moderate leaders were voted out and the organization was recast as a constitutional crusade.
Together with even more intense advocates, such as the Second Amendment Foundation, of Bellevue, Washington, they are quick to decry any gun laws as an assault on a core, sacred constitutional right. They waged a relentless constitutional campaign to change the way we see the amendment.
Remarkably, the first time the Supreme Court ruled that the Second Amendment recognizes an individual right to gun ownership was in 2008. The decision, District of Columbia v. Heller, rang loudly. But a close read shows that Justice Antonin Scalia and his colleagues make the familiar point that gun rights and responsibilities go together. The court said that, like all constitutional rights, there could be limits. “Nothing in our opinion should be taken to cast doubt on longstanding prohibitions on the possession of firearms by felons and the mentally ill, or laws forbidding the carrying of firearms in sensitive places such as schools and government buildings, or laws imposing conditions and qualifications on the commercial sale of arms,” Scalia wrote.
That’s how judges have interpreted this constitutional right. Dozens of courts have examined gun laws since 2008. Overwhelmingly they have upheld them, despite the claims of gun-rights attorneys. Yes, there is an individual right to gun ownership — but with rights come responsibilities. Society, too, has a right to safety, and there is a compelling public interest in laws to keep guns out of the hands of dangerous people.
To be sure, the final scope of the constitutional provision has not been determined. The Supreme Court has not spoken again. It is infallible because it is final, as Justice Robert Jackson once wrote, not final because it is infallible. But the greatest controversy revolves around issues such as the rules for carrying a gun outside the home.
So what does the Second Amendment really mean? From the debate over the Constitution to today’s gun fights, the answer is really up to us, to the people. That answer changes over time. But one thing has remained surprisingly constant: Americans cherish freedom, but believe passionately that rights demand responsibilities. It’s hard to think of an area where that insight matters more than when it comes to ensuring that lethal weapons do not fall into the wrong hands.
By: Michael Waldman, President of the Brennan Center for Justice at New York University School of Law; The National Memo, July 14, 2014
“What Did The Framers Really Mean?”: It Wasn’t To Trump The Public Good
Three days after the publication of Michael Waldman’s new book, “The Second Amendment: A Biography,” Elliot Rodger, 22, went on a killing spree, stabbing three people and then shooting another eight, killing four of them, including himself. This was only the latest mass shooting in recent memory, going back to Columbine.
In his rigorous, scholarly, but accessible book, Waldman notes such horrific events but doesn’t dwell on them. He is after something else. He wants to understand how it came to be that the Second Amendment, long assumed to mean one thing, has come to mean something else entirely. To put it another way: Why are we, as a society, willing to put up with mass shootings as the price we must pay for the right to carry a gun?
The Second Amendment begins, “A well-regulated Militia, being necessary to the security of a free State,” and that’s where Waldman, the president of the Brennan Center for Justice at the New York University School of Law, begins, too. He has gone back into the framers’ original arguments and made two essential discoveries, one surprising and the other not surprising at all.
The surprising discovery is that of all the amendments that comprise the Bill of Rights, the Second was probably the least debated. What we know is that the founders were deeply opposed to a standing army, which they viewed as the first step toward tyranny. Instead, their assumption was that the male citizenry would all belong to local militias. As Waldman writes, “They were not allowed to have a musket; they were required to. More than a right, being armed was a duty.”
Thus the unsurprising discovery: Virtually every reference to “the right of the people to keep and bear Arms” — the second part of the Second Amendment — was in reference to military defense. Waldman notes the House debate over the Second Amendment in the summer of 1789: “Twelve congressmen joined the debate. None mentioned a private right to bear arms for self-defense, hunting or for any purpose other than joining the militia.”
In time, of course, the militia idea died out, replaced by a professionalized armed service. Most gun regulation took place at the state and city level. The judiciary mostly stayed out of the way. In 1939, the Supreme Court upheld the nation’s first national gun law, the National Firearms Act, which put onerous limits on sawed-off shotguns and machine guns — precisely because the guns had no “reasonable relation” to “a well-regulated militia.”
But then, in 1977, there was a coup at the National Rifle Association, which was taken over by Second Amendment fundamentalists. Over the course of the next 30 years, they set out to do nothing less than change the meaning of the Second Amendment, so that it’s final phrase — “shall not be infringed” — referred to an individual right to keep and bear arms, rather than a collective right for the common defense.
Waldman is scornful of much of this effort. Time and again, he finds the proponents of this new view taking the founders’ words completely out of context, sometimes laughably so. They embrace Thomas Jefferson because he once wrote to George Washington, “One loves to possess arms.” In fact, says Waldman, Jefferson was referring to some old letter he needed “so he could issue a rebuttal in case he got attacked for a decision he made as secretary of state.”
Still, as Waldman notes, the effort was wildly successful. In 1972, the Republican platform favored gun control. By 1980, the Republican platform opposed gun registration. That year, the N.R.A. gave its first-ever presidential endorsement to Ronald Reagan.
The critical modern event, however, was the Supreme Court’s 2008 Heller decision, which tossed aside two centuries of settled law, and ruled that a gun-control law in Washington, D.C., was unconstitutional under the Second Amendment. The author of the majority opinion was Antonin Scalia, who fancies himself the leading “originalist” on the court — meaning he believes, as Waldman puts it, “that the only legitimate way to interpret the Constitution is to ask what the framers and their generation intended in 1789.”
Waldman is persuasive that a truly originalist decision would have tied the right to keep and bear arms to a well-regulated militia. But the right to own guns had by then become conservative dogma, and it was inevitable that the five conservative members of the Supreme Court would vote that way.
“When the militias evaporated,” concludes Waldman, “so did the original meaning of the Second Amendment.” But, he adds, “What we did not have was a regime of judicially enforced individual rights, able to trump the public good.”
Sadly, that is what we have now, as we saw over the weekend. Elliot Rodger’s individual right to bear arms trumped the public good. Eight people were shot as a result.
By: Joe Nocera, Opinion Writer, The Washington Post, May 26, 2014
“The Presidency Comes With Executive Power, Deal With It”: Obama’s Just Doing What He’s Empowered To Do
In his State of the Union address, President Barack Obama vowed to act on his own if Congress did not do its part. Republicans duly took the bait. “We don’t have a monarchy in this country,” said Representative Steve Scalise of Louisiana. “The abuse of power by the administration has only become more brazen,” said Senator Ted Cruz.
Obama has unsheathed the sword of executive power, and yet rather than use it to smite his foes, he seems intent on clipping hedges. He says he will raise the minimum wage for a few thousand employees of federal contractors, tinker with the pension system, trim red tape, cajole business leaders to fund pre-kindergarten education, and do something unspecified to help stop gun violence.
Obama begged Congress for help far more often than he vowed to go it alone. Obama’s significant acts of executive power—the Libya intervention, the refusal to defend DOMA before the Supreme Court, non-enforcement of the immigration law against certain groups, climate regulation, NSA surveillance, recess appointments, executive privilege, and so on—lie in the past.
So we have a paradox. In his first term, Obama humbly beseeched Congress for help and sang the virtues of bipartisanship while resorting to unilateral action whenever he needed to. Today, he announces his defiance of Congress yet seems uninterested in using his newly acknowledged executive powers to, for example, shut Guantanamo Bay or raise the debt ceiling on his own.
Be that as it may, it is worth understanding what is at stake in these debates. We all learned in school that the founders feared executive power and so gave policy-making authority to Congress. In fact, the founders feared a too-powerful Congress as well, and they sought to create a strong executive. But the idea that Congress makes law and the president executes it—and any deviation from this pattern is tyranny—is burned into our political culture.
This system of separation of powers was cumbersome from the start. The country did well in its first few decades probably because state governments led the way, and state government structure was far less rigid than federal structure, which finally collapsed with the Civil War. When the communications and transportation revolutions created national markets and new opportunities and threats in foreign relations, it was finally clear that the federal separation-of-powers system could not manage policy at a national level.
The problem was that Congress was an enormously clumsy institution. Its numerous members fiercely advanced their deeply parochial interests. Policies of great importance for one section of the country, or one group of people, could not be embodied in legislation unless logrolling could be arranged, which was slow, difficult, and vulnerable to corruption. As a public, deliberative body, Congress could not react swiftly to changing events, nor act secretly when secrecy was called for.
No one held a constitutional convention to replace the eighteenth-century constitution with a twentieth-century one. Instead, political elites acting through the party system adjusted the government structure on their own. Congress created gigantic regulatory agencies and tasked the president to lead them. Congress also acquiesced as presidents asserted authority over foreign policy. The Supreme Court initially balked at the legislative delegations but eventually was bullied into submission; it hardly ever objected to the president’s dominance over foreign affairs.
This was not a smooth process. The rise of executive power sometimes hurt important interests and always rubbed against the republican sensibilities that Americans inherited from the founders. From time to time, Congress reaped political benefits from thwarting the president. But today Congress reacts rather than leads. It investigates allegations of corruption in the executive branch. It holds hearings to torment executive officials. It certainly doesn’t give the executive the budget he always wants, or pass every new law that he believes that he needs. But existing laws and customs almost always give the president the power he needs to govern. And when they don’t, Congress will sooner or later give him the power he wants. Witness the Dodd-Frank Act and the Affordable Care Act—two massive expansions of executive power.
In monarchies, the official position was that the king made policy but everyone understood that his ministers did. In our system, the official story is that Congress makes policy and the president implements it—such is the inertia of history. But the reality is that the president both makes policy and implements it, subject to vague parameters set down by Congress and to its carping from the sidelines. Presidents can defy the official story and assert the reality if they want. That is what the George W. Bush administration did, to its eventual sorrow. In hindsight, the broad assertions of executive power by Bush administration lawyers in signing statements, executive orders, and secret memos were naïve. They described, with only some exaggeration, the actual workings of the government, but their account conflicted with the official narrative and thus played into the hands of critics, who could invoke tyranny, dictatorship, and that old standby, the “imperial presidency.”
Democratic presidents have been shrewder. Bill Clinton and Obama have been just as muscular in their use of executive power as Ronald Reagan and Bush, but they resisted the temptation to brandish the orb and scepter. Whereas Republican presidents cite their constitutional powers as often as they can, Democratic presidents avoid doing so except as a last resort, preferring instead to rely on statutes, torturing them when necessary to extract the needed interpretation. Thus did Obama’s lawyers claim that the military intervention in Libya did not violate the War Powers Act because the U.S. bombing campaign did not amount to “hostilities” (the word in the statute). A more honest legal theory—one that does not require such a strained interpretation of a word—is that the War Powers Act infringes on the president’s military powers, but a theory like that would have provoked howls of protest.
In most cases, lawyers do not need to resort to such measures because Congress has already granted authority. The president’s power to raise the minimum wage comes from the Federal Property and Administrative Services Act of 1949, which, in typically broad language, permits the president to set contract terms with federal contractors so as to promote “efficiency.” Far from being a bold assertion of executive power, this is the type of humdrum presidential action that takes place every day.
Congress gave the president the power to determine contract terms because Congress did not want to—practically speaking could not—negotiate those terms itself every time the U.S. government entered a contract. This principle explains why Congress gives the executive branch enormous discretion to determine health, education, environmental, and financial policy. Congress directed the financial regulators to implement the Volcker Rule, but it would be entirely up to those regulators to make the rule meaningful or toothless. Nor can Congress block Obama’s decision to effectively implement the Dream Act—which was not passed by Congress—by not enforcing immigration laws against those who would have benefited from the act.
Meanwhile, the founders’ anxieties about executive tyranny have proven erroneous. The president is kept in check by elections, the party system, the press, popular opinion, courts, a political culture that is deeply suspicious of his motives, term limits, and the sheer vastness of the bureaucracy which he can only barely control. He does not always do the right thing, of course, but presidents generally govern from the middle of the political spectrum.
Obama’s assertion of unilateral executive authority is just routine stuff. He follows in the footsteps of his predecessors on a path set out by Congress. And well should he. If you want a functioning government—one that protects citizens from criminals, terrorists, the climatic effects of greenhouse gas emissions, poor health, financial manias, and the like—then you want a government led by the president.
By: Eric Posner, The New Republic, February 3, 2014
“Jelly Belly Flag Wavers”: Remembering Why The Right Doesn’t Own The Stars and Stripes
Like many men who volunteered for the U.S. Army in World War II, my late father never boasted about his years in uniform. A patriot to his core, he nevertheless despised what he called the “jelly-bellied flag flappers.” But in the decade or so before he passed away, he began to sport a small, eagle-shaped pin on his lapel, known as a “ruptured duck.” Displaying the mark of his military service said that this lifelong liberal loved his country as much as any conservative — and had proved it.
Are such gestures still necessary today? For decades right-wingers have sought to establish a near-monopoly on patriotic expression, all too often with the dumb collusion of some of its adversaries on the left. But on July 4, when we celebrate the nation’s revolutionary founding, I always find myself pondering just how fraudulent and full of irony this right-wing tactic is. It is only our collective ignorance of our own history that permits conservatives to assert their exclusive franchise on the flag, the Declaration of Independence, and the whole panoply of national symbols, without provoking brutal mockery.
But we need not play their style of politics to argue that the left is equally entitled to a share of America’s heritage — indeed, in the light of history, perhaps more entitled than its rivals. So let’s begin, in honor of the holiday, at the official beginning.
Although “right” and “left” didn’t define political combat at that time on these shores, there isn’t much doubt that behind the American Revolution, and in particular the Declaration of Independence, was not only a colonial elite but a cabal of left-wing radicals as well.
What other description would have fitted such figures as Samuel Adams and Thomas Paine, who declared their contempt for monarchy and aristocracy? Their wealthier, more cautious colleagues in the Continental Congress regarded Adams as a reckless adventurer “of bankrupt fortune,” and Paine as a rabble-rousing scribbler. Popular democracy was itself a wildly radical doctrine in the colonial era, tamed in the writing of the Constitution by the new nation’s land-owning elites and slaveholders.
The right-wingers of the Revolutionary era were Tories — colonists who remained loyal to the British crown, fearful of change and, in their assistance to the occupying army of George III, the precise opposite of patriots. Only from the perspective of two centuries of ideological shift can the republican faith of the Founding Fathers be described as “conservative.”
The Civil War, too, was a struggle between left and right, between patriots and … well, in those days the Confederate leaders were deemed traitors (an epithet now usually avoided out of a decent concern for Southern sensibilities). Academics will argue forever about that war’s underlying economic and social causes, but it was the contemporary left that sought to abolish slavery and preserve the Union, while the right fought to preserve slavery and dissolve the Union. Today, reverence for the Confederacy remains the emotional province of extremely right-wing Southern politicians and intellectuals (as well as the Ku Klux Klan and neo-Nazi skinheads, and not a few members of the Tea Party). These disreputable figures denigrate Lincoln, our greatest president, and wax nostalgic for the plantation culture.
At the risk of offending every furious diehard who still waves the Stars and Bars, it is fair to wonder what, exactly, is patriotic about that?
Yet another inglorious episode in the annals of conservatism preceded the global war against fascism. The so-called America First movement that opposed U.S. intervention against Hitler camouflaged itself with red, white and blue but proved to be a haven for foreign agents who were plotting against the United States. While Communists and some other radicals also initially opposed American entry into World War II for their own reasons, the broad-based left of the New Deal coalition understood the Axis threat very early. Most conservatives honorably joined the war effort after Pearl Harbor, but more than a few on the right continued to promote defeatism and appeasement even then. And with all due respect to neoconservatives and other late-arriving right-wingers, the historical roots of postwar conservatism — the “Old Right” of Joe McCarthy and Pat Buchanan, the Buckleys and the Kochs — can be traced to those prewar sympathizers of the Axis.
The criminal excesses of the Cold War in Vietnam and elsewhere, so eagerly indulged by the right to this day, alienated many Americans on the left from their country for a time. Conservatives seized the opportunity presented by flag-burning protests and other adolescent displays to marginalize their ideological opponents as un-American, although only a tiny minority dove off that deep end. But how many conservatives like Dick Cheney and Rush Limbaugh beat the Vietnam draft while liberals like John Kerry, Al Gore, and Wesley Clark all served? And who truly protected this country’s best interests back then — the politicians who dispatched 50,000 young Americans to their deaths in the rice paddies, or those who dissented?
It is a lesson we didn’t learn in time to save us from another debacle in Iraq, when dissent was again vilified – and again proved more sane and patriotic than the bloodlust of the chicken-hawks.
Yet somehow our wingers always manage to wrap themselves in Old Glory, as if it belongs to them alone. But on this holiday, and every day, it assuredly does not.
By: Joe Conason, The National Memo, July 2, 2013