“If You Vote For A Republican…Beware”: Republican Governors Show Their True Colors Turning Down Billions In Medicaid Expansion
In a 2012 decision, the Supreme Court ruled that states could decide to take the Medicaid expansion or not. In a purely political, but predictable move, Republican governor after Republican governor chose to say no to Medicaid expansion for their states even though their community hospitals are bursting at the seams.
Why would any elected official turn down free health care dollars for its citizens? These 24 Republican governors would prefer to say no to billions of federal dollars that would provide healthcare coverage for millions of destitute folks, than take funds from the Obama administration. They claim their states could not afford the expansions. The truth is that the federal government pays 100 percent of the cost the first three years and then at least 90 percent thereafter. Hate truly is stronger than compassion in the GOP and it is costing the party their logic, reason and good business sense. When you turn down health care for millions of citizens, billions of dollars and job creation out of spite, you are not representing the best interests of your constituents.
Many Republicans say they don’t think the government should be involved in keeping its citizens healthy through a government-provided healthcare system. My question is why is it OK for great government health care to be provided to these elected Republicans but it’s not OK to provide for our American people?
Rick Perry, governor of Texas, turned down the Medicaid expansion that would have created 200,000 new jobs in addition to insuring millions of people. As a result of his selfish ideology, Texas will lose more than $9 billion.
In Florida, the healthcare company Columbia/HCA, was fined $1.7 billion for Medicare fraud while Rick Scott, prior to being governor, was CEO. Now Scott doesn’t want to let Florida’s citizens receive the benefits from the Medicaid expansion. Florida will lose $5 billion.
If Louisiana accepted the ACA provisions and expanded Medicaid, 240,000 people would be eligible for affordable care, yet Governor Bobby Jindal refused.
Gov. Mary Fallin and legislative leaders also rejected the expansion of Medicaid coverage for approximately 175,000 uninsured Oklahomans leaving the state with no viable overall healthcare plan.
In Pennsylvania, Gov. Corbett’s decision not to accept the expansion will leave $500 million in federal funds on the table in 2014. These funds could provide health care for 500,000 people, a financial boost to hospitals and local healthcare providers, and create upwards of 35,000 jobs.
Likewise, Governor Christi of New Jersey vetoed a bill that would permanently establish the Medicaid expansion.
By 2022, Georgia, Missouri, North Carolina and Virginia will all lose more than $2 billion each.
Expanding Medicaid coverage costs less than 1 percent of the state budgets on average, while not accepting the funds are leading to state budget shortfalls and health facilities closures.
While the Republicans are quick to send our military into harm’s way, they are less eager to take care of them when they return home.
About 1.3 million veterans are uninsured nationwide. According to a report by Pew, approximately 258,600 of these veterans are living below the poverty line in states refusing to expand Medicaid. Without veteran’s benefits and with incomes too low to qualify for subsidies to use state exchanges, these veterans are left without affordable coverage options.
State governors owe the best health care available to their citizens whether veterans, indigent or just the sick. But, that isn’t what these Republican governors are doing. They are placing their political ideology over their citizens’ health.
The states with the most uninsured and the poorest people are the same states refusing to take federal funds to help their people. Instead of embracing the Medicaid expansion, they are shunning it as if it was a plague. While taxpayers in all states fund the Medicaid expansion, only people in half the states are reaping the advantages of those tax dollars, jobs and medical benefits — the states with Democratic governors.
Democratic Maryland Gov. Martin O’Malley, on the day after the Affordable Care Act of 2010 was signed into law, appointed a task force to prepare his state to accept more Medicaid money and establish rules on how it would be spent. Its program will offer 300 insurance options provided by 12 private insurance companies and nine managed-care systems. These aren’t government programs but private ones — just like the coverage carried today by millions of Americans.
Governor Beshear (D) of Kentucky has made the Medicaid expansion a key component of his administration. He quickly accepted ACA realizing that the 640,000 uninsured Kentuckians would be able to get insurance through Medicaid expansion and coverage through the health benefit exchanges.
Every American citizen over the voting age of 18 has the right to vote for a Democrat, a Republican or someone from one of the smaller parties. But, if you vote for a Republican… beware of what you might lose as a result.
By: Gerry Myers, CEO, President and Co-founder of Advisory Link; The Huffington Post Blog, May 26, 2014
“Actions Speak For Themselves”: Talking About Race Is No Black-And-White Matter
When Sen. Jay Rockefeller (D-W.Va.) remarked last week that some of the opposition to President Obama’s Affordable Care Act is “maybe he’s of the wrong color,” he was just saying out loud what many people believe. And no, he wasn’t calling Sen. Ron Johnson (R-Wis.) a “racist.”
Believing that some of the Republican and tea party opposition to Obama has to do with his race is not, I repeat not, the same as saying that anyone who disagrees with the nation’s first black president is racist.
Speaking Wednesday at a sparsely attended Senate commerce committee hearing, Rockefeller said this subject is “not something you’re meant to talk about in public.” He’s retiring from the Senate at the end of the year and, well, he’s a Rockefeller, so I imagine he feels free to talk about anything he likes.
Johnson was the only Republican senator in the room when Rockefeller made the remark. He took umbrage, telling Rockefeller, “I found it very offensive that you would basically imply that I’m a racist because I oppose this health-care law.” He later added, “I was called a racist. I think most people would lose their temper, Mr. Chairman.”
But Rockefeller didn’t call him a racist. Nor did he “play the race card,” as Johnson accused him of doing.
My purpose here is not to convince everyone that Rockefeller is right about the massive GOP resistance to Obama — although I certainly agree with him — but rather to consider the things we say when we want to avoid talking about race. “You called me a racist” and “You played the race card” have become all-purpose conversation stoppers.
Whenever I write about race, some readers react with one or the other of these end-of-discussion criticisms. Some people believe, or pretend to believe, that mentioning race in almost any context is “playing the race card.” Nearly 400 years of history — since the first Africans landed at Jamestown in 1619 — amply demonstrate that this view is either Pollyannaish or deeply cynical. We will never get to the point where race is irrelevant if we do not talk about the ways in which it still matters.
As for the “called-me-a-racist” charge, I go out of my way not to do that. All right, I did make an exception for Cliven Bundy and Donald Sterling — I wrote that they were not “the last two racists in America” — but I think most people would agree that I was on solid ground. Their own words and actions proved the point.
In general, I try to focus on what a person does or says rather than speculate on what he or she “is.” How can I really know what’s in another person’s heart?
Is it true, as Dallas Mavericks owner Mark Cuban opined, that everyone is a little bit racist? Beats me. I know that psychologists, sociologists and anthropologists have written sheaves of peer-reviewed papers about implicit or unconscious bias, and I have no reason to doubt this research. But no generalized finding says anything definitive about a given individual.
In the end, all we can do is look at what the individual does, listen to what he or she says and then draw conclusions about those words and deeds.
I’m reminded of a tea party rally at the Capitol four years ago when Congress was about to pass the Affordable Care Act. I can’t say that the demonstrators who hissed and spat at members of the Congressional Black Caucus were racists — but I saw them committing racist acts. I can’t say that the people holding “Take Back Our Country” signs were racists — but I know this rallying cry arose after the first African American family moved into the White House.
I believe Rockefeller was justified in looking at the vehemence and implacability of Republican opposition to the Affordable Care Act and asking whether the president’s race is a factor. I believe there are enough words and deeds on the record to justify Rockefeller’s subsequent comment that race “is a part of American life . . . and it’s a part — just a part — of why they oppose absolutely everything that this president does.”
Sen. Tim Scott of South Carolina, the only black Republican in Congress, said it was “ridiculous” to think GOP opposition to the health-care reforms had anything to do with race.
Referring to Rockefeller, Scott added: “I can’t judge another man’s heart.” On this, at least, we agree.
By: Eugene Robinson, Opinion Writer, The Washington Post, May 26, 2014
“Rand Paul Is A Deeply Cynical Politician”: It’s Hard To Spot The Conviction But The Hypocrisy Is Evident
When Washingtonians refer to Rand Paul as a different breed of politician than his father, they generally mean it in a good way. The implication is that he is more pragmatic and tactical (probably more tactful, too); that his worldview has broader appeal. Whereas Ron Paul is way too much of a crank to ever have a shot of winning the GOP presidential nomination, Rand increasingly looks like a contender.
But whatever you might think about the elder Paul, you can say this for him: He is not cynical. He is a conviction politician, however repugnant some of us may find his convictions. The younger Paul? Well, he certainly styles himself a man of conviction. But at this point in his presidential quest, it’s getting hard to say for sure.
Take this story on Rand Paul’s “evolving” foreign policy views in Saturday’s New York Times. The premise of the piece is that Paul is being somewhat unfairly attacked by the hawkish wing of his party, whose members often fail to see the distinction between his father’s isolationism and his more nuanced brand of non-interventionism. As evidence, the piece adduces this rather eye-catching data point:
Morton Klein, president of the Zionist Organization of America and a close associate of [GOP mega-donor Sheldon] Adelson’s, said that when he pressed Mr. Paul to explain his position on aid to Israel in a recent meeting in the senator’s Washington office, Mr. Klein left reassured. “He said if there was a vote and for any reason it seemed like it was actually going to be close, he would vote for it,” Mr. Klein said.
So, if Klein’s account is correct (and the Times presumably ran it by Senator Paul), what we have is as follows: Paul’s public position is that we should cut off all foreign aid, including aid to Israel, which he dubbed “welfare” back in 2011. But if Paul were ever in a position to end aid to Israel—which is to say, the only time his personal position would really matter—he would abandon that position, and instead vote to ensure that the aid continues.
I’m not sure I can think of a more irresponsible position. If Rand Paul thinks aid to Israel is truly important, then it’s deeply cynical to badmouth that aid simply because bad-mouthing appeals to the type of voter he’s courting. And if he thinks aid to Israel is irredeemably wasteful, then it’s deeply cynical to fink out when given the opportunity to roll it back. Either way, it’s hard to spot the conviction here.
In fairness, Paul did try to resolve this tension at another event, telling the board of the Republican Jewish Coalition that, in the Times’ paraphrasing, “while he would eventually like to terminate all foreign aid, he knew that would not be realistic now.” The most charitable interpretation of this riff is that Paul would like to cut off aid as soon as possible, but realizes you can’t do it abruptly without triggering major blowback among U.S. allies that would damage our standing around the world. That would indeed speak to his pragmatism.
But this interpretation seems like a stretch given that Paul’s comments appear to have been a lot less coherent than that, or at least less specific. “You could see he was a work in progress,” former George W. Bush spokesman Ari Fleischer, who attended the meeting, told the Times. Instead, it’s hard to avoid the impression that Paul is simply trying to reassure neoconservatives that he’ll be with them on the issue they care about most, but without junking a big source of his political appeal. That’s not an “evolution.” It’s hypocrisy.
By: Norm Scheiber, The New Republic, May 26, 2014
“The Longest War”: Afghanistan, The Soon To Be Forgotten War
President Obama made a surprise visit to Afghanistan yesterday, telling American troops that while “Afghanistan is still a very dangerous place,” they can take pride in what they’ve accomplished. “More Afghans have hope in their future, and so much of that is because of you.” As we honor the service members who gave their lives in all of America’s wars, it’s a good time to ask how we’ll look at the longest one we’ve ever fought. By the time we wind down our mission there at the end of this year, the Afghanistan war will have lasted over 13 years.
Here’s a prediction, one I make with no pleasure: when we pull most of our troops out of the country later this year, most Americans will quickly try to forget Afghanistan even exists.
Consider this: How much have you thought about Iraq lately? When the last U.S. troops left there in December 2011 after nearly nine years of war, the public was relieved that we could finally wash our hands of what was probably the worst foreign policy disaster in American history, with over 4,000 Americans dead (not to mention hundreds of thousands of Iraqis) and a couple of trillion dollars spent, all for a war sold on false pretenses. But unless you’ve been paying attention to the stories on the inside pages, you may not have noticed that Iraq is not exactly the thriving, peaceful democracy we hoped we would leave behind. The country is beset by factional violence; according to the United Nations, 7,818 Iraqi civilians were killed in attacks in 2013. No country in the world saw more terrorism.
I’m not arguing that there’s much we can do about it now, or that we should have stayed. But as far as Americans are concerned, Iraq’s problems are now Iraq’s to solve, and most of us would rather just not think about it.
We’ll be keeping troops in Afghanistan after the end of this year, to do targeted counterterrorism and training of Afghan forces. The number hasn’t yet been determined, but it will be small enough that we can say we’re no longer at war there. And for all we know today, things could turn out great. Perhaps the Afghan government will manage to clear itself of the corruption with which it has been infected, and perhaps the country will not be riven by factional violence. Perhaps we will leave behind a state with enough strength and legitimacy to hold the country together. But if those things don’t happen, most Americans won’t want to hear about it.
Afghanistan will get put in the same corner of our minds we now place Iraq. So many misguided decisions from those at the top, so much sacrifice from those on the ground, and for what? The answer is too painful to contemplate, so we’ll prefer to thank the veterans for their service and not spend too much time thinking about the larger questions of what the war meant.
By: Paul Waldman, The Plum Line, The Washington Post, May 26, 2014
“What Did The Framers Really Mean?”: It Wasn’t To Trump The Public Good
Three days after the publication of Michael Waldman’s new book, “The Second Amendment: A Biography,” Elliot Rodger, 22, went on a killing spree, stabbing three people and then shooting another eight, killing four of them, including himself. This was only the latest mass shooting in recent memory, going back to Columbine.
In his rigorous, scholarly, but accessible book, Waldman notes such horrific events but doesn’t dwell on them. He is after something else. He wants to understand how it came to be that the Second Amendment, long assumed to mean one thing, has come to mean something else entirely. To put it another way: Why are we, as a society, willing to put up with mass shootings as the price we must pay for the right to carry a gun?
The Second Amendment begins, “A well-regulated Militia, being necessary to the security of a free State,” and that’s where Waldman, the president of the Brennan Center for Justice at the New York University School of Law, begins, too. He has gone back into the framers’ original arguments and made two essential discoveries, one surprising and the other not surprising at all.
The surprising discovery is that of all the amendments that comprise the Bill of Rights, the Second was probably the least debated. What we know is that the founders were deeply opposed to a standing army, which they viewed as the first step toward tyranny. Instead, their assumption was that the male citizenry would all belong to local militias. As Waldman writes, “They were not allowed to have a musket; they were required to. More than a right, being armed was a duty.”
Thus the unsurprising discovery: Virtually every reference to “the right of the people to keep and bear Arms” — the second part of the Second Amendment — was in reference to military defense. Waldman notes the House debate over the Second Amendment in the summer of 1789: “Twelve congressmen joined the debate. None mentioned a private right to bear arms for self-defense, hunting or for any purpose other than joining the militia.”
In time, of course, the militia idea died out, replaced by a professionalized armed service. Most gun regulation took place at the state and city level. The judiciary mostly stayed out of the way. In 1939, the Supreme Court upheld the nation’s first national gun law, the National Firearms Act, which put onerous limits on sawed-off shotguns and machine guns — precisely because the guns had no “reasonable relation” to “a well-regulated militia.”
But then, in 1977, there was a coup at the National Rifle Association, which was taken over by Second Amendment fundamentalists. Over the course of the next 30 years, they set out to do nothing less than change the meaning of the Second Amendment, so that it’s final phrase — “shall not be infringed” — referred to an individual right to keep and bear arms, rather than a collective right for the common defense.
Waldman is scornful of much of this effort. Time and again, he finds the proponents of this new view taking the founders’ words completely out of context, sometimes laughably so. They embrace Thomas Jefferson because he once wrote to George Washington, “One loves to possess arms.” In fact, says Waldman, Jefferson was referring to some old letter he needed “so he could issue a rebuttal in case he got attacked for a decision he made as secretary of state.”
Still, as Waldman notes, the effort was wildly successful. In 1972, the Republican platform favored gun control. By 1980, the Republican platform opposed gun registration. That year, the N.R.A. gave its first-ever presidential endorsement to Ronald Reagan.
The critical modern event, however, was the Supreme Court’s 2008 Heller decision, which tossed aside two centuries of settled law, and ruled that a gun-control law in Washington, D.C., was unconstitutional under the Second Amendment. The author of the majority opinion was Antonin Scalia, who fancies himself the leading “originalist” on the court — meaning he believes, as Waldman puts it, “that the only legitimate way to interpret the Constitution is to ask what the framers and their generation intended in 1789.”
Waldman is persuasive that a truly originalist decision would have tied the right to keep and bear arms to a well-regulated militia. But the right to own guns had by then become conservative dogma, and it was inevitable that the five conservative members of the Supreme Court would vote that way.
“When the militias evaporated,” concludes Waldman, “so did the original meaning of the Second Amendment.” But, he adds, “What we did not have was a regime of judicially enforced individual rights, able to trump the public good.”
Sadly, that is what we have now, as we saw over the weekend. Elliot Rodger’s individual right to bear arms trumped the public good. Eight people were shot as a result.
By: Joe Nocera, Opinion Writer, The Washington Post, May 26, 2014