“Profits Before Patients”: National Drug Shortages Are Threatening Cancer Patients’ Lives
Millions of Americans battling cancer are facing obstacles to recovery that have nothing to do with the disease’s toll on their bodies. According to a new study, national shortages of cancer drugs are threatening the health of the people who rely on them to stay alive.
According to the survey, presented at an oncology conference in Chicago on Monday, about 83 percent of cancer specialists have experienced a drug shortage at their clinics in the past six months. Of those doctors, 92 percent said the shortage had some effect on their patients’ care.
A little over a third of the doctors facing a shortage ended up switching their patients from a cheaper, generic version of that drug to a more expensive brand-name version. Considering the fact that cancer care is already exorbitantly expensive — Americans battling cancer are twice as likely to wind up bankrupt compared to those who don’t have the disease — that could represent a serious strain on those patients.
But cancer patients are facing much more than potential financial hardship. Thanks to the shortages, some cancer specialists can’t find the drugs their patients need at any price. When that happens, doctors are forced to make some painful choices. Nearly 80 percent reported that they switched patients to a different, and potentially less effective, chemotherapy regimen. Some have been forced to give cancers more time to spread further by delaying patients’ treatment or reducing their doses. And 37 percent of the study’s participants even had to choose between their patients, deciding which ones could receive life-saving medication and which ones would have to go without.
William Li, the executive director of a foundation that sponsors research into blood vessel growth, told USA Today that some hospitals are forced to hold lotteries to decide which patients will be able to receive the cancer drugs that are in short supply. “It baffles the mind that this is happening in a modern society,” Li said, pointing out that the FDA should do more to avert drug shortages.
Currently, drug manufacturers can alert the FDA when they suspect an impending shortage, and the federal agency can take steps to try to mitigate the effect on the market, like approving the same kind of drug from a different manufacturer. But so far, that hasn’t been enough to avert the situation. Largely due to manufacturing errors in drug-production facilities across the country, the U.S. faces limited supplies of everything from ADHD medications to painkillers — and cancer patients end up being hit the hardest.
Much of the blame may lie with powerful pharmaceutical companies. One of the co-authors of the new study, Keerthi Gogineni, noted that cancer doctors are concerned drug manufacturers may be prioritizing the most profitable medications over the most life-saving ones. “Some manufacturers have diverted existing production capacity from less profitable agents to more expensive agents,” Gogineni explained. Similarly, a group of over 100 doctors recently criticized Big Pharma for “causing harm to patients” by continuing to sell cancer drugs at unsustainably high prices.
By: Tara Culp-Ressler, Think Progress, June 3, 2013
“A Very Sweet Deal”: Prescription Drug Price-Gouging Enabled By Congress
Republicans and Democrats don’t agree on much. But one thing they would agree on if they knew the facts is that because of the cozy relationship big drug companies have with our lawmakers in Washington, Americans pay far more for their medications than people anywhere else on the planet.
As a consequence, our health insurance premiums are much higher than they should be. And our Medicare program is costing both taxpayers and beneficiaries billions of dollars more than necessary.
Americans who are uninsured are at an even greater disadvantage: many of them have no choice but to put their health at risk because they can’t afford the medications their doctors prescribe for them.
Drug makers have so much influence in Washington that they’ve been able to kill numerous proposals over the years that would enable the U.S. government to regulate drug prices like most other countries do. Between 1988 and 2012, the pharmaceutical industry spent more on lobbying than any other special interest, forking over a total of $2.6 billion on lobbying activities, according to OpenSecrets.org. That’s far more than even banks and oil and gas companies spent.
That money helped them get a very sweet deal when members of Congress were drafting legislation that would eventually be the Medicare Part D prescription drug program. Drug makers were able to get their friends in Congress to insert language in the Part D legislation that prohibits the federal government from seeking the best prices from pharmaceutical companies.
According to a recent analysis by Health Care for America Now (HCAN), an advocacy group, the 11 largest drug companies reported $711.4 billion in profits over the 10 years ending in 2012, much of it coming from the Medicare program. They reaped $76.3 billion in profits in 2006 alone, 34 percent more than in 2005, the year before the Part D program went into effect.
“Americans pay significantly more than any other country for the exact same drugs,” said HCAN Executive Director Ethan Rome.
How much more do we pay than residents of other countries? Here are a few examples of what we pay on average for six brand name drugs compared to what residents of other countries pay, according to the International Federation of Health Plans:
— Celebrex (for pain) – U.S.: $162; Canada: $53
— Cymbalta (for depression and anxiety) – U.S: $176; France: $47
— Lipitor (for high cholesterol) – U.S.: $124; New Zealand: $6
— Nasonex (for nasal allergies) – U.S: $108; U.K.: $12
— Vytorin (for high cholesterol) – U.S: $123; Argentina: $31
— Nexium (for acid reflux) – U.S.: $123; Spain: $18
The Congressional Budget Office says that if Medicare could get the same bulk purchasing discounts on prescription drugs as state Medicaid programs already get, the federal government would save at least $137 billion over 10 years.
In his proposed budget for 2014, President Obama is asking Congress to require drug companies to sell their medications to Medicare at the best price they offer private insurance companies, which is what they are required to do for Medicaid.
On April 16, several members of Congress, led by Sen. Jay Rockefeller (D-W.Va.) and Rep. Henry Waxman (D-Calif.), introduced legislation to require drug companies to provide rebates to the federal government on drugs used by people who are eligible for both Medicare and Medicaid. One of the cosponsors was Independent Sen. Angus King, the former governor of Maine. The lawmakers noted that with the exception of Medicare Part D, all large purchasers of prescription drugs negotiate better prices. Their bill, they say, would correct excessive payments to drug companies, while saving taxpayers and the federal government billions of dollars.
As you can imagine, the drug companies don’t like what President Obama and the lawmakers are proposing. You can expect them to mount a multi-million dollar PR and lobbying campaign over the coming months to protect both their sweet deal with Medicare and their Wall Street-pleasing profits.
By: Wendell Potter, Guest Contributor, Politix, April 23, 2013
“Perpetrating A Healthcare Fraud”: Professors Go Unpunished In Glaxo $3 Billion Guilty Plea Over Paxil
The head of the UCLA hospital, Dr. David Feinberg, and twenty-one other academics are going unpunished despite their role in perpetrating a healthcare fraud that has resulted in the largest fine ever paid by a pharmaceutical company in US history.
On July 3 GlaxoSmithKline pleaded guilty to criminal charges and agreed pay $3 billion in fines for promoting its bestselling antidepressants for unapproved uses. The heart of the case was an article in a medical journal purporting to document the safety and efficacy of Paxil in treating depression in children. The article listed more than twenty researchers as authors, including UCLA’s Feinberg, but the Department of Justice found that Glaxo had paid for the drafting of the fraudulent article to which the researchers had attached their names.
The study, which, according to The Chronicle of Higher Education, had been criticized because it “dangerously misrepresented data” and had “hidden information indicating that the drug promoted suicidal behavior among teenagers,” was published in 2001 in The Journal of the American Academy of Child and Adolescent Psychiatry. The lead “author” was Martin B. Keller, at the time a professor of psychiatry at Brown University. He retired this month. The article had been exposed as fraudulent in a 2007 BBC documentary and in the 2008 book Side Effects: A Prosecutor, a Whistleblower, and a Bestselling Antidepressant on Trial, by Alison Bass. Glaxo’s guilty plea, according to the Chronicle, included an admission that “the article constituted scientific fraud.”
Paxil went on sale in the US in 1993 and, according to Bass, prescriptions for children “soared” after the study appeared, even though research showed Paxil was not more effective than a placebo. But in 2004, the Chronicle reports, British regulators warned against prescribing Paxil to children, after a study reported that children taking Paxil were nearly three times more likely to consider or attempt suicide. Then the US FDA issued a similar warning. Paxil sales totaled more than $11 billion between 1997 and 2005.
Brown University officials said they had no plans to take action against Keller. At UCLA, Dale Triber Tate, a spokesperson for the medical center and Dr. Feinberg, had no comment. The journal that published the fraudulent research has failed to retract it, and editor-in-chief Andres S. Martin, a professor of psychiatry at Yale, told the Chronicle he had no comment on the options the journal might take.
Feinberg and Keller were among twenty-two people listed as “authors” on the fraudulent article. Others included Karen D. Wagner, now professor and vice chair of psychiatry at the University of Texas Medical Branch at Galveston; Boris Birmaher and Neal D. Ryan, professors of psychiatry at the University of Pittsburgh; Graham J. Emslie, professor of psychiatry at the University of Texas Southwestern Medical Center at Dallas; and Michael A. Strober, professor of psychiatry at UCLA.
Although Glaxo pled guilty and paid $3 billion in fines, none of the academics have been disciplined by their universities for their roles in perpetrating research fraud. Moreover, according to the Chronicle, several continue to receive federal grants from the National Institute of Health.
By: Jon Wiener, The Nation, August 7, 2012
“Pay-For-Delay”: Ending Drug Companies’ Deals
An upcoming report by the Federal Trade Commission shows that brand-name pharmaceutical makers continue to cut questionable deals with generic manufacturers that delay the introduction of cheaper drugs onto the market.
Such pay-for-delay arrangements hurt consumers and increase costs for federal programs such as Medicare and Medicaid, according to the report, a copy of which was obtained by the editorial board. These deals are not illegal, but they should be.
Pharmaceutical companies rightly enjoy strong protections for products that often take years and billions of dollars to develop. These protections were so strong at one point that they discouraged would-be competitors from jumping in. The Hatch-Waxman Act of 1984 meant to address this problem by allowing generics to market “bio-equivalent” drugs as long as they did not infringe on the brand-name drug’s patent; the generic could also proceed if it proved the brand-name patent was invalid. The goal was to enhance competition and lower drug prices. That goal is thwarted when brand-name manufacturers engage in the popular practice of paying generic-drug makers to keep their products off the market.
In 2004, the FTC did not identify a single settlement in a patent litigation matter involving drug makers that raised pay-for-delay concerns. In its new report, the agency points to 28 cases that bear the telltale signs of pay-for-delay, including “compensation to the generic manufacturer and a restriction on the generic manufacturer’s ability to market its product.”
Sens. Charles E. Grassley (R-Iowa) and Herb Kohl (D-Wis.) have introduced the Preserve Access to Affordable Generics Act to close the pay-for-delay loophole. The bill would make such schemes presumptively illegal and empower the FTC to challenge suspicious arrangements in federal court. The most recent version gives companies a chance to preserve certain deals if “clear and convincing evidence” proves that their “pro-competitive benefits outweigh the anti-competitive harms.” The Obama administration estimates that eliminating pay-for-delay could save the government $8.8 billion over 10 years; the Congressional Budget Office offers a dramatically more conservative savings estimate of roughly $3 billion over the same period.
The legislation should appeal to the deficit-reduction “supercommittee,” which has been tasked with identifying ways to cut the federal deficit.
By: Editorial Board Opinion, The Washington Post, October 24, 2011
Corporate Dysmorphia: Why “Business Needs Certainty” Is Destructive
If you read the business and even the political press, you’ve doubtless encountered the claim that the economy is a mess because the threat to reregulate in the wake of a global-economy-wrecking financial crisis is creating “uncertainty.” That is touted as the reason why corporations are sitting on their hands and not doing much in the way of hiring and investing.
This is propaganda that needs to be laughed out of the room.
I approach this issue as as a business practitioner. I have spent decades advising major financial institutions, private equity and hedge funds, and very wealthy individuals (Forbes 400 level) on enterprises they own. I’ve run a profit center in a major financial firm and have have also operated a consulting business for over 20 years. So I’ve had extensive exposure to the dysfunction I am about to describe.
Commerce is all about making decisions and committing resources with the hope of earning profit when the managers cannot know the future. “Uncertainty” is used casually by the media, but when trying to confront the vagaries of what might happen, analysts distinguish risk from “uncertainty”, which for them has a very specific meaning. “Risk” is what Donald Rumsfeld characterized as a known unknown. You can still estimate the range of likely outcomes and make a good stab at estimating probabilities within that range. For instance, if you open an ice cream store in a resort area, you can make a very good estimate of what the fixed costs and the margins on sales will be. It is much harder to predict how much ice cream you will actually sell. That is turn depends largely on foot traffic which in turn is largely a function of the weather (and you can look at past weather patterns to get a rough idea) and how many people visit that town (which is likely a function of the economy and how that particular resort area does in a weak economy).
Uncertainty, by contrast, is unknown unknowns. It is the sort of risk you can’t estimate in advance. So businesses also have to be good at adapting when Shit Happens. Sometimes that Shit Happening can be favorable, but they still need to be able to exploit opportunities (like an exceptionally hot summer producing off the charts demand for ice cream) or disaster (like the Fukushima meltdown disrupting global supply chains). That implies having some slack or extra resources at your disposal, or being able to get ready access to them at not too catastrophic a cost.
So why aren’t businesses investing or hiring? “Uncertainty” as far as regulations are concerned is not a major driver. Surveys show that the “uncertainty” bandied about in the press really translates into “the economy stinks, I’m not in a business that benefits from a bad economy, and I’m not going to take a chance when I have no idea when things might turn around.”
The “certainty” they are looking for is concrete evidence that prevailing conditions have really turned. But with so many people unemployed, growth flagging in advanced economies, China and other emerging economies putting on the brake as their inflation rates become too high, and a very real risk of another financial crisis kicking off in the Eurozone, there isn’t any reason to hope for things to magically get better on their own any time soon. In fact, if you look at the discussion above, we actually have a very high degree of certainty, just of the wrong sort, namely that growth will low to negative for easily the next two years, and quite possibly for a Japan-style extended period.
So why this finger pointing at intrusive regulations, particularly since they are mysteriously absent? For instance, Dodd Frank is being water down in the process of detailed rulemaking, and the famed Obamacare actually enriches Big Pharma and the health insurers.
The problem with the “blame the government” canard is that it does not stand up to scrutiny. The pattern businesses are trying to blame on the authorities, that they aren’t hiring and investing due to intrusive interference, was in fact deeply entrenched before the crisis and was rampant during the corporate friendly Bush era. I wrote about it back in 2005 for the Conference Board’s magazine.
In simple form, this pattern resulted from the toxic combination of short-termism among investors and an irrational focus on unaudited corporate quarterly earnings announcements and stock-price-related executive pay, which became a fixture in the early 1990s. I called the pattern “corporate dysmorphia”, since like body builders preparing for contests, major corporations go to unnatural extremes to make themselves look good for their quarterly announcements.
An extract from the article:
Corporations deeply and sincerely embrace practices that, like the use of steroids, pump up their performance at the expense of their well-being…
Despite the cliché “employees are our most important asset,” many companies are doing everything in their power to live without them, and to pay the ones they have minimally. This practice may sound like prudent business, but in fact it is a reversal of the insight by Henry Ford that built the middle class and set the foundation for America’s prosperity in the twentieth century: that by paying workers well, companies created a virtuous circle, since better-paid staff would consume more goods, enabling companies to hire yet more worker/consumers.
Instead, the Wal-Mart logic increasingly prevails: Pay workers as little as they will accept, skimp on benefits, and wring as much production out of them as possible (sometimes illegally, such as having them clock out and work unpaid hours). The argument is that this pattern is good for the laboring classes, since Wal-Mart can sell goods at lower prices, providing savings to lower-income consumers like, for instance, its employees. The logic is specious: Wal-Mart’s workers spend most of their income on goods and services they can’t buy at Wal-Mart, such as housing, health care, transportation, and gas, so whatever gains they recoup from Wal-Mart’s low prices are more than offset by the rock-bottom pay.
Defenders may argue that in a global economy, Americans must accept competitive (read: lower) wages. But critics such as William Greider and Thomas Frank argue that America has become hostage to a free-trade ideology, while its trading partners have chosen to operate under systems of managed trade. There’s little question that other advanced economies do a better job of both protecting their labor markets and producing a better balance of trade—in most cases, a surplus.
The dangers of the U.S. approach are systemic. Real wages have been stagnant since the mid-1970s, but consumer spending keeps climbing. As of June, household savings were .02 percent of income (note the placement of the decimal point), and Americans are carrying historically high levels of debt. According to the Federal Reserve, consumer debt service is 13 percent of income. The Economist noted, “Household savings have dwindled to negligible levels as Americans have run down assets and taken on debt to keep the spending binge going.” As with their employers, consumers are keeping up the appearance of wealth while their personal financial health decays.
Part of the problem is that companies have not recycled the fruits of their growth back to their workers as they did in the past. In all previous postwar economic recoveries, the lion’s share of the increase in national income went to labor compensation (meaning increases in hiring, wages, and benefits) rather than corporate profits, according to the National Bureau of Economic Analysis. In the current upturn, not only is the proportion going to workers far lower than ever before—it is the first time that the share of GDP growth going to corporate coffers has exceeded the labor share.
And businesses weren’t using their high profits to invest either:
Companies typically invest in times like these, when profits are high and interest rates low. Yet a recent JP Morgan report notes that, since 2002, American companies have incurred an average net financial surplus of 1.7 percent of GDP, which contrasts with an average deficit of 1.2 percent of GDP for the preceding forty years. While firms in aggregate have occasionally run a surplus, “. . . the recent level of saving by corporates is unprecedented. . . .It is important to stress that the present situation is in some sense unnatural. A more normal situation would be for the global corporate sector—in both the G6 and emerging economies—to be borrowing, and for households in the G6 economies to be saving more, ahead of the deterioration in demographics.”
The problem is that the “certainty” language reveals what the real game is, which is certainty in top executive pay at the expense of the health of the enterprise, and ultimately, the economy as a whole. Cutting costs is as easy way to produce profits, since the certainty of a good return on your “investment” is high. By contrast, doing what capitalists of legend are supposed to do, find ways to serve customer better by producing better or novel products, is much harder and involves taking real chances and dealing with very real odds of disappointing results. Even though we like to celebrate Apple, all too many companies have shunned that path of finding other easier ways to burnish their bottom lines. and it has become even more extreme. Companies have managed to achieve record profits in a verging-on-recession setting.
Indeed, the bigger problem they face is that they have played their cost-focused business paradigm out. You can’t grow an economy on cost cutting unless you have offsetting factors in play, such as an export led growth strategy, or an ever rising fiscal deficit, or a falling household saving rate that has not yet reached zero, or some basis for an investment spending boom. But if you go down the list, and check off each item for the US, you will see they have exhausted the possibilities. The only one that could in theory operate is having consumers go back on a borrowing spree. But with unemployment as high as it is and many families desperately trying to recover from losses in the biggest item on their personal balance sheet, their home, that seems highly unlikely. Game over for the cost cutting strategy.
And contrary to their assertions, just as they’ve managed to pursue self-limiting, risk avoidant corporate strategies on a large scale, so too have they sought to use government and regulation to shield themselves from risk.
Businesses have had at least 25 to 30 years near complete certainty — certainty that they will pay lower and lower taxes, that they’ will face less and less regulation, that they can outsource to their hearts’ content (which when it does produce savings, comes at a loss of control, increased business system rigidity, and loss of critical know how). They have also been certain that unions will be weak to powerless, that states and municipalities will give them huge subsidies to relocate, that boards of directors will put top executives on the up escalator for more and more compensation because director pay benefits from this cozy collusion, that the financial markets will always look to short term earnings no matter how dodgy the accounting, that the accounting firms will provide plenty of cover, that the SEC will never investigate anything more serious than insider trading (Enron being the exception that proved the rule).
So this haranguing about certainty simply reveals how warped big commerce has become in the US. Top management of supposedly capitalist enterprises want a high degree of certainty in their own profits and pay. Rather than earn their returns the old fashioned way, by serving customers well, by innovating, by expanding into new markets, their ‘certainty’ amounts to being paid handsomely for doing things that carry no risk. But since risk and uncertainty are inherent to the human condition, what they instead have engaged in is a massive scheme of risk transfer, of increasing rewards to themselves to the long term detriment of their enterprises and ultimately society as a whole.
By: Yves Smith, Salon, August 14, 2011