Last week, Gabrielle Levy, a sophomore at Dartmouth College, outlined her take on Big Pharma for the college’s newspaper The Dartmouth: Drugs are expensive, sure, but the research is worth it, and demonizing the pharmaceutical industry plants a seed of anti-science beliefs in the general American populace. Though Levy’s points are a much-needed reminder of the fiscal value of rigorous research in a political climate rightfully hung up on issues of corruption and the profit-seeking behaviors of leviathan corporations, they miss the concerns of consumers and patients entirely.
Let’s make one thing very clear. Biotechnology and pharmaceutical research are immensely costly endeavors. In the 2014-2015 fiscal year, 16 of the top 50 corporations which spent the most on research and development (R&D) operations were pharmaceutical companies, according to a 2016 study published in the Journal of Translational Medicine. As Levy points out, the Tufts University Center for Drug Development estimates the cost of developing a drug ultimately approved by the Food and Drug Administration (FDA) to be around $2.6 billion — and this process can take years as a drug undergoes four stages of trials before it is deemed safe enough to go to market. But the pharmaceutical industry is not what it once was. The days of a corporation investing millions or billions in the development of its own proprietary small molecules are over. Now, basic research occurring in academic institutions globally play a significant role, both financially and through patient outcomes, in the development of new drugs. A 2019 review in Expert Opinions on Drug Discovery found that between 1998 and 2007, 58 percent of drugs approved by the FDA were solely the product of pharmaceutical companies. In stark contrast, every single drug approved between 2010 and 2016 received funding, at least in part, from the National Institutes of Health (NIH), which is itself funded by the federal government. Levy cites a different number, claiming that up to 75 percent of new drugs are funded solely by pharmaceutical companies, but I was unable to substantiate that data.
As the source of funding starts to shift, the nature of these companies’ research — and corporate pay structures — changes as well. According to Forbes, in 2013 GlaxoSmithKline, a British pharmaceutical company consistently among the 10 largest pharma companies in revenue, introduced Wall Street-style bonuses for its researchers, offering up to $15 million to lead researchers whose drugs were approved and predicted to have success on the market.
These bonuses are largely an acknowledgement of the importance of past research on motivating new discoveries.
Today, companies are moving away from the traditional goal of unearthing new small molecules on their own and moving toward biologics and gene therapies primarily discovered and developed in academic settings. The bottom line here is that as biotechnology and academia become more intertwined than ever before, new discoveries have the potential to offer massive positive gains to public health. Important work, however, is expensive.
In that regard, Levy speaks the truth. Anyone who believes the government ought to mindlessly break up pharmaceutical conglomerates and slash prices of newly-approved drugs is ignoring the concerted efforts and expenses of hundreds, or sometimes thousands, of scientists helping to bring those drugs to the market. That said, public concerns are not wrought from thin air, and the monopolistic nature of the pharmaceutical industry means that price-gouging does occur.
Profit-seeking behaviors are more concerning now than ever as they’re brought to national attention through thousands of litigation efforts seeking recompense from companies like Purdue Pharma. These lawsuits are intended to compensate states and families affected by corporations which marketed opioids to patients and doctors knowing that the drugs, such as Purdue’s OxyContin, had a high potential for abuse by patients.
There is a silver lining, though. A 2014 study in PLOS One, an open-access, peer-reviewed journal publishing research from any scientific discipline, indicated that just a 20 percent cap on the prices of new drugs could offer an additional 10 percent consumer surplus and a 23 percent increase in the number of customers using the drugs. Such significant consumer gains would almost entirely counteract lower prices, netting pharmaceutical companies only 1 percent less in profits. This data indicates that drastic reductions in the price of drugs are not necessary to dramatically change the landscape of pharmaceutical access. Managing the price of pharmaceuticals with the proposed 20 percent price cap not only provides consumers with reduced cost, but also has the potential to improve public health by allowing almost an additional quarter of consumers access to a drug they otherwise could not afford. The biggest gains would be seen when additional consumers gain access to life-saving medications like epinephrine, insulin or antibiotics.
Clearly, there is room for improvement in the pricing of new drugs to the benefit of both public health and patients’ wallets. Imposing price caps comes at a small cost to pharmaceutical companies, a mere 1 percent profit loss. To a company netting tens of billions in profits annually, a 1 percent profit loss can seem like a nightmare, but as Big Pharma and academia become intertwined, much of the burden on research funding is lifted off of pharmaceutical companies and taken on by federal agencies like the NIH. Even still, these companies will likely continue their practice of awarding large cash bonuses. But ultimately, what is the cost of a life? To me, it’s whatever it takes. If it takes a seven- or eight-figure bonus to compensate a C-suite executive or lead scientist managing years of critical research and billions of dollars in funding to create a product capable of sparing even a few lives, then maybe that’s a necessary cost of driving continuous innovation.