Over the past century modern medicine has been revolutionised by major advances in pharmaceutical drugs, surgery, diagnostics, and most other areas of healthcare. In conjunction with better access to care, better nutrition, sanitation and housing this has seen the global life expectancy rise from 31 in 1900 to 72 today; in Ireland it has gone from 54 to 82 since 1911. Yet there are good reasons to worry that the pharmaceutical innovation which has underpinned so much of this progress is both stalling in its ingenuity and becoming too expensive for all but the richest patients.
To understand why this system is not working its useful to understand how new drugs are discovered. The search for new drugs starts in the pre-clinical stage. This is where targets, or possible methods of treating an illness are identified, such as a weakness in cancer cells or a reaction that might lower blood pressure. Molecules thought to be able to hit these targets are synthesised, honed and tested in labs to estimate their likely success, before promising drugs are then tested in animal models. Those that still show promise (a tiny proportion of all molecules examined) are tested in humans. About one in every 10 drugs that are tested in humans is approved for use. The time, difficulty and expense for this work is huge, with pharmaceutical companies spending about €1.8 billion for every drug approved.
Pre-clinical research
However, almost all the relevant costs of pre-clinical research have fallen dramatically over the past 60 years, each chemist can synthesise almost 1,000 more molecules per year. We can sequence DNA a billion times faster than we could in the 1970s, we can calculate the dimensions of and compare proteins much more easily and against far bigger databases than ever before, making it easier to identify targets as well as identifying the molecules that can treat them. Despite this, research costs associated with discovering a new drug are increasing at an alarming, unsustainable rate. In one study, Dr Jack Scannell estimated the amount of money pharmaceutical companies spend on researching new drugs along with the number of drugs approved for sale in the United States, and showed that adjusting for inflation, it costs 80 times more to find a drug today than it did in 1950, with the cost of development doubling every nine years.
The reasons for these higher costs are multifaceted. Firstly, when creating a new drug it needs to be better than previous treatments – as the quality of treatments rise, overcoming this barrier becomes more difficult. Secondly, regulation standards are much higher today than they were in the 1950s. Companies must now spend far more time and money demonstrating that their drug is safe. This regulation is intended to reduce negative outcomes for patients and is often a good thing. However, there’s a risk that it can make it prohibitively difficult to introduce drugs that people need, thus harming patients in less obvious ways.
Finally, if two similar drugs come on to the market at the same time, the drug that gets approval first tends to be far more profitable than the drug that comes out second, as physicians become accustomed to prescribing the former. Companies sometimes spend huge amounts of money to speed up their research and development process, even slightly, to have the first product on the market.
Little research
A second problem is that large pharmaceutical companies are undertaking increasingly little research. A recent study of the 18 largest US pharmaceutical sector found that between 2005 and 2016 these companies gave more money to their shareholders through share buy backs and dividends than they invested in research ($465 billion versus $516 billion). Similar research has shown that the world's 10 largest companies also spent about 1.5 times as much on marketing as they do on research. This means that most of the money is not spent researching future drugs, but instead is spent on either adverts or given to shareholders.
As well as underinvesting in research, and increasingly poor returns from this investment, there’s also some evidence to suggest that pharmaceutical companies are not investing in the areas of greatest need. The industry spends more every year treating male baldness than on researching treatments for HIV, malaria and TB combined. Those illnesses kill almost three million people, mostly poor people, ever year.
It is not hard to imagine why the economics of pharmaceutical research fails these people. More surprisingly, many wealthy people are failed too. Areas where scientific research risks are high, such as Alzheimer’s disease, receive very limited investment, because it’s thought unlikely that companies will recoup their investment here – a drug needs to be successful to earn money.
Pharmaceutical patents last about 20 years on average, but for many of these years the drug is still being researched. The shorter the research period the longer they get patent-protected sales for. As it takes longer to demonstrate the success of products that prevent rather than treat illness, these get less attention.
This lack of innovation and higher costs can easily be seen in the health system, where the amount of money we spend on drugs is increasing far faster than health budgets as a whole. In the coming decades demographic changes are going to put huge strain on the health system, with the number of people over 70 rising far faster than the number of taxpayers. To sustain high-quality care, we need to better research into preventative treatments and to keep drug costs down. The research system should also find solutions to the problems of the world’s poorest as well as those afflicted by illnesses where the science is difficult. This means re-examining how we incentivise and undertake pharmaceutical research, and whether we should reform the patent system underpinning it, or have governments fund more of the research directly.
– Anthony McDonnell is a senior health economist at the University of Oxford