Copyright STAT

Around 1990, a bright, young Harvard academic became interested in the possibility that a relatively unknown peptide might slow gastric emptying and reduce hunger — a potential boon to the treatment of diabetes. Although he was employed as a full-time faculty researcher and clinician at a major teaching hospital and his lab was funded by the National Institutes of Health, he chose to pursue this particular line of research privately with support from a large pharmaceutical company, which required him to keep the work secret and not publish his findings or present them at scientific meetings. Despite the exciting promise that emerged from these studies, the company that funded the work did not allow the researcher and his colleagues to disseminate their findings, and the drugmaker failed to follow up on the enormous potential of what was to become the GLP-1 class of drugs, losing billions of dollars in potential revenue. Advertisement We will never know how much these studies might have accelerated the creation of one of the most important therapeutic approaches of the last several decades. The 2024 Lasker Awards were given for the development of this class of medications, much of it accomplished by other Harvard researchers at the Massachusetts General Hospital through federal grants. The unpublished findings of the other early pharma-funded research were not mentioned, nor was Jeffrey Flier, the investigator who believed that such corporate contracts were a good way to support his innovative studies. He remained a proponent of drug company-sponsored university-based research and later became the dean of Harvard Medical School. This story offers an important lesson during this fraught time in research, as threats to NIH funding have dramatically upended life for researchers working to understand the mechanisms of disease or discover better ways to treat them: Pharmaceutical industry support cannot replace public funding for research. The 2025 Nobel Prize in economics was recently awarded to three researchers who studied the origins of innovation and progress; their work emphasized the vital roles played by university scientists and the free flow of ideas. At my institution, the Trump administration ignored routine federal procedures and froze additional billions of dollars of research funding, supposedly to combat antisemitism. The bulk of those funding cuts have hit our schools of medicine and public health in Boston, even though the student protest-related events that supposedly triggered the government’s action occurred on our arts and sciences campus in Cambridge 3 miles away. Similar off-target penalties have been threatened to punish Columbia and other research-intensive universities. Harvard and Columbia leaders’ initial sluggish and half-hearted responses to those problems have long since been addressed; if sluggish and half-hearted administrative responses were a reason to terminate nearly all public funding, many schools would have had to dip into their endowments years ago. Advertisement Despite the mismatch between the problem and the government’s scorched-earth response, the effects on frontline researchers here have been devastating. Many laboratory experiments and clinical trials were stopped dead in their tracks, with consequences that will be difficult to repair. Established world-class programs working productively in areas from basic science to drug development struggle to survive, and many of these researchers are unsure about their salaries in the months to come. Unlike our faculty in departments like anthropology or economics or history, most of us in the university’s clinical departments depend on “soft money” to be paid — either external research support or our clinical work — and we have to generate those dollars ourselves. “You eat only what you kill,” as one helpful mentor explained early in my career. One solution that has been proposed to these dramatic shortfalls is to replace federal funding with support from the pharmaceutical industry or other corporate entities. After all, the argument goes, drugmakers already fund the work of many university scientists, and we share a common goal: to advance biomedical knowledge and thus the discovery of important new medications. Some academic medical centers are now promoting conferences on how to attract more pharma dollars to pick up the pieces of defunded federal research. Young and senior scientists alike who were funded by NIH grants are now looking for jobs with drug companies simply to make certain they will still have a steady salary a year from now. There is a long tradition of pharma-funded research in medical schools and their affiliated hospitals, and much important work has come from such collaborations. But before we can take any comfort in additional privatization of medical research, we have to confront some major issues. First, such a transition just won’t scale. Before the withholds and threatened future cuts, the NIH was spending $48 billion each year on biomedical research. The amount available to universities from pharma is smaller; despite the industry claim that it is the wellspring of pharmaceutical innovation, most major companies spend a far smaller share of revenues on innovative research than on promotion and marketing, stock buyback programs, shareholder dividends, and executive compensation. Advertisement More important, drugmakers’ external research expenditures are necessarily oriented to the development and testing of patentable products rather than to the more basic research from which future products will flow. This should be neither surprising nor appalling. A company’s main responsibility is to produce a return on the investment of its shareholders. Conducting or supporting studies of the basic mechanisms of disease in order to produce results that are not patentable — in the so-called pre-competitive space — is simply not their responsibility. By contrast, work from my group and others has shown that the majority of important new breakthrough drugs had their origins in research that was publicly supported. Detailed studies of this process have shown its importance in treatments for conditions from viral infections to cancer. One goal of publicly funded research is to publicize its findings and disseminate them throughout the field; that’s how we do our work, get promoted, and acquire new grants. The opposite is of necessity true of many privately supported product-driven studies. As a result, companies that fund research based in universities or their affiliated medical centers sometimes require nondisclosure agreements that can impede the free publication of study results — either because the sponsor wants to maintain a competitive advantage, or because it does not want widespread discussion of findings that might cast its product in an unfavorable light. This is not the best way to advance science. In the future, much important work will still be funded by well-designed, industry-academic collaborations, as long as the basic principles of open scientific communication are not compromised. But if public grantmaking through the NIH is not fully protected, a huge gap will remain even if we look to industry to make up for the draconian cuts we face. That will apply particularly to the early-stage, non-ownable basic science from which truly innovative new treatments emerge. Greater reliance on corporate largesse can never be a satisfactory alternative to a healthy and adequately budgeted source of peer-reviewed public support. Advertisement This is a hard time to be making that case, as my colleagues doing vital and vulnerable health-related research increasingly seek help from pharmaceutical companies to address threats to their funding from the NIH. But we need to understand the important limits of that industry support, and hasten the day that the nation can again be confident about the public funding that until now has made the U.S. the engine of worldwide biomedical discovery and innovation.