美国如何放弃科学超级大国地位
How the United States Gave Up Being a Science Superpower

原始链接: https://steveblank.com/2025/05/13/how-the-united-states-became-a-science-superpower-and-how-quickly-it-could-crumble/

美国历史上在科学领域的统治地位,得益于强大的公私合作关系,如今却面临着特朗普政府时期启动的大规模资金削减的威胁。NIH 拨款、研究机构人员和大学报销的减少,有可能破坏推动美国创新的基础设施。 这个系统诞生于战时的研发,它培养了一个独特的生态系统:联邦政府资助的大学研究推动了技术进步、专利和初创企业的发展。与英国的中央集权模式不同,美国的分散式方法重视学术自由和竞争,从而带来了突破和商业化,创造了硅谷和生物技术等产业。 这些削减危及间接成本报销制度,而这对于维持大学研究设施和行政支持至关重要。此外,它们还可能导致人才流失,因为中国等国家正在积极招募在美国接受过培训的科学家。如果美国拆除其研究基础设施,未来的突破将在其他地方发生,从而损害其经济和国家安全。政策制定者必须认识到研发投资对于增长和创造就业至关重要。

一篇Hacker News帖子讨论了史蒂夫·布兰克的文章《美国如何放弃科学超级大国地位》,引发了关于美国科学霸权衰落的辩论。评论者将衰落归因于多种因素,包括专业化、高额学生债务阻碍创造性探索以及对新冠疫情政策的反应。一些人认为,对科学研究的攻击具有政治动机,让人联想起德国破坏其研究大学体系等历史事件。另一些人指出大学内部的行政膨胀和优先级偏差,主张回归核心学术使命和研究重点。讨论涉及潜在的解决方案,例如限制间接费用以及促进私营部门和政府之间的合作,并以曼哈顿计划为例。人们担心,如果NSF和NIST被削弱,则缺乏标准和研究。政治影响和学术界中感知到的偏见是一个反复出现的主题。

原文

US global dominance in science was no accident, but a product of a far-seeing partnership between public and private sectors to boost innovation and economic growth.

Since 20 January, US science has been upended by severe cutbacks from the administration of US President Donald Trump. A series of dramatic reductions in grants and budgets — including the US National Institutes of Health (NIH) slashing reimbursements of indirect research costs to universities from around 50% to 15% — and deep cuts to staffing at research agencies have sent shock waves throughout the academic community.

These cutbacks put the entire US research enterprise at risk. For more than eight decades, the United States has stood unrivalled as the world’s leader in scientific discovery and technological innovation. Collectively, US universities spin off more than 1,100 science-based start-up companies each year, leading to countless products that have saved and improved millions of lives, including heart and cancer drugs, and the mRNA-based vaccines that helped to bring the world out of the COVID-19 pandemic.

These breakthroughs were made possible mostly by a robust partnership between the US government and universities. This system emerged as an expedient wartime design to fund weapons research and development (R&D) in universities. It has fuelled US innovation, national security and economic growth.

But, today, this engine is being sabotaged in the Trump administration’s attempt to purge research programmes in areas it doesn’t support, such as climate change and diversity, equity and inclusion, and to rein in campus protests. But the broader cuts are also dismantling the very infrastructure that made the United States a scientific superpower. At best, US research is at risk from friendly fire; at worst, it’s political short-sightedness.

Researchers mustn’t be complacent. They must communicate the difference between eliminating ideologically objectionable programmes and undermining the entire research ecosystem. Here’s why the US research system is uniquely valuable, and what stands to be lost.

Unique innovation model

The backbone of US innovation is a close partnership between government, universities and industry. It is a well-calibrated ecosystem: federally funded research at universities drives scientific advancement, which in turn spins off technology, patents and companies. This system emerged in the wake of the Second World War, rooted in the vision of US presidential science adviser Vannevar Bush and a far-sighted Congress, which recognized that US economic and military strength hinge on investment in science (see ‘Two systems’).

Two Systems – How US and UK science diverged

When Winston Churchill became UK prime minister in 1940, he had at his side his science adviser, physicist Frederick Lindemann. The country’s wartime technical priorities focused on defence and intelligence — such as electronics-based weapons, radar-based air defence and plans for nuclear weapons. Their code-breaking organization at Bletchley Park, UK, was reading secret German messages using the earliest computers ever built.

Under Churchill, Lindemann influenced which projects received funding and which were sidelined. His top-down, centralized approach, with weapons development primarily in government research laboratories, shaped UK innovation during the Second World War — and led to its demise post-war.

Meanwhile, in the United States, Vannevar Bush, a former dean of engineering at the Massachusetts Institute of Technology (MIT) in Cambridge, became science adviser to US president Franklin Roosevelt in June 1940. Bush told him that war would be won or lost on the basis of advanced technology. He convinced Roosevelt that, although the army and navy should keep making conventional weapons (planes, ships, tanks), scientists could develop more-advanced weapons and deliver them faster. He argued that the only way that the scientists could be productive was if they worked in a university setting in civilian-run weapons laboratories run by academics. Roosevelt agreed to it.

In 1941, Bush convinced the president that academics should also be allowed to acquire and deploy weapons, which were manufactured in volume by US corporations. To manage this, Bush created the US Office of Scientific Research and Development. Each division was run by an academic hand-picked by Bush. And they were located in universities, including MIT, Harvard University, Johns Hopkins University, the California Institute of Technology, Columbia University and the University of Chicago.

Nearly 10,000 scientists, engineers, academics and their graduate students received draft deferments to work in these university labs. Their work led to developments in a wide range of technologies, including electronics, radar, rockets, napalm and the bazooka, penicillin and cures for malaria, as well as chemical and nuclear weapons.

The inflow of government money — US$9 billion (in 2025 dollars) between 1941 and 1945 — changed US universities, and the world. Before the war, academic research was funded mostly by non-profit organizations and industry. Now, US universities were getting more money than they had ever seen. They were full partners in wartime research, not just talent pools.

Wartime Britain had different constraints. First, England was being bombed daily and blockaded by submarines, so focusing on a smaller set of projects made sense. Second, the country was teetering on bankruptcy. It couldn’t afford the big investments that the United States made. Many areas of innovation — such as early computing and nuclear research — went underfunded. And when Churchill was voted out of office in 1945, with him went Lindemann and the coordination of UK science and engineering. Post-war austerity led to cuts to all government labs and curtailed innovation.

The differing economic realities of the United States and United Kingdom also shaped their innovation systems. The United States had an enormous industrial base, abundant capital and a large domestic market, which enabled large-scale investment in research and development. In the United Kingdom, key industries were nationalized, which reduced competition and slowed technological progress.

Although UK universities such as Cambridge and Oxford remained leaders in theoretical science, they struggled to commercialize their breakthroughs. For instance, pioneering work on computing at Bletchley Park didn’t turn into a thriving UK computing industry — unlike in the United States. Without government support, UK post-war innovation never took off.

Meanwhile, US universities and companies realized that the wartime government funding for research had been an amazing accelerator for science and engineering. Everyone agreed it should continue.

In 1950, Congress set up the US National Science Foundation to fund all basic science in the United States (except for life sciences, a role that the US National Institutes of Health would assume). The US Atomic Energy Commission spun off the Manhattan Project and the military took back advanced weapons development. In 1958, the US Defense Advanced Research Projects Agency and NASA would also form as federal research agencies. And decades of economic boom followed.

It need not have been this way. Before the Second World War, the United Kingdom led the world in many scientific domains, but its focus on centralized government laboratories rather than university partnerships stifled post-war commercialization. By contrast, the United States channelled wartime research funds into universities, enabling breakthroughs that were scaled up by private industry to drive the nation’s post-war economic boom. This partnership became the foundation of Silicon Valley and the aerospace, nuclear and biotechnology industries.

The US government remains the largest source of academic R&D funding globally — with a budget of US$201.9 billion for federal R&D in the financial year 2025. Out of this pot, more than two dozen research agencies direct grants to US universities, totalling $59.7 billion in 2023, with the NIH and the US National Science Foundation (NSF) receiving the most.

The agencies do this for a reason: they want professors at universities to do research for them. In exchange, the agencies get basic research from universities that moves science forward, or applied research that creates prototypes of potential products. By partnering with universities, the agencies get more value for money and quicker innovation than if they did all the research themselves.

This is because universities can leverage their investments from the government with other funds that they draw in. For example, in 2023, US universities received $27.7 billion from charitable donations, $6.2 billion in industrial collaborations, $6.7 billion from non-profit organizations, $5.4 billion from state and local government and $3.1 billion from other sources — boosting the $59.7 billion up to $108.8 billion (see ‘US research ecosystem’). This external money goes mostly to creating research labs and buildings that, as any campus visitor has seen, are often named after their donors.

Source: US Natl Center for Science and Engineering Statistics; US Congress; US Natl Venture Capital Assoc; AUTM; Small Business Administration

Thus, federal funding for science research in the United States is decentralized. It supports mostly curiosity-driven basic science, but also prizes innovation and commercial applicability. Academic freedom is valued and competition for grants is managed through peer review. Other nations, including China and those in Europe, tend to have more-centralized and bureaucratic approaches.

But what makes the US ecosystem so powerful is what then happens to the university research: it’s the engine for creating start-ups and jobs. In 2023, US universities licensed 3,000 patents, 3,200 copyrights and 1,600 other licences to technology start-ups and existing companies. Such firms spin off more than 1,100 science-based start-ups each year, which lead to countless products.

Since the 1980 Bayh–Dole Act, US universities have been able to retain ownership of inventions that were developed using federally funded research (see go.nature.com/4cesprf). Before this law, any patents resulting from government-funded research were owned by the government, so they often went unused.

Closing the loop, these technology start-ups also get a yearly $4-billion injection in seed-funding grants from the same government research agencies. Venture capital adds a whopping $171 billion to scale those investments.

It all adds up to a virtuous circle of discovery and innovation.

Facilities costs

A crucial but under-appreciated component of this US research ecosystem is the indirect-cost reimbursement system, which allows universities to maintain the facilities and administrative support necessary for cutting-edge research. Critics often misunderstand the function of these funds, assuming that universities can spend this money on other areas, such as diversity, equity and inclusion programmes. In reality, they fund essential infrastructure: laboratory space, compliance with safety regulations, data storage and administrative support that allows principal investigators to focus on science rather than paperwork. Without this support, universities cannot sustain world-class research.

Reimbursing universities for indirect costs began during the Second World War, and broke ground, just as the weapons development did. Unlike in a typical fixed-price contract, the government did not set requirements for university researchers to meet or specifications for them to design their research to. It asked them to do research and, if the research looked like it might solve a military problem, to build a prototype they could test. In return, the government paid the researchers for their direct and indirect research costs.

Two scientists demonstrate the Dr. Robert Van De Graf 1,500,000 volt generator.

Vannevar Bush (right) led the US Office of Scientific Research and Development during the Second World War.Credit: Bettmann/Getty

At first, the government reimbursed universities for indirect costs at a flat rate of 25% of direct costs. Unlike businesses, universities had no profit margin, so indirect-cost recovery was their only way to pay for and maintain their research infrastructure. By the end of the war, some universities had agreed on a 50% rate. The rate is applied to direct costs, so that a principal investigator will be able to spend two-thirds of a grant on direct research costs and the rest will go to the university for indirect costs. (A common misconception is that indirect-cost rates are a percentage of the total grant, for example a 50% rate meaning that half of the award goes to overheads.)

After the Second World War, the US Office of Naval Research (ONR) began negotiating indirect-cost rates with universities on the basis of actual institutional expenses. Universities had to justify their overhead costs (administration, facilities, utilities) to receive full reimbursement. The ONR formalized financial auditing processes to ensure that institutions reported indirect costs accurately. This led to the practice of negotiating indirect-cost rates, which is still used today.

Since then, the reimbursement process has been tweaked to prevent gaming the system, but has remained essentially the same. Universities negotiate their indirect-cost rates with either the US Department of Health and Human Services (HHS) or the ONR. Most research-intensive universities receive rates of 50–60% for on-campus research. Private foundations often have a lower rate (10–20%), but tend to have wider criteria for what can be considered a direct cost.

In 2017, the first Trump administration attempted to impose a 10% cap on indirect costs for NIH research. Some in the administration viewed such costs as a form of bureaucratic bloat and argued that research universities were profiting from inflated overhead rates.

Congress rejected this and later added language in the annual funding bill that essentially froze most rates at their 2017 levels. This provision is embodied in section 224 of the Consolidated Appropriations Act of 2024, which has been extended twice and is still in effect.

In February, however, the NIH slashed its indirect reimbursement rate to an arbitrary 15% (see go.nature.com/4cgsndz). That policy is currently being challenged in court.

If the policy is ultimately allowed to proceed, the consequences will be immediate. Billions of dollars of support for research universities will be gone. In anticipation, some research universities are already scaling back their budgets, halting lab expansions and reducing graduate-student funding. This will mean fewer start-ups being founded, with effects on products, services, jobs, taxes and exports.

Race for talent

The ripple effects of Trump’s cuts to US academia are spreading, and one area in which there will be immediate ramifications is the loss of scientific talent. The United States has historically been the top destination for international researchers, thanks to its well-funded universities, innovation-driven economy and opportunities for commercialization.

US-trained scientists — many of whom have historically stayed in the country to launch start-ups or contribute to corporate R&D — are being actively recruited by foreign institutions, particularly in China, which has ramped up its science investments. China has expanded its Thousand Talents Program, which offers substantial financial incentives to researchers willing to relocate. France and other European nations are beginning to design packages to attract top US researchers.

Erosion of the US scientific workforce will have long-term consequences for its ability to innovate. If the country dismantles its research infrastructure, future transformative breakthroughs — whether in quantum computing, cancer treatment, autonomy or artificial intelligence — will happen elsewhere. The United States runs the risk of becoming dependent on foreign scientific leadership for its own economic and national-security needs.

History suggests that, once a nation loses its research leadership, regaining it is difficult. The United Kingdom never reclaimed its pre-war dominance in technological innovation. If current trends continue, the same fate might await the United States.

University research is not merely an academic concern — it is an economic and strategic imperative. Policymakers must recognize that federal R&D investments are not costs but catalysts for growth, job creation and national security.

Policymakers need to reaffirm the United States’ commitment to scientific leadership. If the country fails to act now, the consequences will be felt for generations. The question is no longer whether the United States can afford to invest in research. It is whether it can afford not to.

联系我们 contact @ memedata.com