The views expressed by contributors are their own and not the view of The Hill

Why the federal government must put more money toward basic science

Getty Images
Development of vaccines to combat the COVID-19 virus moved more quickly than research typically does for most health conditions.


A consensus is forming in Washington that the federal government is not doing enough to help American innovation. New research suggests that federal underinvestment is contributing to sluggish productivity and eroding America’s global competitiveness. Current public spending on research and development (R&D) stands at roughly $130 billion — dwarfed by the private sector’s more than $450 billion. This is a complete reversal of R&D spending in the decades following World War II, when the federal government led the way.

Large increases in federal R&D may now be imminent. Spurred by a pandemic disaster and fierce global competition with China — and buoyed by the success of the government-led effort to develop COVID-19 vaccines — policymakers across the political spectrum are calling for more R&D funding. It’s not just how much money the federal government spends on R&D that matters, however, but what it spends its money on. Lawmakers looking to boost innovation must make sufficient allocations for basic science, which historically has helped make the U.S. a world leader in science and technology. 

In 2020, lawmakers in Congress unveiled a bipartisan, bicameral bill — the Endless Frontier Act — that allocates $100 billion over five years to create a technology directorate within the National Science Foundation (NSF). A new version of the bill recently was reintroduced and is being considered on the Hill. Additionally, the Biden administration’s massive infrastructure plan allocates $250 billion for research, including $50 billion to restructure NSF along the same lines as the Endless Frontier Act.

These proposals rightly highlight the need for more federal R&D. But inadequate government support is only one of many problems plaguing American science and innovation. Others include decreasing research productivity, increasing bureaucratization, and concerns about the reliability of many published scientific findings. Less discussed, but no less significant, is the bias against basic science.

Basic science can be understood as research aimed at discovery, rather than immediate applications or commercial products. Although such research never enjoyed anything close to the majority of U.S. R&D spending, it now makes up far less than it did at its peak in the middle of the 20th century. This is largely because industry has overtaken government as the leading funder, and only about 5 percent of private R&D funding goes to basic science. The vast majority goes to applied research and development, which is more likely to pay off in the short term. 

Yet, many of the most significant technologies of the past century — computers, radar, X-rays, lasers, nuclear energy and mRNA vaccines — are traceable to basic scientific discoveries, sometimes made decades before their applications. Such practical uses are often not foreseen by the researchers who first made the discoveries. Even when they are, bringing ideas to market typically requires the collaborative efforts of countless scientists, engineers and entrepreneurs.

Because many purely scientific discoveries never translate into new technologies, much less in timeframes meaningful to industry, the private sector is reluctant to take on the risks associated with funding this kind of research. This leaves the federal government as science’s biggest patron. Yet, today, only about a quarter of federal research dollars goes to basic science, with applied research and development taking the rest. Current proposals to strengthen the American R&D system would worsen this imbalance. 

Advocates of increasing federal R&D spending like to point out that government investments in mission-directed research during World War II helped spark the post-war boom. They propose replicating these efforts, directing research dollars to develop green energy or other “focus areas” of emerging technology — as both the Biden infrastructure plan and the Endless Frontier Act propose. But these arguments ignore why the government’s research during WWII was successful in the first place. 

Each of the major wartime inventions — the atomic bomb, radar, computers — were possible only because of basic scientific discoveries made before the war. To be sure, the government’s research programs played a key role. But without prior advances in atomic physics, electromagnetism, or mathematical logic in the 19th and early 20th centuries, none of the war’s iconic inventions would have been conceivable. 

So, it may be true that the government’s mission-directed research had long-term effects on the U.S. economy. But these programs were successful, in part, because of the long-term effects of basic scientific discoveries. This suggests that if we want to recover the kind of economic growth we had during the post-war years, merely increasing federal funding is not enough. We also need to prioritize the kind of basic scientific research on which major technological breakthroughs often depend — as seen most recently with the mRNA research used for COVID-19 vaccines.

Without making sufficient allocations for basic science, lawmakers’ current proposals to increase federal R&D funding may fail to achieve their stated aim: stimulating U.S. innovation and maintaining America’s global lead in science and technology.

M. Anthony Mills is a resident scholar at the American Enterprise Institute and a senior fellow at the Pepperdine School of Public Policy.

Tags economy Innovation Research and development Science and technology in the United States

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

More Finance News

See All