I recently had an interesting conversation with an executive engineer from a spaceflight company. Our conversation began with satellites and rockets but soon came around to the Higgs boson discovery at CERN. With a concerned expression, he bluntly told me he did not understand why anybody would spend billions of dollars just to see a tiny particle that exists only for an instant.
I was taken aback. Here was a man who had spent his whole life as an engineer—surely he should appreciate the value of scientific discovery, especially something as fundamental as the explanation for mass. Apparently, he did not.
That attitude seems increasingly common today. Traditionally, the goal of science has been elucidating nature and discovering its laws. However, the public now seems to view science's primary goal as developing technology and creating products.[1] Earlier this year Steven Weinberg, a Nobel laureate in physics, wrote about that change of attitude in both politicians and the public.[2] During a congressional briefing in June, Craig Mello, winner of the 2006 Nobel Prize in Physiology or Medicine, also pointed out the trend.[3] And in July Lawrence Krauss, a prominent theoretical physicist, remarked that the public values science because of 'the practical applications.'[4]
If Weinberg, Mello, and Krauss are right—if the public truly thinks that science is all about innovation—what will it mean for the discipline? What will be the ramifications?
Basic research could lose funding
Science-policy people usually talk about two types of research: basic and applied. Basic research is conducted to investigate nature, whereas applied research is conducted to develop a specific technology or product. Although the line separating the two is often fuzzy, companies and governments usually differentiate between them for budgeting purposes. Historically, strong funding has been allocated for both. However, under the mindset that science is about 'making the world a better place,' inquiry would be valuable only if it eventually led to new technologies. When funding is limited, as in recent years, it will flow to research that yields direct, timely results. What manufacturer would have paid Albert Einstein to develop special relativity?
Basic research, although it can be an expensive and uncertain undertaking, is the foundation for all future applied R&D. If we do not conduct basic research today, there will be no applied research tomorrow. And basic research needs funding.
Leslie Tolbert, senior vice president for research at the University of Arizona, is concerned that applied research is already receiving funding priority. In her testimony before the US House Committee on Science, Space, and Technology, she warned, 'In recent years, federal financial support for research has not kept pace with what is needed. . . . Simultaneously, private-sector companies that do research and development (R&D) increasingly have tightened their focus to more applied research and development, leaving the universities and national labs most of the fundamental (or 'basic') research.'[5]
Fundamental theories may not be pursued
Science is all about fundamental theories—explaining how the natural world works. One observes the facts, develops a hypothesis, makes predictions based on that hypothesis, and tests the predictions. If the predictions are correct, then the hypothesis is probably right. But in a world where science is all about deliverables, such speculation would have no place.
Imagine ancient researchers observing falling objects and wanting to harness the phenomenon to develop new technology—say, a catapult. They note that objects tend to fall at the same rate, regardless of weight. They are developing not only an experiment but also equations that accurately predict an object's trajectory. If you asked them why all objects fall in that manner, they would probably say they don't know. All that matters for the catapult is that its makers be able to predict—not explain.
The preference for application is a problem because inductive explanation is fundamental to science. Why do objects fall to the ground? Why do some compounds react with others? How do atoms hold themselves together? Prediction is important, but the real breakthroughs come with revolutions in explanation.
Consider Max Planck. He theorized the existence of quantized energy to explain an oddity in electromagnetic waves, an oddity noticed only after theories failed to fully explain radiation. From Planck's equation, scientists developed quantum theory, which has radically reshaped our understanding of the world.
Niels Bohr, Erwin Schrödinger, and other quantum pioneers would not have spent decades developing something as elegant and comprehensive as quantum theory if all they cared about had been creating products. Planck had no idea that his theory would underlie a new age of electronic computing; he was just looking for an explanation. To develop something like quantum theory, somebody needs to be looking for a fundamental theory. If we aren't looking for the laws of nature, we won't find them.
Certain disciplines could disappear
Theoretical astrophysics, cosmology, and high-energy particle physics can tell us a lot about the universe, but they don't necessarily lead to better smartphones or more innovative medical procedures. Rather, researchers in those fields are motivated by a desire to understand the universal principles that govern the world.[6] Occasionally, they may come up with spinoff technologies from the machines used in their experiments, but funding the experiments for the technology alone would be horribly inefficient.
Science in the US is already seeing funding cuts, as Weinberg reported earlier this year:
In the past few years funding has dropped for astrophysics at NASA. In 2010 the National Research Council carried out a survey of opportunities for astronomy in the next ten years, setting priorities for new observatories that would be based in space. . . . No funds are in the budget for any of these.[7]
Things are not much better in particle physics. The US canceled its funding for its next particle accelerator almost three decades ago.[8]
Where real discoveries come from
To see the true foundation of discovery, we must consider the history of science.[9] Isaac Newton, possibly the greatest scientist the world has ever known, pursued the link between mathematics and the physical world. He did that not to create machines and develop technology but to 'afford some light either to this or some truer method of [natural] philosophy,'[10] as stated in his Principia.
The scientists who make the real breakthroughs are still the ones who are looking for the laws of nature. John Mather, who won the Nobel Prize in Physics in 2006 for his measurements of the cosmic microwave background radiation, said he was compelled by 'one of the most exciting endeavors of the twentieth century: the quest to understand how the universe began and how it has evolved since. That humans could even contemplate supplying answers to such questions filled me with awe.'[11] Craig Mello, credited with the discovery of RNA interference, bluntly detailed his group's motivation for biology research: 'We're not trying to make a drug. We're trying to discover how the darn cell actually works.'[3]
Even in corporate-sponsored labs, the motivation behind major discoveries was inquisitiveness rather than innovation. Bell Labs, which once was a powerhouse of research and discovery, produced dozens of devices that remain familiar today. Nevertheless, as Jon Gernter noted in his history of the labs, 'The teams at Bell Labs that invented the laser, transistor and solar cell were not seeking profits. They were seeking understanding. Yet in the process they created not only new products but entirely new—and lucrative—industries.'[12]
Why public perception matters
Fortunately, the mindset that 'science is about technological advancement' has not yet infiltrated the scientific community. Most scientists still understand that science is really about discovery. It is vital, however, that we pay attention to what the public thinks. The public's perception of science will define the direction of the discipline, for two reasons.
First, public perception determines where the money goes. Corporations and governments fund research based on what they think its purpose is. We see that happening today. The Research Council of the UK lists its priorities for scientific research in terms of economic and social benefits.[13] Likewise, China recently announced that its focus for scientific research will be on 'translating research into technologies that can power economic growth and address pressing national needs.'[14] Any purely basic research will be hard to defend or initiate under such paradigms.
Consider, too, Weinberg's chilling example of the cancellation of the Superconducting Super Collider: 'During the debate over the SSC, I was on the Larry King radio show with a congressman who opposed it. He said that he wasn't against spending on science, but that we had to set priorities. I explained that the SSC was going to help us learn the laws of nature, and I asked if that didn't deserve a high priority. I remember every word of his answer. It was 'No.''[15] If the sponsors believe that science's value lies in technological innovation, they will fund only research that yields products and technology.
The second reason public perception is so important is that the applied-first mindset will eventually seep into science itself. Whereas today's scientists might still believe in the quest for understanding, tomorrow's scientists will grow up in a world that tells them that science is about delivering more tangible benefits to society.
Science for its own sake
In light of recent trends, Weinberg predicted in May 2012, 'In the next decade we may see the search for the laws of nature slow to a halt, not to be resumed again in our lifetimes.'[16] His prediction is sobering. Yet despite all the frightening possibilities, it is not too late. In 2010 82% of surveyed Americans agreed that the federal government should support scientific research 'even if it brings no immediate benefits.'[17]
In an age of satellites, laptops, and smartphones, one can easily be swept away by the amazing technology that results from scientific research. We must not lose sight, however, of science's foundation: inquisitiveness about creation. Science teachers, researchers, and science writers must take advantage of that foundation and explain to the public the value in the quest for understanding, the thrill of discovery, and the incredible privilege of searching for the secrets of the universe.
So how did I respond to the spaceflight engineer who said the Large Hadron Collider was a waste of money? I replied that the goal of the experiment was not just to see a flash of a particle. It was to explain mass—one of the most fundamental yet mysterious aspects of our world. Whether or not the Higgs discovery yields technological advances, we are a step closer to understanding the vast and mysterious workings of the cosmos. I do not know if I convinced him, but I hope I gave him a taste of what the scientific endeavor really is.
Even if scientists do not develop products or help the economy, it is a glorious honor to discern the order of the universe. Let's not lose sight of that, and let's pass that inquisitiveness on to the next generation. The universe is a beautiful place, and there is still so much to learn.
References
- 1. M. Smith, Yahoo Voices, 26 September 2008.
- 2. S. Weinberg, New York Review of Books, 10 May 2012.
- 3. C. Mello, 'Silencing human disease with RNA interference,' Congressional Biomedical Research Caucus 2012 Briefing Series, Washington, DC, 20 June 2012.
- 4. C. Santa Maria, Huffington Post, 18 July 2012.
- 5. L. Tolbert, testimony before the House Committee on Science and Technology, Subcommittee on Research and Science Education, Washington, DC, 27 June 2012, p. 2.
- 6. Ref. 2, p. 3.
- 7. Ref. 2, p. 4.
- 8. Ref. 2, p. 2.
- 9. For a fascinating discussion of this historical foundation, see N. R. Pearcey, C. B. Thaxton, The Soul of Science, Crossway Books, Irvine, CA (1994).
- 10. G. Gamow, The Great Physicists from Galileo to Einstein, Harper & Brothers, New York (1961), p. 54.
- 11. J. Boslough, J. Mather, The Very First Light: The True Inside Story of the Scientific Journey Back to the Dawn of the Universe, Basic Books, New York (2008), p. 5.
- 12. J. Gertner, New York Times, 25 February 2012.
- 13. Department for Business Innovation and Skills, The Allocation of Science and Research Funding, London, December 2010, p. 9.
- 14. J. Qiu, Nature 470, 15 (2011).
- 15. See ref. 2, p. 3.
- 16. See ref. 2, p. 4.
- 17. National Science Board, Science and Engineering Indicators 2012, National Science Foundation, Arlington, VA (2012), chap. 7, p. 7.
Allen Scheie is a physics and philosophy major at Grove City College in Grove City, Pennsylvania. As a participant in the AIP Mather Public Policy Intern Program , he recently worked on the US House of Representatives' Committee on Science, Space, and Technology.