Despite the role it has played in the evolution of humans, science has often been considered a dangerous and profane activity or, at best, an obscure practice whose validity resides mainly in the possible usefulness of its objectives. The need to justify science through its applications dates back to the dawn of the scientific enterprise, as shown by the following dialog from Plato's The Republic, circa 360 BCE.
GLAUCON: To know something about the seasons, the months, and the years is of use for military purposes, as well as for agriculture and for navigation.
SOCRATES: It amuses me to see how afraid you are, lest the common herd of people should accuse you of recommending useless studies.
Things have not changed. A crucial point of any research project is the ability to demonstrate the usefulness of the proposed investigation for practical purposes.
Science does not escape cost–benefit analysis. Since its main product, knowledge, is not easy to evaluate, allocation of resources is based mainly on derived results, in the form of technology and commercial applications. When those cannot be foreseen within reasonable times, funding may be at risk. Basic science is particularly penalized by such an attitude: Results are expected in the long run and can be predicted only along general lines.
Moreover, chance often is important in the path to fundamental results. A significant example comes from Enrico Fermi's Nobel Prize–winning discoveries on neutron diffusion. One morning in the autumn of 1934, Fermi was examining the effect of placing a piece of lead before the incident neutrons. He later said, “I tried every excuse to postpone putting the piece of lead in its place…. With no advance warning, no conscious prior reasoning, I immediately took some odd piece of paraffin and placed it where the piece of lead was to have been.” 1 Neutrons, slowed down by collisions with light hydrocarbon molecules, remained in the vicinity of target nuclei long enough to increase their absorption and thus became extremely effective in carrying out nuclear transmutation.
Undoubtedly, investors prefer short-term, fairly certain results to long-term, uncertain returns. That approach might foster the schizophrenic belief that technology could be separated from science, and thus short-term results might come without long-term dreams and efforts. The incapacity to realize that today's fundamental research represents the basis of tomorrow's technological applications might produce a sort of local minimum effect. This occurs, for example, in optimization processes, in which candidate solutions correspond to valleys in a landscape: Without enough energy to overcome barriers separating adjacent valleys, a long-range search for the deepest valley—the true solution—is not possible, and the process will be stuck in the closest local minimum. Short-term research can refine actual knowledge but will hardly generate radically new ideas and products. Penalizing fundamental science will affect not only academic careers and scientists' egos, but in the long run might cause a significant slowing of the economic system itself. For example, quantum computing, which is expected to represent a boost in computing technology and related commercial applications, is based on decades of basic research into quantum mechanics. 2
Those responsible for science funding should consider that the mechanisms underlying scientific research often follow different paths and times with respect to economic rules. The process that goes from the starting question of any scientific investigation, “How does it happen?” to the more pragmatic “How does it cash out?” cannot be shortened at will, or it might end up collapsing on itself.