The Industrial Physicis
Loading
past issues contact us reprints TIP home

American Institute of Physics

 

 

Opinion
PDF version of this article
Will innovation flourish in the future?
by Jerome I. Friedman

artwork by Anthony Robinson

Science and technology grew exponentially during the 20th century. But will the conditions necessary for creating the kinds of innovations that shape our lives be sustained in the future?

By definition, the word innovate means to bring in something new, to make changes in something established. Clearly, there is a continuum of innovation that ranges from breakthroughs that change the underpinnings of our society to new methods or tools to solve particular problems. The major innovations of the future, those that will shape society, will require a foundation of strong basic research. Innovation is the key to the future, but basic research is the key to future innovation. And today, the future of basic research appears vulnerable.

Although applied research and invention play important roles in innovation, they do not generally produce the major conceptual breakthroughs necessary for creating radically new technologies. The limitation of focused or problem-oriented research becomes apparent in the following observation: If you know what you are looking for, you are limited by what you know. As inventive as Thomas Edison was, he could not have created the transistor—perhaps the most important invention of the 20th century. To elucidate this point, it is useful to trace the transistor’s development.

  • In the latter 19th century, scientists studied the atomic spectra of various elements.

  • In 1885, Johann Balmer discovered his formula for the spectral lines of the hydrogen atom; the Lyman, Pfund, Brackett, and Paschen spectral series followed.

  • In 1900, Max Planck proposed the concept of the quantum in the emission of energy; and in 1905, Albert Einstein developed the idea of the quantum of energy in the radiation field (the photon).

  • In 1911, Ernest Rutherford discovered the atomic nucleus in alpha-particle scattering experiments and confirmed the “planetary model” of the atom.

  • Two years later, Niels Bohr developed a semiclassical model of the hydrogen atom based on a quantization of the electron orbit; it accounted for the observed discrete spectra of hydrogen and established a new model for the atom’s stability.

  • In 1925 and 1926, Werner Heisenberg and Erwin Schrödinger developed quantum mechanics.

  • In 1928, Felix Bloch applied the full machinery of quantum mechanics to the problem of conduction in solids, spearheading the development of the modern theory of solids.

  • In 1929, Walter Schottky and others found electron “holes” in the valenceband structure of semiconductors, uncovering the mechanism of semiconductor behavior.

  • In 1933, solid-state diodes were used as receiving rectifiers.

  • In the late 1930s and early 1940s, investigators began doping silicon and germanium to create new semiconductors.

  • In 1947, John Bardeen and Walter Brattain took out a patent for the transistor, and William Schockley applied for a patent for the transistor effect and a transistor amplifier.

  • In 1951, semiconductors entered the world market. Four years later, transistors had replaced nearly all tubes.

  • In 1959, Robert Noyce and Jack Kilby invented the integrated circuit.

This example demonstrates how basic research established the foundations of the technological revolution created by the invention of the transistor. Brattain said it clearly: “The transistor came about because fundamental knowledge had developed to a stage where human minds could understand phenomena that had been observed for a long time. In the case of a device with such important consequences to technology, it is noteworthy that a breakthrough came from work dedicated to the understanding of fundamental physical phenomena, rather than the cut-and-try method of producing a useful device.” Ironically, quantum mechanics—an abstruse conceptual framework in physics that was developed to explain the structure of the atom—came to underlie some of our most important technologies. It has contributed to the development of the Internet, computers, lasers, consumer electronics, atomic clocks, and superconductors, to mention a few.

Symbiotic research
In addition to basic research, applied research and product development played crucial roles in the transistor’s development. New technologies clearly cannot be created without a synthesis of all three. And often the boundaries between these types of research get blurred. Sometimes applied research leads to important basic knowledge, and technologies developed for basic research lead to broader applications. Accelerators, for example, were invented to study the interactions of subatomic particles; and now various types are used for such diverse applications as cancer therapy, studying the structure of viruses, designing new drugs, and the fabrication of semiconductors and microchips.

Other examples include the Global Positioning System, nuclear medicine, and diagnostic tools such as magnetic resonance imaging. The World Wide Web provides an especially interesting example. Based on the concept of the Internet, it was developed at CERN (the European Organization for Nuclear Research) to enable high-energy physicists worldwide to exchange data and programs and to work together more effectively. The rapidly developing Web is changing the way we communicate, teach, and do business and is promoting economic growth in many parts of the world.

It is also accelerating advances in scientific knowledge and innovation, and it has dramatically changed the scientific landscape. The Web spreads scientific information much faster than printed scientific journals do, and this speeds the flow of work. For example, the human genome is available online to any molecular biologist with a computer connected to the Internet. The Web is also having an effect on economic growth through its impact on scientific research and innovation. This raises the question of the relationship between research and the gross domestic product.

Economists have studied the impact of research on various measures of wealth or well being, which reflect the economic impact of the innovations derived from research. They have estimated that one-half to two-thirds of the economic growth of developed nations is knowledge-based. Recent studies have estimated that the average annual rate of return on R&D investment ranges from 28% to 50%, depending on the assumptions used. Although there is uncertainty in these numbers, there is general agreement that the impact is huge and that past investment in research has paid for itself many times over.

In the United States and in other countries, university research has generated technology- based industries and a large number of jobs. A 1997 study found that Massachusetts Institute of Technology alumni, faculty, and staff have founded more than 4,000 companies during the last four decades, which employ more than 1.1 million people and have annual world sales of $232 billion. Most of these companies are knowledge based. This emphasizes the necessity of keeping research universities strong to maintain a high level of innovation. They provide the scientific workforce of the future; they are the source of most of the research that drives major innovation; and young people with new ideas start many new companies after leaving the university.

Protecting innovation
Creativity is the basis of all innovation, and although it is doubtful that it can be taught, creativity should be nurtured in those who have it. Innovation ultimately depends on a scientifically and technologically creative workforce. Thus, in addition to strong research universities, there should be pre-university schools of excellence that bring together the best young minds to introduce them early to science and give them opportunities for creative work. Corporations and government research agencies should support special educational projects, such as science fairs for young students. Many outstanding young scientists participated in science fairs as high school students.

Here are some other suggestions for enhancing innovation:

  • Young people should be given good support and freedom in their research. They are the greatest source of scientific creativity because they are not as committed to existing scientific orthodoxy, and they have the energy and enthusiasm to push new ideas. As the zoologist Konrad Lorenz once said, “The best morning exercise for a researcher is to cast off one favorite hypothesis ever y day before breakfast.” The young do this better than anyone else.

  • We should willingly take risks in supporting new projects. The tendency is to play it safe when funding is low, but we need to remember that the greatest risks have the greatest payoffs. In addition, individuals or small groups should be given sufficient latitude to develop new ideas, which take time and are often only accepted with difficulty by others.

  • People who innovate should get recognition and appropriate compensation for what they do, especially young people.

  • We should not allow institutional boundaries to impede interdisciplinary research. Some of the most important innovations of the future can be expected from such collaborations. Excessive bureaucracy is distracting, time-consuming, and destructive to creativity.

The scientific and technology communities must also address another set of issues. As science and technology advance, we see a growing public concern about their social and cultural consequences. There are fears about whether future developments in robotics, genetic engineering, and nanotechnology, for example, will enhance the welfare of humankind or prove to be a Faustian bargain. Such fears are causing a technological backlash, especially in developed nations. The science and technology communities must engage in these discussions, be completely open to listening to such concerns, and assess and address them. If we do not listen and respond, we will lose the public as partners.

Basic vulnerability
All of these recommendations for protecting and enhancing future innovation assume an appropriately funded research environment. But who will support basic research in the future? Industry, which previously supported a significant amount, no longer does so because global competition has put an enormous amount of economic pressure on corporations.

Private industry makes R&D investments that are expected to pay off in 5 to 7 years. but it won’t make the 20- to 30-year investments necessary to create entirely new industries. Such long-term investments in R&D have been cut as firms have merged and downsized. Companies that once did long-term R&D, such as AT&T and IBM, have seen their industries become highly competitive. To compete, they have largely withdrawn from supporting basic research.

Patents are a strong indicator of innovation. A 1997 study funded by the National Science Foundation found strong evidence that publicly financed scientific research plays a large role in the breakthroughs of industrial innovation in the United States. It reported that 73% of the main science papers cited by American industrial patents in two prior years involved domestic and foreign research financed by government or nonprofit agencies. Such publicly financed science, the study concluded, has turned into a fundamental pillar of industrial advance. This shows the close connection between national science budgets and the economy, and points to the importance of establishing good bridges between universities, government, and industrial laboratories.

Of all the types of research, basic research is the most vulnerable. It is a risky activity that seeks scientific knowledge for its own sake without thought of practical ends, and neither its outcome nor its applications can be predicted in advance. Even great scientists have fallen woefully short in making such predictions. Ernest Rutherford, discoverer of the atomic nucleus, said in 1933, "Anyone who expects a source of power from the transformation of the atom is talking moonshine." Nine years later, Enrico Fermi produced the first self-sustaining chain reaction.

In addition, there are often long delays in the applications that arise from basic research, such as occurred in the invention of the transistor. Because of these factors, the public and many political leaders do not fully understand the importance of basic research. With the exception of biomedical research, basic research generally does not rank high among a nation's priorities. The public and political leaders seem to recognize that it is important to understand how nature works in all domains and at all levels. But given the needs of society, this argument is not sufficiently persuasive to convince political leaders to make the needed investment in basic research. They want to hear about applications, economic growth, and competitiveness.

We can make such arguments; but if they want examples, we can only talk about the past because we cannot make specific promises about the future. We can tell them, however, that throughout history, advances in scientific knowledge have resulted in revolutions in technology that have improved the standard of living and changed our way of life. Although direct benefits from basic research generally require several decades, they do come. Electricity and magnetism were laboratory curiosities in the early 1800s and did not become a factor in people's lives until more than half a century later. And there are many other examples.

Future innovations
It is clear to me that under the right conditions, future technologies will be created that we cannot even imagine. Think of someone in the year 1900 trying to imagine what would exist in the year 2000. The developments so familiar to us today would be inconceivable to this individual then. Even developments of current technology are difficult to foresee. Who, in 1987, would have been able to predict the World Wide Web, which started in 1990?

Nonetheless, we can safely say that there certainly will be profound innovations in many current technologies. These areas include biotechnology, energy production, computation, artificial intelligence, robotics, miniaturization, communication, sensors, and materials. Although not all human problems can be fixed by technology because of their political nature, many of them could be significantly alleviated by major technological innovations.

The challenges faced by science and technology today are crucial for the future of humankind. They include:

  • Improving the general health of the world population.

  • Understanding ecological and environmental issues and providing guidance to policy makers.

  • Providing sufficient food for the world's rapidly growing population.

  • Developing alternative sources of energy and substitutes for increasingly scarce natural resources.

  • Providing new technologies to enhance the quality of life of our citizens while extending those benefits to regions and groups that have not yet shared in them.

To achieve these goals, we must provide sufficient support for continued progress in basic science, applied science, and engineering. We have to expand our base of knowledge and provide our young people with an education that will enable them to utilize and further expand this knowledge and produce the innovations we need for the future.

Jerome I. Friedman is a professor of physics at the Massachusetts Institute of Technology in Cambridge and shared the Nobel Prize in Physics in 1990. This article has been adapted from his keynote address at a conference titled "Infrastructure for e-Business, e-Education, e-Science, and e-Medicine" that was held at the Scuola Superiore G. Reiss Romoli in L¡'Aquila, Italy, July 29-August 4, 2002.

 

 
adcalls_sub