The history of science is filled with questions about the nature of matter, its constituent elements, how properties emerge from the elements’ arrangements, and how the arrangements can guide or be guided by energy flows. The answers to those questions have progressed from philosophical proposals of atomic theory to practical demonstrations of atoms’ existence to modern quantum theory. And, importantly, the answers have been based on experimental measurements.
Recently, electron-microscopy techniques have given resounding answers to such questions as the following: Can we see atoms? What do they do? How do their interactions give rise to properties, forms, and functions? The fields of condensed-matter physics and materials science are now transitioning toward more directly practical goals—namely, understanding why atoms do what they do and controlling their behavior.
The origins of modern atomistic theory can be traced to ancient Greece, where the concept of indestructible and indivisible atoms was developed. Solids were assumed to be formed by atoms with multiple hooks and openings to ensure strong bonding (see figure 1), and liquids by slippery atoms that could easily move with respect to each other. Although the theory was simplistic from the point of view of modern science, Democritus correctly ascribed the properties of matter to the interactions between individual components. He rather adroitly noted that our macroscopic world is built up of fundamental building blocks: “By convention sweet and by convention bitter, by convention hot, by convention cold, by convention colour; but in reality atoms and void.”1
In Democritus’s time, however, atomic theory remained but one of many competing worldviews, and it was developed on a philosophical rather than experimental basis. Some Persian scientists hinted at the atomic model in their works from the 12th to 14th centuries, the golden age that also gave the world much of the basis for algebra, medicine, chemistry, astronomy, and geography. (See the box on page 34 for an example.) Still, the theory lacked an experimental foundation.
—Jalāl ad-Dīn ar-Rūmī (translation by Mahshid Ahmadi)
From hypothesis to visualization
The birth of modern atomic theory dates to around 1800 with the work of John Dalton. It was based on experimental observations, including the constant ratios of elements in compounds and the physical properties of gases. Skepticism remained strong for several decades after Dalton published his findings; in 1871, for example, Edmund Mills scathingly concluded that “the atomic theory has no experimental basis, is untrue to nature generally, and consists in the main of a materialistic fallacy.”2
As Freeman Dyson famously said, though, “Science originated from the fusion of two old traditions, the tradition of philosophical thinking that began in ancient Greece and the tradition of skilled crafts that began even earlier and flourished in medieval Europe. Philosophy supplied the concepts for science, and skilled crafts provided the tools.”3 Indeed, the tools of science ultimately settled the debate. Albert Einstein’s interpretation of the experimental observation of microscopic Brownian motion was a critical step in verifying the atomistic conjecture.
Early 20th-century physics brought forth both conclusive evidence of matter’s atomistic structure and insight into the atom’s internal structure. A high point of that legendary time was the demonstration of x-ray scattering from periodic crystalline structures for which father-and-son collaborators William and Lawrence Bragg received the 1915 Nobel Prize in Physics. The structures’ ideal periodicity allowed for the representation of solids in reciprocal space and molded the mindsets of subsequent generations of physicists.
With the existence of atoms established, at least indirectly, a question arose: Can atoms be seen one at a time? That question was answered around the middle of the 20th century, when the first images of atomic species were obtained in a field ion microscope that detected the electron-emission patterns from an anatomically sharp tip.4 The same operating principle—applying an electric field to a sharp tip until it ejects electrons by field emission or tunneling—was behind the development of atom-probe tomography, scanning tunneling microscopy (STM), and electron microscopy (see figure 2).
Imagineers of atomic assembly
Progress in both experimental and theoretical atomic physics has stimulated exploration of the possibility of direct atomic visualization and fabrication. In his famous 1959 lecture “There’s Plenty of Room at the Bottom,” Richard Feynman pointed out both the need to make the electron microscope much more powerful and the vast potential for processing and storing information if single atoms could be controlled.5 More recently, many have begun to appreciate the enormous potential that atomic-scale control can bring to information processing. Quantum information science seeks to leverage the quantized nature of matter and energy and the related phenomena of entanglement and superposition to solve previously intractable computational problems. Individual atoms can host quantum bits that, if properly arranged and encoded, could receive, process, and transmit quantum information in a coordinated and massively parallelized fashion.
One of Feynman’s last quotations, “What I cannot create, I do not understand,” clearly sets forth what may be the next big challenge for understanding the atomic world: deliberately creating structures, atom-by-atom, that exhibit predefined functionalities. In the 1980s Eric Drexler put forward a similar concept of atomic-scale machines based on sufficiently complex molecular structures.6 Perhaps based on the work of John Von Neumann, the idea has firmly entered the world of popular science fiction, including Drexler’s apocalyptic gray goo, Alastair Reynolds’s nano-assemblers, and the television show The Expanse’s mysterious protomolecule. Despite appearing physically feasible, however, practical realization of such devices remains uncertain. Following Dyson’s framework, the philosophy backing molecular machines and atom-by-atom assembly is in place, yet scientists still lack the necessary craft and tools.
Enter the scanning probe
The emergence of scanning probe microscopy in the 1980s provided a major boost for the field of nanoscale imaging and atomic-scale assembly. Together with the introduction of STM by Gerd Binnig and Heinrich Rohrer in 1981, it brought new visual insights to controversies in surface science. It also heralded the advent of tools capable of imaging atomic structures in real space using desktop-scale instrumentation.
The fundamental operating principle of STM is based on the quantum mechanical phenomenon of tunneling electrons. An extremely sharp tip is brought near a surface, and an applied voltage causes electrons to tunnel through the gap, thereby producing a measurable current that reflects the surface’s shape and electronic properties. It effectively, if indirectly, puts quantum physics at one’s fingertips. The development a few years later of atomic force microscopy (AFM), which uses a tip mounted on a bendable cantilever, and related methods for probing magnetic, electrical, transport, and electromechanical phenomena has opened the nanoworld for exploration.7
In 1989 Don Eigler demonstrated direct atomic manipulation using an STM probe by forming the letters I, B, and M in xenon atoms on a copper surface. His work had a profound impact on both the research community and the general population because it showed for the first time the ability to not only visualize but also control matter on the single-atom level—a direct response to Feynman’s challenge.
For more than 20 years following Eigler’s experiments, the field remained narrow because of the practical barriers to constructing and operating low-temperature STM machines and the lack of immediate practical applications. But quantum computing and quantum information systems are now at the forefront of scientific inquiry, and STM-based atom-by-atom manipulation is one of the few approaches that can create atomically precise structures. The Kane quantum computing architecture, for example, relies on single atoms precisely positioned inside isotopically purified silicon. Exciting progress has been made by several groups toward the fabrication and production of such devices, in particular one developed by Michelle Simmons and coworkers at the University of New South Wales in Sydney, Australia, that uses single phosphorus atoms.
As impressive as the results described above are, atomic manipulation still takes place on a surface inside an ultrahigh-vacuum chamber. In the real world, atmospheric molecules and surface contamination would quickly overwhelm single-atom devices. The obvious answer is to encapsulate the resulting structures, but that process presents its own difficulties—it would necessitate complex surface chemistries and integration strategies. Hence, the question remains: Is it possible to visualize all the atoms in a material, probe their dynamics and functionality, and arrange them in desired patterns?
Scanning the beam
The key limitation of STM is the use of very low-energy electrons that are geometrically confined by the tip to length scales well below their characteristic wavelength. The alternative approach is to reduce the electrons’ wavelength to visualize matter, akin to optical imaging. Transmission electron microscopy (TEM) was invented by Max Knoll and Ernst Ruska in the 1930s; Ruska won the Nobel Prize in Physics for their work in 1986. In the technique, a relatively large area of a sample is illuminated by a beam of electrons with near-parallel trajectories. A series of electromagnetic magnifying lenses enlarge the transmitted waves to form an image at a phosphor detector screen.
Scanning transmission electron microscopy (STEM) is closely related to TEM, and a single microscope can typically operate in both modes. The invention of STEM and scanning electron microscopy (SEM) can largely be attributed to Manfred von Ardenne’s work in the 1930s; the modern form of STEM was optimized by Albert Crewe in the 1970s.
STEM can be thought of as an upside-down and highly focused version of TEM. The magnifying optics are primarily located before the sample, and they project an atomic-sized beam of electrons—the probe—onto a sample. An image is formed by recording the scattered intensity of the beam as it scans across a sample. The principal benefit of STEM over TEM for imaging is that the electrons scattered at high angles give an image that depends mainly on the atomic number Z. Thus a so-called Z-contrast image can be approximately interpreted as directly mapping nuclear positions in the samples. Several technological advances enabled the modern STEM instrument; see reference 8 for a review. Chief among them is aberration correction.
The question of what imaging resolution is ultimately achievable is one that is still debated today. Following the wisdom of optical microscopy, it seems natural that the illumination wavelength should be smaller than the size of the object to be resolved. Thus, the short de Broglie wavelength of high-energy electron beams—typically a few picometers—and the ability to accurately focus those beams using electric or magnetic fields position the electron microscope as a promising instrument to directly image single atoms. (Interestingly, Ruska and Knoll appear to have been unaware of the electron’s wavelike nature at the time of their invention.)
In practice, however, a modern electron microscope’s lenses will always suffer from aberrations, and those imperfections are the principal factor limiting the device’s resolution. In the 1930s and 1940s, Otto Scherzer demonstrated that aberrations are unavoidable. But he also indicated several methods by which they could be mitigated. The most promising method used a series of electromagnetic fields with different symmetries to shape and modify the beam. Consequently, contemporary aberration correctors are complicated systems that add extra elements to the microscope column. The addition of such a device is why the column in the opening image is so tall.
Building an aberration corrector proved to be so complicated that for many years it was feared to be impossible. The lenses must each be precisely aligned and dynamically adjusted to compensate for varying conditions while also remaining extremely stable. During the imaging of single atoms, even a small instability from a stray field, a noisy power supply, or air-pressure variation could be disastrous. The sheer number of variables makes it difficult for a human to keep track of all the elements, so quantitative computer control and alignment are essential.
Aberration correctors were successfully developed in the 1990s and early 2000s. The devices have revolutionized the field of electron microscopy, and imaging of single atoms is now almost routine. In recognition of that advance, the Kavli Prize was jointly awarded in 2020 to two endeavors—one led by Ondrej Krivanek for STEM9 and another by Knut Urban, Harald Rose, and Maximilian Haider for TEM.10
In addition to providing structural information about nuclear positions, beam electrons transmitted through a sample also interact with the sample’s electrons. After the beam and sample electrons exchange energy, a magnetic prism in an electron spectrometer can disperse the outgoing beam onto a position-sensitive detector to give an electron-energy-loss spectrum (EELS), which provides information about the composition, bonding, and electrical structure of the material.
The energy resolution of an EELS is primarily limited by the energy spread of the electron beam. The spread can be reduced by removing electrons with too much or too little energy before they get to the sample. The removal process, known as electron monochromation, has been used since the early days of electron microscopy and has achieved impressive results. But it reduces the number of electrons in the beam, and because of significant experimental challenges, early implementations usually degraded the signal’s spatial resolution.
A new generation of electron monochromators, in particular those pioneered by Krivanek and coworkers,11,12 has mitigated those issues. When paired with aberration correctors, the devices enable microanalysis at previously unprecedented energies and spatial resolutions. Given that EELS reflects a material’s vibrational and electronic properties, monochromation improvements are beginning to open a new vista of biological, chemical, and physics applications. Advanced measurements of atomic-scale structure and function are possible and continuously improving.
With those capabilities an aberration-corrected STEM device is essentially “a synchrotron in a microscope,” as STEM pioneer Mick Brown eloquently described it in his 1997 paper of that name. In the years following its publication, the ability to perform atomic-resolution spectroscopy and obtain spectra from even single atoms was experimentally demonstrated.13
From imaging to knowledge
Advances in STEM resolution, functionality, and sensitivity over the past decade or so have transformed the technique from a mere imaging system to a quantitative tool. It can characterize atomic structures with picometer-level precision, watch structural evolution under external stimuli, and provide information on local functionalities using EELS. Developments in detector technology now allow recording of a diffraction pattern at every probe position.14 It has thus become possible to record scattering information at atomic resolution to generate multidimensional data sets featuring both real- and reciprocal-space information.
The new data streams present challenges for recording and interpretation. Unlike bulk-scattering methods, in which information is averaged over mesoscopic volumes, STEM obtains distinct data from multiple spatially separate locations. It therefore requires mathematical tools capable of interpreting and compressing the information and relating it to macroscopic properties and functionalities. Although still relatively uncommon in condensed-matter physics, such approaches are regularly used in other fields, such as astronomy. If fully adopted, they can provide a wealth of information on a solid’s chemical and physical functionalities, ranging from defect equilibria and solid-state reactions to the nature of ferroic, charge-ordering, and magnetic distortions. A variety of long-standing questions, including ones on the nature of ferroelectric relaxor and morphotropic materials, nanoscale phase separation, and dynamic phenomena, might now be open for exploration.
Advances in quantitative spectroscopy have opened the door to exploration and discovery of quantum phenomena through spectroscopic signatures in electron-energy-loss spectra, multidimensional scattering data sets, and structural images. Correlating and condensing the large, varied data streams into compressible, interpretable information necessitates linking materials functionalities to reduced descriptors. It also requires the inversion of experimental data, along with their associated uncertainties, to recover the physical functionality of interest.
Once such data-analysis methods become available, researchers will be able to explore the atomic-level origins of materials functionality. Of course, for many phenomena, such analysis is nontrivial. In phonon and plasmon measurements, for example, localized quasiparticles are considerably larger than the beam, so the interactions behind the resulting image are, at the beam’s scale, nonlocal. Similarly, in multidimensional STEM, the measurement process will be strongly affected by the beam shape and aberrations.
All of those problems are surmountable. Still, working with the logic from Feynman’s quote, there might be further development—namely, moving from understanding preexisting atomic configurations to intentionally building them atom by atom.
From lab to fab
Prior to the advent of aberration correction, the preponderant way to achieve better resolution in TEM and STEM was to increase the accelerating voltage used in the microscope, thereby giving a shorter electron wavelength. The problem with that approach is that the amount of kinetic energy that can be directly transferred to a nucleus in a single collision increases, which increases the damage done to the sample. By providing an alternative, the aberration corrector has made STEM a technique of choice for materials science, condensed-matter physics, and high-resolution spectroscopy. And, importantly, it has set the community on the pathway to routine visualization of single atoms under a variety of conditions.
Over the past few years, advances in low-voltage aberration-corrected microscopy have led to many studies of beam-sensitive materials. They also opened for exploration the regime in which beam-induced material changes are minor and localized, often even on the atomic or single-chemical-bond level. In many cases, the changes occur sufficiently slowly that both the initial and the final state of the system can be visualized. Rather usefully, the rate of induced changes can be controlled by adjusting beam parameters, such as voltage and current. Those capabilities have led researchers to actively pursue direct atomic fabrication: The electron beam, in conjunction with image- or spectrum-based feedback, is proposed as a means to manipulate atoms and create atomic-scale structures;15 see figure 3 for an example.
STEM- and STM-based atomic manipulation strategies each offer benefits. The electron beam in STEM can induce changes inside a material, whereas STM interacts with only the topmost layer of atoms and therefore necessitates clean, atomically flat surfaces. STEM also provides a more direct picture of atomic structure than STM does because it is sensitive to atomic nuclei, whereas STM provides maps of electron density. STM-based atomic fabrication requires surface-science strategies to passivate, depassivate, and protect surfaces. STEM can offer greater levels of environmental control around samples—gases and even liquids can be introduced to induce and control a range of material transformations. In practice, most STEM samples must be relatively thin films, typically less than 100 nm. That is the perfect experimental space, however, to investigate and exploit two-dimensional materials, such as graphene or ultrathin suspended layers of three-dimensional materials.
Unlike STM, STEM provides high-resolution imaging and spectroscopy over a wide range of temperatures and thus permits the use of temperature as a knob to allow or forbid certain transformations. Recent advances in STEM should allow one to operate anywhere between a few Kelvin, where quantum phenomena can be investigated, to over 1000 K, where defects and dopants can readily diffuse or be more easily moved by the beam.
To date, four distinct classes of manipulation have been demonstrated: control of single vacancies, atoms, and multiatom complexes in 2D materials; control of single heavy atoms inside 3D materials; phase changes, which are characterized by the ordering of vacancies or localized amorphous–crystalline transitions; and controlled addition or removal of material at local sites. Interesting opportunities may emerge in the context of stacked and twisted 2D materials (see the article by Pulickel Ajayan, Philip Kim, and Kaustav Banerjee, Physics Today, September 2016, page 38), where local beam-induced changes can give rise to the emergence of the moiré materials and open new vistas for the physics of proximity effects.
Remarkably, electron-beam modifications can be performed at length scales ranging from nanometers to angstroms, which span the range covered by conventional lithographic and fabrication methods and single-atom manipulation. Some modifications are analogous to those possible in larger-scale electron-beam fabrication or conventional lithography; the beam-directed repositioning of atoms is perhaps most comparable to using STM to move atoms16,17 and assemble multi-atomic structures.18
A lab in a beam
In the near future, researchers may be able to modify materials atom by atom, explore and define their quantum properties, and realize a so-called quantum lab in a beam. That capability will represent a convergence of nanoscience techniques brought together primarily by STEM advances. The new capabilities will enable visualization of important electronic, magnetic, and optical properties with near-atomic resolution, and they will increase control of reactions, local environments, and chemistries. Combining those emerging capabilities with advances in machine learning that provide real-time feedback and analytics will allow for the extraction of physical functionalities from the collection of atomic variables—a revolution for nano- and atomic-scale science.
As lab-in-a-beam capabilities become more widespread, routine, and understood, they may even grow to include fabrication. “Fab in a beam” capacity could become a critical component in the development and process flow of quantum information science devices and applications (see figure 4).
Progress will require extensive integration across several disciplines. Although the realization of quantum devices and the exploration of quantum phenomena in atomically engineered systems are immediate targets for electron-beam manipulation, creation of such devices will require integration between STEM and semiconductor workflows.
As surface-chemistry control becomes more important, sample-preparation requirements will become more demanding. Many of the relevant technologies and limitations are well known and understood in related fields, but they have yet to be transferred to the electron-microscopy world. That transfer can be integrated with surface-science methods to deliver and control dopants. Ultimately, true atomic-scale fabrication may require combining and leveraging the different strengths of all three approaches: STM, STEM, and traditional nanofabrication.
Data and information-support infrastructures will also be necessary. Just as computer control was essential for aberration correction, it will also likely be essential for lab-in-a-beam capabilities, including such developments as real-time beam control with automatic drift correction, low-dose imaging based on compressed sensing and nonlinear scans, and real-time image analysis and feedback based on deep learning. Data-transfer rates, the availability of central and graphics processing units, and real-time feedback then become key considerations for further instrumental design.
It is also interesting to speculate about whether electron-beam fabrication can be scaled up for practical applications. Such systems appear to have much lower intrinsic latencies than scanning probe manipulations. But even at tens or hundreds of manipulations per second per beam, they do not scale easily to industrial production.
At the same time, one doesn’t need to make very many elements in a quantum system to have a real impact. Several good but easily accessible elements might be enough for many applications. In some cases, only about 50 error-free qubits would be expected to compete with the fastest classical computers. Similar to how enzyme-catalyzed chain reactions enable the duplication of biological signals, a combination of the atomic fabrication of seed elements and chemistry-based duplication may open the way to mass production.
Equally important is the development of fundamental theory for beam–solid interactions. Although the theory for electron scattering that underpins STEM image and EELS formation is well developed, beam-induced changes in solids remain relatively underexplored. An electron with precisely known energy can be delivered to a selected part of an atomic lattice with atomic-scale horizontal precision—although presently without equivalent vertical resolution—yet the type of changes it will induce are still unclear. The multistage process includes energy transfer between the electron and nuclei and, potentially, dynamic evolution of localized bonding, delocalized conductive subsystems, and core electronic excitations. The underlying mechanisms are difficult to model because they can span multiple orders of magnitude in energy and time. A lab in a beam would not only produce atomic-scale devices but also provide the ideal test bed to explore those mechanisms and learn how to make new quantum systems.
The opportunity to create quantum structures atom by atom, visualize them, and explore their functionality with the lab in a beam makes the field an exciting one to pursue. The more precisely we can build, the deeper our understanding can become.
This work was supported by the US Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division. It was performed at the Center for Nanophase Materials Sciences, which is a US Department of Energy, Office of Science user facility at Oak Ridge National Laboratory. Discussions with Nader Engheta are gratefully acknowledged.
References
Sergei Kalinin is a professor in the department of materials science and engineering at the University of Tennessee in Knoxville. Starting in 2023, he will be the Weston Fulton Professor. Stephen Jesse and Andrew Lupini are researchers at the Center for Nanophase Materials Sciences at Oak Ridge National Laboratory in Tennessee.