Electrons in metals and semiconductors are often naively described as little balls bouncing around, much like atoms or molecules in dilute gases. That description, sketched in figure 1, originally came from Lev Landau, who reduced the complex many-body problem to a Fermi gas of nearly free electrons. But his simplification is counterintuitive, because Landau theory also infers that electron gases in normal metals should be exceedingly viscous because of pervasive electron–electron (e–e) collisions in solids. Indeed, the theory predicts that viscosity becomes infinite with decreasing temperature T, and simple estimates show that as T drops to that of liquid helium, electron gases in metals should be more viscous than honey.

Figure 1.

Diffusive transport versus viscous-electron flow. (a) In the single-particle, diffusive model, electrons (red circles) move as independent particles, undergoing collisions with impurities, phonons (yellow stars), and boundaries. (b) In the hydrodynamic regime, in which the electron–electron mean free path is the shortest length scale in a material, frequent interactions among electrons can give rise to collective, viscous behavior, dubbed Poiseuille flow. The velocity in the flow direction, vx, is a parabolic function of the transverse coordinate y. (Images from Marco Polini.)

Figure 1.

Diffusive transport versus viscous-electron flow. (a) In the single-particle, diffusive model, electrons (red circles) move as independent particles, undergoing collisions with impurities, phonons (yellow stars), and boundaries. (b) In the hydrodynamic regime, in which the electron–electron mean free path is the shortest length scale in a material, frequent interactions among electrons can give rise to collective, viscous behavior, dubbed Poiseuille flow. The velocity in the flow direction, vx, is a parabolic function of the transverse coordinate y. (Images from Marco Polini.)

Close modal

One might expect researchers to rely on hydrodynamics, with its difficult-to-solve Navier–Stokes equation, to describe the resistivity of metals. And yet that approach is not routine. The nearly free electron model works well because the spatial interactions between electrons in metals differ so much from those between molecules in a gas. Whereas molecules scatter only when they directly contact each other, the mean free path, lee, at which electrons effectively scatter is much longer—typically microns at liquid-helium T—and it grows longer still at lower T.

The large mean free path leaves plenty of time and space for impurities and thermal vibrations (phonons) to destroy any nascent collective response of electrons that would otherwise produce viscous flow. To understand why the destruction happens, imagine a highly viscous classical gas moving through a large tube, its flow experiencing dissipation, convective nonlinearities, and other hydrodynamic behavior. Now fill the tube with sand, so that the intergranular gaps are smaller than the molecules’ mean free path. The flow through the porous sand would then no longer be viscous. Rather, it would be diffusive, each particle moving independently of the others.

Something similar happens in normal metals: Impurities and phonons act like those grains of sand, which are packed densely enough to eliminate any sign of the electrons’ collective behavior. In theory, it should be possible to recover the intrinsic hydrodynamic behavior of electrons, unmasked by impurities or phonons, if a metal is ultraclean and cooled to a low enough T to avoid phonon scattering. (Think of that recovery as equivalent to the removal of sand.) But in practice, little experimental progress has been made in reaching that hydrodynamic regime, despite efforts over many decades. Fortunately, the availability of new high-quality electronic materials—graphene, in particular—has recently improved the situation.

In 1963 Soviet theorist Radii Gurzhi asked how a viscous electron flow could reveal itself in an experiment.1 He assumed the existence of a metallic system in which lee was the shortest length scale that electrons would travel, much shorter than both the sample size W and the mean free path l of electrons whose collisions—for instance, with phonons and crystal defects—did not conserve momentum. Given that assumption, frequent collisions between electrons should be able to establish a collective flow, illustrated in figure 1b, because their total momentum and energy is not lost to the outside world.

Gurzhi found that the resistance R of such an imaginary metal would have to decrease with increasing T. That’s a shocking result because the standard definition of a metal is that its R increases with T. Nonetheless, the theoretical prediction was unambiguous and could be traced to the fact that the electron viscosity ν in metals decreases with T. Intuitively, it also makes sense: As a system warms, it becomes less viscous, which allows easier passage of a fluid. The anomaly is usually referred to as the Gurzhi effect, and it explains that if a metal enters the hydrodynamic regime—where leeW and l—it should exhibit a T dependence that is the opposite of metallic and is more like that of semiconductors, whose R decreases with T.

Unfortunately, finding a system that satisfies those conditions turned out to be nearly impossible. One usually thinks of large, clean crystals at cryogenic T, which would mitigate the effect of phonons and thus increase l. Indeed, clean, three-dimensional metals at low T exhibit values of l that are nearly a centimeter. However, lee also rapidly increases with decreasing T because of what’s known as Pauli blocking. (See box 1 for details on the fundamental properties of electron systems.)

Box 1.
A primer on Fermi liquids

Statistical mechanics tells us that the ground state of a system of noninteracting electrons is a Fermi sphere—that is, all the states with wavenumber |k| smaller than a maximum, dubbed the Fermi wave number kF, are occupied, and states with |k|>kF are empty. The occupation number nk of a state with momentum k is therefore a step function, changing discontinuously from 1 to 0 as |k| crosses kF. The energy of the state at kF is the Fermi energy EF, and the related temperature scale TF=EF/kB is the Fermi temperature. At finite T, such a step is smeared around |k|=kF into a smooth Fermi–Dirac distribution function.

In a series of brilliant papers in 1957, Lev Landau showed that when electron–electron interactions are taken into account, they do not modify that single-particle picture much. In a Fermi liquid at T = 0, nk displays a finite jump in amplitude when |k| crosses kF. Due to electron–electron interactions, bare electrons become “dressed” electrons, known as quasiparticles. In a Fermi liquid, scattering between quasiparticles is heavily constrained by the Pauli exclusion principle; transitions can only occur between initial occupied states and final empty states. At finite T, only partially occupied states in a window of width kBT around EF can participate in the scattering. That “Pauli blocking” is at the heart of the existence of Fermi liquids and is responsible for the 1/T2 divergence in the mean free path of electrons in the limit of TTF.

Fermi statistics greatly limits the available phase space for e–e collisions when T is well below the Fermi temperature TF. As a result, lee diverges as (TF/T)2 with decreasing T. That low-T regime is precisely where Landau quasiparticles are long lived and the single-particle model of electrical conductivity is justified.

The only way to reach the hydrodynamic regime is to work at elevated T, such that the Fermi sphere becomes “softer” and Pauli blocking less obstructive to e–e scattering. At those higher T, phonons become the main hindrance and limit l to the electron–phonon scattering length, lep. The resulting condition, leelep, required to observe viscous behavior is extremely difficult to satisfy because lep often decreases faster with increasing T than does lee. (For 3D metals, lep usually varies as T3, whereas lee varies as T2.) That scaling narrows the materials systems one could use and the T interval in which electron hydrodynamics could possibly be observed.

An elegant attempt to break the impasse2 was undertaken in the 1990s. Researchers applied a high electrical current that increased the electron T of a 2D electron semiconductor (2DES) system and shortened lee. Even so, the crystal lattice remained close to liquid-helium T, which kept the electron–phonon scattering low as well. Measuring the differential resistance revealed a small but distinct bump as a function of applied current, a feature the researchers interpreted as plausible evidence for the Gurzhi effect.

Gurzhi and coworkers immediately disagreed with that interpretation,3 and they pointed out that peculiarities of e–e scattering in 2D materials demand an even more stringent condition than that in 3D metals—namely, leeW(T/TF), which had not been achieved in the experiment. Their rejection left the research status in limbo: For a half century after the Gurzhi theory was postulated, no electronic system had been found to exhibit unambiguous signs of hydrodynamic behavior.

Despite having a Nobel Prize behind it, graphene did not initially look like a promising candidate for studies of electron hydrodynamics. It was filled with impurities, with a mean free path barely exceeding 100 nm (see the article by Andre Geim and Allan MacDonald, Physics Today, August 2007, page 35, and Physics Today, December 2010, page 14). But that changed around 2011, when researchers found that encapsulating graphene in hexagonal boron nitride dramatically improved its electronic quality. The encapsulation shielded graphene from outside impurities and flattened the crystal by suppressing scattering at microscopic corrugations.

Today, graphene is one of the highest quality electronic materials ever produced: Its low-T mean free path is currently limited only by the device size W, at least up to 10 μm, and exceeds a micron even at room T. More importantly, graphene is extremely stiff, a feature that suppresses phonon scattering and increases lep. And unlike what happens in 3D metals, electron–phonon scattering in 2D graphene increases slowly with T; lepT1, with a small proportionality coefficient that accounts for stiffness. As noted earlier, e–e scattering rises much faster, with leeT2.

Therefore, above a certain T, lee is expected to become the shortest scattering length in graphene. Moreover, graphene’s TF is typically greater than 1000 K. That’s neither too small, as it would be in semiconductor 2DESs, where the Fermi surface is largely destroyed at room T, nor too high for the required 2D condition leeW(T/TF) to be consequential. In short, it is hardly possible to imagine a better material than graphene for studying viscous electron flows.

Despite the promise of that expectation, in 2015 when the first contemporary experiments began probing the phenomenon, graphene’s resistance showed no sign of the Gurzhi effect at any T. In hindsight, one can understand why the viscous effects did not show up straightforwardly. The kinematic viscosity ν enters the Navier–Stokes equation as a coefficient in front of the second spatial derivative of velocity v(x,y) (see box 2). In the standard resistance measurements that use a long strip of a uniform width, only vx(y)—the y dependence of flow velocity in the x direction—is nonzero. Unless significant momentum losses occur at the strip boundaries, the y dependence tends to be weak. The result is a fairly uniform flow profile. And without a significant velocity gradient, the viscosity term contributes little to the solution of the Navier–Stokes equation and, hence, to the resistance R.

Box 2.
The Navier–Stokes equation in condensed matter

The motion of water in oceans, turbulent air currents, and Marangoni flows, which produce “tears of wine” inside a glass, are a few examples of phenomena governed by the Navier–Stokes equation. The equation is essentially Newton’s second law for each fluid element—a small volume of a liquid or gas subjected to external forces. Today, no mathematical theory exists that would unlock the equation’s complete solution. Finding it remains one of the famous seven Millennium Prize problems.

To describe a steady-state flow of electrons, the simplest, linearized form of the Navier–Stokes equation is normally used:4–6 

in which J(r)=nv(r) is the current density, n is the electron density, ϕ(r) is the electric potential, σ0 is the diffusive conductivity, and e is the electron charge. The length over which the flow’s momentum diffuses is given by Dν=ντ, where τ is a time scale that describes momentum dissipation from the scattering of electrons with impurities and phonons. In the limit where Dν goes to 0, the linearized Navier–Stokes equation yields Ohm’s law locally: eJ(r)=σ0E(r), where E(r)=ϕ(r) is the electric field. To find an electron flow pattern, the Navier–Stokes equation needs to be solved together with the continuity equation, J(r)=0, and the boundary conditions.

That insight offered a tip for how to proceed: To maximize the hydrodynamics effects in experiment, it is essential to create a current flow as inhomogeneous as possible.4 

One geometry that provides large velocity gradients is a narrow current injector, shown schematically in figure 2. According to the Navier–Stokes equation, the electric potential changes its sign at a characteristic distance of order of Dv=leel/2 from the injector.4–6 One can measure that local potential in the so-called vicinity geometry—that is, by placing a voltage probe sufficiently close to the injector. The corresponding resistance RV—the local voltage divided by the injected current—has the normal, positive sign for noninteracting electrons in both diffusive and ballistic transport regimes. Negative RV, by contrast, is a smoking gun for viscous flow.4 

Figure 2.

Negative resistance and current whirlpools.(a) In a so-called vicinity-resistance measurement, current I is injected into a two-dimensional device of width W through a narrow lead, and a potential drop ΔV is measured between a voltage probe placed a short distance L from the injector and a faraway lead. (b) In this micrograph of a real device, graphene (white) is tipped with electrical contacts (magenta), and current and voltage probes can sample any of several positions during an experiment. (c) This color map shows the calculated distribution of electrical potential in the absence of viscosity. The voltage and resistance are positive (red) and arrows reveal the steady-state current pattern. (d) In the case of viscous flow, lobes of negative voltage (blue), and thus negative resistance, emerge near the current injector I. The finite viscosity induces whirlpools in the current flow. (Adapted from ref. 4.)

Figure 2.

Negative resistance and current whirlpools.(a) In a so-called vicinity-resistance measurement, current I is injected into a two-dimensional device of width W through a narrow lead, and a potential drop ΔV is measured between a voltage probe placed a short distance L from the injector and a faraway lead. (b) In this micrograph of a real device, graphene (white) is tipped with electrical contacts (magenta), and current and voltage probes can sample any of several positions during an experiment. (c) This color map shows the calculated distribution of electrical potential in the absence of viscosity. The voltage and resistance are positive (red) and arrows reveal the steady-state current pattern. (d) In the case of viscous flow, lobes of negative voltage (blue), and thus negative resistance, emerge near the current injector I. The finite viscosity induces whirlpools in the current flow. (Adapted from ref. 4.)

Close modal

However, one must be careful. As T increases, the initial sign change indicates that ballistic transport is strongly affected by e–e interactions, and the hydrodynamic regime develops only later, at higher T when collisions among electrons become more frequent.7 The observation of negative RV in graphene and its comparison with behavior expected by Navier–Stokes theory allowed the first measurements of an electron fluid’s viscosity. At liquid-nitrogen T, ν turns out to be 100 times as great as honey. Reassuringly, that result agrees with many-body theory.4 

Navier–Stokes theory also predicts another spectacular effect in the conductivity of metals because of viscosity.4–6 The negative region of electric potential near the injector is predicted to develop into a whirlpool of electrical current. Whirlpools are familiar phenomena in the laminar flow of ordinary fluids, but in the vicinity geometry4,6 of figure 2a, they are theoretically expected to exist near a narrow injector. Only the size of Dν depends on the actual value of ν.

For other geometries that create a nonuniform flow,5 current whirlpools generally disappear if Dν gets smaller than the characteristic device size W, even though the negative potential anomaly doesn’t change.

In 1908 Martin Knudsen observed that the speed of gas flowing through a small aperture suddenly increased when he increased the gas’s density. The experiment implies that a higher viscosity boosts the gas flow, which is a counterintuitive result. The effect is well understood today as the transition from Knudsen flow to Poiseuille flow, or in the language of metal physics, from ballistic-electron transport to viscous-electron transport. The phenomenon observed by Knudsen can be viewed as the analogue of the Gurzhi effect for gases rather than electrons.

An experiment similar to Knudsen’s was recently performed on graphene.8 As shown in figures 3 and 4, a narrow aperture of width w connects two wider regions, a geometry known as point contact (PC). In the ballistic regime at low T, such PCs were first made and studied by Yuri Sharvin in the 1960s. He found that even in the ideal case—without any disorder and scattering—a PC exhibited finite electrical conductance. Its value is given by the number of electron-wave modes that can fit inside the aperture.

Figure 3.

Electron flow through a constriction. A narrow aperture of width wW separates two wide leads. (a) In ballistic transport—Knudsen flow in the language of gas dynamics—electrons move independently. With no scattering between them, the resistance to their flow (blue) through the constriction had been expected to be a minimum. (b) In a viscous electron fluid, however, Poiseuille flow corresponds to yet lower resistance. An individual electron (red), initially directed toward the boundary, isn’t expected to contribute to the conductance. But collisions with other electrons effectively drag it toward the constriction and the collective motion decreases the resistance. The quantity Dv is the length scale over which momentum diffuses as a result of electron–electron collisions. (Images from Marco Polini.)

Figure 3.

Electron flow through a constriction. A narrow aperture of width wW separates two wide leads. (a) In ballistic transport—Knudsen flow in the language of gas dynamics—electrons move independently. With no scattering between them, the resistance to their flow (blue) through the constriction had been expected to be a minimum. (b) In a viscous electron fluid, however, Poiseuille flow corresponds to yet lower resistance. An individual electron (red), initially directed toward the boundary, isn’t expected to contribute to the conductance. But collisions with other electrons effectively drag it toward the constriction and the collective motion decreases the resistance. The quantity Dv is the length scale over which momentum diffuses as a result of electron–electron collisions. (Images from Marco Polini.)

Close modal
Figure 4.

The Gurzhi effect.(a) A graphene device has a series of point contacts of different widths w; the contacts link several boxes (turquoise), each connected to separate electrodes (yellow). (b) The resistance of one of those point contacts (w = 0.5 μm) is plotted as a function of temperature T for three electron densities n. The horizontal lines indicate the ideal, ballistic limit. But as T increases, the resistance drops below the expected minimum and follows a nonmonotonic dependence on T—the Gurzhi effect. (c) Black dots represent viscosity8 measured as a function of T for n = 1012 cm−2. The experimental dots closely agree with many-body theory calculations (red line). For comparison, note the y-axis scale: The viscosity of honey is about 10−3 m2/s. (Adapted from ref. 8.)

Figure 4.

The Gurzhi effect.(a) A graphene device has a series of point contacts of different widths w; the contacts link several boxes (turquoise), each connected to separate electrodes (yellow). (b) The resistance of one of those point contacts (w = 0.5 μm) is plotted as a function of temperature T for three electron densities n. The horizontal lines indicate the ideal, ballistic limit. But as T increases, the resistance drops below the expected minimum and follows a nonmonotonic dependence on T—the Gurzhi effect. (c) Black dots represent viscosity8 measured as a function of T for n = 1012 cm−2. The experimental dots closely agree with many-body theory calculations (red line). For comparison, note the y-axis scale: The viscosity of honey is about 10−3 m2/s. (Adapted from ref. 8.)

Close modal

Until recently, researchers have tacitly accepted that Sharvin’s conductance was the highest possible value. The absence of disorder seemed to imply the best-case scenario for unimpeded electron transport. But that turned out to be wrong. Figure 4b shows that when T is increased and a system enters the hydrodynamic regime, the resistance measured in a graphene PC drops below the ideal ballistic limit. For the experiment in the figure, the drop was caused by the transition from ballistic to viscous electron transport. It was also accompanied by a semiconductor-like T dependence—the first unambiguous manifestation of the Gurzhi effect.

How is it possible for viscosity to lower the electrical conductivity? After all, basic physics tells us that greater electron scattering should increase the resistance—a trend known as Matthiessen’s rule. Making the transition from the low-T regime, where Sharvin’s description applies, to the higher-T hydrodynamic regime, electron viscosity sets up a funnel-like current pattern through the aperture, akin to what happened in Knudsen’s experiment.

Imagine an electron moving toward the PC, as in figure 3. In the ballistic regime, it hits the wall and stops contributing to the conductance. But in the hydrodynamic regime, the same electron is dragged by electron collisions toward the opening and forced to funnel through it. That funneling is what raises the conductance above Sharvin’s ballistic limit. Mathematically, the superballistic flow happens because conductivities are added—the so-called anti-Matthiessen’s rule9 described in box 3. By comparing experimental results and theory, the two of us and our colleagues were able to accurately measure graphene’s viscosity as a function of electron concentration and T.

Box 3.
Anti-Matthiessen’s rule

Formulated in 1864, Matthiessen’s rule states that if several independent scattering processes exist in a system, the total resistance R is the sum of the resistances due to each process. Deviations from the rule occur in metals but are generally tiny. The occurrence of an anti-Matthiessen’s rule, in which conductivities G rather than resistivities are added, is exceptionally rare. One possible scenario was proposed for the case of strange metals.14–16 

A viscous electron flow through a point contact (PC) is another exception. Two relevant time scales exist in that situation. The first is the single-particle flight time across the constriction, τ1=2/π(w/vF), where w is the size of a constricting aperture and vF is the Fermi velocity. The second is the time scale over which the momentum diffuses over the same distance, τ2=π/32(w2/ν), where ν is the viscosity. The total PC resistance9 is given by

where m is the effective electron mass, G1 is the Sharvin conductance, G2 is the contribution to conductance from electron–electron interactions, and n is the electron concentration. Three years ago, experiments confirmed the validity of that anti-Matthiessen equation.8 

Another knob that can be turned to explore viscous flow is the magnetic field B. In traditional metallic systems, B causes the Hall effect, a potential drop perpendicular to the direction of both current flow and the magnetic field. How is the Hall effect influenced by electron viscosity? The presence of a magnetic field breaks down time-reversal symmetry and produces a new kinematic coefficient νH in the Navier–Stokes equation. The coefficient, known as the Hall viscosity, is odd under reversal of B and is dissipationless. The Hall viscosity gives rise to an extra term in the Navier–Stokes equation that is proportional to νH, acts against the Lorentz force, and suppresses the resulting potential drop.

The suppression of the Hall effect is local and extends only over distances of Dv, typically about 0.5–0.6 µm. By placing voltage probes close to a narrow current injector, we measured a local Hall effect.10 For graphene in the hydrodynamic regime, it was found to be notably smaller than the standard Hall effect, measured simultaneously at some distance from the current contact.

Now that we know how to force hydrodynamics to show up in experiments, we expect to soon observe viscous phenomena in many systems, including 2DESs in semiconductors, graphite, bismuth, topological insulators, and Weyl metals. Evidence already exists for viscous flow in delafossites,11 and local (vicinity and PC) geometries should help make those observations. Materials in which electrons and holes coexist and interact strongly present another interesting challenge.12,13 

Let’s also not forget about materials that defy the Fermi-liquid paradigm. They are called strange metals14,15 and have Planckian transport scattering times on the order of /(kBT) down to the lowest T. Those metals are also expected to exhibit viscous electron motion, albeit with a tiny viscosity conjectured to be close to a universal lower bound predicted by string-theory methods. Experimental evidence of the lower bound has been reported in ultrahot nuclear matter, such as quark–gluon plasmas, and in ultracold atomic Fermi gases, but not in condensed-matter physics.

Yet another enticing project would be to extend existing hydrodynamic studies into the regime where nonlinear terms in the Navier–Stokes equation can no longer be ignored. In classical fluids, those terms are responsible for nonlinear phenomena such as turbulence. Similar physics is expected to occur in electron fluids, but studying such fluids would require materials with smaller ν and longer τ compared with the 2DESs studied so far.

For all those new ventures, one should use not only electrical probes but also the visualization tools that are now available. Scanning probe microscopes that can sense voltages or magnetic fields are one example. They can image local distributions of electrical current at submicron scales and reveal electron hydrodynamics at an entirely new, more spectacular level. Watch out for beautiful images of electron whirlpools and viscous flows coming soon.

The European Union’s Horizon 2020 research and innovation program (Graphene Flagship) supported this work. We are grateful to everyone who contributed to the research—particularly Denis Bandurin, Alexey Berdyugin, Roshan Kumar, Leonid Levitov, Francesco Pellegrino, Leonid Ponomarenko, Alessandro Principi, Andrea Tomadin, and Iacopo Torre. Author Marco Polini dedicates this article to Rachele.

1.
R. N.
Gurzhi
,
J. Exp. Theor. Phys.
17
,
521
(
1963
).
2.
M. J. M.
de Jong
,
L. W.
Molenkamp
,
Phys. Rev. B
51
,
13389
(
1995
).
3.
R. N.
Gurzhi
,
A. N.
Kalinenko
,
A. I.
Kopeliovich
,
Phys. Rev. Lett.
74
,
3872
(
1995
).
4.
D. A.
Bandurin
 et al.,
Science
351
,
1055
(
2016
).
5.
L.
Levitov
,
G.
Falkovich
,
Nat. Phys.
12
,
672
(
2016
).
6.
F. M. D.
Pellegrino
 et al.,
Phys. Rev. B
94
,
155414
(
2016
).
7.
D. A.
Bandurin
 et al.,
Nat. Commun.
9
,
4533
(
2018
).
8.
R. K.
Kumar
 et al.,
Nat. Phys.
13
,
1182
(
2017
).
9.
H.
Guo
 et al.,
Proc. Natl. Acad. Sci. USA
114
,
3068
(
2017
).
10.
A. I.
Berdyugin
 et al.,
Science
364
,
162
(
2019
).
11.
P. J. W.
Moll
 et al.,
Science
351
,
1061
(
2016
).
12.
J.
Crossno
 et al.,
Science
351
,
1058
(
2016
).
13.
P.
Gallagher
 et al.,
Science
364
,
158
(
2019
).
14.
For a popular introduction to strange metals, see
J.
Zaanen
,
SciPost Phys.
6
,
061
(
2019
).
15.
S. A.
Hartnoll
,
A.
Lucas
,
S.
Sachdev
,
Holographic Quantum Matter
,
MIT Press
(
2018
).
16.
P. A.
Casey
,
P. W.
Anderson
,
Phys. Rev. Lett.
106
,
097002
(
2011
).
17.
A. K.
Bodenmann
,
A. H.
MacDonald
,
Physics Today
60
(
8
),
35
(
2007
).
18.
M.
Wilson
,
Physics Today
63
(
12
),
14
(
2010
).

Marco Polini is a professor at the University of Pisa in Italy, a professor at the University of Manchester in the UK, and an external collaborator of the Italian Institute of Technology in Genoa. Andre Geim is the Regius Professor at the University of Manchester.