All-optical spatial frequency filtering has a long history with many applications now commonly replaced with digital alternatives. Although optical approaches are attractive in that they minimize energy requirements and images can be manipulated in real time, they are relatively bulky compared to the compact electronic devices that are now ubiquitous. With emerging interest in nanophotonic approaches to all-optical information processing, these approaches to enhancing images and performing phase visualization are attracting significant interest. Metasurfaces have been demonstrated as tailored alternatives to conventional spatial filters, but utilizing the spatial frequency sensitivity of these and thin film devices also has the potential to form the basis for ultracompact approaches to image processing. There are, however, significant challenges remaining to realize this promise. This review summarizes the current status of research in this rapidly growing field, places it in the context of the history of all-optical spatial filtering, and assesses prospects for future directions.
I. INTRODUCTION
The ability to manipulate spatial information and images is ubiquitous in modern technology with applications as diverse as cellular microscopy, image enhancement, and object identification.1,2 The traditional purpose of image processing was to improve the esthetic appeal or level of information available to the human eye. With advances in technology, however, it is playing an increasing role in automation, including machine vision. The input into any analog or digital image processing system is an optical field, and the output is a machine-readable intensity that has been captured by a camera and possibly also digitally modified. In the case where the optical field is modified in real-time using optical components, we refer to this as all-optical image processing, whereas computational processing of the intensity of the field captured using a digital camera is referred to as digital or electronic image processing.
Building on the Abbe theory of image formation, there were considerable developments in the 1950s and 1960s demonstrating theoretically and experimentally both coherent and incoherent all-optical image processing. Inspired by electronic signal processing, research focused on edge enhancement, object and defect identification, motion analysis, and the visualization of phase.3–7
With the exception of imaging techniques that require access to polarization or phase, today most image processing applications are performed digitally. Efficient algorithms have been developed, including the Cooley–Tukey and other implementations of the fast Fourier transform, as well as the Hough, Hotelling, and other transforms,8 which have been implemented in large systems and portable devices. Furthermore, specialized chips that enable electronic edge detection in amplitude images with no additional computation have been demonstrated.9,10 The phase of a field or phase gradients cannot, however, be directly sensed by a conventional photodetector, and alternative optical or computational strategies are required to extract this information from a field.11–14
With access to advances in digital image processing, reliance on all-optical methods has declined. The enormous increase in the amount of data acquired for applications ranging from remote sensing, LIDAR, autonomous and semi-autonomous vehicles, and face recognition, however, presents challenges for maintaining the speed of operations, energy use, data storage, and transmission, particularly from remote locations. Furthermore, despite significant advances in intensity-based phase imaging methods,13,14 these generally require significant computational post-processing. It is, therefore, timely to revisit all-optical image processing that can be performed in real time with no requirement for additional energy. Furthermore, advances in flat optics—metasurfaces—provide a platform for undertaking these operations on a chip complementing dramatic reductions in the size of electronic systems over the past few decades.
Spatial filtering systems require a certain volume of space to accommodate lenses, the filter itself, and the propagation distances to transform from the image/object plane to the Fourier plane. One exciting opportunity afforded by meta-optics and thin films is the prospect of performing direct manipulation of the angular spectrum in the image (or object) plane itself. Meta-optic designs have also been put forward to compress the propagation distances required in conventional Fourier plane imaging.15
Advances in nanofabrication methods and simulation tools over the past decade have led to many proposals for, and demonstrations of, meta-optical and other nanophotonic systems. One of us16 suggested a resonant metasurface as an image-plane spatial high- and low-pass filter. Silva et al.17 explored the potential use of three-dimensional metamaterials for performing mathematical operations on images and also suggested the use of thin metasurfaces as highly tailorable spatial filters permitting manipulation of both the amplitude and phase of each Fourier component.
Here we discuss these and subsequent advances in the field, placing these developments in the context of earlier research in all-optical spatial filtering and the potential for nanostructured, thin film meta-optics to revolutionize this field. This review seeks to complement other recent review articles that have addressed other broader imaging-related applications of meta-optics18 as well as those surveying recent advances in spatial analog computing with meta-optics.19,20
II. SPATIAL FILTERING
The theoretical framework underpinning spatial filtering is well known and included here only to support subsequent discussions involving meta-optical devices. Interested readers are referred to the classic texts in the field, including those by Goodman21 and VanderLugt22 and foundational references such as Cutrona et al.4 For simplicity, we exclude birefringent materials or polarization effects introduced by lenses and other elements, and confine the discussion to coherent systems.
A. Theory
Consider a monochromatic coherent scalar optical field with free space wavelength λ that could, for example, contain information about an object. Any such field at a point in space (x, y, z) can be decomposed into its angular spectrum via a Fourier transform representation,
where . In the above is the wavenumber of the incident light. The function is the angular spectrum of the field given by the Fourier transform of the field in a plane designated z = 0,
The integral in Eq. (1) represents a plane wave decomposition of the field. If kz is real there is a one-to-one correspondence between the transverse spatial frequencies (kx, ky) of each plane wave component and the direction of propagation as defined by the angle of propagation with respect to the optical axis given by polar and azimuthal angles of that component, as discussed further in Sec. III A. If kz is imaginary, the plane wave is said to be evanescent and decays exponentially with propagation distance z. Here we focus exclusively on forward propagating fields with real and positive values of kz.
If this field passes through a lens with focal length f, placed at z = f, and is focused [Fig. 1(a)], the field in the focal plane (X, Y) at can be expressed in terms of the Fourier transform of the field. Specifically, if we assume the spatial extent of the input field is much smaller than the lens aperture so that diffraction can be ignored, the output field is given by21
In this expression, Uc is a constant amplitude, and f is the focal length of the lens. Since , a field is produced in the Fourier plane that is proportional to the scaled Fourier transform of the input field at z = 0. Hence, the use of a lens permits direct access to the Fourier transform of the field in the spatial domain, and the introduction of filters with spatially varying absorption, thickness, or refractive index provides a method to modify the angular spectrum components of the incident field. Introducing an otherwise transparent filter with an absorbing “dot” on the optical axis, for example, removes the components of the field with zero (or low) spatial frequencies. The introduction of a second lens, placed at , transforms back to the image space forming an image that is modified in appearance at , for which this type of setup is also referred to as a 4f-system. If the low spatial frequencies are removed, the resulting image is edge enhanced. Although the concept of spatial filtering had been understood since the work of Abbe and Porter,23,24 experimental demonstrations required a further half century. Elias et al.25,26 discussed the potential of spatial filtering for various all-optical processes, including object identification and image enhancement. Early published research with experimental demonstrations of spatial filtering include work by O'Neill,3 who highlighted the link between optical and electronic filtering. This work demonstrated all-optical edge enhancement, the removal of granular features from photographs, and the detection and recognition of periodic and other signals in the presence of noise. In 1964 VanderLugt27 presented theoretical analysis and experimental demonstrations of complex spatial filtering. A summary of early work on optical spatial filtering is given in the review by Birch.28
B. Applications
The introduction of a spatial filter into the Fourier plane of an imaging system leads to a modification of the angular spectrum. Given the one-to-one mapping between spatial frequency and position in the Fourier plane, we can describe its influence on the output field via a complex-valued transmittance, the optical transfer function given by
where is the modulation transfer function and is the phase-transfer function that describes the phase introduced by a filter. The amplitude and phase of the angular spectrum of the output field is modified as
According to the convolution theorem, the output field can alternatively be written as the convolution of the input field and the inverse Fourier transform h(x, y) of the transfer function
Hence, by tailoring the transfer function in the spatial frequency domain or h(x, y) in the spatial domain, there is the potential to implement a large range of operations on an input image field. To fully realize this functionality, however, requires independent control of both the real and imaginary values of the transfer function for all values of (kx, ky) with transmittances having amplitudes lie within a circle of radius 1 (Fig. 2). In practice, applications reported to date have been confined to subsets of this complex plane.
Early demonstrations of spatial filtering were restricted to binary filters (see, for example, Brown5) where light in the Fourier plane was either blocked or perfectly transmitted, i.e., the transfer function takes values of either 0 or 1. Such binary filters are used in Schlieren methods for visualizing phase by inserting a “knife edge” filter in the Fourier plane that blocks negative spatial frequencies. By using a central stop to block low spatial frequencies, the image background corresponding to low spatial frequencies is eliminated, leading to an edge enhanced, “dark field” image. Alternatively, a “low pass” filter can be used to block high spatial frequency noise to improve the quality of a beam. Binary filters can also be used to selectively detect, block, or identify periodic features in an image. The next level in complexity involves filters with varying gray levels to perform filtering operations while minimizing artifacts associated with the abrupt variations in transmission of the binary filters. The spatial filter transfer function hence takes values lying on the positive real axis in the complex plane, as shown in (Fig. 2).
Pure phase filters have no influence on the amplitude of the transmitted function, but vary the phase. Their transmittance would lie on the circle of Fig. 2. Traditionally, these were created by varying the thickness of a filter composed of a homogeneous material or varying the refractive index in a transparent film of uniform thickness, but devices such as a spatial light modulator can also be used to dynamically modulate the phase. One of the most prominent examples of phase filtering is that of Zernike phase contrast microscopy11,12 where low and high spatial frequencies are advanced or retarded by and in the Fourier plane introducing intensity contrast into a phase image. This is particularly important in imaging transparent objects such as live cells. Since a phase shift of π is introduced between low and high spatial frequencies, this type of filter is often referred to as being “real valued.” A further example of “real valued filters” where the transmission varies over negative and positive real values is that of differentiation using a transfer function directly proportional to the spatial frequency . Similarly a system with a transfer function produces an output proportional to the Laplace operator . This is directly apparent when the first- or second-order spatial derivatives are applied to an input field's Fourier representation as provided in Eq. (1), and the definition for the optical transfer function as given in Eq. (5) is considered. These operations enable for instance the detection of edges in amplitude and phase images as well as the visualization of phase-gradients in optical wavefields, which is otherwise invisible to conventional camera technology.
Another important application is that of identifying a known object in an image. Convolution is an operation that is widely used to discriminate objects from a background, and this operation can be performed in the Fourier plane through the design of tailored filters. More generally optical systems can be implemented to perform a range of integral transforms.4
Despite the flexibility demonstrated by spatial filtering systems, the increasing availability of digital systems to manipulate and enhance images led to a decrease in the use of all-optical approaches. Methods requiring access to phase, however, cannot be simply replicated digitally, and other mechanisms are required to process this information.
C. Metasurfaces for Fourier plane spatial filtering
Metasurfaces are nanostructured two-dimensional surfaces that can be tailored to manipulate light-matter interactions to produce a specific near- or far-field optical response. Their basic principle of operation tends to fall into two categories. First, the phase and amplitude of light scattered by individual optical resonators can be controlled by manipulating the variations in amplitude and phase accompanying resonance. The second approach involves using the Pancharatnam–Berry (P-B) phase,31 which utilizes the polarization/phase nexus.
In the first case, the resonances of each unit cell depend on the geometry of the cell as well as the optical properties of the materials of which the resonator is composed as well as the substrate, superstrate, and any other proximate materials. Given the significant variations in the amplitude and phase of light scattered at a wavelength close to resonance, it is apparent that metasurfaces can be used to tune the phase of light either transmitted through or reflected from the device. If the properties of the resonators are tuned point-by-point across the surface, by modifying the geometry, for example, it is possible to create flat analogues of conventional optical elements using meta-optics, where phase is accumulated through resonance rather than propagation.32,33 The concept of geometric phase has also been used to design metasurface devices. In this case, light that is right- or left-circularly polarized experiences a phase shift correlated with the orientation of a dielectric or metallic nanorod or nanoslot in a film. This has been used to demonstrate optical routing or the so-called photonic spin-Hall effect, where the direction of propagation of the scattered free or surface waves depends on the helicity of the incident field.34 By introducing a spatial variation into the orientation of the nanorods or apertures a spatial variation in the transmitted phase can be introduced.35
Using the flexibility afforded by metasurfaces has enabled image processing to be demonstrated by independently controlling the amplitude and phase of light transmitted through, or reflected from, a device at a particular point29 [Fig. 3(a)]. This permits broader access to arbitrary points within the green circle of Fig. 2. Using the concept of a “reflectarray,” the authors of Ref. 29 were able to design and experimentally demonstrate devices that could perform differentiation and integration of an optical field. In particular, the authors were able to demonstrate devices with a one-dimensional spatial variation, linear in position, that differentiated the incident field and another that was approximately inversely proportional to distance that was designed to perform integration. The concept was further developed using a “dendritic” unit cell structure rather than blocks while retaining the reflection geometry.36
A dielectric metasurface utilizing the P-B effect has also been used to demonstrate Fourier plane spatial filtering [Fig. 3(b)].30 With a P-B phase linear in position on the metasurface, transmitted circularly polarized light is filtered to produce an output spatially shifted copy of the input field with the shift direction dependent on the helicity of the incident light. If linearly polarized light is used, and the orthogonally polarized output analyzed, this yields the difference between two opposite and spatially shifted copies of the incident field yielding the spatial derivative of the field. In Ref. 30 the filter consisted of form-birefringent nanostructured glass fabricated using femtosecond pulse irradiation. One significant advantage of using geometric phase approaches, rather than resonant nanostructures, is that the effect is relatively broadband, and this research was able to demonstrate edge enhanced images at wavelengths across the visible spectrum. The concept has been extended to the demonstration of spiral phase contrast imaging37 where a spiral phase ramp introduced into the Fourier plane was shown to demonstrate two-dimensional edge contrast (dark field) images of phase objects.
All of the approaches discussed in this section are based on modifying the Fourier content of an image by introducing spatially varying metasurfaces as spatial filters in optical systems. In Sec. III we focus on devices that modify the Fourier content of a wavefield through sensitivity to the angle of the incident light, and therefore, the transverse components of spatial frequency.
III. IMAGE AND OBJECT PLANE FOURIER FILTERING
Given the obvious success of spatial filtering, it is intriguing to consider the prospect of direct filtering of spatial frequencies in thin filtering devices. This would eliminate additional lenses and space required to perform traditional spatial filtering. In the first paper to describe this image or object plane Fourier filtering approach was put forward by Peri and Friesem38 and Case,39 who utilized thick volumetric phase gratings with a strong angular selectivity to directly filter spatial frequencies without requiring access to the Fourier plane. Using this approach, image deblurring38 and edge enhancement39 were demonstrated. Case also pointed out another key attribute of this direct filtering approach—the fact that it need not be located immediately in the object (or image) plane, but depending on the numerical aperture of the system could be located some distance from the sample (or its image). Image plane filtering using holographic gratings in reflection was subsequently demonstrated by Molesini.40 Afterward the concept of object plane filtering was extended to the use of Fabry–Pérot etalons41 and standard interference transmission (spectral) filters.42
A. Spatial frequency sensitive surfaces and films
As mentioned above, an arbitrary optical field can be decomposed into a superposition of plane waves via Fourier methods. The Fourier transform is expressed in terms of transverse components of the wavevector of the plane wave, and there is a one-to-one correspondence between spatial frequency and propagation angle
where θ and are, respectively, the polar and azimuthal angles of propagation with respect to the optical axis. Hence, if an optical device has a transmission or reflection that is suitably sensitive to the angle of incidence, it will provide direct filtering in the Fourier domain. This would remove the requirement for additional optics and associated propagation distances to access the Fourier plane. Angle-sensitive reflectance from, and transmittance through, materials is a phenomenon well known in a range of structures including crystals, waveguides, multilayer thin films, and diffraction gratings. This phenomenon can be harnessed to manipulate optical images in object, rather than Fourier, space.
B. Thin film devices
As evidenced by the earlier work discussed above, any uniform surface or slab exhibiting a transmission or reflection that is a function of angle of incidence has the capacity to perform spatial frequency (Fourier) filtering of an image.
While in Sec. III C we will discuss approaches that incorporate one or more layers textured on the nanoscale, here we exclusively discuss devices composed of stacks of uniform, non-structured optical films with subwavelength thickness or flat surfaces. A major difference between these two broad approaches lies in the fabrication schemes required for their experimental implementation. Devices containing nanostructured surfaces usually require complex fabrication techniques including lithography.43,44 Uniform thin-film structures, on the other hand, are most commonly fabricated through electron beam or thermal evaporation, plasma or ion-beam sputtering, or other vapor deposition techniques. These methods inherently enable large area, and often low-cost, fabrication usually unavailable via nanolithography techniques.45 One group of thin-film devices that have recently gained attention for analog optical computation consists of variations of Phase-Shifted Bragg Gratings (PBSGs).48–50 Optical Bragg gratings refer to periodic modulations of the refractive index inside a transparent material such that near-complete reflection occurs at a particular design wavelength. These are commonly integrated into optical fibers.51,52 PSBGs incorporate an additional defect layer between two consecutive Bragg gratings introducing a phase shift. This design implements a Fabry–Pérot-type cavity between the two Bragg gratings that produces ultra-narrow transmission bands.53 PBSGs can perform analog optical computation in the spatial domain due to a steep change in the reflection or transmission condition with the angle of incidence. Two numerical studies have exploited this to demonstrate first-order differentiation in reflection for a wavefront incident at a device specific angle θ0,48 as well as second-order differentiation (Laplace operator) for normally incident light.49 In a subsequent theoretical study, the potential for PSBGs to perform optical integration of transmitted wavefields was demonstrated.50 A common constraint using such thin-film devices with large numbers of stacked layers for optical computation purposes is that the quality of the computation, as well as its energy efficiency, depend on the total number of layers in the system as pointed out in Bykov et al.49 Case dependent compromises need to balance these effects. A more general approach that employs an optimization algorithm to tailor the optical response of alternating thin-film stacks to a desired transfer function has been demonstrated theoretically.54 Using the example of stacks of alternating Si and SiO2 thin films, the approach was used to design a device that performs two-dimensional second-order spatial differentiation in reflection. This work discussed requirements for the total numbers of layers in the system based on the width of the spatial frequency spectrum they are designed to process. It should be noted that this and other approaches requiring large numbers of thin-film layers commonly result in devices thicker than the wavelength of the light for which they are designed.
Another approach to thin-film enabled analog optical computation are dielectric slab waveguides. This method exploits the angular sensitivity in the vicinity of modal coupling into dielectric slab waveguides to engineer optical transfer functions. Two theoretical studies exploited this to demonstrate spatial-differentiation in reflection55 as well spatial integration in transmission56 through investigations of Si on SiO2 slab waveguide devices. The authors of both studies also investigated a more compact version of this approach using graphene with an engineered surface conductivity instead of dielectric layers forming the waveguide. Experimentally, these designs would generally require a coupling mechanism as, for instance, through prism-coupling in the Kretschmann or Otto configurations. It should be noted that several approaches reviewed below in Sec. III C also exploit the angular sensitivity of the excitation of thin-film slab waveguide modes, but incorporate grating enabled coupling mechanisms.
Despite the extensive theoretical research into the various thin-film approaches discussed above, they remain yet to be experimentally demonstrated. This can be attributed to the fact that the required precision in film thicknesses and refractive indices implies complex fabrication schemes, in particular for devices incorporating a large number of thin-films.
As a less complex approach to analog optical computation, the excitation of surface plasmon polaritons (SPP) on a single optically thin metallic film has also been investigated. The approach is based on the reflection of a wavefield from a single film at an angle that ensures matching of the relevant components of the wavevectors of the incident light wave and SPP on an air-metal boundary which, in turn, depends on the material and thickness of the film. This requires a wavevector matching mechanism, usually enabled through use of a glass prism in the Kretschmann configuration. In an initial theoretical study, also incorporating a gain medium, this concept was demonstrated to enable all-optical computation of the first derivative in reflection.57 Furthermore, the study investigated how this structure can be used to perform optical integration using an additional pump beam incident on the gain medium from the reverse side of the film. Vohnsen and Valente58 experimentally demonstrated the first experimental implementation of the approach using a thin gold film on a glass prism to perform wavefront sensing via spatial differentiation. An implementation of the concept based on a thin silver film on a glass prism was also used to perform one-dimensional experimental edge detection on amplitude and phase images [Fig. 4(a)].46 Although the thin films employed in this experiment had a subwavelength thickness of approximately 50 nm, the requirement for a glass prism as the wavevector matching mechanism renders integration of this approach into compact optical systems unfeasible. To overcome this limitation, and enable differentiation in both spatial dimensions, it was suggested46 that the prism be replaced by a grating coupler. This approach was investigated in a theoretical study for illumination at terahertz wavelengths59 that demonstrated second-order differentiation for normally incident waves using a silicon grating between a graphene film and a gold substrate. Nanostructured surfaces are discussed in Sec. III C below.
Various studies have investigated optical effects on reflection from the flat boundary between air and a semi-infinite bulk material for spatial optical computation. The Brewster effect, which is observed for light with TM-polarization reflected from a dielectric optical interface, was exploited in Ref. 62. The study numerically demonstrated first-order differentiation for light reflected at the Brewster angle associated with the air-material boundary. Several other studies utilize variations of optical effects that are observed when linearly polarized light is reflected from a dielectric interface and analyzed with a perpendicular polarizer. These approaches exploit interaction between the spin angular momentum (SAM) and the orbital angular momentum (OAM) of light, as for example reviewed in Ref. 63. The spin-Hall effect (SHE) of light was also experimentally demonstrated to enable analog optical computation [Fig. 5(a)].60 The approach exploits the fact that linearly polarized light refracted at a surface is spatially separated into its right- and left-circularly polarized components,63 which can be employed for spatial differentiation using an additional polarizing element. Based on the SHE, one-dimensional edge detection in an amplitude and a phase image reflected from an air-glass and from an air-gold interface were demonstrated in this study. It was, furthermore, experimentally demonstrated that utilizing spin–orbit interaction in this setup enables one-dimensional spatial differentiation along a direction adjustable by the incident polarization.64 Application of this single-interface approach for phase-visualization was demonstrated experimentally in Ref. 65. Using a spatial light modulator, phase-object models of scaled epithelial cells with sizes of several hundred micrometers were converted into pseudo 3D intensity images through this method [Fig. 9(a)]. Another study theoretically and experimentally investigated the relationship between systems that alter the topological charge of a light field and their ability to perform two-dimensional spatial optical computation given certain polarization requirements [Fig. 5(b)].61 The authors emphasize that various photonic systems enable imposing a non-trivial topological charge on the incident beam including total internal reflection at planar dieletric interfaces, lossy metallic reflectors, and photonic bandgaps. Using the specific example of a circularly polarized input field reflected from an air-glass interface that is subsequently analyzed using a linear polarizer, the study demonstrates two-dimensional spatial differentiation of an amplitude image.
While the multilayer thin-film structures discussed at the beginning of this section commonly require large numbers of layers, an approach based on a simple three layer metal-insulator-metal (MIM) absorber structure has recently been demonstrated to enable all-optical spatial frequency filtering in reflection [Fig. 4(b)].47 A MIM absorber consists of a dielectric spacer layer sandwiched between a reflective substrate layer and a thin, partly absorbing coverlayer. For a spacer thickness of approximately a quarter wavelength, destructive interference leads to near-perfect absorption of normally incident light. The condition for absorption in an MIM absorber, however, changes with angle of incidence therefore enabling two-dimensional spatial frequency filtering in reflected wavefields. While the structure also enables low-pass spatial frequency filtering, the authors focus on high-pass filtering using a Au-SiO2-Au absorber to demonstrate two-dimensional edge-detection in a micrometer sized amplitude image [Fig. 4(b)] as well as conversion of micrometer sized phase-gradients in a wavefield into intensity modulations [Fig. 9(b)]. One drawback of this device is that the nonlinear optical transfer function of the structure does not perform first-order differentiation, which contributes to ringing effects in processed amplitude and phase images. The device has a total thickness of less than half a wavelength, thus carrying potential for integration in advanced optical imaging systems. Furthermore, the use of nanophotonic all-optical spatial frequency filters for biological imaging was also emphasized in this work and the first experimental demonstration of this kind showed an increase in contrast of images of pond-algae.47
C. Gratings, photonic crystals, and metasurfaces
This section discusses the application of nanophotonic approaches incorporating one or several layers textured on the nanoscale, including gratings, photonic crystals, and metasurfaces to all-optical image processing. Grating structures, including resonant subwavelength diffraction gratings (RWGs) or guided mode diffraction gratings, have been investigated in various studies as object plane spatial frequency filters. RWGs describe a group of nanostructures incorporating subwavelength gratings that support leaky guided modes along the grating. A detailed review of the properties of resonant waveguide gratings is provided in Ref. 68. While RWGs encompass a variety of configurations, a common design consists of a shallow grating on a slab waveguide. This enables coupling of incident light into resonant modes supported by the structure. Through engineering of the angular dispersion of this coupling mechanism, these structures can be used for analog optical computation in reflection and transmission mode. RWGs should be distinguished from high-contrast gratings (HCGs), which exploit the excitation of vertical Bloch modes in deep grating grooves rather than guided modes traveling horizontally.
An initial theoretical study established a general framework describing the transformation of a two-dimensional beam diffracted by a RWG and provides conditions under which the structure implements an optical differentiator or integrator.69 Using this model, the authors numerically demonstrated differentiation of a Gaussian beam upon transmission through an AlA grating on a silica waveguide layer.
Following this proposal various RWG structures have been experimentally investigated and applied to performing spatial differentiation in one direction. In Ref. 66 a ERP40 electron resistant grating on a TiO2 thin-film was investigated for operation at visible wavelengths [Fig. 6(a)]. The authors demonstrated differentiation of a Gaussian beam transmitted through the device. In an another study, the concept was implemented for operation at near-infrared wavelengths via a silicon grating on a quartz layer.70 The authors of this study, furthermore, applied the structure to perform edge detection of transmitted amplitude images. An extension of the concept that enables wavelength-multiplexed spatial differentiation along one spatial direction was numerically investigated in Ref. 67 [Fig. 6(b)]. The proposed structure is designed for operation at telecommunication wavelengths and comprises a silica slab between a silicon grating and a gold backreflector. This enables storing several images at different wavelengths in the input field, which are then diffracted under different angles and thereby demultiplexed in the output field. In a specific numerical example, the authors demonstrated simultaneous edge enhancement of four images stored within a spectral bandwidth of approximately 260 nm. It should be noted that either of the above implementations of RWG-type devices rely on operation at a device specific oblique angle of incidence. Furthermore these devices are designed for operation along only one spatial direction. This was addressed in a numerical study by using a two-dimensional rather than one-dimensional grating structure.71 This investigation demonstrated two-dimensional edge detection in transmitted amplitude images via a silicon nitride grating embedded in a SiO2 thin film. The authors acknowledge the anisoptropy and non-linearity of the optical transfer function as limiting factors that prevent the device from performing uniform edge detection as well as “clean” first- or second-order differentiation. In a recent study RWG-type structures consisting of a subwavelength silver grating on a thin film of TiO2 were demonstrated as high-pass spatial frequency filters for biological phase-imaging of transparent objects including human cancer cells,72 as discussed in more detail in Sec. III D 1.
Guided mode structures have furthermore been engineered to take advantage of inherently incidence-angle-dependent Fano-type resonances and leaky modes in photonic crystals to enable analog optical computation. A theoretical study investigated arrays of resonant particles with a superimposed periodic spatial- or dielectric modulation.73 This enables interference between particle resonances and guided surface modes resulting in narrow Fano-type resonances. Through the example of an array of split-ring resonators with spatially modulated dielectric constant of the gap material, the authors tailored the angular response of the device to perform first- and second-order differentiation of transmitted wavefields in one dimension [Fig. 7(a)]. The authors showed that an extended version of the design enabled two-dimensional operation, and they numerically demonstrated edge detection of amplitude images.
In a subsequent study it was demonstrated that arrays of dielectric nanobeams supported by a thin dielectric film with suitably chosen parameters can also be engineered to support narrow Fano resonances that permit analog optical computation [Fig. 7(b)].74 The authors numerically demonstrated first- and second-order differentiation along one spatial dimension and experimentally implemented a device performing second-order differentiation at near-infrared wavelengths. Their design for a device performing first-order differentiation strategically included a thin high-index layer to create up-down asymmetry permitting their design to have an asymmetric transfer function. Photonic crystal slabs were also proposed as a platform for analog optical computation in a numerical study by Guo et al. [Fig. 7(c)].75 It was demonstrated that the photonic band structure of a photonic crystal slab, consisting of a hole array and a dielelectric layer interrupted by an air gap, can be engineered to perform the Laplace operator on a transmitted wavefield. The authors exploit this in a numerical example to demonstrate edge detection on a transmitted amplitude image using unpolarized light.
In an extensive experimental study Valentine et al. demonstrated a two-dimensional dielectric metasurface that performs the Laplace operator on a transmitted wavefield at long-wavelength visible and near-infrared wavelengths.77 The structure consists of an array of Si “nanoposts” on a silica slab. By tailoring the geometry such that leaky-modes excited in the silica slab interfere with directly transmitted light to form Fano-type resonances, a device with a quadratic optical transfer function is obtained. The structure was applied to experimental edge detection of amplitude images and contrast enhancement in images of biological samples, including onion epidermis, pumpkin stems, and pig motor nerves. Furthermore the authors integrated the device with a metalens to form a compact, compound optical device that can directly be placed on an image sensor to perform edge detection.
While the previously discussed approaches involved tailoring guided mode features, the excitation of Mie-resonances and subradiant modes of resonant nanoparticles have also been proposed as a further physical mechanism for all-optical spatial frequency filtering. Using Mie-resonances of a dielectric metasurface consisting of a hexagonal lattice of Si nanodisks, two-dimensional edge detection was demonstrated in transmission.78 Operation of this metasurface was experimentally demonstrated at near-IR wavelengths between 1400 and 1600 nm.
The electric or magnetic dipole resonance of a nanoparticle or nanoparticle ensemble usually dominates its electromagnetic response. However, weaker and spectrally narrower higher order modes that carry zero net electric dipole moment can be excited on nanoparticles and nanoparticle ensembles under certain conditions.79 These resonance features are commonly referred to as subradiant modes, dark modes, or bound modes in the continuum (BIC).80 Subradiant modes have received considerable scientific attention owing to their unusual optical properties. These have longer lifetimes compared to dipole resonances, stemming from reduced radiative damping, as well as the fact that they do not couple to linearly polarized, normally incident light.81,82 Off-normally incident light, however, introduces a phase-shift across the unit cell that can result in the excitation of subradiant modes of the antenna elements or other structures in the cell. This angle-dependent excitation can be exploited as a platform for analog optical computation. Roberts et al. used quasi-analytic models to theoretically investigate the excitation of subradiant modes of optical metasurfaces and their application to edge detection of amplitude (Fig. 8) and converting phase gradients to intensity.76,83 In the case of arrays of annular apertures in a perfectly conducting film, their transmission could be calculated using a mode matching technique. For a unit cell consisting of arrangements of nanorods, on the other hand, an electrostatic model well-predicted their performance. A variant of the mode-matching technique in the monomodal approximation was also used to investigate how different arrangements of nanorod trimers per unit cell enable modal hybridization and support the excitation of dark modes at off-normal illumination. Interference between this dark mode and the broader dipole mode of the structure can furthermore result in the appearance of ultra-narrow Fano-resonances that have an angular sensitivity potentially useful for image processing applications.83
A plasmonic nanorod ensemble that has been investigated in several studies as a platform for analog optical computation through subradiant mode excitation are variations of dolmen-shaped rod arrangements.84–88 Initial theoretical work demonstrated that a plasmonic dolmen structure comprising two parallel rods (input arms) and a third perpendicular rod (the signal arm) centered above permits sensing of phase differences.84 The approach exploits the fact that phase-differences in the localized surface plasmon oscillations between the input arms translate into excitation of a dipole mode on the signal arm. With incident light linearly polarized along the input arms and a second, perpendicular polarizer analyzing the output field, this structure enables sensing of phase-gradients along one spatial direction in the input field.85 The potential of this device for phase imaging is discussed further below. Furthermore, a version of this structure that enables sensing of phase-gradients along two spatial directions was subsequently proposed and experimentally demonstrated for operation at visible wavelengths.86 A related structure has also been numerically engineered to all-optical compute the second-order spatial derivative of a transmitted wavefield in one spatial direction.87
Another nanoplasmonic structure that has drawn attention as a platform for spatial frequency filtering based on subradiant mode excitation is an array of plasmonic radial rod trimers.81 It has been numerically and experimentally demonstrated that this acts as two-dimensional high-pass spatial frequency filters in reflection at visible wavelengths.89 The authors use the structure to demonstrate spatial frequency filtering through the excitation of electric and magnetic subradiant modes and discuss inter-unit cell coupling as a reason for the emergence of complex optical transfer functions. Recent results have further experimentally confirmed edge detection in an image reflected from an array of radial silver rod trimers and numerically simulated examples of phase-gradient to intensity conversion upon reflection.90 The sensitivity of the optical transfer function to the geometric parameters of the trimers, however, impose high requirements on the fabrication process and generally results in non-isotropic and non-linear transfer functions. This can serve as a platform to engineer metasurfaces with highly specialized optical transfer functions but also imposes limits on the ability to perform high-resolution image processing and phase visualization.
D. Key considerations
The theory of Sec. II ignored polarization effects for clarity, but this needs to be considered in almost all implementations of image processing involving spatially dispersive meta-optics or thin film devices. Under these circumstances, the theory needs to be adapted to incorporate a vector field input that is decomposed into, for example, p- and s-polarized components, and the transfer function becomes a tensor with elements describing the response to p- and s-polarized light
where the diagonal components lead to co-polarized transmitted or reflected field components and the off-diagonal terms describe cross-polarization.76 In some applications minimizing polarization sensitivity is desirable, whereas in others polarization can be used to tune the response of the device. It has also been shown that devices without depth asymmetry cannot produce asymmetric optical transfer functions73 unless polarization effects are used.88 Asymmetry is important if meta-surfaces are ultimately required to produce any arbitrary optical transfer function.
In most research discussed above, the performance of the metasurfaces also requires monochromatic plane waves incident on the sample. This is related to the meta-optic devices being wavelength sensitive as well as requiring a well-defined direction for the incident light, since this defines the zero spatial frequency. Devices that operate at multiple wavelengths, and particularly those corresponding to a red-green-blue color gamut, may be required for image processing where color rendering is important. The effects of spatial, rather than temporal, coherence are discussed in more detail below.
Another issue that has attracted little attention is the effective numerical aperture of the meta-device. The numerical aperture affects the range of spatial frequencies intercepted by the device. Devices or systems that do not capture all the spatial frequencies in the incident beam produce artifacts in the resulting image. Ringing artifacts at sharp edges and corners caused by the finite numerical aperture of real optical systems are explained by the Gibbs-Wilbraham phenomenon.91 This phenomenon refers to the overshoot observed at jumps in the band limited version of a signal, which translates to ringing artifacts in optical images reproduced with limited spatial frequency bandwidth. An additional contribution to such ringing artifacts arises from non-linearities in the optical transfer function of the device. Ringing artifacts are evident in most experimental works on meta-surface spatial filtering and sometimes also appear in numerical studies.47,74,77 Finally, the spatial frequency range, over which a particular meta-optical device can perform a useful filtering operation, dictates the required scaling of the input field. Experimentally, this commonly requires additional bulk-optical components that magnify or demagnify the input field to match its spatial frequency spectrum to the filter function of the meta-optical device as for example in Refs. 46, 47, 61, and 74. Many nanophotonic systems such as those incorporating resonant gratings66 or coupled nanorods85,92 can be approximated by a linear transfer function leading to all-optical first-order differentiation of the input signal. In the case of Ref. 66 the reported numerical aperture (NA) was of the order of 0.04 for a demonstration of one-dimensional first-order differentiation using a tilted dielectric grating, and Eftekhari et al.85,92 were able to show an approximately linear response up to a NA of almost 0.5. Dielectric metasurfaces utilizing Fano resonances exhibiting an approximately quadratic transfer function producing second-order differentiation have been demonstrated to NAs of 0.3277 and 0.35.74 These low-loss systems also demonstrate an efficiency in excess of 90%. A more detailed discussion of requirements for analog optical computing can be found in Ref. 20.
Despite these considerations, many of which provide interesting avenues for further research, meta-surfaces have distinct advantages, not only because of their compact size. Those devices that operate directly in Fourier space can be placed almost anywhere in the optical system between the sample and the detector. This advantage provides flexibility in the design of the optical system incorporating the metasurface.
To date most researchers have focused on designing devices for performing mathematical operations, and notably first- and second-order differentiation that highlight edges in an image. This may have some advantage, but operations such as edge detection are very efficiently performed using computers, for example via application of the Sobel, Prewitt, or Robert operators.93 On the other hand, one application where these devices have great potential is for phase imaging, since a conventional camera cannot detect phase variations in an incident beam, only the intensity.
Since neither the application of spatially dispersive devices to phase imaging, nor the influence of spatial coherence on their performance, have been extensively investigated elsewhere, we include further discussion regarding each of these aspects below.
1. Phase-visualization
It is well known that spatial frequency filtering is a powerful technique to visualize phase gradients in a light field. Such filtering was developed for x-rays to enable phase imaging in the absence of regular optical elements.94,95 The advent of metasurfaces enabling all-optical spatial frequency filtering provides the opportunity to convert phase gradients to intensity variations that can be imaged with a conventional camera (Fig. 9). In the following we will briefly review the fundamentals of phase visualization based on spatial frequency filtering and draw conclusions about the desirable attributes of suitable metasurfaces. Here we consider a pure phase field with a spatially varying phase modulation but constant amplitude U0. The correlation between first- and second-order spatial differentiation and a linear or quadratic optical transfer function, respectively, was summarized above in Sec. II. Here we demonstrate the extraction of phase information from a wavefield through spatial differentiation and conversion to an intensity modulation. First-order spatial differentiation of U(x, y) along one coordinate yields an intensity distribution given by
The equivalent holds for the perpendicular y-direction. The first-order spatial differentiation of the input field thus results in an intensity distribution proportional to the square of the phase-gradients and .
While first-order spatial differentiation, i.e., processing by a linear optical transfer function, results in a simple relationship between input phase and output intensity, nonlinear transfer functions generally produce a complex output intensity that does not allow unambiguous visualization of the initial phase map. Non-linear transfer functions correspond to non-integer spatial derivatives as elucidated by fractional calculus.96 We demonstrate the effects of non-linearity using the example of second-order differentiation with the Laplace operator , i.e., a system with quadratic optical transfer function,
It is apparent that both first- and second-order derivatives of the phase contribute to the resulting intensity distribution. Hence, for the purpose of phase extraction, a linear optical transfer function is generally required. However, for a phase variation that can be approximated by a Taylor expansion as a linear deviation from a constant phase background with , applying the Laplace operator yields
In this case an intensity variation that is a function of only the phase-gradients in both spatial directions is produced. The Laplace operator is therefore a useful operation for two-dimensional detection of linear phase-gradients, although artifacts may appear for non-linear variations in phase.
Real optical systems will be able to perform a differentiation operation only within a limited spatial frequency range and commonly exhibit a constant off-set in their optical transfer function. Of particular interest for phase-imaging are systems exhibiting a linear optical transfer function with a constant offset.65,72,88 Note that as discussed previously, this transfer function is asymmetric about zero spatial frequency. This produces pseudo-3D intensity images of phase objects similar to those observed in differential interference microscopy (DIC).97 With appropriate choice of polarizer and analyzer settings, the dolmen structure discussed in Sec. III C exhibits such a property and experimentally verified transfer functions were theoretically demonstrated to permit the generation of phase contrast images with this property [Fig. 9(c)].88 Most importantly the study showed that the polarizer settings permit introduction of a tunable asymmetry in the optical transfer function, which is important in generating such pseudo-3D phase images of transparent objects. In another recent study introduced above in Sec. III C, RWG-type structures were used to extend phase-imaging capability to a glass coverslip.72 The study experimentally demonstrated the visualization of human cancer cells directly placed on the device at visible wavelengths. To obtain the pseudo-3D images produced through this method, the authors used a small angle tilt in the illumination to introduce an asymmetry in the optical transfer function of the device.
In summary, the ability of nonlocal optical systems to perform phase visualization was discussed, and the requirements on the optical transfer function of a system as well as the nature of the phase-modulation under consideration was pointed out. The above discussion may inform the design of nanophotonic structures aiming to perform phase-visualization, for example, as part of biological imaging systems.
2. Coherence
An issue with conventional spatial filtering is the effect of incoherence on the resulting image. In the discussion so far, we have considered a monochromatic plane wave incident on the object to be imaged. Such a wave is highly coherent, both spatially and temporally. However, many light sources are distributed in space, resulting in many spatial frequencies, i.e., directions of propagation, before interacting with the sample. Such distributions lead to a “smearing” out of the effects of the spatial filters we have previously considered.
The effects of partial coherence or even total incoherence can be understood using the mutual optical intensity.98 A monochromatic, one-dimensional, partially coherent (scalar) complex field is represented by
where the asterisk denotes complex conjugation and the brackets represent averaging over all realizations of the field. In this notation the intensity of the field is given by . A field with mutual intensity G0 incident on an object with scalar transmission function O(x) is transmitted with mutual intensity
After being transmitted through the sample, the “processed” field leaving a meta-device is still related to that incident by the optical transfer function with a mutual intensity . On Fourier transforming, we have the mutual intensity as a function of spatial frequency
This result allows us to express the Fourier transform of the mutual intensity in terms of the optical transfer functions of the meta-device and the Fourier transform of the incident mutual intensity
where we write u and v as the spatial frequencies conjugate to x1 and x2, respectively. This equation provides a formal method for propagating partially coherent fields through an optical system. An example calculated using this method is shown in Fig. 10, where sinusoidal amplitude gratings with different periods are illuminated with a partially coherent beam based on the Gaussian-Schell model99
which describes a beam with a mutual coherence length σ and spatial width w. The illuminated object is a one-dimensional transmission grating with a sinusoidal profile and the meta-device has a linear transfer function , which takes the first derivative of the optical field. For large coherence lengths, the meta-device yields images of the grating with twice the period, since this is the modulus squared of the grating gradient. As the coherence length decreases to close to the period of the grating the effect of the meta-device is diminished as observed by the loss of the gradient, or period doubling, eventually leading to simply an intensity image of the grating itself. For partially coherent illumination (the green curves of Fig. 10), a more complex intensity distribution could introduce artifacts in 2D images.
Coherence effects were considered in the early days of optical spatial filtering, where it was shown that additional information obtained from the experiment could enable spatial filtering with incoherent light.100–103 In particular Lohmann104 noted that incoherent spatial filtering involves real, positive functions, and he devised a method for mimicking a complex spatial filter function by imposing a spatial frequency “carrier” offset into the incoherent light beam. Such a procedure was demonstrated by Stoner.101 With meta-devices, incoherent spatial filtering was achieved using a hybrid imaging approach based on a metasurface coupled to a wavelength-sensitive (color) imaging detector.105 The device has significantly different optical transfer functions at two different wavelengths that are resolved by a color detector and processed.
IV. OUTLOOK
The recent surge of scientific interest in meta-optical systems for spatial information processing has resulted in a broad range of available concepts and structures for further exploration and applications. However, theoretical demonstrations of the general prospects of meta-optical systems for information processing as, for example, in Refs. 73, 76, 106 and 107 currently outweigh experimental implementations. Closing this gap provides intriguing avenues for future research. Furthermore, there is a broad spectrum of potential extensions of existing research as well as scientific and technological fields that could benefit from the application of meta-optical information processing systems. Below we provide an overview of these prospects.
The sensing, quantification, and correction of phase aberrations in optical wavefronts is of fundamental importance to several scientific and technical disciplines, including ocular diagnostics108 as well as astronomy.109,110 Current approaches in these fields usually involve Shack–Hartmann (SH)-type sensors or rely on interferometry. A major drawback of these methods is their requirement for post-processing (SH) and bulk-optical components (interferometry). Object-plane image processing solutions as discussed in this review have potential to not only visualize phase variations but also underpin new ultra-compact wavefront sensing solutions, as evidenced by Vohnsen and Valente.58
Biological cells commonly generate insufficient amplitude contrast in order to visualize details of their spatial structure or internal features. These are, for example, important indicators for medical conditions such as sickle disease or malaria.111 Chemical staining and fluorescent labeling are frequently used methods to enhance image contrast in biological samples. Due to their invasive nature, changing, and in some cases, destroying the cell environment is inevitable using these techniques.112,113 Relevant information about the spatial structure of a cell is, however, also contained in the phase of a wavefield once it has passed through the cell. Conventional bulk-optical phase imaging methods such as Zernike phase contrast11,12 or Differential Interference Contrast (DIC) imaging97 exploit this by converting these phase-modulations into detectable intensity variations. Indirect methods, including those based on the transport of intensity equation (TIE), have been demonstrated but require computational post-processing.114 Nanophotonic, all-optical phase-imaging systems on the other hand carry enormous potential to simultaneously eliminate the requirements for bulk-optical components and computational post-processing. Initial demonstrations have already highlighted the role nanophotonic systems could play in future biological imaging systems.47,65,77 In addition to qualitative phase imaging, meta optical systems could also enable precise quantification of phase excursions. Such quantitative phase imaging (QPI) approaches are an important tool for research investigating physiological processes within living cells and also underpin new approaches to 3D imaging.115
The spatial filtering property of meta-devices is a consequence of their ability to detect the direction of propagation of the incident light. In this regard, such devices also have potential for determining distances to objects, for example in stereoscopic imaging that exploits the small change in the direction of an object when viewed from two nearby locations. Fast and accurate distance sensing is important in the development of autonomous vehicles, which currently use a range of technologies including LIDAR.116 The passive nature of meta-devices and their rapid processing of optical information may prove useful in these applications in the future.
Object classification within images in the terahertz117 and visible118 regions of the spectrum has been recently demonstrated by propagating an optical field through several spatially separated diffractive surfaces cascaded along the direction of propagation. The properties of the flat surfaces are tailored to produce a specific input–output relationship accommodating transformation by the free-space regions between them. Such approaches have significant potential for performing all-optical image processing of fields, including many of the applications discussed above.
Meta-optical information processing systems carry significant potential to be part of future ultra-compact imaging systems in mobile devices. Proof-of-concept demonstrations of such meta-optical systems to date usually require additional bulk optical components including lenses, microscope objectives, and polarizers, as reviewed above. Through compound optical devices that also incorporate flat optical components, such as metasurface lenses, imaging-on-a-chip could become possible as evidenced by initial demonstrations.77 Most importantly this could facilitate access to mobile medical imaging devices in remote regions of developing countries where conventional diagnostic tools are often unavailable. This lack of medical diagnostic equipment is regarded as a major cause for the high morbidity rate of diseases such as malaria in developing countries.119 In particular, the integration of meta-optical systems with the computing and connectivity features of smartphones represents a promising pathway toward the next generation of mobile medical imaging systems.120–122 In addition to this, such compound meta-optical systems could also find applications in situations where large amounts of spatial information need to be processed in real-time with high requirements on component sizes, energy consumption, and processing speed, such as in small-scale geosatellites.123
In this review we have focused on all-optical information processing in the spatial domain. However, meta-optical systems have been investigated as temporal information processors as well.124,125 Simultaneous processing in the spatial and temporal domain carries potential to further increase the information throughput of meta-optical systems and has for instance been investigated using resonant grating structures.126 Extension of the approaches discussed in this review to spatiotemporal operation will therefore be a promising pathway for further research.
V. CONCLUSION
In this article we have summarized the fundamentals of all-optical spatial filtering for information processing purposes, placed early work into context, and provided a detailed review of studies published throughout the recent surge in scientific interest in this topic. The authors believe that the development of compact, meta-optical information processing systems will be of fundamental importance for the advancement of any technological area involving optical imaging and processing of spatial information in general.
AUTHOR CONTRIBUTIONS
L.W., T.J.D., and A.R. contributed equally to this manuscript. All authors reviewed the final manuscript.
ACKNOWLEDGMENTS
The authors acknowledge funding through the Australian Research Council (ARC) Discovery Projects Scheme (Project No. DP160100983) and the Center of Excellence Scheme (Project No. CE200100010) through the ARC Center of Excellence for Transformative Meta-Optical Systems.
DATA AVAILABILITY
Data sharing is not applicable to this article as no new data were created or analyzed in this study.