High intensity focused ultrasound (FUS) is a noninvasive technique for treatment of tissues that can lie deep within the body. There is a need for methods to rapidly and quantitatively map FUS pressure beams for quality assurance and accelerate development of FUS systems and techniques. However, conventional ultrasound pressure beam mapping instruments, including hydrophones and optical techniques, are slow, not portable, and expensive, and most cannot map beams at actual therapeutic pressure levels. Here, a rapid projection imaging method to quantitatively map FUS pressure beams based on continuous-wave background-oriented schlieren (CW-BOS) imaging is reported. The method requires only a water tank, a background pattern, and a camera and uses a multi-layer deep neural network to reconstruct two-dimensional root-mean-square (RMS) projected pressure maps that resolve the ultrasound propagation dimension and one lateral dimension. In this work, the method was applied to collect beam maps over a 3 × 1 cm2 field-of-view with 0.425 mm resolution for focal pressures up to 9 MPa. Results at two frequencies and comparisons to hydrophone measurements show that CW-BOS imaging produces high-resolution quantitative RMS projected FUS pressure maps in under 10 s, the technique is linear and robust to beam rotations and translations, and it can map aberrated beams.

Focused ultrasound (FUS) is a noninvasive therapeutic modality that has a broad range of established and emerging applications, including tumor,1 fibroid,2 and desmoid3 ablation, drug delivery,4 brain surgery,5,6 blood-brain barrier opening,7 and neuromodulation.8 The acoustic waves used in FUS can be focused to a particular location deep in the body with a spatial resolution on the order of the wavelength of the ultrasound (approximately 1.5 mm at 1 MHz) to thermally or mechanically treat tissues at the focus without affecting intervening tissues. To maximize the therapeutic benefit of FUS, it is required to know with high spatial accuracy and precision how much acoustic energy is delivered and where it is delivered. Furthermore, for therapeutic efficacy and safety, it is necessary to assess whether the FUS system output changes between treatments9 and check for system failures, which could dangerously alter energy delivery. Experts have recommended that rigorous quantitative pressure beam mapping be performed on clinical systems 2–3 times monthly.10 For these reasons, the ability to quantitatively map the acoustic pressure beam in two or three spatial dimensions in the clinic is essential, and it is important for the safety and reproducibility of FUS treatments that instruments for rapid field characterization become available. FUS pressure beam mapping is also essential for research and development of new FUS technologies and techniques, such as new therapeutic transducers,11 methods to propagate FUS through the skull and other bones,12–14 acoustic lenses,15–17 and FUS-transparent magnetic resonance imaging (MRI) radio frequency (RF) coils.18 Finally, FUS pressure beam mapping is important for focused imaging transducers, whose output must be characterized to ensure they meet safety guidelines set by the U.S. Food and Drug Administration.

The most widely used FUS pressure beam mapping instruments are hydrophones. They are ill-suited to rapidly mapping the pressure beams produced by FUS transducers because (as illustrated in Fig. 1) they only sample one spatial location at a time, and a three-dimensional (3D) motion stage must be used to move them through a tank to produce a spatially resolved map. This results in long measurement times that even with variable density sampling schemes can take tens of minutes for two-dimensional (2D) planes and several hours for 3D volumes, and one-dimensional (1D) and 2D hydrophone scans require careful alignment with the ultrasound focus. The long measurement times limit hydrophones' usefulness in measuring beams at multiple power levels or across ranges of experimental variables. Common polyvinylidene fluoride (PVDF) hydrophones are not prohibitively expensive but can only measure subtherapeutic pressure levels since they are easily damaged by cavitation. To overcome their speed and power limitations, hydrophone measurements can be made at low pressures over a 2D surface close to a transducer and then serve as source definitions for 3D computational modeling (holography) at different power levels or medium configurations,19,20 but these methods still require a large number of hydrophone measurements with a motion stage. Specially designed robust membrane21 and fiber optic hydrophones22 can withstand higher pressures, but they are more expensive (>$10 000) and less sensitive than PVDF hydrophones, and bandwidth limitations at high pressures can be a problem. Any instrument that sits in the focus will experience damage due to cavitation and require periodic repair and recalibration, and hydrophone systems with scanning tanks and translation stages lack the portability and ease of setup needed for clinical quality assurance measurements.

FIG. 1.

3D ultrasonic pressure beam mapping using a hydrophone probe. The hydrophone probe only samples one spatial location at a time, therefore, it must be translated by a motion stage to obtain a spatially resolved map as illustrated by the images on the left. The proposed CW-BOS method produces root-mean-square (RMS) projected pressure beam maps. To obtain the same maps using a hydrophone, 3D hydrophone measurements are integrated along the line-of-sight dimension to obtain projected pressure waveforms, and then the RMS amplitude of the projected waveform is calculated as illustrated by the images on the right.

FIG. 1.

3D ultrasonic pressure beam mapping using a hydrophone probe. The hydrophone probe only samples one spatial location at a time, therefore, it must be translated by a motion stage to obtain a spatially resolved map as illustrated by the images on the left. The proposed CW-BOS method produces root-mean-square (RMS) projected pressure beam maps. To obtain the same maps using a hydrophone, 3D hydrophone measurements are integrated along the line-of-sight dimension to obtain projected pressure waveforms, and then the RMS amplitude of the projected waveform is calculated as illustrated by the images on the right.

Close modal

Ultrasound pressure beams can also be mapped based on the deflection of light due to the acousto-optic effect, which relates a spatial pressure distribution to the index of refraction. Optical ultrasound pressure beam mapping methods, such as photographic and laser schlieren methods, have been used for more than 50 years to visualize ultrasound pressure fields in two dimensions,23–31 and laser-based tomographic schlieren methods have been developed for temporally resolved 3D ultrasound pressure beam mapping.27,32–38 The laser-based systems are founded on the same physical principle as the technique proposed here and are capable of impressive spatiotemporal resolution and sensitivity. However, they are limited to small fields-of-view (FOVs) and are prohibitively expensive (>$10 000). Furthermore, to perform 3D mapping, the laser-based systems typically require that the transducer itself be rotated, which makes them incompatible with in situ clinical transducers and limits their research utility.

Unlike conventional schlieren systems that require elaborate optical setups involving pulsed light sources, collimating lenses, and filters, background-oriented schlieren (BOS)39 imaging uses only a camera to image a background pattern through a nonuniform refractive index field. The background pattern is blurred by the nonuniform refractive index field, and the cross correlation of images acquired with and without the refractive index field in place produces index of refraction maps. In essence, in BOS, the conventional sophisticated schlieren optical setup is traded for a more sophisticated computation, which is much less expensive and easily replicated. The method has been used tomographically outside of acoustics to map static refractive index fields in 3D,39–45 and it has been used to visualize FUS beams qualitatively in two dimensions.46 However, the image formation process in BOS imaging of FUS pressure beams is different from conventional BOS in which the nonuniform refractive index field is static during image acquisition because the refractive index is proportional to pressure,47 which changes dynamically during a typical camera exposure time, so the background image is blurred rather than coherently displaced, and cross correlation cannot be directly applied to extract refractive index or pressure maps. To freeze time to a fixed phase in the ultrasound cycle, tomographic BOS FUS pressure beam mapping has been performed with a strobed light source,48,49 but these methods have not yet been validated beyond qualitative comparisons to hydrophone measurements, and a strobed light source again complicates the setup and limits signal-to-noise.

Here, we describe a rapid projection imaging method to quantitatively map FUS pressure beams based on continuous-wave background-oriented schlieren (CW-BOS) imaging. It requires only a water tank, a background pattern displayed on one side of the tank, and a camera to photograph the pattern through the other side of the tank [Fig. 2(a)]. The proposed method leverages the recent availability of tablet personal computers (PCs) with high-resolution displays and consumer-grade digital single-lens reflex (DSLR) cameras with high pixel density that can resolve the submillimeter blurring of the BOS background pattern at a distance, as well as deep learning techniques that can solve the difficult inverse problem of relating blurred photographs to projected pressure amplitudes. It can be implemented in a small and portable package with no moving parts to rapidly map FUS and focused imaging transducer root-mean-square (RMS) projected pressure beams in two dimensions, and there are no parts to experience wear from the FUS beam. Illustrated in Fig. 2(b), the background images are bed-of-nails patterns, where each dot is blurred by the ultrasound beam in a distinctive pattern that can be interpreted as a histogram of local image displacement over time and is related to the projected RMS pressure field along the line-of-sight dimension at the position of the dot. An acquisition is performed by taking photographs of blurred background patterns with FUS switched on and processing them through a deep neural network that relates the blurring pattern at each dot to a RMS projected pressure amplitude. The reconstruction is trained from simulated photographs, generated using acoustic simulations of similar FUS transducers. Figure 1 defines the RMS projected pressure measurement mathematically and for illustration, shows how the RMS projected pressure amplitude would be obtained using hydrophone measurements with a motion stage. The proposed method produces a RMS projected pressure map in seconds. It was evaluated in experiments using FUS transducers operating at 1.16 and 2.25 MHz for peak negative pressures (PNPs) ranging from 1 to 4 megapascals (MPa) and associated peak positive pressures (PPPs) ranging from 1.5 to 9 MPa, which overlaps with the range of pressures used in FUS thermal treatments.50–53 As will be shown, the RMS projected pressure follows the contours of the main- and sidelobes of the FUS pressure beam and is clearly disturbed when the beam is aberrated. The method was implemented and compared to fiber-optic hydrophone RMS projected pressure measurements to evaluate its feasibility, accuracy, and robustness.

FIG. 2.

The proposed CW-BOS projected pressure mapping method displays a bed-of-nails background pattern on one side of a water tank and photographs it from the other side of the tank. When FUS is switched on, the nails blur in a distinctive pattern that can be related to the projected pressure at each spatial location in the photo. (a) A 2D CW-BOS system comprises a glass tank filled with water that is acoustically coupled to the transducer, a tablet displaying a background pattern, and a camera to photograph the blurred pattern. (b) Acquired photographs without and with FUS and a hydrophone-measured RMS projected pressure map of the same FUS beam. In the photographs, the blurred nails are narrower and elongated along the beam propagation direction (bottom to top) in the focus while the nails are blurred diagonally on either side of the focus. Sound is propagating from bottom to top in these images.

FIG. 2.

The proposed CW-BOS projected pressure mapping method displays a bed-of-nails background pattern on one side of a water tank and photographs it from the other side of the tank. When FUS is switched on, the nails blur in a distinctive pattern that can be related to the projected pressure at each spatial location in the photo. (a) A 2D CW-BOS system comprises a glass tank filled with water that is acoustically coupled to the transducer, a tablet displaying a background pattern, and a camera to photograph the blurred pattern. (b) Acquired photographs without and with FUS and a hydrophone-measured RMS projected pressure map of the same FUS beam. In the photographs, the blurred nails are narrower and elongated along the beam propagation direction (bottom to top) in the focus while the nails are blurred diagonally on either side of the focus. Sound is propagating from bottom to top in these images.

Close modal

Figure 3 shows the hardware setup for CW-BOS FUS projected pressure beam mapping, which was built around an ultra-clear rimless water tank (Fragtastic Reef, Mankato, MN) made of 5 mm-thick aquarium-grade glass. The size of the glass tank was 31 × 19 × 19 cm3 (width × depth × height), and it was filled with degassed deionized water. To suppress reflections, an acoustic absorber (Aptflex F48, Precision Acoustics Ltd., Dorchester, Dorset, UK) was placed against the tank wall opposite the FUS transducer. Two FUS transducers were used in this study: a 6.32 cm-diameter 1.16 MHz transducer with focal length 6.3 cm and f-number 1 (H101, Sonic Concepts, Bothell, WA), and a 1.91 cm-diameter 2.25 MHz transducer with focal length 5.1 cm and f-number 2.67 (Valpey Fisher IL0206HP, Hopkinton, MA).

FIG. 3.

(a) Photograph of the CW-BOS measurement setup used in this study. The camera is in the lower right and was centered on the nominal FUS beam focus. The iPad (Apple Inc, Cupertino, CA) was placed against the opposite side of the water tank from the camera, the FUS transducer was mounted on the right side of the tank, and a blue acoustic absorber was mounted on the left side opposite the transducer to suppress reflections. (b) Electrical diagram of the setup, including a top-down depiction of the tank. An Arduino was used to open the camera shutter a fixed time period after triggering the waveform generator so that photos were taken once the FUS beam had reached a steady state. The experiment was coordinated by a matlab script (The MathWorks, Natick, MA), which set the waveform generator parameters and initiated an acquisition via the Arduino.

FIG. 3.

(a) Photograph of the CW-BOS measurement setup used in this study. The camera is in the lower right and was centered on the nominal FUS beam focus. The iPad (Apple Inc, Cupertino, CA) was placed against the opposite side of the water tank from the camera, the FUS transducer was mounted on the right side of the tank, and a blue acoustic absorber was mounted on the left side opposite the transducer to suppress reflections. (b) Electrical diagram of the setup, including a top-down depiction of the tank. An Arduino was used to open the camera shutter a fixed time period after triggering the waveform generator so that photos were taken once the FUS beam had reached a steady state. The experiment was coordinated by a matlab script (The MathWorks, Natick, MA), which set the waveform generator parameters and initiated an acquisition via the Arduino.

Close modal

A 10.5 in. iPad Pro (Apple Inc, Cupertino, CA) was placed against one of the long sides of the tank, which displayed bed-of-nails background images using a PYTHON script running in the Pythonista app (OMZ Software, Berlin, Germany). The experiment computer told the iPad which background image to display via Transmission Control Protocol/Internet Protocol (TCP/IP) commands sent over Wi-Fi. An EOS 80 D 24.2 megapixel DSLR camera with an EF-S 17–55 mm f/2.8 IS USM lens (Canon Inc., Tokyo, Japan) was placed so that its body was 20 cm from the outer wall of the tank opposite the iPad, and the aperture was 12 cm from the outer wall of the tank. The camera was connected to the experiment computer via a universal serial bus (USB) connection. The EOS Utility software (Canon Inc., Tokyo, Japan) was used to control the camera's settings and download photos from it. An Arduino Leonardo R3 microcontroller board (Arduino, Scarmagno, Italy; hereafter, noted as Arduino) was used to open the camera shutter a fixed delay period after triggering the waveform generator so that photos were taken once the FUS beam had reached a steady state (timing described further below). The Arduino controlled the shutter via an analog switch (CD74HC4066E, Texas Instruments, Dallas, TX), which electrically closed two switches (focus and shutter) of a modified wired manual shutter release that was connected to the N3 connector of the camera, and it sent a transistor-transistor logic (TTL) trigger to the external trigger port of the FUS waveform generator (Keysight 33500B series, Santa Rosa, CA) to initiate the ultrasound pulse. The waveform generator's parameters were set using TCP/IP commands sent from the computer via an ethernet connection, and its output was connected to an amplifier with 55 dB gain to drive the transducer (E&I Ltd., Rochester, NY) with a 55 dB gain to drive the transducer.

Acquisitions were initiated by the experiment computer. When instructed by the computer, the Arduino opened the camera shutter, waited 50 ms, and then sent a TTL pulse to the waveform generator to generate a 100 000-cycle, 86 ms pulse at 1.16 MHz and a 150 000-cycle, 67 ms pulse at 2.25 MHz. The 50 ms delay was set based on an observed 75–100 ms delay between when the Arduino commanded the shutter to open and when it actually opened, and the pulse durations were set long enough to accommodate some variability in the shutter delay while ensuring that photos were captured when the pressure field was at steady state. Using a shorter delay caused the shutter to sometimes open after the pulse had already shut off. The camera settings were: image size 4000 × 6000 pixels, International Organization of Standardization (ISO) 640, shutter speed 1/800 s (1.25 ms), and f-number f/5. The photographs were saved on the computer in the RAW image format. The shutter speed corresponded to 1450 cycles for the 1.16 MHz transducer and 2813 cycles for the 2.25 MHz transducer. During the experiments, the whole measurement setup was covered by a black cloth to suppress ambient light, therefore, the iPad (Apple Inc, Cupertino, CA) provided the only illumination.

The background images displayed by the iPad were bed-of-nails patterns comprising black dots/nails on a regular grid with a white background. The size of each dot was 2 pixels × 2 pixels. The distance between consecutive dots in each direction was 16 pixels, which corresponded to a physical distance of 1.7 mm and was set based on the maximum expected displacement in the experiments. To obtain a high-resolution projected pressure map, the background image was moved in intervals of 4 pixels in the x and z dimensions as illustrated in Fig. 4, and a total of 16 photos were acquired, 1 for each image position. A single high-resolution (0.425 × 0.425 mm2) RMS projected pressure map was calculated from the 16 photos as described below.

FIG. 4.

To measure high-resolution RMS projected pressure maps, multiple CW-BOS photos were collected over a range of grid translations in the x and z directions. The reconstructed RMS projected pressure values were then tiled into the final map.

FIG. 4.

To measure high-resolution RMS projected pressure maps, multiple CW-BOS photos were collected over a range of grid translations in the x and z directions. The reconstructed RMS projected pressure values were then tiled into the final map.

Close modal

The following acquisitions were performed to evaluate the proposed CW-BOS RMS projected pressure beam mapping method:

  • Two FUS frequencies. FUS frequencies range from hundreds of kHz to several MHz. To demonstrate the CW-BOS method at different frequencies, mapping was performed with the 1.16 and 2.25 MHz transducers and compared to optical hydrophone measurements. The waveform generator voltages (before 55 dB power amplifier gain) were 200 mV peak-to-peak (mVpp; 1.16 MHz) and 100 mVpp (2.25 MHz). The acquisitions were repeated five times for averaging.

  • Pressure amplitude. To evaluate the method across a range of pressure amplitudes, photos were acquired with the 1.16 MHz transducer with waveform generator voltages between 0 and 200 mVpp in 25 mVpp steps. The acquisitions were repeated five times for averaging.

  • Signal-to-noise ratio (SNR) and number of averages. To investigate how the number of signal averages (NSA) affects the SNR of CW-BOS maps, an acquisition was repeated ten times with the 1.16 MHz transducer and waveform generator voltages of 50, 100, 150, and 200 mVpp. Each acquisition was reconstructed separately to a RMS projected pressure map, which was then averaged with other reconstructions at the same generator voltage.

  • Rotation and translation. User error could introduce rotations and displacements between the camera and the FUS beam, thus, CW-BOS should be robust to a reasonable range of such errors. To verify the robustness to beam rotations, five averages were acquired with the 1.16 MHz transducer rotated by 15° and 30° about the line-of-sight axis with a waveform generator voltage of 200 mVpp. Five more averages were acquired with the camera translated +2.5 cm and −2.5 cm along the z dimension to verify the robustness to beam translations with a waveform generator voltage of 150 mVpp.

  • Aberrations. An important potential application of the method is to detect aberrations and measure aberrated beams in research and the clinic. To verify the ability of CW-BOS to detect aberrated beams, an acoustic aberrator67 made from silicone (Elite double 8, Zhermack, Badina Polesine, Italy) was constructed and placed against the bottom half or the left half of the transducer to aberrate the pressure field in the x-z plane or the y-z plane, respectively. This material was chosen based on the previous experience that it can be molded into the transducer's shape and used to aberrate beams without heating the transducer, and it stays on the transducer surface when submerged in water. Five averages were acquired for each of these two cases with the 1.16 MHz transducer and waveform generator voltages of 200 mVpp.

The acquired photos were segmented into small rectangular patches around each dot using matlab's bwmorph() function (The MathWorks, Natick, MA), then the RMS projected pressure was calculated by the neural network for each dot as described below, and those values were tiled into the final reconstructed beam map. With the camera placed a total distance of 31 cm from the iPad (Apple Inc, Cupertino, CA), each rectangular patch comprised between 42 × 42 and 46 × 46 pixels, and was upsampled to 54 × 54 pixels for reconstruction. To avoid optical color dispersion, only the green channel from the photos was used for reconstruction, which has the largest weight among red, green, and blue channels in the Rec.ITU-R BT.601–7 (Ref. 54) video standard's luminescence calculation (0.299R+0.587G+0.114B). The total scan time for five averages was 3–5 min and was dominated by the time taken for the data transfer from the camera to the PC. Without these delays, the total scan times were approximately 8 s (16 photos × 100 ms of FUS on-time × 5 averages).

For validation, pressure maps were also measured using calibrated fiber-optic hydrophone sensors (136–10 T and 132–03, Precision Acoustics Ltd., Dorchester, Dorset, UK) connected to a fiber-optic hydrophone system control unit.55 The hydrophone measurements were performed in the same water tank as CW-BOS, using a Picoscope (model 5242B, Pico Technology, St Neots, UK) to record data from the output of the system control unit to the computer and a 3D motion stage (Image Guided Therapy, Bordeaux, France). The motion stage and the Picoscope were both controlled by the experiment computer via USB. The motion stage was fully stopped at each spatial location prior to recording. A 100-cycle pulse was generated by the waveform generator for both 1.16 and 2.25 MHz measurements to produce enough cycles for which the waves from each part of the transducer had reached the measurement volume. The measured voltage waveforms were converted to pressure by multiplying them by the probe sensitivities given below. As illustrated in Fig. 1, to calculate reference RMS projected pressure maps from the hydrophone data, the synchronized hydrophone measurements were integrated along the line-of-sight (y) dimension to calculate the projected pressure waveforms and RMS projected pressure using five cycles, which were extracted from the hydrophone measurements at a time when the amplitudes at all measurement positions in the 3D volume were stable. The 1.16 MHz maps were measured over a 10 × 10 × 30 (x×y×z) mm3 volume with waveform generator voltages of 200 mVpp and step sizes of 0.25 mm in x and y and 0.25 mm in z. The 132–03 probe was used for the 1.16 MHz measurements and had a sensitivity of 162.27 mV/MPa and a background noise standard deviation of 5.4 mV. The 2.25 MHz map was measured over a 10 × 10 × 47.5 mm3 volume with waveform generator voltages of 100 mVpp and step sizes of 0.25 mm in x and y and 0.5 mm in z. The 136–10 T probe was used for the 2.25 MHz measurements, and had a sensitivity of 118.05 mV/MPa and a background noise standard deviation of 3.5 mV.

1D hydrophone measurements were also taken along z with a 0.25 mm step size through the 1.16 MHz transducer's focus with waveform generator voltages of 50, 100, 150, and 200 mVpp for comparison with CW-BOS acquisitions across pressure amplitudes.

Figure 2(a) shows that when a background image displayed by the tablet is photographed through the water tank by the camera, the image is distorted due to the refraction of light rays as they travel through the water from the tablet to the camera lens. The refraction angle in each of the photographed dimensions (x and z) is determined by the refractive index of the water,

(1)

where n0 is the ambient refractive index of water, n is the 3D refractive index field, and y is the projected (line-of-sight) dimension. The 3D refractive index field (n) is proportional to the acoustic pressure p (Ref. 47),

(2)

where n/p=1.4636×1011 Pa−1 is the adiabatic piezo-optic coefficient.56,57 Assuming the light ray is deflected as it passes through the FUS beam's refractive index field and then continues across a distance D in the y dimension before being recorded by the camera, the image displacement at a location (x,z) in the photograph of the tablet's image is obtained by substituting Eq. (2) into Eq. (1) and scaling by D,

(3)

where K=(D/n0)(n/p). In this equation, the integral of the 3D pressure field along the projected dimension (y) is the projected pressure field Pproj(x,z,t). This forward model was used to calculate simulated CW-BOS histograms to train the neural network reconstructor as described below.

To generate the FUS-blurred background images used to train the reconstructor, spatially and temporally resolved steady-state FUS pressure fields with nonlinearity were simulated using a modified angular spectrum method58 with frequency domain attenuation and dispersion, absorbing boundary layers, and an adaptive propagation step size. An operator splitting term was used to separate the terms in the retarded-time formulation of the nonlinear angular spectrum equation.59 The attenuation and dispersion term was solved directly in the frequency domain using a filtering approach. The nonlinear term was solved in the time domain using a Rusanov scheme to accurately capture the shock front in a flux-conservative fashion.60 Whereas the dispersion modeling performed by this general simulation tool was not necessary in water, the attenuation modeling is important to correctly describe the shock wave physics, particularly, at high frequencies. Without it, simulated shock rise times would be significantly larger than reality. A dispersionless boundary with a quartic increase in attenuation was implemented at the boundaries over a region of 10λ. This absorbing boundary layer was placed 20λ away from the outermost edges of the active transducer area so it would not affect the wave propagation in the transducer footprint. This simulation method has been validated experimentally.61 The simulations used a speed of sound in water of c0=1500 m/s, λ/8 grid spacing in the dimensions transverse to beam propagation (0.16 mm at 1.16 MHz and 0.08 mm at 2.25 MHz), λ/4 grid spacing in the axial/propagation dimension (0.32 mm at 1.16 MHz and 0.17 mm at 2.25 MHz), a nonlinearity coefficient of β=3.5, an equilibrium density ρ0=1000 kg/m3, and a dwell time (temporal step size) of 1/(40f0) (21.5 ns at 1.16 MHz and 11.1 ns at 2.25 MHz). The beams were simulated over a 9.5 × 9.5 × 9.5 cm3 volume for the larger 1.16 MHz transducer and a 2.86 × 2.86× 7.6 cm3 volume for the smaller 2.25 MHz transducer. The simulated transducers generated Hamming-windowed 48-cycle pulses, and the middle cycle was saved at each spatial location, representing the steady state. A total of 34 simulations were run for the 1.16 MHz transducer for focal PNPs between 1 and 11.5 MPa, associated focal PPPs between 1.2 and 21.5 MPa, and f-numbers of 0.5 (focal length = 3.2 cm) and 1 (focal length = 6.3 cm). A total of 36 simulations were run for the 2.25 MHz transducer for focal PNP between 0.9 and 9.7 MPa, associated focal PPP between 1.3 and 9.7 MPa, and f-numbers of 1.335 (focal length = 2.5 cm) and 2.67 (focal length = 5.1 cm).

The training data for the reconstructor comprised simulated CW-BOS histograms paired with their RMS projected pressure amplitudes. First, projected pressure waveforms were calculated by integrating the beams along the y dimension, and RMS projected pressure amplitudes were calculated from those waveforms. Then, the associated histograms were calculated as illustrated in Fig. 5. To calculate a histogram for each simulated (xz) location, projected pressure waveforms were first calculated, and then image displacements were calculated using Eq. (3), which required finite differencing the y-projected pressure fields in the x and z dimensions and scaling the result by the distance D between the focus and the tablet screen (D = 8.5 cm for our hardware setup). Then, for each time instant, a distorted image was computed by shifting the spatial location's dot by the calculated image displacements in each direction, and the distorted images were averaged over one ultrasound period to obtain a final simulated BOS histogram image. The simulated histograms were convolved with a point spread function measured from an undistorted (no-FUS) photograph taken with our system. The histograms were individually normalized for zero mean and unit standard deviation, and the RMS projected pressures were collectively normalized. The histogram dimensions were 54 × 54, which corresponded to a spatial area of size 1.7 × 1.7 mm2 on the screen of the tablet, with a pixel width of 0.024 mm. The 1.16 MHz training data comprised a total of N = 923 174 examples, and the 2.25 MHz training data comprised a total of N = 1 777 987 examples.

FIG. 5.

Simulated histogram calculation procedure. The background image displacements at position (xi,zj) in a simulated FUS pressure field were calculated using the projected pressure waveforms and applied to the unblurred dot to shift it in x and z for each simulated time point tk. Then, the displaced dot patterns were averaged over one ultrasound cycle to obtain the 2D histogram.

FIG. 5.

Simulated histogram calculation procedure. The background image displacements at position (xi,zj) in a simulated FUS pressure field were calculated using the projected pressure waveforms and applied to the unblurred dot to shift it in x and z for each simulated time point tk. Then, the displaced dot patterns were averaged over one ultrasound cycle to obtain the 2D histogram.

Close modal

Prior to inputting them to the reconstructor network, the training histograms were compressed to a dimensionality smaller than their number of pixels by projecting them to a subspace derived by singular value decomposition (SVD) truncation.62,63 For each FUS frequency, a dictionary was formed from all the training data by reshaping the histograms to length-M row vectors dR1×M, where M=542=2916, and stacking them into a dictionary matrix DRN×M, where N is the number of training examples. The matrix D was decomposed by SVD into the product of three matrices, D=USVT, where URN×M is an orthonormal matrix containing the left singular vectors, SRM×M is a diagonal matrix containing the singular values, and VRM×M is an orthonormal matrix of right singular vectors. A lower-dimensional compressed subspace was obtained by truncating the SVD to its first K singular values (K = 137 at 1.16 MHz and 97 at 2.25 MHz, where K was set to retain 99.999% of the sum-of-squared singular values compared to no truncation), and the vectors of the resulting truncated right singular vector matrix VKRM×K spanned this lower-dimensional subspace. Thereafter, each training and experimental BOS histogram d was projected to the lower-dimensional subspace by multiplying it with the matrix VK to obtain its compressed coefficients c=dVK. The coefficient vector c was the input to the neural network which output the RMS projected pressure value corresponding to that histogram as illustrated in Fig. 6 and described further below. The matrices VK were stored and used to compress experimentally measured histograms prior to reconstruction of their RMS projected pressures.

FIG. 6.

Schematic representation of the neural network-based pressure reconstruction for each photographed histogram. Each histogram in a photo is segmented into an M × M sub-image, and then projected to the compressed subspace, and its K (KM) coefficients in that subspace are input to a deep neural network with a fully connected input layer and three fully connected hidden layers and hyperbolic tangent activations, which feed a single-node layer that outputs the final estimated RMS projected pressure value, which is tiled into the final beam map.

FIG. 6.

Schematic representation of the neural network-based pressure reconstruction for each photographed histogram. Each histogram in a photo is segmented into an M × M sub-image, and then projected to the compressed subspace, and its K (KM) coefficients in that subspace are input to a deep neural network with a fully connected input layer and three fully connected hidden layers and hyperbolic tangent activations, which feed a single-node layer that outputs the final estimated RMS projected pressure value, which is tiled into the final beam map.

Close modal

To reconstruct RMS projected pressure maps from a set of photographs, Fig. 6 shows that each histogram is projected into the compressed SVD subspace, and the resulting length-K vector of coefficients c is input to a deep neural network comprising three fully connected layers (FC1–FC3 in Fig. 6). The input and fully connected layers all have K nodes and each is followed by a hyperbolic tangent activation function. The output layer comprises a linear activation function and has one node, the output of which is the RMS projected pressure for the input histogram coefficients.

A network was trained for each frequency using the simulated histograms and their associated RMS projected pressures in Keras64 on the Tensorflow deep learning framework65 using two NVIDIA graphics processing units (NVIDIA, Santa Clara, CA) for 20 epochs (approximately 1 h) on Vanderbilt University's parallel computing cluster (Advanced Computing Center for Research and Education, Vanderbilt University, Nashville, TN). Each batch was trained for 400 steps. The optimization algorithm RMSProp (Ref. 66) was used with mini-batches of size 1024, a learning rate of 0.00005, momentum 0.0, and decay 0.9. The mean squared error was used as the loss function for training. An additional L1-norm penalty (λ1 = 0.00002) was used to promote sparsity of the weights in the input layer, and an L2-norm penalty (λ2 = 0.0002) was used to prevent overfitting in each layer. Keras's real-time augmentation was used to rotate the training histograms by 0°–30° to achieve robustness to transducer rotations. The final network weights occupied 662 kB disk space for the 1.16 MHz transducer and 230 kB disk space for the 2.25 MHz transducer; the sizes differed due to the different sizes of the SVD-compressed subspaces for each frequency. Given all the acquired photos, the beam map reconstructions each took approximately 20 s of computation on a desktop computer with a 4.2 GHz Intel Core i7 central processing unit (CPU) and 32 GB 2400 MHz DDR4 random access memory (RAM; iMac, Apple Inc, Cupertino, CA). Note that while hydrophone measurements were used to evaluate the proposed method, they are not required and were not used to calibrate or train it. MATLAB and Python codes for network training and reconstruction are available at https://github.com/wgrissom/zebrography.

Figure 7 shows a comparison of simulated and hydrophone-measured 1.16 MHz PNP maps in x-y and x-z planes that cut through the transducer's focus. The hydrophone measurements were taken with a waveform generator voltage of 200 mVpp, and the simulated transducer's source amplitude was adjusted to match the hydrophone-measured focal PNP (4.38 MPa). Table I lists measured amplitudes and main lobe widths for the simulated and measured beams, which correspond closely, supporting the use of the simulations to train the CW-BOS reconstruction network.

FIG. 7.

Simulated and optical hydrophone-measured 2D PNP maps at 1.16 MHz. (Top) x-y maps cutting through the peak of the focus in z. (Bottom) x-z maps cutting through the peak of the focus in y.

FIG. 7.

Simulated and optical hydrophone-measured 2D PNP maps at 1.16 MHz. (Top) x-y maps cutting through the peak of the focus in z. (Bottom) x-z maps cutting through the peak of the focus in y.

Close modal
TABLE I.

Quantitative comparisons of simulated and hydrophone-measured 2D pressure fields at 1.16 MHz.

SimulationHydrophone
Focal PNP (MPa) 4.38 4.39 
Focal PPP (MPa) 8.04 8.03 
PNP main-lobe FWHMa (x axis, mm) 2.3 2.2 
PNP main-lobe FWHMa (y axis, mm) 2.2 2.2 
PNP main-lobe FWHMa (z axis, mm) 16.2 14.6 
SimulationHydrophone
Focal PNP (MPa) 4.38 4.39 
Focal PPP (MPa) 8.04 8.03 
PNP main-lobe FWHMa (x axis, mm) 2.3 2.2 
PNP main-lobe FWHMa (y axis, mm) 2.2 2.2 
PNP main-lobe FWHMa (z axis, mm) 16.2 14.6 
a

Full width at half-maximum (FWHM).

The hydrophone-measured PNPs produced at 1.16 and 2.25 MHz were 4.5 and 1.4 MPa, respectively, and the PPPs were 8.0 and 2.7 MPa, respectively. Figure 8(a) shows the hydrophone-measured RMS projected pressure map (left), the reconstructed CW-BOS RMS projected pressure map (middle), and their difference (right) for the 1.16 MHz transducer. The amplitudes and shapes of the hydrophone and CW-BOS beams matched closely with a root-mean-squared error (RMSE) of 316 Pa m, calculated over the whole 2D map, or 5.1% of the hydrophone-measured peak amplitude, and the difference in spatial peak projected pressure was 239 Pa m (3.8% of the hydrophone-measured spatial peak). The main-lobe full widths at half-maximum (FWHMs) were 1.5 mm (hydrophone) versus 1.5 mm (CW-BOS) in the x dimension (perpendicular to the beam axis) and 11.4 mm (hydrophone) versus 11.3 mm (CW-BOS) in the z dimension (parallel to the beam axis). Figure 8(b) shows the hydrophone (left) and CW-BOS (middle) RMS projected pressure maps and their difference (right) for the 2.25 MHz transducer. The RMSE between the two was 209 Pa m or 8.6% of the hydrophone-measured peak amplitude, and the difference in spatial peak projected pressure was 29 Pa m (0.1% of the hydrophone-measured spatial peak). The main-lobe FWHMs in x were 2.1 mm (hydrophone) versus 1.8 mm (CW-BOS), and the main-lobe full widths at 75% of maximums in z were 31.2 mm (hydrophone) versus 32.2 mm (CW-BOS). Table II summarizes these comparisons. The total hydrophone scan times to generate these maps were approximately 6.7 h for the 1.16 MHz transducer (200 000 spatial locations) and 5.5 h for the 2.25 MHz transducer (160 000 spatial locations).

FIG. 8.

Comparison of RMS projected pressure maps measured using an optical hydrophone and CW-BOS for two transducers at different frequencies. The beam propagation direction is from bottom to top. (a) Measured maps and their difference for the 1.16 MHz transducer. (b) Measured maps and their difference for the 2.25 MHz transducer.

FIG. 8.

Comparison of RMS projected pressure maps measured using an optical hydrophone and CW-BOS for two transducers at different frequencies. The beam propagation direction is from bottom to top. (a) Measured maps and their difference for the 1.16 MHz transducer. (b) Measured maps and their difference for the 2.25 MHz transducer.

Close modal
TABLE II.

Quantitative comparisons of hydrophone and CW-BOS measured RMS projected pressure maps.

1.16 MHz2.25 MHz
HydrophoneCW-BOSHydrophoneCW-BOS
RMSE (Pa m) 316 191 
Percentage RMSEa (%) 5.1% 8.6% 
RMS projected pressure amplitude in the focus (Pa m) 6003 6242 2424 2453 
Main-lobe full width at 50% maximum (x axis, mm) 1.5 1.5 2.1 1.8 
Main-lobe full width at 50% (or 75%) maximum (z axis, mm) 11.4 (50%) 11.3 (50%) 31.2 (75%) 32.2 (75%) 
1.16 MHz2.25 MHz
HydrophoneCW-BOSHydrophoneCW-BOS
RMSE (Pa m) 316 191 
Percentage RMSEa (%) 5.1% 8.6% 
RMS projected pressure amplitude in the focus (Pa m) 6003 6242 2424 2453 
Main-lobe full width at 50% maximum (x axis, mm) 1.5 1.5 2.1 1.8 
Main-lobe full width at 50% (or 75%) maximum (z axis, mm) 11.4 (50%) 11.3 (50%) 31.2 (75%) 32.2 (75%) 
a

RMSE as a percent of the hydrophone-measured peak amplitude (percentage RMSE, %).

Figure 9(a) shows reconstructed RMS projected pressure maps across a waveform generator driving voltage amplitudes. Figure 9(b) plots the mean across five averages of the RMS projected pressure in the focus [indicated by the arrow in Fig. 9(a)] at each amplitude, along with optical hydrophone measurements, which were taken at waveform generator voltages of 50, 100, 150, and 200 mVpp. The error bars represent the standard deviation of the values over the five averages. The fitted slopes of the RMS projected pressure amplitudes were 32.2 Pa m/mVpp (hydrophone) versus 32.3 Pa m/mVpp (CW-BOS). The Pearson's r-value between the hydrophone and CW-BOS measurements was 0.996. Figure 9(c) plots pressure waveforms measured by the optical hydrophone at the focus. For waveform generator amplitudes of 50, 100, 150, and 200 mVpp, the focal PNPs measured from those waveforms were 1.5, 2.5, 3.6, and 4.5 MPa, respectively, and the PPPs were 1.7, 3.7, 6.4, and 9.2 MPa, respectively.

FIG. 9.

CW-BOS pressure maps versus waveform generator driving voltage amplitude. (a) Reconstructed RMS projected pressure maps for the 1.16 MHz transducer. (b) RMS projected pressure at the focus measured by the hydrophone and CW-BOS, where the CW-BOS values were averaged across five repeated measurements, and the error bars represent standard deviation across the measurements. (c) Plots of hydrophone-measured waveforms at the focus for four waveform generator voltage amplitudes.

FIG. 9.

CW-BOS pressure maps versus waveform generator driving voltage amplitude. (a) Reconstructed RMS projected pressure maps for the 1.16 MHz transducer. (b) RMS projected pressure at the focus measured by the hydrophone and CW-BOS, where the CW-BOS values were averaged across five repeated measurements, and the error bars represent standard deviation across the measurements. (c) Plots of hydrophone-measured waveforms at the focus for four waveform generator voltage amplitudes.

Close modal

Figure 10 shows that noise is reduced by averaging reconstructions from repeated CW-BOS acquisitions. Figure 10(a) shows reconstructed CW-BOS RMS projected pressure maps resulting from powers-of-two averages from one to eight (1, 2, 4, and 8) for a waveform generator voltage of 200 mVpp. The map resulting from eight averages is the maximum shown in Fig. 10(a), which was indistinguishable from the map resulting from the maximum ten averages (not shown). The maps became smoother as noise was reduced with increased averaging. Figure 10(b) plots the decremental RMS error in a 5.6 × 20 mm2 region centered on the focus between maps reconstructed from one to ten averages for each waveform generator voltage amplitude. The differences stopped changing significantly after five averages. Figure 10(c) plots the incremental SNR around the focus of reconstructed 200 mVpp maps with 1–10 averages; with one acquisition the SNR was 50, but was improved to 80 by 5 averages. Here, SNR was calculated as the ratio of the signal amplitude in the middle of the focus to the standard deviation in background regions without significant projected pressure. SNR was improved by 30 (∼60%) with 5 averages and was improved by only another 10 with 10 averages (∼10%). For this reason, five averages were used for the other experiments. The SNR of the hydrophone RMS projected pressure map of the 1.16 MHz transducer at 200 mVpp was 407, supporting the use of hydrophone measurements as references for CW-BOS validation.

FIG. 10.

SNR and reconstruction from different numbers of averages. (a) Reconstructed RMS projected pressure maps across numbers of averages in powers of two with a waveform generator voltage of 200 mVpp at 1.16 MHz. (b) Mean-squared error between maps resulting from i + 1 versus i averages in a 5.6 × 20 mm2 region around the focus for four waveform generator voltage amplitudes. (c) SNR around the focus with a waveform generator voltage of 200 mVpp and 1–10 averages.

FIG. 10.

SNR and reconstruction from different numbers of averages. (a) Reconstructed RMS projected pressure maps across numbers of averages in powers of two with a waveform generator voltage of 200 mVpp at 1.16 MHz. (b) Mean-squared error between maps resulting from i + 1 versus i averages in a 5.6 × 20 mm2 region around the focus for four waveform generator voltage amplitudes. (c) SNR around the focus with a waveform generator voltage of 200 mVpp and 1–10 averages.

Close modal

Neural networks were trained without and with rotated beams. Figure 11(a) shows CW-BOS RMS projected pressure maps imaged with a 200 mVpp waveform generator voltage with no transducer rotation (the reference case) and 15° and 30° rotation about the y (line-of-sight) axis, where the reconstruction was done using the network that was not trained with rotated beams. Figure 11(a) shows that the intensity of the rotated pressure fields is different than the reference: the RMS projected pressure amplitudes in the focus were 6007 Pa m (0°), 5704 Pa m (15°; 4.0% difference with respect to 0°), and 5184 Pa m (30°; 13.7% difference with respect to 0°). Figure 11(b) shows reconstructions using the network that was trained with rotated beams, and the shape and intensity of the rotated pressure fields are much closer to the reference: the RMS projected pressure amplitudes in the focus were 6036 Pa m (0°), 6292 Pa m (15°; 4.2% difference with respect to 0°), and 6266 Pa m (30°; 3.8% difference with respect to 0°). FWHMs measured perpendicular to the beam axis were 1.5 mm (0°), 1.7 mm (15°), and 1.4 mm (30°), and FWHMs measured parallel to the beam axis were 11.5 mm (0°), 12.1 mm (15°), and 11.2 mm (30°). Figure 11(c) further shows CW-BOS RMS projected pressure maps measured with the camera translated ±2.5 cm along the z dimension with a 150 mVpp waveform generator voltage. The intensity and shape of the pressure fields were again unchanged compared to the reference. The projected pressure amplitudes around the focus were 4419 Pa m (no translation), 4587 Pa m (−2.5 cm), and 4378 Pa m (+2.5 cm). The differences of the amplitudes around the focus compared to the reference (0 cm) were 3.8% (−2.5 cm) and 0.9% (+2.5 cm) of the amplitude with the camera translated 0 cm. FWHMs in the x dimension (perpendicular to the beam axis) were 1.5 mm (no translation), 1.6 mm (−2.5 cm and +2.5 cm), and 12.2 mm (no translation), and in the z dimension (parallel to the beam axis), FWHMs were 12.7 mm (−2.5 cm) and 14.0 mm (+2.5 cm). These comparisons are summarized in Table III.

FIG. 11.

Rotational and translational invariance. [(a),(b)] RMS projected pressure maps obtained by rotating the 1.16 MHz transducer 0°, 15°, and 30° and reconstructing maps using networks trained without (a) and with (b) rotated beams. (c) Reconstructed RMS projected pressure map obtained with the camera focus centered on the focus and shifted ±2.5 cm along the z axis.

FIG. 11.

Rotational and translational invariance. [(a),(b)] RMS projected pressure maps obtained by rotating the 1.16 MHz transducer 0°, 15°, and 30° and reconstructing maps using networks trained without (a) and with (b) rotated beams. (c) Reconstructed RMS projected pressure map obtained with the camera focus centered on the focus and shifted ±2.5 cm along the z axis.

Close modal
TABLE III.

Quantitative comparisons of CW-BOS pressure maps with the transducer rotated or the camera translated.

RotationsTranslations
15°30°0 cm−2.5 cm2.5 cm
RMS projected pressure amplitude in the focus (Pa m) 6036 6292 6266 4419 4587 4378 
Percent error in the focus  4.2% 3.8%  3.8% 0.9% 
FWHM (x axis, mm) 1.5 1.7 1.4 1.5 1.6 1.6 
FWHM (z axis, mm) 11.5 12.1 11.2 12.2 12.7 14.0 
RotationsTranslations
15°30°0 cm−2.5 cm2.5 cm
RMS projected pressure amplitude in the focus (Pa m) 6036 6292 6266 4419 4587 4378 
Percent error in the focus  4.2% 3.8%  3.8% 0.9% 
FWHM (x axis, mm) 1.5 1.7 1.4 1.5 1.6 1.6 
FWHM (z axis, mm) 11.5 12.1 11.2 12.2 12.7 14.0 

Figure 12(a) shows the silicone acoustic aberrator that was made placed against the bottom half of the 1.16 MHz transducer to aberrate the pressure field in the x-z plane. CW-BOS RMS projected beam maps, hydrophone beam maps measured with this aberrator configuration, and their difference are shown in Fig. 12(b). The beams' intensities and shapes agree qualitatively and exhibit similar characteristics, and the RMSE between them was 257 Pa m (10.8% of the hydrophone-measured peak amplitude at x = −2.7 mm, z = −9.5 mm). The difference in the peak projected pressure is 141 Pa m (6.0% of the hydrophone-measured peak amplitude). Figure 12(c) further shows the aberrator placed against the left half of the transducer to aberrate the pressure field in the y-z plane, and Fig. 12(d) shows measured beam maps with this configuration. The maps show similar features, with a RMSE of 380 Pa m (19.3% of the hydrophone-measured peak amplitude at x = −0.1 mm, z = 5.1 mm). The difference in the peak RMS projected pressure is 236 Pa m (11.7% of the hydrophone-measured peak amplitude).

FIG. 12.

Aberrated beam mapping. (a) An aberrator made from silicone was placed in front of the bottom half of the 1.16 MHz transducer. (b) Optical hydrophone, CW-BOS RMS projected beam maps and the difference map between them measured with the bottom half of the transducer blocked. (c) The aberrator positioned to block the left half of the transducer. (d) Optical hydrophone, CW-BOS RMS projected beam maps and the difference maps between them measured with the left half of the transducer blocked.

FIG. 12.

Aberrated beam mapping. (a) An aberrator made from silicone was placed in front of the bottom half of the 1.16 MHz transducer. (b) Optical hydrophone, CW-BOS RMS projected beam maps and the difference map between them measured with the bottom half of the transducer blocked. (c) The aberrator positioned to block the left half of the transducer. (d) Optical hydrophone, CW-BOS RMS projected beam maps and the difference maps between them measured with the left half of the transducer blocked.

Close modal

Quantitative and fast mapping of FUS pressure fields is essential for treatment planning, safety, dosimetry, quality assurance, and technical research.10 We have proposed and demonstrated a rapid and inexpensive optical projection imaging method that quantitatively maps FUS pressure fields in two dimensions and requires only a water tank, a tablet to display background patterns, a camera, and a PC to reconstruct RMS projected pressure maps. While there are therapeutic ultrasound applications, such as histotripsy, which use pressures higher than the PNPs in our study, thermal ablation is performed using pressures within the range studied,50–52 and we have observed heating in mouse models using the 1.16 MHz FUS system used in this study at pressures within the range we evaluated.53 The method could also be used to map the beams of focused imaging transducers, which are capable of generating similar pressure amplitudes to the FUS transducers evaluated here.

Unlike previous optical beam mapping methods that used strobed light sources to “freeze” the ultrasound beam at different phases so that it can be reconstructed algebraically,48 we simplified the hardware setup by allowing the beam to run continuously during the acquisition, which caused a blur rather than a coherent shift of the background pattern, and then used a deep neural network to solve the difficult inverse problem of reconstructing RMS projected pressure amplitudes from the blurred image at each location in the photographs. We described a complete 2D BOS hardware system and acquisition protocol, described a forward model for image formation, and established a reconstruction. It is important to note that the reconstruction network operates only on one spatial location at a time and does not make assumptions about spatial smoothness or structure of the beam in the imaged 2D plane, yet, it produced beam maps that closely matched optical hydrophone measurements. This way, the technique maintains generality for important applications where beam structure would be difficult to predict, such as when mapping aberrated fields or when the beam rotates or moves, as were demonstrated here.

CW-BOS is an inexpensive (under $2000) and rapid 2D FUS beam mapping tool based on a consumer-grade tablet and camera with no moving parts or parts that can experience wear from the FUS beam. To make it portable, the tank could be sealed, the tablet and camera could be rigidly attached to it, and FUS could be coupled into it via a Mylar membrane.

The result would be a fully portable push-button beam mapping method, which could be set up and performed more easily and faster than quality assurance (QA) tests, such as MRI temperature measurements in a phantom, without requiring MRI scan time or an MRI-compatible FUS system. Bowl-shaped transducers could be coupled into the tank via a coupling bag.

Our total CW-BOS scan times were 3–5 min, which was dominated by the time taken for data transfer from the camera to the PC, and comprised less than 8 s of FUS on-time. We expect that with optimization, the total scan duration could be reduced to less than 10 s; reconstruction in matlab (The MathWorks, Natick, MA) and PYTHON (OMZ Software, Berlin, Germany) then took another 20 s, which could be further optimized, and does not require a high-end computer. Overall, the method achieved an approximate 2000× speedup compared to the time required to obtain the same information (RMS projected pressure over a 3 × 1 cm2 FOV with 0.425 × 0.425 mm2 spatial resolution) using a 3D hydrophone scan. While linear hydrophone scans along the axial and focal plane dimensions may suffice for QA, those scans are still slower and involve more delicate and less portable hardware than CW-BOS imaging and require careful alignment with the ultrasound focus. Overall, the most likely applications for CW-BOS projected pressure mapping will be those that are not well-served by hydrophones and in which speed, simplicity, and low cost are of highest importance, such as measuring changes in overall transducer output and detecting beam aberrations.

There are several possible ways to improve or extend the proposed technique. One potential source of error is the assumed parallel ray geometry between the background pattern and the camera; it may be possible to generate more accurate training histograms using ray tracing and accounting for refraction at the air–glass and glass–water interfaces.68,69 The dominant noise source in our measurements is likely electronic noise in the camera's detector, but this needs to be characterized and a propagation-of-error analysis should be done to understand noise in the final maps and how to best mitigate it. Another possible source of error would be a mismatch between the assumed and actual distances between the expected transducer focus and the background pattern. The reconstructions are trained using histograms calculated for a specific distance between the midpoint of the nonuniform index of the refraction field and the background pattern, which corresponded in our experimental setup to the distance between the expected focus and the background pattern. This is a parameter that could be optimized: moving the beam closer to the background pattern would magnify the blurring patterns by a factor <1, which would enable a larger FOV at the cost of reduced sensitivity. Moving the beam farther from the background pattern would magnify the blurring patterns by a factor >1, which would increase sensitivity at the cost of a reduced FOV. A larger convolutional neural network that operates on entire photos rather than individual segmented histograms may achieve improved accuracy by learning spatial relationships between blurring patterns and FUS beam features, and it could enable the use of a single, dense background pattern to reduce acquisition times. However, this would require a much larger training corpus to maintain generality, as well as more computation and memory, both for training and reconstruction. RMS projected pressure was used as the reconstructed quantity in this work since it is proportional to transducer output, but it may be possible to extend the method to also reconstruct peak positive and negative pressures or even a full cycle of the projected pressure waveform at each spatial location, which would be needed to reconstruct 3D beam maps that contain the same information as 3D hydrophone scans. A 3D system would also require the optics and the transducer to be rotated with respect to each other. We have previously developed a qualitative back projection–based 3D CW-BOS beam mapping system70,71 and will extend the proposed 2D quantitative method to 3D beam mapping in future work. Finally, in this work we trained the reconstruction network from acoustic simulations of transducers at the same frequencies as those used in the experiments but with two different focal lengths and across a range of peak pressures. This approach resulted in RMS projected pressure reconstructions that closely matched hydrophone measurements even when an aberrator was placed on the surface of the 1.16 MHz transducer. However, further research is needed to optimize the training data for different applications. For example, it may not be necessary to train for multiple focal lengths when the reconstruction network will be used with just one transducer, but care must be taken to maintain accuracy when the beam is disturbed by an aberration. For such an application, it may be better to perform simulations of the same transducer with different sub-elements on its surface virtually switched off to mimic aberrations. This process would ideally be informed by the types of aberrations (e.g., bones versus failed transducer elements, air bubbles in the beam path, or system calibration errors9) that are most likely to be encountered for a given clinical or research application. If the reconstruction network will be used for just a small range of driving voltages, it may be possible to also reduce the range of simulated pressures. It may also not be necessary to train the network for each transducer frequency, or it may be possible to train a single network across a range of frequencies without compromising reconstruction accuracy. At the same time, the storage required for each network was small (<1 MB), therefore, many networks could be trained and stored, and the most appropriate one could be selected for each mapping task. While the proposed CW-BOS hardware is not currently compatible with very large-aperture transcranial FUS transducers whose foci do not extend beyond their shell, it may be possible to map these systems by projecting background patterns onto the transducer surface. These and related questions will be addressed in future work.

This work was supported by the National Institutes of Health (NIH) Grant No. R21 EB 024199. The authors would like to thank Charlotte Sappo for help with the shutter switch and Marshall (Tony) Phipps and Sumeeth Jonanthan for help using the optical hydrophone and FUS amplifier.

1.
Y.-F.
Zhou
, “
High intensity focused ultrasound in clinical tumor ablation
,”
World J. Clin. Oncol.
2
(
1
),
8
27
(
2011
).
2.
E. A.
Stewart
,
W. M.
Gedroyc
,
C. M.
Tempany
,
B. J.
Quade
,
Y.
Inbar
,
Y.
Ehrenstein
,
A.
Shushan
,
J. T.
Hindley
,
R. D.
Goldin
, and
M.
David
, “
Focused ultrasound treatment of uterine fibroid tumors: Safety and feasibility of a noninvasive thermoablative technique
,”
Am. J. Obstet. Gynecol.
189
,
48
54
(
2003
).
3.
P.
Ghanouni
,
A.
Dobrotwir
,
A.
Bazzocchi
,
M.
Bucknor
,
R.
Bitton
,
J.
Rosenberg
,
K.
Telischak
,
M.
Busacca
,
S.
Ferrari
,
U.
Albisinni
,
S.
Walters
,
G.
Gold
,
K.
Ganjoo
,
A.
Napoli
,
K. B.
Pauly
, and
R.
Avedian
, “
Magnetic resonance-guided focused ultrasound treatment of extra-abdominal desmoid tumors: A retrospective multicenter study
,”
Eur. Radiol.
27
,
732
740
(
2017
).
4.
H.
Grüll
and
S.
Langereis
, “
Hyperthermia-triggered drug delivery from temperature-sensitive liposomes using MRI-guided high intensity focused ultrasound
,”
J. Control Release
161
(
2
),
317
327
(
2012
).
5.
W. J.
Elias
,
D.
Huss
,
T.
Voss
,
J.
Loomba
,
M.
Khaled
,
E.
Zadicario
,
R. C.
Frysinger
,
S. A.
Sperling
,
S.
Wylie
,
S. J.
Monteith
,
J.
Druzgal
,
B. B.
Shah
,
M.
Harrison
, and
M.
Wintermark
, “
A pilot study of focused ultrasound thalamotomy for essential tremor
,”
New Engl. J. Med.
369
(
7
),
640
648
(
2013
).
6.
P.
Ghanouni
,
K.
Butts Pauly
,
W. J.
Elias
,
J.
Henderson
,
J.
Sheehan
,
S.
Monteith
, and
M.
Wintermark
, “
Transcranial MRI-guided focused ultrasound: A review of the technologic and neurologic applications
,”
Am. J. Roentegnol.
205
(
1
),
150
159
(
2015
).
7.
N.
Lipsman
,
Y.
Meng
,
A. J.
Bethune
,
Y.
Huang
,
B.
Lam
,
M.
Masellis
,
N.
Herrmann
,
C.
Heyn
,
I.
Aubert
,
A.
Boutet
,
G. S.
Smith
,
K.
Hynynen
, and
S. E.
Black
, “
Blood–brain barrier opening in Alzheimer's disease using MR-guided focused ultrasound
,”
Nat. Commun.
9
,
2336
(
2018
).
8.
J.
Blackmore
,
S.
Shrivastava
,
J.
Sallet
,
C. R.
Butler
, and
R. O.
Cleveland
, “
Ultrasound neuromodulation: A review of results, mechanisms and safety
,”
Ultrasound Med. Biol.
45
(
7
),
1509
1536
(
2019
).
9.
N.
McDannold
and
K.
Hynynen
, “
Quality assurance and system stability of a clinical MRI-guided focused ultrasound system: Four-year experience
,”
Med. Phys.
33
(
11
),
4307
4313
(
2006
).
10.
G.
Ter Haar
, “
Safety first: Progress in calibrating high-intensity focused ultrasound treatments
,”
Imag. Med.
5
(
6
),
567
575
(
2013
).
11.
S. H.
Wong
,
M.
Kupnik
,
R. D.
Watkins
,
K.
Butts-Pauly
, and
B. T.
Khuri-Yakub
, “
Capacitive micromachined ultrasonic transducers for therapeutic ultrasound applications
,”
IEEE Trans. Biomed. Eng.
57
(
1
),
114
123
(
2009
).
12.
G.
Clement
and
K.
Hynynen
, “
A non-invasive method for focusing ultrasound through the human skull
,”
Phys. Med. Biol.
47
(
8
),
1219
1236
(
2002
).
13.
J.
Aarnio
,
G. T.
Clement
, and
K.
Hynynen
, “
A new ultrasound method for determining the acoustic phase shifts caused by the skull bone
,”
Ultrasound Med. Biol.
31
(
6
),
771
780
(
2005
).
14.
A.
Kyriakou
,
E.
Neufeld
,
B.
Werner
,
M. M.
Paulides
,
G.
Szekely
, and
N.
Kuster
, “
A review of numerical and experimental compensation techniques for skull-induced phase aberrations in transcranial focused ultrasound
,”
Int. J. Hyperth.
30
(
1
),
36
46
(
2014
).
15.
R. J.
Lalonde
,
A.
Worthington
, and
J. W.
Hunt
, “
Field conjugate acoustic lenses for ultrasound hyperthermia
,”
IEEE Trans. Ultrason. Ferroelectr. Freq. Control
40
(
5
),
592
602
(
1993
).
16.
H. W.
Baac
,
J. G.
Ok
,
A.
Maxwell
,
K.-T.
Lee
,
Y.-C.
Chen
,
A. J.
Hart
,
Z.
Xu
,
E.
Yoon
, and
L. J.
Guo
, “
Carbon-nanotube optoacoustic lens for focused ultrasound generation and high-precision targeted therapy
,”
Sci. Rep.
2
,
989
(
2012
).
17.
C.
Chang
,
K.
Firouzi
,
K. K.
Park
,
A. F.
Sarioglu
,
A.
Nikoozadeh
,
H.-S.
Yoon
,
S.
Vaithilingam
,
T.
Carver
, and
B. T.
Khuri-Yakub
, “
Acoustic lens for capacitive micromachined ultrasonic transducers
,”
J. Micromech. Microeng.
24
(
8
),
085007
(
2014
).
18.
J. R.
Corea
,
A. M.
Flynn
,
B.
Lechêne
,
G.
Scott
,
G. D.
Reed
,
P. J.
Shin
,
M.
Lustig
, and
A. C.
Arias
, “
Screen-printed flexible MRI receive coils
,”
Nat. Commun.
7
,
10839
(
2016
).
19.
M. S.
Canney
,
M. R.
Bailey
,
L. A.
Crum
,
V. A.
Khokhlova
, and
O. A.
Sapozhnikov
, “
Acoustic characterization of high intensity focused ultrasound fields: A combined measurement and modeling approach
,”
J. Acoust. Soc. Am.
124
(
4
),
2406
2420
(
2008
).
20.
W.
Kreider
,
P. V.
Yuldashev
,
O. A.
Sapozhnikov
,
N.
Farr
,
A.
Partanen
,
M. R.
Bailey
, and
V. A.
Khokhlova
, “
Characterization of a multi-element clinical HIFU system using acoustic holography and nonlinear modeling
,”
IEEE Trans. Ultrason. Ferroelectr. Freq. Control
60
(
8
),
1683
1698
(
2013
).
21.
O. V.
Bessonova
and
V.
Wilkens
, “
Membrane hydrophone measurement and numerical simulation of HIFU fields up to developed shock regimes
,”
IEEE Trans. Ultrason. Ferroelectr. Freq. Control
60
(
2
),
290
300
(
2013
).
22.
Y.
Zhou
,
L.
Zhai
,
R.
Simmons
, and
P.
Zhong
, “
Measurement of high intensity focused ultrasound fields by a fiber optic probe hydrophone
,”
J. Acoust. Soc. Am.
120
(
2
),
676
685
(
2006
).
23.
N. R.
Bronson
, “
An inexpensive schlieren apparatus
,”
Ultrasonics
7
(
1
),
67
70
(
1969
).
24.
D. R.
Newman
, “
Ultrasonic schlieren system using a pulsed gas laser
,”
IEEE Trans. Sonics Ultrason.
20
(
3
),
282
284
(
1973
).
25.
S. W.
Smith
and
F. L.
Thurstone
, “
Schlieren study of pulsed ultrasound transmission through human skull
,”
J. Clin. Ultrasound
2
(
1
),
55
59
(
1974
).
26.
V. M.
Baborovsky
, “
Visualisation of ultrasound in solids
,”
Phys. Technol.
10
(
4
),
171
(
1979
).
27.
E. K.
Reichel
,
S. C.
Schneider
, and
B. G.
Zagar
, “
Characterization of ultrasonic transducers using the schlieren–technique
,” in
2005 IEEE Instrumentation and Measurement Technology Conference Proceedings
(
2005
), Vol.
3
, pp.
1956
1960
.
28.
C. I.
Zanelli
and
S. M.
Howard
, “
Schlieren metrology for high frequency medical ultrasound
,”
Ultrasonics
44
,
e105
e107
(
2006
).
29.
D. A.
Christensen
and
A.
Chao
, “
A pulsed schlieren system for visualizing beams from phased-array HIFU applicators
,” in
AIP Conference Proceedings
(
2007
), Vol.
911
, pp.
15
19
.
30.
L. D.
Johns
,
S. J.
Straub
, and
S. M.
Howard
, “
Analysis of effective radiating area, power, intensity, and field characteristics of ultrasound transducers
,”
Arch. Phys. Med. Rehabil.
88
(
1
),
124
129
(
2007
).
31.
T.
Neumann
and
H.
Ermert
, “
Schlieren visualization of ultrasonic wave fields with high spatial resolution
,”
Ultrasonics
44
,
e1561
e1566
(
2006
).
32.
R.
Reibold
and
W.
Molkenstruck
, “
Light diffraction tomography applied to the investigation of ultrasonic fields. Part I: Continuous waves
,”
Acta Acust. Acust.
56
(
3
),
180
192
(
1984
).
33.
A.
Hanafy
and
C.
Zanelli
, “
Quantitative real-time pulsed schlieren imaging of ultrasonic waves
,” in
Proceedings of the IEEE Ultrasonics Symposium
(
1991
), pp.
1223
1227
.
34.
T. A.
Pitts
,
J. F.
Greenleaf
,
J.-Y.
Lu
, and
R. R.
Kinnick
, “
Tomographic Schlieren imaging for measurement of beam pressure and intensity
,” in
Proceedings of the IEEE Ultrasonics Symposium
(
1994
), Vol.
3
, pp.
1665
1668
.
35.
T.
Charlebois
and
R.
Pelton
, “
Quantitative 2D and 3D schlieren imaging for acoustic power and intensity measurements
,”
Med. Electron.
1995
,
789
792
(
1995
).
36.
B.
Schneider
and
K. K.
Shung
, “
Quantitative analysis of pulsed ultrasonic beam patterns using a schlieren system
,”
IEEE Trans. Ultrason. Ferroelectr. Freq. Control
43
(
6
),
1181
1186
(
1996
).
37.
R.
Reibold
, “
Light diffraction tomography applied to the investigation of ultrasonic fields. Part II: Standing waves
,”
Acta Acust. Acust.
63
(
4
),
283
289
(
1987
).
38.
T.
Nakamura
,
R.
Iwasaki
,
S.
Yoshizawa
, and
S.-I.
Umemura
, “
Quantitative measurement of ultrasonic pressure field using combination of optical phase contrast and nonlinear acoustic holography methods
,”
Jpn. J. Appl. Phys.
57
,
07LB13
(
2018
).
39.
G. E. A.
Meier
, “
Computerized background-oriented schlieren
,”
Exp. Fluids
33
(
1
),
181
187
(
2002
).
40.
H.
Richard
and
M.
Raffel
, “
Principle and applications of the background oriented schlieren (BOS) method
,”
Meas. Sci. Technol.
12
(
9
),
1576
(
2001
).
41.
L.
Venkatakrishnan
and
G. E. A.
Meier
, “
Density measurements using the background oriented schlieren technique
,”
Exp. Fluids
37
(
2
),
237
247
(
2004
).
42.
E.
Goldhahn
and
J.
Seume
, “
The background oriented schlieren technique: Sensitivity, accuracy, resolution and application to a three-dimensional density field
,”
Exp. Fluids
43
(
2-3
),
241
249
(
2007
).
43.
B.
Atcheson
,
I.
Ihrke
,
W.
Heidrich
,
A.
Tevs
,
D.
Bradley
,
M.
Magnor
, and
H.-P.
Seidel
, “
Time-resolved 3D capture of non-stationary gas flows
,” in
ACM Transactions on Graphics
(
2008
), Vol.
27
, p.
1
32.
44.
B.
Atcheson
,
W.
Heidrich
, and
I.
Ihrke
, “
An evaluation of optical flow algorithms for background oriented schlieren imaging
,”
Exp. Fluids
46
(
3
),
467
476
(
2009
).
45.
T. J.
Tipnis
,
M. V.
Finnis
,
K.
Knowles
, and
D.
Bray
, “
Density measurements for rectangular free jets using background-oriented schlieren
,”
Aeronaut. J.
117
(
1194
),
771
785
(
2013
).
46.
I.
Butterworth
and
A.
Shaw
, “
Realtime acousto-optical QA methods for high intensity fields
,” in
Proceedings of the 39th Annual Symposium of the Ultrasonic Industry Association
(
2010
),
IEEE
, pp.
1
5
.
47.
R. M.
Waxler
and
C. E.
Weir
, “
Effect of pressure and temperature on the refractive indices of benzene, carbon tetrachloride, and water
,”
J. Res. Natl. Bur. Stand. Sect. A, Phys. Chem.
67A
(
2
),
163
171
(
1963
).
48.
A.
Pulkkinen
,
J. J.
Leskinen
, and
A.
Tiihonen
, “
Ultrasound field characterization using synthetic schlieren tomography
,”
J. Acoust. Soc. Am.
141
(
6
),
4600
4609
(
2017
).
49.
E.
Koponen
,
J. J.
Leskinen
,
T.
Tarvainen
, and
A.
Pulkkinen
, “
Acoustic pressure field estimation methods for synthetic schlieren tomography
,”
J. Acoust. Soc. Am.
145
,
2470
(
2019
).
50.
N.
Ellens
and
K.
Hynynen
, “
Simulation study of the effects of near- and far-field heating during focused ultrasound uterine fibroid ablation using an electronically focused phased array: A theoretical analysis of patient safety
,”
Med. Phys.
41
(
7
),
072902
(
2014
).
51.
A. W.
Wong
,
B. Z.
Fite
,
Y.
Liu
,
A.
Kheirolomoom
,
J. W.
Seo
,
K. D.
Watson
,
L. M.
Mahakian
,
S. M.
Tam
,
H.
Zhang
,
J.
Foiret
,
A. D.
Borowsky
, and
K. W.
Ferrara
, “
Ultrasound ablation enhances drug accumulation and survival in mammary carcinoma models
,”
J. Clin. Invest.
126
(
1
),
99
111
(
2016
).
52.
P. J.
White
,
P.
von Pattenberg
, and
G. T.
Clement
, “
A nonlinear method for high-intensity focused ultrasound (HIFU) aberration reduction
,” in
IEEE Ultrasonics Symposium Proceedings
, pp.
2059
2061
(
2008
).
53.
M. E.
Poorman
,
V. L.
Chaplin
,
K.
Wilkens
,
M. D.
Dockery
,
T. D.
Giorgio
,
W. A.
Grissom
, and
C. F.
Caskey
, “
Open-source, small-animal magnetic resonance-guided focused ultrasound system
,”
J. Ther. Ultrasound
4
(
1
),
22
(
2016
).
54.
Recommendation ITU-R.BT, “Studio encoding parameters of digital television for standard 4:3 and wide-screen 16:9 aspect ratios” (
2011
).
55.
P.
Morris
,
A.
Hurrell
,
A.
Shaw
,
E.
Zhang
, and
P.
Beard
, “
A Fabry–Pérot fiber-optic ultrasonic hydrophone for the simultaneous measurement of temperature and acoustic pressure
,”
J. Acoust. Soc. Am.
125
(
6
),
3611
3622
(
2009
).
56.
C. V.
Raman
and
K. S.
Venkataraman
, “
Determination of the adiabatic piezo-optic coefficient of liquids
,”
Proc. R. Soc. London, Ser. A
171
(
945
),
137
147
(
1939
).
57.
W. A.
Riley
and
W. R.
Klein
, “
Piezo-optic coefficients of liquids
,”
J. Acoust. Soc. Am.
42
(
6
),
1258
1261
(
1967
).
58.
B. B.
Tripathi
,
D.
Espíndola
, and
G. F.
Pinton
, “
Piecewise parabolic method for propagation of shear shock waves in relaxing soft solids: One-dimensional case
,”
Int. J. Numer. Methods Biomed. Eng.
35
(
5
),
e3187
(
2019
).
59.
F.
Dagrau
,
M.
Rénier
,
R.
Marchiano
, and
F.
Coulouvrat
, “
Acoustic shock wave propagation in a heterogeneous medium: A numerical simulation beyond the parabolic approximation
,”
J. Acoust. Soc. Am.
130
(
1
),
20
32
(
2011
).
60.
V.
Rusanov
, “
On difference schemes of third order accuracy for nonlinear hyperbolic systems
,”
J. Comp. Phys.
5
(
3
),
507
516
(
1970
).
61.
G. F.
Pinton
, “Numerical methods for nonlinear wave propagation in ultrasound
,” Ph.D. thesis,
Duke University
,
2007
.
62.
D. F.
McGivney
,
E.
Pierre
,
D.
Ma
,
Y.
Jiang
,
H.
Saybasili
,
V.
Gulani
, and
M. A.
Griswold
, “
SVD compression for magnetic resonance fingerprinting in the time domain
,”
IEEE Trans. Med. Imag.
33
(
12
),
2311
2322
(
2014
).
63.
M.
Turk
and
A.
Pentland
, “
Eigenfaces for recognition
,”
J. Cognit. Neurosci.
3
(
1
),
71
86
(
1991
).
64.
F.
Chollet
, Keras, https://keras.io (
2015
).
65.
M.
Abadi
,
P.
Barham
,
J.
Chen
,
Z.
Chen
,
A.
Davis
,
J.
Dean
,
M.
Devin
,
S.
Ghemawat
,
G.
Irving
, and
M.
Isard
, Savannah, GA, “
Tensorflow: A system for large-scale machine learning
,” in
12th USENIX Symposium on Operating Systems Design and Implementation
(OSDI) (
2016
), pp.
265
283
.
66.
T.
Tieleman
and
G.
Hinton
, “
Lecture 6.5-rmsprop: Divide the gradient by a running average of its recent magnitude
,”
COURSERA: Neural Networks Mach. Learn.
4
(
2
),
26
31
(
2012
).
67.
G.
Maimbourg
,
A.
Houdouin
,
T.
Deffieux
,
M.
Tanter
, and
J.-F.
Aubry
, “
3D-printed adaptive acoustic lens as a disruptive technology for transcranial ultrasound therapy using single-element transducers
,”
Phys. Med. Biol.
63
(
2
),
025026
(
2018
).
68.
A. S.
Glassner
,
An Introduction to Ray Tracing
(
Elsevier Science & Technology
,
San Francisco, CA
,
1989
).
69.
B. A.
Barsky
,
D. R.
Horn
,
S. A.
Klein
,
J. A.
Pang
, and
M.
Yu
, “
Camera models and optical systems used in computer graphics: Part I, object-based techniques
,” in
International Conference on Computational Science and its Applications
(
Springer
,
Berlin, Heidelberg
,
2003
), pp.
246
255
.
70.
M.
Kremer
,
C. F.
Caskey
, and
W. A.
Grissom
, “
Background-oriented schlieren imaging and tomography for rapid measurement of FUS pressure fields: Initial results
,” in
Proceedings of the 4th Int. Symp. on Focused Ultrasound
(
2014
).
71.
M.
Kremer
,
C. F.
Caskey
, and
W. A.
Grissom
, “
Rapid characterization of focused ultrasound pressure fields using background-oriented schlieren tomography
,” in
Proceedings BMES
(
2015
).