In this paper, an active ultrasound-based touchscreen technology is presented for real-time monitoring of multiple contacts on a glass panel of 20 × 19 cm. Both fundamental Lamb wave modes are generated by sending a linear chirp between 50 and 100 kHz to a single piezoelectric ceramic lead zirconate titanate. Measurement is performed using four piezoelectric ceramic lead zirconate titanate elements, and real-time localization and detection are performed on a system on module i.MX 8 M Nano. Results show that Lamb waves can be used for real-time multi-touch detection and localization on both sides of the panel. Moreover, the prototype developed allows for relative pressure measurement.

Originally, the focus in the development of human–machine interfaces was primarily on devising touch technologies for slim, flat glass surfaces for touchscreen applications (Walker, 2012). This scope has since expanded to encompass curved and thicker conductive materials, paving the way for a broader spectrum of interactive products. Although projected capacitive technology is the predominant technology for touchscreens up to 32 inches diagonal, infrared technology is preferred for larger sizes (Dai and Chung, 2014). However, capacitive sensors face challenges with conductive or thick surfaces and become increasingly costly with larger sizes or complex curvatures. Infrared sensors, conversely, are not well suited for curved surfaces and can be disrupted by ambient sunlight (Walker, 2012).

An alternative approach involves the use of vibration-based systems, which utilize the interaction of an object with mechanical waves propagating in a rigid surface to accurately determine the object's location. Mechanical vibrations are not limited by the material's curvature, thickness, or type, allowing them to propagate through virtually any material. Vibration-based systems are divided into two categories: (1) passive systems, which utilize vibration waves generated by an object impacting the surface; and (2) active systems, which emit a specific vibration wave through the surface and detect the wave resulting from an interaction with an object in contact with the surface. Although passive systems are simpler and more energy efficient, they require a significant force upon impact and cannot track or locate multiple objects simultaneously (Reis , 2010). Active systems, on the other hand, can track and detect multiple contacts, as well as measure contact pressure.

The initial active acoustic systems used high-frequency (>1 MHz) surface acoustic waves that travel linearly along the boundary of a surface (Adler and Desmares, 1987). Interaction with soft objects attenuates specific ultrasound pathways, facilitating the detection of touch points. This technology necessitates a network of ultrasound emitters and receivers encircling the touch surface to establish a matrix of transmitting-receiving paths, resulting in a bulky setup. Other systems presented in the literature relied on low-order A and S Lamb wave modes produced by a compact transducer array, which are reflected by objects that touch the surface (Liu , 2010). Although more compact than surface acoustic wave devices, embedded applications of such systems suffer from the dispersive nature of Lamb waves and reflections at surface boundaries that require complex signal processing to accurately locate touch points (Firouzi , 2016).

Currently, correlation-based imaging algorithms (Quaegebeur , 2016) and lookup table algorithms (Ing , 2005) are the most promising for locating multiple touch points. However, those classes of algorithms require parallel and sometimes recursive processing to produce images with a sufficient signal-to-noise ratio for robust touch-point identification. Such techniques have been implemented for multi-touch imaging or for real-time imaging using embedded electronics (Kang , 2022). To date, no active ultrasound-based embedded technology has demonstrated the capability for real-time multi-touch functionality. Therefore, this article introduces an ultrasound-based touchscreen technology that enables real-time multi-touch and pressure measurement. In Sec. 2, the configuration, pre-processing, imaging, and the post-processing algorithms for touch event localization are presented. Section 2.4 more specifically presents the electronics and the prototype developed. Finally, the results are discussed in Sec. 3 followed by a conclusion in Sec. 4.

The entire process of touch localization can be segmented into three primary steps. Initially, there is the acquisition and pre-processing of the radio frequency signals. Subsequently, the imaging process employs a designated imaging algorithm to reconstruct a two-dimensional (2D) heatmap based on these radio frequency signal data. Last, a post-processing procedure is implemented to condense this 2D image into the x and y coordinates of touch events and to export their amplitude. A block diagram illustrating the complete algorithm is shown in Fig. 1.

Fig. 1.

Main steps of the touch localization algorithm: signal acquisition and pre-processing, ultrasound image processing, and post-processing for generating touch events.

Fig. 1.

Main steps of the touch localization algorithm: signal acquisition and pre-processing, ultrasound image processing, and post-processing for generating touch events.

Close modal

The measurement and pre-processing block depicted in Fig. 1 is performed in parallel for all acquisition channels. In order to reduce the effect of crosstalk on imaging and to automate the detection of touch events, reference signals s0 devoid of touch interactions are measured and averaged using a moving average over 100 iterations. Following this brief calibration phase, the measured real-time signals s are continuously compared to s0, and the resulting residual signals sres are utilized to identify the occurrence of touch events. These residual signals are then time-windowed for dimensional reduction and its fast Fourier transform is computed over Nf frequency bins between 50 and 100 kHz. Finally, the frequency spectra of all channels are concatenated in a single complex vector M.

The direct formulation of the measurement vector M can be expressed as:
(1)
where A is a P × 1 amplitude vector. Each element of A is a scalar that represents the touch intensity at a specific pixel (P being the total number of pixels), M is a complex vector of size N × 1 corresponding to the actual touch signal where N is the total number of samples in all channels, meaning that the different reception channels are concatenated one after another to form a single vector, and B is the signal database matrix of size N × P that contains reference touch signals relative to all possible touch pixels (i.e., all possible positions of a reflector on the structure). The superposition principle implied by this linear formulation assumes that the fingertip contacts on the plate can be seen as single scatterers, neglecting multiple reflections between contact points. This signal database is acquired once for a given system and is equivalent of measuring the transfer function between a touch at each pixel and the measured frequency signals for all receivers. For the recording of the signal baseline B, a collaborative robotic arm (Universal Robot UR5e) with a force sensor is used to simulate touches with a force of 8 N on 10 000 evenly spaced points on the surface using a silicone tip. This procedure sets the system's imaging resolution at 2 mm in the x direction and 1.9 mm in the y axis. The reconstruction of the unknown pixel amplitude vector A can be obtained by inverting the matrix B
(2)
where BP is the Moore-Penrose pseudo-inverse of B and |·| is the norm operator. It is well known that this pseudo-inverse formulation, herein named PINV, is highly sensitive to measurement noise (Bertero , 2021). Therefore, the authors propose a frequency-formulated pseudoinverse recursive imaging algorithm (PINV-REC, see Algorithm 1) to reduce noise sensitivity and increase contrast in ultrasound images obtained during the imaging phase. The iterative algorithm is similar to the classical deconvolution algorithm CLEAN (Högbom, 1974; Sijtsma, 2007) in the sense that, for each consecutive iteration, the measurement matrix is updated by subtracting the contribution associated with the region with pixels of highest intensity.

Therefore, in Algorithm 1, Ak and Mk represent the updated matrices at iteration k. The maximum number of iterations was empirically determined through experimental validation. The authors observed that a total of 10 iterations sufficed to identify all touches on the plate and still allowed real-time imaging using the electronics detailed in Sec. 2.4. For each iteration k, pixels p with an amplitude Ak[p] greater than an empirically determined threshold c1 are backpropagated, and their contribution is subtracted from the measurement Mk.

Algorithm 1.

PINV-REC algorithm.

Data: APinv-rec0,max_it, P, c1, Mk, B 
Result: Updated APinv-rec 
for k:=0 to max_it do 
   Ak|BP·Mk|
  for p:=0 to P – 1 do 
   if Ak[p]>c1 then 
     MkMkAk[p]·B
     APinv-recAPinv-rec+Ak[p]
   else 
    continue; 
   end 
  end 
end 
Data: APinv-rec0,max_it, P, c1, Mk, B 
Result: Updated APinv-rec 
for k:=0 to max_it do 
   Ak|BP·Mk|
  for p:=0 to P – 1 do 
   if Ak[p]>c1 then 
     MkMkAk[p]·B
     APinv-recAPinv-rec+Ak[p]
   else 
    continue; 
   end 
  end 
end 
For comparison purposes, another classical frequency-formulated imaging algorithm is used to reconstruct touch heatmaps. The General Cross Correlation with Phase Transform (GCC-PHAT), which assumes sparse reflectors, is expressed by Knapp and Carter (1976),
(3)
where Nf is the number of frequency bins and (·)* is the conjugate transpose operator. This algorithm is characterized by a low computational complexity and is therefore a good candidate for real-time ultrasound imaging (Quaegebeur and Masson, 2012; Bilodeau , 2023).

To extract touch coordinates from the 2D heatmaps, a straightforward centroid calculation is carried out on a 3 × 3 pixels grid surrounding the peak values that exceed a predefined threshold. The coordinates of the centroid are then determined, and the sum of the pixel intensities within the 3 × 3 pixels grid linked to each touch event is used to provide an estimate of the touch intensity, related to the pressure exerted by the finger on the surface.

The entire touch localization algorithm described in Fig. 1 is tested using a dedicated embedded electronics. The prototype and hardware used for the experimental validation are presented in Fig. 2.

Fig. 2.

Instrumented glass plate with it is aluminum frame (left); flexible PCB with the single emitter and four receivers connected to the embedded system (SOM i.MX 8 M Nano) (right).

Fig. 2.

Instrumented glass plate with it is aluminum frame (left); flexible PCB with the single emitter and four receivers connected to the embedded system (SOM i.MX 8 M Nano) (right).

Close modal

As shown in Fig. 2, the ultrasound array is soldered on a flexible Printed Circuit Board (PCB) and is composed of 5 piezoelectric ceramics lead zirconate titanate (type PZT5A) discs of diameter 6 mm and thickness 0.2 mm purchased (Zhejiang Jiakang Electronics Co., Ltd). One transducer is used for actuation, while the remaining four are designated for reception. This configuration of transducers was determined through experimental trials. It represents the optimal balance, yielding robust multi-touch detection with minimal computational complexity. The sensor is bonded to the 20 cm × 20 cm glass plate using epoxy. The surface of this plate is the region of interest. Its thickness is set at 5 mm such that the wavelengths of the A0 and S0 Lamb modes range from 1.9 to 3 cm and 5.5 to 11 cm, respectively, in the frequency bandwidth of interest (50–100 kHz). Since the A1 Lamb mode cutoff frequency is 425 kHz, only the A0 and S0 are generated in this configuration. To reduce direct paths between the four supports of the glass plate, foam dampeners were added at each clamp. All computation and signal storage is performed using the System On Module (SOM) i.MX 8 M Nano. This SOM is composed of a 1.8 GHz Quad Core A53 processor and a 400 MHz ARM M4 micro-controller. The SOM is connected to the dedicated PCB. This board also contains a pre-amplification stage between the sensor and the SOM and is powered using an external 12 V power supply.

The M4 micro controller is used to control the ultrasound generation and acquisition sequence. The emission signal is a linear chirp between 50 and 100 kHz having a duration of 1 ms with an amplitude of 3 V peak. This emission is repeated every 10 ms and the emission sampling frequency is set to 500 kHz, whereas the 4 acquisition channels record with a sampling frequency of 250 kHz per channel. The M4 is also used for touch event detection. Indeed, as long as the real-time measured signals are similar to those acquired in the initialization phase without contacts on the surface, the M4 continuously performs the generation and acquisition sequence. When the root mean square of all measured samples over all receivers is greater than an empirically defined threshold, a touch event is identified and the imaging sequence starts on the A53 processor.

The preprocessed inverted signal baseline BP used for the imaging is measured once and stored in the A53 eMMC memory. When the system is powered on, the content of the eMMC memory is transferred in the A53 RAM and is used for the calculation.

In order to compare the performance of all three imaging algorithms presented in Sec. 2 under exact same conditions, different datasets were acquired and exported. The algorithms were then applied offline and the results are reported in Sec. 3.1. Following this, real-time online imaging performances are presented in Sec. 3.2 for varying number of touches.

Touch datasets with 1, 2, 3, and 4 simultaneous touches (human fingers) with the plate were acquired, and the images are processed using the three algorithms defined previously (PINV, PINV-REC, GCC-PHAT). The 2D normalized images are presented in Fig. 3, and the touch events obtained using the dedicated procedure are given by the center of the circles plotted.

Fig. 3.

Ultrasound images obtained with algorithms GGC-PHAT, PINV, and PINV-REC, for 1, 2, 3, and 4 touches in contact. The center of the black circles is the coordinates of the touch event calculated using the post-processing procedure.

Fig. 3.

Ultrasound images obtained with algorithms GGC-PHAT, PINV, and PINV-REC, for 1, 2, 3, and 4 touches in contact. The center of the black circles is the coordinates of the touch event calculated using the post-processing procedure.

Close modal

One can see in Fig. 3 that GCC-PHAT outputs 2D images with high background noise and therefore low contrast. This is mainly due to the highly reverberant conditions of the instrumented structure and the low number of transducers used for imaging. Indeed, Lamb waves are known to propagate over large distances, and only four small dampeners are used to hold the entire structure. Although GCC-PHAT is known to perform well in reverberant conditions, its high sensitivity to noise results in a high background noise level, and is thus not adapted for ultrasound image reconstruction using the prototype developed.

In Fig. 3, all the positions of the circles identified using the PINV and PINV-REC algorithms (center and bottom rows of Fig. 3) are highly consistent with the real touch positions. When comparing the two inverse imaging algorithms, one sees that the maxima identified using both algorithms lie in the same regions. The main difference between both algorithms is the touch to background contrast ratio. Indeed, the PINV-REC algorithm only keeps the intensity of the pixels surrounding the identified maximums and replaces all other pixels to a null value. For low number of touches (1 and 2), the PINV algorithm is able to reconstruct the touch coordinates with precision. However, the authors observed that when the number of touches increases or when the fingers are moving on the surface (swipe movement), the PINV algorithm fails in identifying the right touch coordinates. Thus, the high background noise level fluctuations result in false positives anywhere on the plate. Looking at Fig. 3, one sees that the background noise level in the images reconstructed using PINV increases with the number of touches (from left to right) and results in true negatives for Ntouch = 4. This background noise fluctuation is also observed with finger movements on the surface and is thus responsible for many false positives and true negatives when reconstructing real-time images. On the other hand, the PINV-REC algorithm successfully reconstructs the position for up to four simultaneous touches. The iterative process in which the measurement matrix M is modified by subtracting the contribution of the identified touch events results in an identification that is more robust to noise and finger movements. The technique is consistent and robust for 3 simultaneous touches. However, the system sometimes fails to identify the position of a fourth contact point when there is movement. For this reason, real-time validation is performed for up to 3 simultaneous moving touches.

Since limited memory space and computational power are available with the prototype (described in Sec. 2.4), only the PINV-REC is implemented in the final system and presented in this section. To demonstrate the real-time imaging performance of the developed system, a first touch sequence, accessible in the supplementary material to this article and summarized in Fig. 4 is composed of

  • static touches with varying pressure applied to the surface (start to 6 s);

  • moving touches with 1, 2, and 3 fingers, respectively (7–13 s);

  • moving touch on the edge of the glass plate (14–18 s); and

  • moving touches on the back side of the plate for 1 and 2 finger, respectively (19 s to end).

Fig. 4.

Real-time localization of multiple touch configurations: two moving fingers on the back side of the plate (green), single touches on the edge of the glass plate (blue-red), single touches with varying pressure applied to the surface (yellow-orange), and three simultaneous touches on the front surface (black).

Fig. 4.

Real-time localization of multiple touch configurations: two moving fingers on the back side of the plate (green), single touches on the edge of the glass plate (blue-red), single touches with varying pressure applied to the surface (yellow-orange), and three simultaneous touches on the front surface (black).

Close modal

A graphical user interface, shown in Fig. 4, is developed to test the prototype, with the defined grid, the colorbar on the right showing a qualitative pressure intensity and the horizontal line showing the edge contacts. As summarized in Fig. 4 and demonstrated in the supplementary material, the system allows robust real-time localization of up to three simultaneous touches with a latency below 20 ms. The front, back, and top edge of the glass plate are all touch capable. Despite the signal database being recorded with touch events on the front side of the panel, the localization algorithm effectively identifies touch events on the back side as well. This is explained by the fact that both fundamental Lamb modes have equivalent amplitudes on both sides of the panel, albeit opposing displacements for the A0 mode. This translates into a 180 degree phase shift in the A0 mode between contacts on both sides of the panel. Since the imaging metric used for reconstruction relies on the absolute value of the coherent sum implied in the matrix multiplication, the system can be used to identify touches on both sides of the panel. The pressure gauge on the right displays a relative pressure estimate from minimal (no pressure applied) to maximal pressure (pressure applied during calibration). This qualitative pressure indicator is calculated by comparing the amplitude of the residual signals Sres with the database signals obtained by applying a force of 8 N. This means that the qualitative pressure gauge does not provide a quantitative measure of the force applied. Instead, it serves as a qualitative tool that facilitates the distinction between strong and weak contacts on the surface. To achieve a quantitative measurement of pressure, the signal database would need to include measurements at various pressure levels, as the relationship between the amplitude of Sres and the applied pressure is non-linear.

The overall system's refresh rate is of 45 frames per second and thus runs smoothly for the configuration detailed herein. When plugged into a computer via USB port, the system is recognized as a dedicated USB HID interface and can thus be used to manipulate the mouse and generate click events. As described in the literature (Quaegebeur , 2016), the interaction between ultrasound waves and objects depends on the local contact impedance. Hence, the technology works with different types of contacts (hands with gloves, contacts with pencils or other objects). This is demonstrated in the supplementary material. The technology can be adapted for larger surfaces; however, adjustments may be necessary for the localization and detection algorithms. Specifically, maintaining a 2 mm spatial resolution would necessitate an increase in the number of pixels as the size of the instrumented surface expands. Consequently, the ill-posed nature of the linear equation system would demand the development of new regularization methods to mitigate the sensitivity of matrix inversion to noise.

This paper introduces a cost-effective solution that brings real-time multi-touch capabilities to future electronic devices using ultrasound Lamb waves. The technology is based on the comparison of the real-time measured radio frequency signals from four transducers with a pre-recorded signal baseline for imaging. A recursive pseudo-inverse algorithm formulated in the frequency domain is presented to identify the number of touches and to precisely localize the coordinates of each touch. This algorithm is more robust to noise and reverberations than the standard GCC-PHAT source localization algorithm and outperforms the simple problem inversion using the pseudo-inverse in low signal-to-noise ratio and in the presence of multiple simultaneous contacts. It is demonstrated that the conceived prototype allows real-time imaging for up to three simultaneous touches and is robust over time. Moreover, the technology is pressure sensitive and allows imaging edge contacts on the instrumented panel. The device outputs the touch coordinates through a USB port using the USB HID interface and can be used with a computer to control the mouse and perform actions, such as clicking, zooming, and pinching. For future applications, an embedded system with a graphics processing unit could be used to instrument large structures with millions of pixels for real-time detection.

See the supplementary material for visualization of the real-time performances of the prototype in terms of multi-touch capability, localization, and pressure measurement.

The authors have no conflicts of interest to disclose.

The data that support the findings of this study are available from the corresponding author upon reasonable request.

1.
Adler
,
R.
, and
Desmares
,
P. J.
(
1987
). “
An economical touch panel using SAW absorption
,”
IEEE Trans. Ultrason. Ferroelectr. Freq. Control.
34
,
195
201
.
2.
Bertero
,
M.
,
Boccacci
,
P.
, and
De Mol
,
C.
(
2021
).
Introduction to Inverse Problems in Imaging
(
CRC Press
,
Boca Raton
,
FL
).
3.
Bilodeau
,
M.
,
Amyot
,
F.-A.
,
Masson
,
P.
, and
Quaegebeur
,
N.
(
2023
). “
Real-time ultrasound phase imaging
,”
Ultrasonics
134
,
107086
.
4.
Dai
,
J.
, and
Chung
,
C. K. R.
(
2014
). “
Touchscreen everywhere: On transferring a normal planar surface to a touch-sensitive display
,”
IEEE Trans. Cybern.
44
,
1383
1396
.
5.
Firouzi
,
K.
,
Nikoozadeh
,
A.
,
Carver
,
T. E.
, and
Khuri-Yakub
,
B. P. T.
(
2016
). “
Lamb wave multitouch ultrasonic touchscreen
,”
Trans. Ultrason. Ferroelectr. Freq. Control.
63
,
2174
2186
.
6.
Högbom
,
J.
(
1974
). “
Aperture synthesis with a non-regular distribution of interferometer baselines
,”
Astron. Astrophys. Suppl.
15
,
417
; available at http://tinyurl.com/pfm7m2n.
7.
Ing
,
R. K.
,
Quieffin
,
N.
,
Catheline
,
S.
, and
Fink
,
M.
(
2005
). “
In solid localization of finger impacts using acoustic time-reversal process
,”
Appl. Phys. Lett.
87
,
204104
.
8.
Kang
,
K. C.
,
Kim
,
Y. H.
,
Pyun
,
J. Y.
, and
Park
,
K. K.
(
2022
). “
Feasibility study on multi-touch ultrasound large-panel touchscreen using guided Lamb waves
,”
Measurement
190
,
110755
.
9.
Knapp
,
C.
, and
Carter
,
G.
(
1976
). “
The generalized correlation method for estimation of time delay
,”
IEEE Trans. Acoust. Speech Signal Process.
24
(
4
),
320
327
.
10.
Liu
,
Y.
,
Nikolovski
,
J. P.
,
Mechbal
,
N.
,
Hafez
,
M.
, and
Vergé
,
M.
(
2010
). “
An acoustic multi-touch sensing method using amplitude disturbed ultrasonic wave diffraction patterns
,”
Sens. Actuators A Phys.
162
,
394
399
.
11.
Quaegebeur
,
N.
, and
Masson
,
P.
(
2012
). “
Correlation-based imaging technique using ultrasonic transmit–receive array for non-destructive evaluation
,”
Ultrasonics
52
(
8
),
1056
1064
.
12.
Quaegebeur
,
N.
,
Masson
,
P.
,
Beaudet
,
N.
, and
Sarret
,
P.
(
2016
). “
Touchscreen surface based on interaction of ultrasonic guided waves with a contact impedance
,”
IEEE Sens. J.
16
,
3564
3571
.
13.
Reis
,
S.
,
Correia
,
V.
,
Martins
,
M.
,
Barbosa
,
G.
,
Sousa
,
R. M.
,
Minas
,
G.
,
Lanceros-Mendez
,
S.
, and
Rocha
,
J. G.
(
2010
). “
Touchscreen based on acoustic pulse recognition with piezoelectric polymer sensors
,” in
2010 IEEE International Symposium on Industrial Electronics
, pp.
516
520
.
14.
Sijtsma
,
P.
(
2007
). “
Clean based on spatial source coherence
,”
Int. J. Aeroacoustics
6
(
4
),
357
374
.
15.
Walker
,
G.
(
2012
). “
A review of technologies for sensing contact location on the surface of a display
,”
J. Soc. Info. Disp.
20
,
413
440
.

Supplementary Material