Advanced instrumentation and versatile setups are needed for understanding light interaction with biological targets. Such instruments include (1) microscopes and 3D scanners for detailed spatial analysis, (2) spectral instruments for deducing molecular composition, (3) polarimeters for assessing structural properties, and (4) goniometers probing the scattering phase function of, e.g., tissue slabs. While a large selection of commercial biophotonic instruments and laboratory equipment are available, they are often bulky and expensive. Therefore, they remain inaccessible for secondary education, hobbyists, and research groups in low-income countries. This lack of equipment impedes hands-on proficiency with basic biophotonic principles and the ability to solve local problems with applied physics. We have designed, prototyped, and evaluated the low-cost Biophotonics, Imaging, Optical, Spectral, Polarimetric, Angular, and Compact Equipment (BIOSPACE) for high-quality quantitative analysis. BIOSPACE uses multiplexed light-emitting diodes with emission wavelengths from ultraviolet to near-infrared, captured by a synchronized camera. The angles of the light source, the target, and the polarization filters are automated by low-cost mechanics and a microcomputer. This enables multi-dimensional scatter analysis of centimeter-sized biological targets. We present the construction, calibration, and evaluation of BIOSPACE. The diverse functions of BIOSPACE include small animal spectral imaging, measuring the nanometer thickness of a bark-beetle wing, acquiring the scattering phase function of a blood smear and estimating the anisotropic scattering and the extinction coefficients, and contrasting muscle fibers using polarization. We provide blueprints, component list, and software for replication by enthusiasts and educators to simplify the hands-on investigation of fundamental optical properties in biological samples.

Biophotonics is the discipline of using light for the diagnosis and treatment of biological tissue.1–3 The field of biophotonics has developed rapidly since the emergence of optoelectronics half a century ago. In particular, thanks to devices such as the light emitting diode (LED), semiconductor lasers, and image sensors. Over the decades, a myriad of approaches, specialized instruments, and applications have emerged within, e.g., medicine,4 environmental monitoring,5,6 and organic products.7–10 In the following sections, we discuss how biophotonic approaches can address these disciplines in various domains covered by our Biophotonics, Imaging, Optical, Spectral, Polarimetric, Angular, and Compact Equipment (BIOSPACE).

Biological insights have been gained by analyzing the spatial features in tissue through optical imaging11 on macroscopic and microscopic scales.12,13 The latest advances allow the study of subcellular structures.14 Imaging concepts have also been extended to three dimensions with techniques such as photoacoustics,15,16 optical coherence tomography,17 structured illumination and light-sheets microscopy,18 diffuse optical tomography,19 and optical projection tomography.20,21

Although molecules are too tiny to be resolved spatially, spectroscopy provides the means to quantify the chemical composition of tissues; such techniques are known as molecular imaging or tissue spectroscopy.22,23 The strategies to acquire multispectral images range from red-green-blue (RGB) imaging, filter wheels, and tunable liquid crystal filters,24 which all collect a limited number of spectral bands. In contrast, hyperspectral imaging by push-broom imaging8,25 or interferometry26–28 captures hundreds of bands. Multispectral imaging can also be accomplished by controlling and multiplexing the light source.29–33 Such multiplexing can be implemented inexpensively and efficiently uses the produced light.

Polarimetry is a viable technique for acquiring molecular contrast that also has the benefit of assessing microstructural features. In polarimetry, coherently scattered photons recalling their initial propagation, phase, and polarization state can be distinguished from incoherent photons resulting from multiple scattering and photon migration in the tissue.34,35 Polarimetric imaging can be accomplished by arranging a linear polarizer on the illumination and a rotating analyzer on the collecting optics. Approaches for snapshot polarization imaging have also been developed.36,37

The most common microscopy and imaging geometries comprise reflectance and transmittance. In fact, these geometries assess different light scattering angles. The imaging contrast can differ significantly between geometries depending on the type of sample.38 This concept can be extended to include modes such as total transmittance, ballistic transmittance,39 total diffuse reflectance, and specular reflectance. In particular, dark-field forward scattering can improve contrast in microscopy.40 In analogy to extending a handful of bands in multispectral imaging to hundreds of bands in hyperspectral imaging, the angular scattering modes can be extended from a few modes such as reflectance, transmittance, and dark-field31 to cover a continuous angular range from zero to π. In the specular domain, as well as the angular domain, this dramatically increases the gathered information and conclusions that one can draw. For angular analysis, such an extended measurement is called goniometry.41,42 Although uncommon, goniometry can also be implemented in imaging mode.43,44

As understood, there are multiple types of both reflectance and transmittance modes, e.g., co- and de-polarized, and ballistic and diffuse. In general, the quantities reported from such studies are subject to the specific measurement geometry, e.g., the numerical apertures of illumination and objectives. Therefore, results are challenging to relate across different studies, laboratories, and instruments. The proposed solution is to convert such measurands into quantitative optical properties. Optical properties include the absorption-, scatter- and de-polarization coefficients—µa, µs, and µLP, respectively—in units of cm−1. Also, the dimensionless refractive index n and the scatter anisotropic factor g govern the radiative transport. The use of optical properties permits the inter-comparison of values between research groups.45 

The task of disentangling optical properties from measurands is not trivial,46 and it constitutes a long-standing fundamental problem in biophotonics. For example, reflectance and transmittance from a blood sample are governed by µa, µs, n, and g for any wavelength. Furthermore, n is coupled to µa via the Kramer–Kronig relations, and in turn, µs and g are determined by deviations of n from the surrounding medium.47 

In general, the solution to the disentangling challenge is to acquire more measurands than the number of optical properties varying within the study. An early approach measures multiple angular scatter lobes by using integrating spheres in total transmittance, total reflectance, ballistic transmittance, and specular reflectance modes.39 A modern approach acquires multiple measurands as a function of time-of-flight using mode-locked lasers and single-photon counting.48–50 Similar results can also be accomplished by frequency sweeps.51 For steady-state solutions, the number of measurands can instead be increased in the spatial domain by multiple injection or detection points, which yield a plurality of interrogation path lengths.9,52,53 Such concepts constitute the cornerstones in diffuse optical tomography.54 

While the ideas, instruments, and insights of modern biophotonics are fascinating, the aforementioned implementations are inaccessible for research groups in lower-income regions, secondary education students, and hobbyists. This fact limits hands-on learning opportunities for light–tissue interaction at early stages and prevents applied research teams from tackling local issues, for instance, in the tropics. Several initiatives for realistic or low-cost instrumentation have emerged. Such concepts include an ultra-low-cost microscope based on a cardboard origami structure;55 accessible optical instruments such as 3D-printed holographic microscopes,56 smartphone-based spectrometers,57,58 LEGO®-based fluorometers;59 and advanced instruments such as Raman spectrometers,60 Brewster angle microscopes,61 and Michelson interferometers.62 In particular, recent progress in widely available 3D printers has facilitated projects in open-source hardware,63,64 biophotonic objects impossible to produce conventionally,65 and general-purpose toolboxes for 3D-printed optomechanical components.66,67 While these advances report on low-cost and more adaptive photonic instrumentation, in general, they are also considered inferior in quality as compared to milled counterparts or commercial photonic lab supplies.

In contrast, here, we report a low-cost biophotonic platform of such complexity and modularity beyond what is available commercially. Furthermore, the multiaxis mechanisms would be highly challenging to design with conventional off-the-shelf photonic laboratory components. The structure of BIOSPACE is made of LEGO technic and 3D-printed adaptors for optical elements, making it modular and adaptive for studies of diverse samples. The instrument captures multispectral images with polarization and light scatter angle information of a biological target, rotated around two axes. This yields measurands of high dimensionality, allowing disentanglement of quantitative optical properties.

The biophotonic instrument BIOSPACE is inspired by simpler earlier work31,44 but now covers more measurement domains. The structure is made with LEGO-technic, which significantly reduces the cost. The instrument consists of three modules: an illuminator, a target rotation stage, and a receiver unit. Four servomotors control the angles of the parts: the illuminator can rotate around the target to achieve a goniometric scan; the target can rotate around two axes to project it from all viewing angles; and a linear polarizer on the receiver unit can be rotated to yield polarimetric information. In Fig. 1, the optical arrangement of BIOSPACE’s illuminator and receiver unit are illustrated, along with the dimensions of the placement of components. The schematic in Fig. 1 is drawn for a goniometric scatter angle of zero (ballistic transmittance mode). The rotation axes of the motors and the data flows are also indicated.

FIG. 1.

BIOSPACE’s components, optical arrangement, and diagram of the data flow of a measurement. The orange M-circles indicate rotating motors, while the orange arrows imply the control signals for the motors. The black arrows indicate other data communication such as image data. The green boxes are circuit boards. The red cone represents the light produced by the illuminator, and the blue cone represents the field of view (FoV) of the camera. The placement of optical components is indicated in mm on the z axis.

FIG. 1.

BIOSPACE’s components, optical arrangement, and diagram of the data flow of a measurement. The orange M-circles indicate rotating motors, while the orange arrows imply the control signals for the motors. The black arrows indicate other data communication such as image data. The green boxes are circuit boards. The red cone represents the light produced by the illuminator, and the blue cone represents the field of view (FoV) of the camera. The placement of optical components is indicated in mm on the z axis.

Close modal

The optical arrangement of BIOSPACE is illustrated in Fig. 1 for ballistic transmittance, and the specific optical components are listed in supplementary material Table S1. The illuminator is seen as an exploded-view drawing in Fig. 2(a), and its mounting on BIOSPACE is shown in Fig. 2(b). It has eight LEDs with individual emission wavelengths ranging from 365 to 940 nm, mounted on a multiplexing printed circuit board (PCB). The LEDs are lit sequentially and synchronized with a strobe input. The LED light is injected into a white diffuse cavity and guided out through a homogenizing light pipe. The light is polarized by a linear polarization filter and collimated by a lens before it impinges on the biological target. The scattered light is collected by an achromatic objective lens. A motorized polarization analyzer can select, e.g., co- or de-polarized light. After this, the scattered light is imaged onto a camera.

FIG. 2.

(a): Exploded view of the illuminator. The components of the illuminator include an LED-multiplexer PCB, a white diffusive cavity, and a connector piece compatible with LEGO-technic. The arrows illustrate the light path of the different LEDs of the illuminator. (b) Photograph of the mechanical construction of BIOSPACE. The placement of different parts is indicated with arrows. The motorized degrees of freedom include scatter-, polarization-, roll- and yaw-angle.

FIG. 2.

(a): Exploded view of the illuminator. The components of the illuminator include an LED-multiplexer PCB, a white diffusive cavity, and a connector piece compatible with LEGO-technic. The arrows illustrate the light path of the different LEDs of the illuminator. (b) Photograph of the mechanical construction of BIOSPACE. The placement of different parts is indicated with arrows. The motorized degrees of freedom include scatter-, polarization-, roll- and yaw-angle.

Close modal

The BIOSPACE mechanical construction is made of LEGO-technic plastic parts as shown in Fig. 2(b). The necessary parts can be acquired from a single LEGO-technic kit (Bucket Wheel Excavator, LEGO, Denmark). The target rotation stage consists of three motorized axes inside each other, where the inner controls the roll- and yaw-angle of the sample and the outer supports the illuminator and controls the scatter angle. The roll- and yaw-axes of rotation are tilted 45° with respect to each other, allowing full goniometric scans without obscuration. The rotation axes are shown in Fig. 2(b). These axes are controlled by servo motors (EV3 Medium Servo Motors, LEGO, Denmark). The size of the BIOSPACE is 300 × 500 × 300 mm3 and the weight is 1 kg. The detailed assembly instructions of BIOSPACE are provided as supplementary material (S2).

The 3D-printed parts connect the illuminator, the optical components, and the camera to the LEGO structure. Individual lenses are secured with O-rings. The 3D-printing material used for connecting optical components and shielding for glare and stray light is black ABS (acrylonitrile butadiene styrene) plastic. The LEGO-optical adaptor CAD (computer-aided design) files are available as supplementary material (S3).

The white diffuse cavity, an integral part of the illuminator, is 3D printed with white ABS filament with 100% infill and coated with Barium Sulfate (BaSO4) applied with a brush. Common filament types for 3D printing were evaluated [see Fig. 3(a)]. The ABS material was selected as it exhibits maximal scattering, minimal absorption, and no fluorescence for the relevant wavelength range (365–940 nm). Each of the evaluated 3D-printed materials suffered from low reflectance in the UV region, and thus, the highly reflective BaSO4 coating was applied.

FIG. 3.

(a) Diffuse reflectance from candidate materials for manufacturing a white diffuse cavity, a high diffuse reflection from 365 to 940 is desired. A strong UV and violet absorption is clearly seen for all 3D-printing filaments (PLA, ABS, and nylon). The PTFE, polystyrene foam, and BaSO4 all show excellent reflectance over the relevant range, and the latters’ reflectance exceeding 100% is explained by their scatter coefficient being greater than the Spectralon standard. A fluorescence peak can be observed at 470 nm for PLA, which makes it unsuitable for LED spectroscopy. The downward slope from 500 nm and onward is explained by photon escape as a consequence of the measurement geometry rather than absorption in the material. (b) The PToFS histogram of 630 nm light inside the cavity shows a histogram of the times each photon takes before leaving the cavity. From this, the material with the highest scattering can be found. The path length of the single reflected peak is 7.6 cm, which corresponds to the shortest path the light can take within the cavity, i.e., a single reflection. The mean path length for all light leaving the cavity is 27 cm.

FIG. 3.

(a) Diffuse reflectance from candidate materials for manufacturing a white diffuse cavity, a high diffuse reflection from 365 to 940 is desired. A strong UV and violet absorption is clearly seen for all 3D-printing filaments (PLA, ABS, and nylon). The PTFE, polystyrene foam, and BaSO4 all show excellent reflectance over the relevant range, and the latters’ reflectance exceeding 100% is explained by their scatter coefficient being greater than the Spectralon standard. A fluorescence peak can be observed at 470 nm for PLA, which makes it unsuitable for LED spectroscopy. The downward slope from 500 nm and onward is explained by photon escape as a consequence of the measurement geometry rather than absorption in the material. (b) The PToFS histogram of 630 nm light inside the cavity shows a histogram of the times each photon takes before leaving the cavity. From this, the material with the highest scattering can be found. The path length of the single reflected peak is 7.6 cm, which corresponds to the shortest path the light can take within the cavity, i.e., a single reflection. The mean path length for all light leaving the cavity is 27 cm.

Close modal

If milling or hot-wire subtractive methods remained viable options, materials such as PTFE (polytetrafluoroethylene) or polystyrene foam would be well-suited options due to their broad high reflectance. The evaluation was performed using spectroscopy with a bifurcated fiber connected to a tungsten deuterium lamp (SLS204, Thorlabs, USA) and a compact spectrometer (USB4000, OceanOptics, USA). A SpectralonTM white standard (Labsphere, Inc., North Sutton, USA) with specified 99% reflectance was used as a reference.

Further investigation of the materials’ quantitative scattering coefficients was made by photon time-of-flight spectroscopy (PToFS).48 The absorption and reduced scattering coefficients of ABS for wavelengths of 630, 810, and 940 nm were found to be µa = 0.014, 0.0090, and 0.015 cm−1, and the reduced scattering coefficients µs′ = 46, 47, and 48 cm−1, respectively. The mean photon path length within the coated cavity (3 cm diameter) with a time-correlated single-photon counting (TCSPC) instrument was found to be 27 cm for 630 nm [see Fig. 3(b)]. The TCSPC setup48 measures the time delay of light traveling between two fiber probes. Slabs of 5, 10, and 15 mm thickness of the different materials were placed in the setup. The absorption and reduced scattering coefficients of the different materials were found by solving the inverse diffusion equation for the slabs.65 

BIOSPACE uses several components for performing and controlling a measurement, including parts for data acquisition, motor control, and user input/output. BIOSPACE is controlled by an open-source single-board computer (Raspberry Pi 4, Raspberry Pi Foundation, Cambridge, UK). The computer runs the Linux operating system and has ports for connecting the display, mouse, and keyboard for user input and output, as well as a USB3 port for connecting an external hard drive for storing the acquired measurement data. The computer communicates with the motors through an add-on board (BrickPi3, Dexter Industries, Washington DC, United States). The add-on board has four Ethernet ports for setting and reading positions of the servo motors. The measurement images are captured with an industrial camera (acA1920-155um, Basler AG, Ahrensburg, Germany), with a silicon CMOS chip. It records 2-megapixel images with a 12-bit dynamic range; it has a global shutter and can be binned up to four times on both rows and columns. Important for our implementation is that it has a strobe out-port, which gives a high value when there is an active exposure of the camera. Finally, for multiplexing the light, a custom-designed and manufactured LED-multiplexing printed circuit board (PCB) is used (see Fig. 4).

FIG. 4.

Circuit diagram of the LED multiplexing PCB. Here, the LEDs are fed with constant-current sources of 500 and 100 mA, respectively. Each LED is lit sequentially and synchronized with the camera strobe connected to the coaxial input.

FIG. 4.

Circuit diagram of the LED multiplexing PCB. Here, the LEDs are fed with constant-current sources of 500 and 100 mA, respectively. Each LED is lit sequentially and synchronized with the camera strobe connected to the coaxial input.

Close modal

The multiplexing PCB consists of a coaxial input (BNC), which is connected to a strobe-out from the camera. The strobe signal is amplified and connected to a (not-)enable port of a decimal counter. A transistor array drives one of the eight LEDs sequentially. There is also one dark time slot used for background subtraction. All LEDs are driven with currents of 100 mA except for the 365 and 940 nm LEDs, which have a current of 500 mA. This equalizes the spectral response of the silicon image chip and allows all bands to be captured within the dynamical range using the same exposure time.31 

BIOSPACE has an open-source application that is available for download.68 The program is built in the free-to-use programming language, Python (Python Software Foundation, Delaware, United States), and uses various open-source software libraries. The graphical user interface is built with a library called Tkinter (Python Software Foundation, Delaware, United States).69 Tkinter is the standard Python binding of an open-source, cross-platform widget toolkit called Tk. Camera communication is implemented with a library called PyPylon.70 PyPylon is the official Python wrapper for the Basler Pylon Camera Software Suite (Basler AG, Ahrensburg, Germany). For controlling the servo motors, the BrickPi3 (Dexter Industries, Washington DC, United States) repository is used.71 

The application controls the standard camera parameters such as exposure time and gain. A calibration view is presented before a measurement is started with the press of a button. A complete measurement is then performed according to a spreadsheet measurement protocol file, where the relevant scatter-, roll-, yaw, and polarization-angles are listed. These angles can then be associated with every multispectral image. The motors of BIOSPACE are moved to the relevant positions and a non-compressed .tiff image is captured for each spectral band. Each image is background-subtracted with an ambient image, ensuring that measurements can be run regardless of the light in the room. However, a minimization of ambient light is recommended to maximize the dynamic range of the measurement.

Image analysis can be performed in any suitable software since the source images are saved as non-compressed image files. Matlab (Mathworks) was used for the image analysis in this paper, and scripts for importing and calibrating images to a white reference are available for download.68 

A BIOSPACE measurement results in a seven-dimensional intensity tensor, I(x, y, b, scatter, roll, yaw, pol), where x, y, b, scatter, roll, yaw, and pol are integer indices of the vectors x, y, λb, θscatter, θroll, θyaw, and θpol, respectively. The coordinates x and y [mm] are the horizontal and vertical distances from the optical axis on the camera sensor, respectively, and the imaging information is found in these vectors. Spectroscopy information is found in λb [nm], which denotes the wavelength bands of the LEDs. The goniometric information consists of the scatter lobes in θscatter [deg], while the 3D aspect angle information of the sample is denoted in θroll and θyaw [deg]. Polarimetry is explored in the polarization-angles θpol [deg]. The performance and calibration procedures are described for each of these measurement domains in Secs. III AIII F. The range and resolution in each domain are also specified in Table I.

TABLE I.

The measurement domains probed by BIOSPACE along with the corresponding examined features. The range and the discretization are provided for each domain. The number of measurands is reported for the relevant domains. The temporal domain could, in theory, provide an infinite number of measurands; however, this is not relevant in the samples studied, which are stationary in time. Similarly, the spatial domain has not been used for increasing the number of measurands but rather to isolate a spatial feature to study. Here, the number of sample viewing angle measurands is defined as the possible lobes in θroll and θyaw. The resolution reported is the framerate in the temporal domain and the average spectral FWHM in the spectral domain.

Subject/domainExamined featureRangeResolutionMeasurands
Intensity/dynamic ⋯ 0 → 4095 12 bit ⋯ 
Time/temporal ⋯ 1 µs → 10 s 164 Hz ⋯ 
Space/spatial Morphology, spatial features 19.8 × 12.5 mm2 20 µ960 × 600 effective pix 
Photon energy/spectral Molecular composition 365 → 940 nm 25 nm 8 bands 
Propagation angle/goniometric n, g, µs′ −165° → 165° 5° 67 lobes 
Polarization/polarimetric Surface structure 0° → 180° 2° 4 parameters 
Sample viewing angle/angular Iridescence, morphology 45° → 135° × 0° → 360° 2° 45 yaw lobes 
    180 roll lobes 
Subject/domainExamined featureRangeResolutionMeasurands
Intensity/dynamic ⋯ 0 → 4095 12 bit ⋯ 
Time/temporal ⋯ 1 µs → 10 s 164 Hz ⋯ 
Space/spatial Morphology, spatial features 19.8 × 12.5 mm2 20 µ960 × 600 effective pix 
Photon energy/spectral Molecular composition 365 → 940 nm 25 nm 8 bands 
Propagation angle/goniometric n, g, µs′ −165° → 165° 5° 67 lobes 
Polarization/polarimetric Surface structure 0° → 180° 2° 4 parameters 
Sample viewing angle/angular Iridescence, morphology 45° → 135° × 0° → 360° 2° 45 yaw lobes 
    180 roll lobes 

A calibrated reflectance (|θscatter| > 90°) measurement can be accomplished according to

(1)

where Isample is the intensity measurement of the sample, Idark is the dark exposure, and Iref is a reference standard. Rref is a traceable table value for the diffuse reflectance standard; in our case, Spectralon was used. Specular reflectance (where the illumination is folded into the receiver) can be calibrated with a metallic mirror instead of the diffuse standard used in Eq. (1). Forward scatter (5° < |θscatter| < 90°) can similarly be calibrated by opal diffusers with known Lambertian transmission lobes. Finally, a measurement with a ballistic configuration (|θscatter| < 5°) of a transparent sample can be calibrated by an empty sample holder. Specular and ballistic measurements are generally more intense than diffuse reflectance and forward scatter. Multiple exposures may be stitched together to acquire a high dynamical range (HDR) measurement for multi-scatter angle or goniometric analysis. The dynamic range of the camera is 12-bit, which defines the full well capacity (FWC) to 212–1 counts. The FWC per second (FWC/s) will be used throughout the paper as the intensity unit of the measurements. The level of dark exposure is highly dependent on the ambient light. It was measured in a dark room to be 10−3 FWC/s, which can be compared to a ballistic measurement of 800 FWC/s and a backscatter measurement of a diffuse white target of 40 FWC/s. These values indicate the appropriate exposure time for specific measurement configurations, e.g., a diffuse backscatter measurement should have an exposure time of less than 25 ms to avoid saturation. The dynamic resolution is given by the standard deviation of multiple acquisitions of a reference standard; it resulted in a signal-to-noise ratio of 42 dB on average over the spectral bands.

The intensity measurement is extended in the spatial domain with the positional information provided by the camera configuration, which maps the object plane, x′, y′ (mm), onto the (11.3 × 7.1 mm2) sensor of the camera according to

(2)

where M is the magnification of the system at the plane of the sensor. The magnification was calculated to be M = 0.57 by acquiring an image of a target with a known size while knowing the size of the imaging sensor. The image is spatially discretized by the pixel pitch of the camera sensor (5.86 × 5.86 µm2) into the discrete horizontal and vertical pixels (x, y). How the object maps to the sensor is described by the convolution of the light from the object, Iobj(x, y), and the point spread function (PSF) of the system,

(3)

expressed in integral form.72 This assumes that the complete FoV is illuminated.

To evaluate both the spatial resolution and the depth of field (DoF) of BIOSPACE, we imaged a Siemens star target at locations −5 mm < z < +5 mm from the focus plane of the instrument. The resolution was also evaluated for the various spectral bands to find the chromatic aberrations (see Fig. 5). The full width half maximum (FWHM) of the PSF was calculated for the axial positions, and the parabolic functions were fitted to the values. The resolution, depth of field, and focus displacement for each spectral band are seen in Fig. 5. The PSF FWHM represents the resolvable spot size measured at (17–23) µm for each spectral band. Our magnification and pixel size imply that the smallest resolvable features are sampled by two pixels, proving a good matching of lens and sensor resolutions according to the Nyquist–Shannon sampling theorem. However, pixel binning could be used to increase the dynamic resolution (signal-to-noise ratio).

FIG. 5.

The resolution of the imaging system for the spectral bands of BIOSPACE. The focus displacement is evident when comparing the NIR 940 nm band to the visible bands. The DoF is different for the bands and the focus displacement further reduces the combined DoF of all spectral bands. In the case of resolving a 50 µm feature in all bands, the DoF is 6 mm.

FIG. 5.

The resolution of the imaging system for the spectral bands of BIOSPACE. The focus displacement is evident when comparing the NIR 940 nm band to the visible bands. The DoF is different for the bands and the focus displacement further reduces the combined DoF of all spectral bands. In the case of resolving a 50 µm feature in all bands, the DoF is 6 mm.

Close modal

Flat field calibration was accomplished by evaluating the spatial variation in the intensity of image I(x, y, b) by acquiring images of the Spectralon standard for each spectral band. The entire image field 20 × 13 mm2 was illuminated by more than 80% of the highest recorded signal (see Fig. 6).

FIG. 6.

The spatial flatness of image I(x, y, b). (a) Horizontal direction. (b) Vertical direction. The signal is normalized to the highest recorded signal. All spectral bands have a high signal over the whole image; the UV 365 nm band sticks out with a slightly lower signal at some vertical positions, possibly due to contamination of the homogenizing hex rod or the reflectance standard.

FIG. 6.

The spatial flatness of image I(x, y, b). (a) Horizontal direction. (b) Vertical direction. The signal is normalized to the highest recorded signal. All spectral bands have a high signal over the whole image; the UV 365 nm band sticks out with a slightly lower signal at some vertical positions, possibly due to contamination of the homogenizing hex rod or the reflectance standard.

Close modal

The spectral intensity measurement is determined by

(4)

where Eλb(λ) is the emitted light in the specific spectral band, Iobj(λ) is the intensity of the reflected light from the object, and S(λ) is the spectral sensitivity of the camera. White intensity calibration is accomplished according to Eq. (1). The resulting spectral shape of Eλb(λ) and S(λ) was measured with the compact spectrometer (see Fig. 7), assuming similar sensitivity to the camera sensor.

FIG. 7.

Spectral measurement of the reference standard. The dots indicate the recorded intensity for each LED and are positioned at the peak emission wavelength according to their datasheets. The spectral shape, Eλb (λ), of the LEDs is recorded by the compact spectrometer. There is a minor overlap of the 405 and 430 nm as well as the 490 and 525 nm LEDs, but considerably less than for human color vision bands. The FWHM of the measured spectral bands ranged from 9 to 59 nm with an average of 25 nm. The bands are slightly red-shifted; however, this was deduced to be a spectrometer misalignment.

FIG. 7.

Spectral measurement of the reference standard. The dots indicate the recorded intensity for each LED and are positioned at the peak emission wavelength according to their datasheets. The spectral shape, Eλb (λ), of the LEDs is recorded by the compact spectrometer. There is a minor overlap of the 405 and 430 nm as well as the 490 and 525 nm LEDs, but considerably less than for human color vision bands. The FWHM of the measured spectral bands ranged from 9 to 59 nm with an average of 25 nm. The bands are slightly red-shifted; however, this was deduced to be a spectrometer misalignment.

Close modal

The goniometric measurement I(θscatter) is determined by the scatter angle between the optical axis of the illuminator and the receiver module (set by a LEGO motor). The accuracy and repeatability when performing a complete scan from −165° to 165° were measured to have a deviation of up to 2°. Potential backlash in the gears is avoided by keeping the same rotation direction throughout a measurement. The goniometric resolution of an I(θscatter) measurement is limited by the angular width of the illuminator light cone convoluted with the received light cone (i.e., the goniometric instrument function). The FWHM of the goniometric resolution can be measured by scanning the scatter angle in the vicinity of ballistic mode (θscatter ≈ 0°), a 2D instrument scatter lobe can also be measured using side scatter mode (θscatter ≈ 90°), and a flat mirror in the sample holder, scanning both θroll and θyaw. We estimated the goniometric resolution to 5°, indicating that the LEGO gears’ hysteresis is not limiting the angular resolution. The scatter angles range from −165° < θscatter < 165°, yielding up to 67 resolvable scatter lobes.

LEGO motors and gears also control the two-axis rotation of the sample with similar precision. The two axes are 45° inclined with respect to each other to allow non-obstructed goniometric scans [see Fig. 2(b) inset]. This inclination provides a 2D observational viewing angle of the sample of 45° < θyaw < 135° and 0° < θroll < 360°. The number of unique combinations of target viewing angles depends on the angular step size for the measurement. For example, 5° steps for θroll and θyaw result in 1296 combinations of viewing angles. This is more than an adequate number to perform 3D reconstructions of, e.g., entomological museum collections.21,73

The polarizer is in a static angle, linearly polarizing the light illuminating the object, while the analyzer in front of the camera rotates to probe the light of specific polarization angles. The polarization information, independent of the high precision of the rotation stages, can always be reduced to four variables (called the Stokes parameters).34 

The combined number of measurands of a complete BIOSPACE measurement is the product of the number of measurands in each domain, which results in over 1013 12-bit intensity values in the tensor I(x, y, b, scatter, roll, yaw, pol), described in Sec. III (see Table I). This corresponds to 107 unique 2-megapixel images. However, the acquisition time for complete measurement of all domains is typically unfeasible and unnecessary (e.g., at 10 ms exposure, it would take over 48 h and consume over 34 terabytes of storage). The vectors x, y, and λb are generally complete in each measurement since the instrument captures multispectral images by default. However, the θscatter, θroll, θyaw, and θpol vectors can range in length from singular to the maximum number of measurands stated in Table I. Therefore, experiments should be designed cleverly to investigate the relevant indices of I. In practice, only a small subset of the full capabilities of BIOSPACE is used for each experiment. What this subset contains varies on the study, and typically, some dimensions of I can be singular. The huge number of possible acquirable images demonstrates the great versatility of BIOSPACE as a platform for pursuing unique experiments in research and education.

There are several design choices where trade-off effects influence the resolution in multiple dimensions, typically when one is increased and the other decreased. Such examples include spatial binning, where spatial resolution is sacrificed for a higher dynamic signal-to-noise ratio. Another example includes the F-number of the lens. A low F-number yields a high spatial resolution and a small PSF; on the other hand, it yields a short DoF. Furthermore, as the F-number affects the size of the light cone collected, it decreases the goniometric resolution, which is the convolution of the emitted light cone and the collected one. The spectral resolution is limited by the number of LEDs used, their availability in relevant peak wavelengths, and their spectral sharpness. Adding more LEDs would result in a more complex and bulky design and would lengthen the measurement durations.

For in situ experiments in entomology and monitoring of insect biodiversity, the gold standard is manually emptied traps, often prepared with bait to attract insects.74,75 Novel techniques such as lidar and hyperspectral imaging have emerged as complementing alternative methods for characterizing individual insect species, diversity, and behavior.6,76,77 Automated entomological lidar measurements have the potential to improve the temporal resolution of observations while reducing the need for labor-intensive trapping. However, it is not a simple task to classify insect species using a lidar signal. A library of target properties is needed for the classification. Such properties could include the target’s optical cross-section, diffuse reflectance spectrum, and the degree of polarization of the body and the wings. Note that such classification is not a biological classification.

BIOSPACE is capable of acquiring all of the aforementioned optical properties of insects. Although the optical configuration of BIOSPACE differs from lidars in terms of exposure time, spectral width, and collimation of the lasers typically used, the measurements acquired with BIOSPACE are expected to be comparable with lidar. Insect sizes fall below the laser pulse lengths and detector bandwidth encountered in lidar; therefore, steady-state spectroscopy suffices for describing the light interaction. Furthermore, the spectrally dull features in biological tissues contributed by melanin, e.g., can be sufficiently resolved by BIOSPACE. The spectrally sharpest details from insects derive from their wing interference patterns, thoroughly addressed below. An advantage of BIOSPACE is that spectral shapes arising from absorption and interference can be distinguished since the structural colors disappear with cross-polarization and shift with scatter angle θscatter. Finally, the sharpest angular details arise from the wings whose surface normal distributions are comparable with the light cones employed in BIOSPACE.

The captured diffuse reflectance spectrum of a hoverfly (Eupeodes corollae) is shown in Fig. 8. Different parts of the body were compared in co- and de-polarized light. The sample was rotated, demonstrating the change in the optical cross-section of the insect depending on its heading direction [see Fig. 8(d)], mimicking the variation in the lidar signature of a specific insect.

FIG. 8.

(a) and (c) Co- and de-polarized true-color images of Eupeodes corollae (a species of hoverfly) acquired by BIOSPACE, respectively. Four cropped-out regions are indicated: A. dark stripe of the body; B. bright stripe of the body; C. eye; D. full-body signal from the insect. (b) Spectral responses from the different regions of the insect in both co- and de-polarized light. The de-polarized signals are noted to be weaker for all regions as expected. (d) The optical cross-section in mm2 from 360° viewing angles of the full body of the insect, in the standard lidar wavelengths of 810 and 940 nm.

FIG. 8.

(a) and (c) Co- and de-polarized true-color images of Eupeodes corollae (a species of hoverfly) acquired by BIOSPACE, respectively. Four cropped-out regions are indicated: A. dark stripe of the body; B. bright stripe of the body; C. eye; D. full-body signal from the insect. (b) Spectral responses from the different regions of the insect in both co- and de-polarized light. The de-polarized signals are noted to be weaker for all regions as expected. (d) The optical cross-section in mm2 from 360° viewing angles of the full body of the insect, in the standard lidar wavelengths of 810 and 940 nm.

Close modal

The wing of a European spruce bark beetle, Ips typographus, was studied in a specular configuration, which can occur in lidar observations.76 The measurements were compared to an analytical thin film equation78,79 sampled with the spectral shape of the LED outputs according to Eq. (4). The measured thickness of the wings was around 0.5 µm and found to decrease toward the posterior end of the wing as expected.77 Although the thickness values were found with a moderate coefficient of determination (R2) values of over 0.6, this is an example where an ex vivo target characterization study can simplify the classification of in vivo lidar measurements (Fig. 9).

FIG. 9.

(inset) A false-color image of the wing of a bark beetle, acquired by BIOSPACE. The crop regions A, B, and C indicate regions further away from the posterior end of the wing. The RGB color bands are represented by 940, 810, and 630 nm. (a)–(c) Reflectance measured on the specular alignment of a wing of a bark beetle on regions A, B, and C where increasing thickness is expected. The reflectance values are compared to a white diffuse reference. Fits of the thin film equation are presented for the estimated wing thicknesses for each region.

FIG. 9.

(inset) A false-color image of the wing of a bark beetle, acquired by BIOSPACE. The crop regions A, B, and C indicate regions further away from the posterior end of the wing. The RGB color bands are represented by 940, 810, and 630 nm. (a)–(c) Reflectance measured on the specular alignment of a wing of a bark beetle on regions A, B, and C where increasing thickness is expected. The reflectance values are compared to a white diffuse reference. Fits of the thin film equation are presented for the estimated wing thicknesses for each region.

Close modal

The study of blood with optical analysis has a long and fruitful history. One technique is spectral analysis, where, e.g., the different responses from oxygenated and deoxygenated hemoglobin have been used to determine blood oxygenation in both research and clinical settings.47,80,81 Another technique is the visual or automated inspection of red blood cells (RBCs). RBCs are 7 µm discs; healthy cells are donut-shaped but can be inflated due to osmotic pressure.82 Morphological changes can occur due to diseases such as sickle cell anemia,83 and significant changes in scattering and absorption can occur after an infection by the malaria parasites (Plasmodium falciparum).43 The gold standard for blood analysis is typically to stain the blood and have a trained professional perform microscopic investigation.84 Preliminary diagnoses without a pathologist have been explored but are not implemented clinically on a large scale. Such methods include, e.g., hyperspectral imaging, multivariate analysis,40,85 and studying the anisotropy of the light.86 With BIOSPACE, the magnification is insufficient to resolve the shape of individual blood cells. Instead, spectroscopic analysis is feasible as well as finding the scattering phase function and anisotropy factor of the RBCs using BIOSPACEs goniometric capabilities.

Within the framework of a study approved by the Swedish Ethical Review Authority, we obtained a blood sample from a healthy volunteer, who provided their written consent after being informed about the study and the voluntary nature of participation. The unstained blood smear was goniometrically scanned with BIOSPACE to acquire the scattering phase function (see Fig. 10). The blood smear on a microscope slide was placed in the instrument at a 45° angle from the camera. This configuration results in a specular reflection on the glass slide when the light source is at −90°, where a ballistic configuration is defined as 0°. A photograph of the mounted microscope slide and a schematic of the experimental setup are presented in Figs. 10(d) and 10(e), respectively. The blood smear measurement provides light intensity as a function of both wavelengths and scattering angles [see the heat map presented in Fig. 10(a)]. A goniometrical measurement has a large dynamic range when scanning an optically thin sample. The transmitted light intensity will typically be close to 100% of the incident light, while the scattered light can have intensities of a factor 104 lower. This was the case for the blood smear measurement, as shown in Figs. 10(a) and 10(b), where the dynamic range of the measurement spans four orders of magnitude. The measurement was conducted with five different exposure times, 50 µs, 300 µs, 12.5 ms, 50 ms, and 200 ms, to resolve the large dynamic with a sufficient signal-to-noise level.

FIG. 10.

(a) Goniometric measurement of a blood smear presented as a 2D heat map. (b) Horizontal integration of the 2D heat map yields the phase function of the blood smear in intensity over the scattering angle. A Henyey–Greenstein function multiplied with a cosine projection to compensate for the amount of light hitting the sample. Furthermore, a Gaussian function is added to account for the specular reflection at −90°. The function is fitted with anisotropy values g = 0.89, 0.92, and 0.95. (c) Vertical integration of the 2D heat map yields the spectral information of the different angular regions. The literature values of the extinction and scattering of hemoglobin are included.2 The literature extinction values show similarities with the specular measurements, and similarly, the scattering values from the literature show similarities with the forward scattering region. The plots in the black boxed legend correspond to the left axis and the ones in the red boxed legend correspond to the right axis. (d) A photograph of the glass slide mounted for measurement in BIOSPACE. (e) A schematic of the different selected angular regions, where the arrows correspond to the light incidence on the target.

FIG. 10.

(a) Goniometric measurement of a blood smear presented as a 2D heat map. (b) Horizontal integration of the 2D heat map yields the phase function of the blood smear in intensity over the scattering angle. A Henyey–Greenstein function multiplied with a cosine projection to compensate for the amount of light hitting the sample. Furthermore, a Gaussian function is added to account for the specular reflection at −90°. The function is fitted with anisotropy values g = 0.89, 0.92, and 0.95. (c) Vertical integration of the 2D heat map yields the spectral information of the different angular regions. The literature values of the extinction and scattering of hemoglobin are included.2 The literature extinction values show similarities with the specular measurements, and similarly, the scattering values from the literature show similarities with the forward scattering region. The plots in the black boxed legend correspond to the left axis and the ones in the red boxed legend correspond to the right axis. (d) A photograph of the glass slide mounted for measurement in BIOSPACE. (e) A schematic of the different selected angular regions, where the arrows correspond to the light incidence on the target.

Close modal

The anisotropic scatter factor (g) was estimated from the goniometric measurement in Fig. 10(b) by fitting a Henyey–Greenstein function to the measured scattering phase function.42 The fit function was multiplied with a cosine projection to compensate for the fraction of light illuminating the sample for different light source angles. A Gaussian function representing the specular reflection of the light source occurring at −90° was also added to the fit. The anisotropic factor g of the blood smear was determined to be 0.92.

Skeletal muscle tissue is vital for functional anatomy, and many skeletal muscle diseases can be caused by mutations in muscles’ sarcomeric proteins. The sarcomeres have a periodical fibrous structure, making them suitable for polarization analysis,87 which is feasible with BIOSPACE.

An ∼1 mm thick slice of porcine skeletal muscle tissue was mounted in BIOSPACE and analyzed for the total intensity and degree of linear polarization (DoLP) for forward- and backward-scattered light.87 A notable contrast between different muscle tissue regions can be observed in Fig. 11. The images shown are captured from the same region of the muscle tissue. There is a substantial regional difference in the images of to what degree the light is de-polarized for the different scattering modes, as seen by comparing Figs. 11(a) and 11(b). In Fig. 11(a), a fine-striped pattern of the myosin and actin muscle-fiber arrangement is distinguishable in the co-polarized light, displayed as green in the figure. Previous research has reported backward-reflected light maintaining a higher DoLP along the axis perpendicular to muscle fiber orientation.88 Thus, BIOSPACE enables polarization analysis to characterize tissue such as skeletal muscle.

FIG. 11.

A slice of muscle tissue in (a) backward scatter and (b) forward scatter configurations of BIOSPACE. The tissue is illuminated with 630 nm light. The DoLP is shown as green corresponding to co-polarized and red corresponding to complete de-polarization of the light.

FIG. 11.

A slice of muscle tissue in (a) backward scatter and (b) forward scatter configurations of BIOSPACE. The tissue is illuminated with 630 nm light. The DoLP is shown as green corresponding to co-polarized and red corresponding to complete de-polarization of the light.

Close modal

The performance evaluation in this paper is based on the components listed in the bill of materials found in supplementary material Table S1. The modular design of BIOSPACE allows for easy modification with, e.g., another camera or imaging lens if there are other specific requirements on spatial resolution or FoV. BIOSPACE could be of great interest to replicate in contexts including secondary education classroom experiments or hands-on biophotonic research in low-income countries. Three copies of the instrument have been assembled and used in a biophotonics course at Lund University.89 The project is co-funded by the International Science Program (ISP), Uppsala, in collaboration with the African Spectral Imaging Network (AFSIN). Multiple copies of BIOSPACE have been deployed in research groups at the University of Cape Coast, UCC, Ghana; Laboratory of Instrumentation Image and Spectroscopy, National Polytechnic Institute of Yamoussoukro, Ivory Coast; and Universidad Nacional de Ingeneria, Ecuador. The method is also disseminated to physics laboratories in Senegal, Togo, Mali, Burkina Faso, Cameroon, and Kenya.

Information for replicating the setup is available in supplementary files and on the software-sharing site, GitHub.68 The available files include the acquisition/controlling software, PCB etching files, CAD files for cavity and LEGO optics adaptors, and a manual for assembly and operation.

We have presented BIOSPACE, an instrument capable of analyzing rotation aspects of a 3D target with multiple scatter angles, in different polarization modes, and with spectral sensitivity ranging from 365 to 940 nm. In total, a target can be captured by over 10 million unique images. This instrument enables automated multi-domain studies of a single sample. We have detailed the precision and calibration procedure of BIOSPACE for this paper to serve as a guide to replicate the instrument for broader use. We have shown the versatility of BIOSPACE in a few biophotonic applications involving directional dependence of an entomological lidar target, goniometric blood analysis, and polarimetric tissue analysis. The low cost and high versatility of BIOSPACE open up opportunities for research in the developing world and deployment in education even at bachelor college or high-school levels.

In analogy to the double integrating sphere method,39,90 the scatter angle stage yields multiple potential measurement geometries. For example, ballistic transmittance, forward scattering, and reflectance could be used to deduce the samples’ optical properties (µa, µs, and g). By orientating the surface normal to intersect the median point between the illuminator and receiver, the specular reflection can be captured, and the refractive index, n, can be found by identifying the Brewster angle (refractometry). Polarimetric diagnostics is emerging with novel sensors in both medicine,91 food inspection,92 and vegetation analysis.93 Our instrument could serve for feasibility studies of such applications across various domains. Envisioned applications also comprise tissue slabs,45,90 food product inspection,9,94 and the study of angular reflectance (BRDF) of leaves95–97 for the diagnostics of health state or quality.

The potential applications extend beyond the examples reported in this study. One application within entomology is small animal imaging and 3D scanning of insects21,73 and building a database of known insect species from museum collections or the wild. Similar attempts are pursued in, e.g., the oVert project,98 where over 20 000 vertebrate species from museum collections are computer-tomography scanned and stored in a digital open-access format. Since there are over a million known insect species, a similar entomological project would need to be massively parallelized to succeed, e.g., by distribution to enthusiasts or high schools worldwide. BIOSPACE is suitable for such a project due to its low cost, use of standard components, and modular design. The extended optical capabilities of BIOSPACE could be of particular interest in such digitization projects since the insects display interesting optical phenomena such as iridescence, structural colors, ultraviolet features, and polarization-dependent reflection.99 

See supplementary material for replicating BIOSPACE. The supplementary material includes the following: S1, a bill of materials; S2, a manual for assembly and operation; and S3, files for 3D-printing the costum components needed.

This work was financed by a Uforsk Grant No. (2018-04073) from the Swedish Research Council to support realistic instrumentation and capacity building in low-income countries. The Royal Physiographical Society supported pilot studies at the Lund Laser Center (LLC). We thank the LEGO Company and colleagues at Lund University for fruitful discussions. We, therefore, extend our thanks to Olaf Diegel, David Frantz, Samuel Jansson, David Hill, David Sanned, and Andreas Johansson. We thank the staff at Laser and Fiber Optics Center (LAFOC) at the University of Cape Coast for hosting a workshop on this topic and participants from African Spectral Imaging Network (AFSIN), as well as the guest lectures Colin Sheppard, Katarina Svanberg, and Sune Svanberg. African workshops and instrument replication have been funded by the Swedish International Development Agency (SIDA) through the International Science Program (ISP, Uppsala) by a grant to the AFSIN.

The authors have no conflicts to disclose.

Hampus Månefjord: Conceptualization (equal); Formal analysis (lead); Investigation (lead); Methodology (lead); Software (lead); Visualization (lead); Writing – original draft (lead); Writing – review & editing (lead). Meng Li Investigation (supporting); Writing – review & editing (supporting). Christian Brackmann: Investigation (supporting); Supervision (supporting); Writing – review & editing (supporting). Nina Reistad: Funding acquisition (equal); Resources (supporting); Writing – review & editing (supporting). Anna Runemark: Writing – original draft (supporting); Writing – review & editing (supporting). Jadranka Rota: Resources (equal); Writing – original draft (supporting); Writing – review & editing (supporting). Benjamin Andersson: Conceptualization (supporting); Funding acquisition (equal); Writing – review & editing (supporting). Jeremie T. Zoueu: Conceptualization (equal); Funding acquisition (equal); Writing – review & editing (supporting). Aboma Merdasa: Conceptualization (equal); Methodology (equal); Project administration (equal); Supervision (supporting); Writing – review & editing (supporting). Mikkel Brydegaard: Conceptualization (equal); Formal analysis (equal); Funding acquisition (equal); Investigation (equal); Methodology (equal); Resources (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal).

The data that support the findings of this study are available from the corresponding author upon reasonable request.

1.
C.
Boudoux
,
Fundamentals of Biomedical Optics: From Light Interactions with Cells to Complex Imaging Systems
(
Pollux
,
Montréal
,
2018
).
2.
J.
Popp
 et al.,
Handbook of Biophotonics, Volume 3: Photonics in Pharmaceutics, Bioanalysis and Environmental Research
(
John Wiley & Sons
,
2012
), Vol. 3.
3.
S. L.
Jacques
and
B. W.
Pogue
, “
Tutorial on diffuse light transport
,”
J. Biomed. Opt.
13
(
4
),
041302
(
2008
).
4.
V. V.
Tuchin
,
Optical Biomedical Diagnostics
(
Izdatelstvo Fizikomatematicheskoy Literaturi
,
Moscow
,
2007
), Vol. 1, p.
560
.
5.
W.
Turner
 et al., “
Remote sensing for biodiversity science and conservation
,”
Trends Ecol. Evol.
18
(
6
),
306
314
(
2003
).
6.
M.
Brydegaard
and
S.
Svanberg
, “
Photonic monitoring of atmospheric and aquatic fauna
,”
Laser Photonics Rev.
12
(
12
),
1800135
(
2018
).
7.
B.
Park
and
R.
Lu
,
Hyperspectral Imaging Technology in Food and Agriculture
(
Springer
,
2015
).
8.
J. M.
Amigo
,
H.
Babamoradi
, and
S.
Elcoroaristizabal
, “
Hyperspectral image analysis. A tutorial
,”
Anal. Chim. Acta
896
,
34
51
(
2015
).
9.
J.
Qin
and
R.
Lu
, “
Measurement of the optical properties of fruits and vegetables using spatially resolved hyperspectral diffuse reflectance imaging technique
,”
Postharvest Biol. Technol.
49
(
3
),
355
365
(
2008
).
10.
S.
Svanberg
, “
Gas in scattering media absorption spectroscopy–from basic studies to biomedical applications
,”
Laser Photonics Rev.
7
(
5
),
779
796
(
2013
).
11.
J. G.
Fujimoto
and
D.
Farkas
,
Biomedical Optical Imaging
(
Oxford University Press
,
2009
).
12.
E. M. C.
Hillman
 et al., “
In vivo optical imaging and dynamic contrast methods for biomedical research
,”
Philos. Trans. R. Soc., A
369
(
1955
),
4620
4643
(
2011
).
13.
J.
Michels
and
S. N.
Gorb
, “
Detailed three‐dimensional visualization of resilin in the exoskeleton of arthropods using confocal laser scanning microscopy
,”
J. Microsc.
245
(
1
),
1
16
(
2012
).
14.
S. W.
Hell
, “
Nobel Lecture: Nanoscopy with freely propagating light
,”
Rev. Mod. Phys.
87
(
4
),
1169
(
2015
).
15.
P.
Beard
, “
Biomedical photoacoustic imaging
,”
Interface Focus
1
(
4
),
602
631
(
2011
).
16.
M. T.
Stridh
 et al., “
Photoacoustic imaging of periorbital skin cancer ex vivo: Unique spectral signatures of malignant melanoma, basal, and squamous cell carcinoma
,”
Biomed. Opt. Express
13
(
1
),
410
425
(
2022
).
17.
P.
Cimalla
 et al., “
Simultaneous dual-band optical coherence tomography in the spectral domain for high resolution in vivo imaging
,”
Opt. Express
17
(
22
),
19486
19500
(
2009
).
18.
O. E.
Olarte
 et al., “
Light-sheet microscopy: A tutorial
,”
Adv. Opt. Photonics
10
(
1
),
111
(
2018
).
19.
H.
Jiang
,
Diffuse Optical Tomography: Principles and Applications
(
CRC Press
,
2018
).
20.
J.
Sharpe
, “
Optical projection tomography
,”
Annu. Rev. Biomed. Eng.
6
,
209
228
(
2004
).
21.
B.
Strobel
 et al., “
An automated device for the digitization and 3D modelling of insects, combining extended-depth-of-field and all-side multi-view imaging
,”
Zookeys
759
,
1
27
(
2018
).
22.
R.
Richards-Kortum
and
E.
Sevick-Muraca
, “
Quantitative optical spectroscopy for tissue diagnosis
,”
Annu. Rev. Phys. Chem.
47
(
1
),
555
606
(
1996
).
23.
N.
Reistad
 et al., “
Diffuse reflectance spectroscopy of liver tissue
,”
Proc. SPIE
9531
,
95314E
(
2015
).
24.
N.
Gat
, “
Imaging spectroscopy using tunable filters: A review
,”
Proc. SPIE
4056
,
50
64
(
2000
).
25.
L. L.
Randeberg
and
J.
Hernandez-Palacios
, “
Hyperspectral imaging of bruises in the SWIR spectral region
,”
Proc. SPIE
8207
,
82070N
(
2012
).
26.
E. M.
Georgieva
,
W.
Huang
, and
W. S.
Heaps
, “
A new remote sensing filter radiometer employing a Fabry-Perot etalon and a CCD camera for column measurements of methane in the Earth atmosphere
,” in
2012 IEEE International Geoscience and Remote Sensing Symposium
(
IEEE
,
2012
).
27.
M. W.
Kudenov
and
E. L.
Dereniak
, “
Compact real-time birefringent imaging spectrometer
,”
Opt. Express
20
(
16
),
17973
17986
(
2012
).
28.
Y.
Ferrec
 et al., “
Experimental results from an airborne static Fourier transform imaging spectrometer
,”
Appl. Opt.
50
(
30
),
5894
5904
(
2011
).
29.
F. J.
Bolton
 et al., “
Portable, low-cost multispectral imaging system: Design, development, validation, and utilization
,”
J. Biomed. Opt.
23
(
12
),
1
11
(
2018
).
30.
G.
ElMasry
 et al., “
Recent applications of multispectral imaging in seed phenotyping and quality monitoring—An overview
,”
Sensors
19
(
5
),
1090
(
2019
).
31.
M.
Brydegaard
 et al., “
Versatile multispectral microscope based on light emitting diodes
,”
Rev. Sci. Instrum.
82
(
12
),
123106
(
2011
).
32.
M.
Goel
 et al., “
HyperCam: Hyperspectral imaging for ubiquitous computing applications
,” in
Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing
(
Association for Computing Machinery
,
2015
).
33.
S.
Kim
 et al., “
Smartphone-based multispectral imaging: System development and potential for mobile skin diagnosis
,”
Biomed. Opt. Express
7
(
12
),
5294
5307
(
2016
).
34.
S. L.
Jacques
, “
Polarized light imaging of biological tissues
,” in
Handbook of Biomedical Optics
(
CRC Press
,
2016
), pp.
669
692
.
35.
V. V.
Tuchin
, “
Polarized light interaction with tissues
,”
J. Biomed. Opt.
21
(
7
),
071114
(
2016
).
36.
H.
Luo
 et al., “
Compact and miniature snapshot imaging polarimeter
,”
Appl. Opt.
47
(
24
),
4413
4417
(
2008
).
37.
D.
Rebhan
 et al., “
Principle investigations on polarization image sensors
,”
Proc. SPIE
11144
,
111440A
(
2019
).
38.
D. B.
Murphy
,
Fundamentals of Light Microscopy and Electronic Imaging
(
John Wiley & Sons
,
2002
).
39.
J. W.
Pickering
,
S. A.
Prahl
,
N.
van Wieringen
,
J. F.
Beek
,
H. J. C. M.
Sterenborg
, and
M. J. C.
van Gemert
, “
Double-integrating-sphere system for measuring the optical properties of tissue
,”
Appl. Opt.
32
,
399
(
1993
).
40.
A.
Merdasa
 et al., “
Staining-free malaria diagnostics by multispectral and multimodality light-emitting-diode microscopy
,”
J. Biomed. Opt.
18
(
3
),
036002
(
2013
).
41.
F.
Foschum
and
A.
Kienle
, “
Optimized goniometer for determination of the scattering phase function of suspended particles: Simulations and measurements
,”
J. Biomed. Opt.
18
(
8
),
85002
(
2013
).
42.
M. L.
Askoura
,
F.
Vaudelle
, and
J.-P.
L’Huillier
, “
Multispectral measurement of scattering-angular light distribution in apple skin and flesh samples
,”
Appl. Opt.
55
(
32
),
9217
9225
(
2016
).
43.
B. K.
Wilson
,
M. R.
Behrend
,
M. P.
Horning
, and
M. C.
Hegg
, “
Detection of malarial byproduct hemozoin utilizing its unique scattering properties
,”
Opt. Express
19
,
12190
(
2011
).
44.
S.
Jansson
 et al., “
First polarimetric investigation of malaria mosquitoes as lidar targets
,”
IEEE J. Sel. Top. Quantum Electron.
25
(
1
),
1
8
(
2019
).
45.
S. L.
Jacques
, “
Optical properties of biological tissues: A review
,”
Phys. Med. Biol.
58
(
11
),
R37
R61
(
2013
).
46.
M. G.
Müller
 et al., “
Intrinsic fluorescence spectroscopy in turbid media: Disentangling effects of scattering and absorption
,”
Appl. Opt.
40
(
25
),
4633
4646
(
2001
).
47.
D. J.
Faber
 et al., “
Oxygen saturation-dependent absorption and scattering of blood
,”
Phys. Rev. Lett.
93
(
2
),
028102
(
2004
).
48.
T.
Svensson
 et al., “
Near-infrared photon time-of-flight spectroscopy of turbid materials up to 1400 nm
,”
Rev. Sci. Instrum.
80
(
6
),
063105
(
2009
).
49.
M.
Burresi
 et al., “
Bright-white beetle scales optimise multiple scattering of light
,”
Sci. Rep.
4
(
1
),
6075
(
2014
).
50.
D.
Elson
 et al., “
Time-domain fluorescence lifetime imaging applied to biological tissue
,”
Photochem. Photobiol. Sci.
3
(
8
),
795
801
(
2004
).
51.
P.
Herman
 et al., “
Frequency‐domain fluorescence microscopy with the LED as a light source
,”
J. Microsc.
203
(
2
),
176
181
(
2001
).
52.
J.
Borggren
, “
Combinatorial light path spectrometer for turbid liquids
,” Lund Reports in Atomic Physics,
2011
.
53.
S. H.
Chung
 et al., “
Macroscopic optical physiological parameters correlate with microscopic proliferation and vessel area breast cancer signatures
,”
Breast Cancer Res.
17
(
1
),
72
(
2015
).
54.
T.
Durduran
 et al., “
Diffuse optics for tissue monitoring and tomography
,”
Rep. Prog. Phys.
73
(
7
),
076701
(
2010
).
55.
J. S.
Cybulski
,
J.
Clements
, and
M.
Prakash
, “
Foldscope: Origami-based paper microscope
,”
PLoS One
9
(
6
),
e98781
(
2014
).
56.
S.
Rawat
 et al., “
Compact and field-portable 3D printed shearing digital holographic microscope for automated cell identification
,”
Appl. Opt.
56
(
9
),
D127
D133
(
2017
).
57.
E. K.
Grasse
,
M. H.
Torcasio
, and
A. W.
Smith
, “
Teaching UV–Vis spectroscopy with a 3D-printable smartphone spectrophotometer
,”
J. Chem. Educ.
93
(
1
),
146
151
(
2016
).
58.
B. S.
Hosker
, “
Demonstrating principles of spectrophotometry by constructing a simple, low-cost, functional spectrophotometer utilizing the light sensor on a smartphone
,”
J. Chem. Educ.
95
(
1
),
178
181
(
2018
).
59.
A.
Lietard
 et al., “
A combined spectrophotometer and fluorometer to demonstrate the principles of absorption spectroscopy
,”
J. Chem. Educ.
98
(
12
),
3871
3877
(
2021
).
60.
E.
Montoya-Rossi
,
Ó.
Baltuano-Elías
, and
A.
Arbildo-López
, “
A homemade cost effective Raman spectrometer with high performance
,”
J. Lab. Chem. Educ.
3
,
67
(
2015
).
61.
J.
Fernsler
 et al., “
A LEGO Mindstorms Brewster angle microscope
,”
Am. J. Phys.
85
(
9
),
655
662
(
2017
).
62.
N.
Haverkamp
 et al., “
Measuring wavelengths with LEGO® bricks: Building a Michelson interferometer for quantitative experiments
,”
Phys. Teach.
58
(
9
),
652
655
(
2020
).
63.
J. M.
Pearce
, “
Building research equipment with free, open-source hardware
,”
Science
337
(
6100
),
1303
1304
(
2012
).
64.
X.-C.
Zhang
 et al., “
Open source 3D printers: An appropriate technology for building low cost optics labs for the developing communities
,” in
14th Conference on Education and Training in Optics and Photonics: ETOP 2017
(
SPIE
,
2017
).
65.
J.
Larsson
 et al., “
Development of a 3-dimensional tissue lung phantom of a preterm infant for optical measurements of oxygen-laser-detector position considerations
,”
J. Biophotonics
11
(
3
),
e201700097
(
2018
).
66.
L. J.
Salazar-Serrano
,
P.
Torres
, and
A.
Valencia
, “
A 3D printed toolbox for opto-mechanical components
,”
PLoS One
12
(
1
),
e0169832
(
2017
).
67.
J. P.
Sharkey
 et al., “
A one-piece 3D printed flexure translation stage for open-source microscopy
,”
Rev. Sci. Instrum.
87
(
2
),
025104
(
2016
).
68.
H.
Månefjord
, BIOSPACE–Github repository,
2021
; Available from: https://github.com/HampusMLTH/BIOSPACE.
69.
F.
Lundh
, An introduction to Tkinter, URL: www.pythonware.com/library/tkinter/introduction/index.html,
1999
.
70.
Stefanklug, PyPylon. 2021. p. https://github.com/basler/pypylon.
71.
Dexter Industries, BrickPi. 2021. p. https://github.com/DexterInd/BrickPi.
72.
K.
Ahi
, “
Mathematical modeling of THz point spread function and simulation of THz imaging systems
,”
IEEE Trans. Terahertz Sci. Technol.
7
(
6
),
747
754
(
2017
).
73.
C. V.
Nguyen
 et al., “
Capturing natural-colour 3D models of insects for species discovery and diagnostics
,”
PLoS One
9
(
4
),
e94346
(
2014
).
74.
J. B.
Silver
,
Mosquito Ecology: Field Sampling Methods
(
Springer Science & Business Media
,
2007
).
75.
C. A.
Hallmann
 et al., “
More than 75% decline over 27 years in total flying insect biomass in protected areas
,”
PLoS One
12
(
10
),
e0185809
(
2017
).
76.
S.
Jansson
, Lund University, Entomological Lidar: Target Characterization and Field Applications,
2020
.
77.
M.
Li
 et al., “
Bark beetles as lidar targets and prospects of photonic surveillance
,”
J. Biophotonics
14
,
e202000420
(
2020
).
78.
H.
Yin
 et al., “
Iridescence in the neck feathers of domestic pigeons
,”
Phys. Rev. E: Stat., Nonlinear, Soft Matter Phys.
74
(
5
),
051916
(
2006
).
79.
D. G.
Stavenga
, “
Thin film and multilayer optics cause structural colors of many insects and birds
,”
Mater. Today: Proc.
1
,
109
121
(
2014
).
80.
H.
Liu
 et al., “
Determination of optical properties and blood oxygenation in tissue using continuous NIR light
,”
Phys. Med. Biol.
40
(
11
),
1983
(
1995
).
81.
A.
Merdasa
 et al., “
Photoacoustic imaging of the spatial distribution of oxygen saturation in an ischemia-reperfusion model in humans
,”
Biomed. Opt. Express
12
(
4
),
2484
2495
(
2021
).
82.
Y.
Tan
 et al., “
Mechanical characterization of human red blood cells under different osmotic conditions by robotic manipulation with optical tweezers
,”
IEEE Trans. Biomed. Eng.
57
(
7
),
1816
1825
(
2010
).
83.
H.
Byun
 et al., “
Optical measurement of biomechanical properties of individual erythrocytes from a sickle cell patient
,”
Acta Biomater.
8
(
11
),
4130
4138
(
2012
).
84.
World Health Organization, Basic malaria microscopy. Volume 2,
2010
.
85.
K. R.
Beebe
and
B. R.
Kowalski
, “
An introduction to multivariate calibration and analysis
,”
Anal. Chem.
59
(
17
),
1007A
1017A
(
1987
).
86.
Y.
Kim
 et al., “
Anisotropic light scattering of individual sickle red blood cells
,”
J. Biomed. Opt.
17
(
4
),
040501
(
2012
).
87.
A.
Shuaib
,
X.
Li
, and
G.
Yao
, “
Transmission of polarized light in skeletal muscle
,”
J. Biomed. Opt.
16
(
2
),
025001
(
2011
).
88.
X.
Li
,
J. C.
Ranasinghesagara
, and
G.
Yao
, “
Polarization-sensitive reflectance imaging in skeletal muscle
,”
Opt. Express
16
(
13
),
9927
9935
(
2008
).
89.
H.
Månefjord
 et al., “
BIOSPACE—A low-cost platform for problem-based learning in biophotonics
,” in
2021 Proceedings of the Teaching and Learning Conferences
, 2021 (
Centre for Engineering Education, Lund University
,
2021
), p.
11
.
90.
A.
ul Rehman
,
I.
Ahmad
, and
S. A.
Qureshi
, “
Biomedical applications of integrating sphere: A review
,”
Photodiagn. Photodyn. Ther.
31
,
101712
(
2020
).
91.
Z.
Ali
 et al., “
Assessment of tissue pathology using optical polarimetry
,”
Lasers Med. Sci.
37
,
1907
1919
(
2021
).
92.
M.
Peyvasteh
 et al., “
Evolution of raw meat polarization‐based properties by means of Mueller matrix imaging
,”
J. Biophotonics
14
(
5
),
e202000376
(
2021
).
93.
S. N.
Savenkov
 et al., “
Measurement and interpretation of Mueller matrices of barley leaves
,”
Quantum Electron.
50
(
1
),
55
(
2020
).
94.
R.
Lu
 et al., “
Measurement of optical properties of fruits and vegetables: A review
,”
Postharvest Biol. Technol.
159
,
111003
(
2020
).
95.
D.
Biliouris
 et al., “
A compact laboratory spectro-goniometer (CLabSpeG) to assess the BRDF of materials. Presentation, calibration and implementation on Fagus sylvatica L. leaves
,”
Sensors
7
(
9
),
1846
1870
(
2007
).
96.
L.
Lolli
 et al., “
PHYTOS: A portable goniometer for in situ spectro-directional measurements of leaves
,”
Metrologia
51
(
6
),
S309
(
2014
).
97.
B.
Tumendemberel
, “
Study of spectro-polarimetric bidirectional reflectance properties of leaves
,”
Ph.D. thesis
(
Hokkaido University
,
2019
).
98.
R.
Cross
, “
The inside story on 20 000 vertebrates
,”
Science
357
,
742
(
2017
).
99.
P.
Vukusic
and
D.
Stavenga
, “
Physical methods for investigating structural colours in biological systems
,”
J. R. Soc., Interface
6
(
Suppl_2
),
S133
S148
(
2009
).

Supplementary Material