In this lesson, we discuss images captured from electromagnetic frequencies beyond the range of human perception. Multispectral imaging employs additional channels of electromagnetic radiation beyond human perception, such as infrared (IR), ultraviolet (UV), X-rays, and more.
Here's a rainbow over CMU. Water droplets in the air split up the sun's white light into component colors.
Here's a prism in someone's window, splitting up sunlight into its component colors.
The visible spectrum is just one slice of electromagnetic radiation that human eyes happen to be able to see. But there are many other frequencies of "light" which can be detected and imaged using special cameras, such as ultraviolet and infrared.
Here is a useful chart showing different spectral bands and some of their applications. (Note that the direction of the spectrum is reversed from the two diagrams above.)
This lesson includes a discussion of the following:
- Overview of Multispectral Imaging
- Some Applications
- Near-Infrared Imaging
- Thermal Imaging
- Thermochromic Imaging
- Ultraviolet Imaging
- X-Ray Imaging
- Hyperspectral Imaging
- Radio-Wave Imaging
- Neutron Imaging
In addition, two other imaging techniques discussed here, while not "multi-spectral", deal with unusual properties of light and its transmissive medium:
Finally, it's not even necessary to image with electromagnetic radiation; let's not forget about sound.
Because the human eye cannot see things like X-rays, IR, UV, etc., we need to use some sort of imaging technology to translate energy patterns in these spectra into patterns that our eyes can see. The "right" way to visualize such energy patterns is always arbitrary — or, at least, an artifact of the sensing and display technology.
Here is a face, simultaneously imaged in the (a) visible spectrum, 0.4–0.7 µm, (b) SWIR or short-wave infrared, 1.0–3.0µm, (c) MWIR or mid-wave infrared, 3.0–5.0µm, and (d) LWIR or long-wave infrared, 8.0-14.0µm. As we image using increasingly long wavelengths, we shift from seeing the reflective (color) properties of the subject, to the emissive (heat) properties of the subject.
- An excellent paper on multispectral face recognition
- Differences between UV, Vis, NIR (Wikipedia)
- More faces viewed in multiple spectra
Here are some faces viewed in UV, visible light, and IR. What sorts of features appear differently in different forms of light?
In the photos below, by Nick Spiker and Nevada Weir, it's clear that depending on the imaging frequency, light skin can look dark, and vice-versa. The dark appearance under Short-Wave IR (SWIR) is because water just below the skin’s surface absorbs radiation in this band.
Satellites employ multispectral imaging to understand the Earth. Here are a set of aligned images taken simultaneously, in different IR bands, by the Chinese weather satellite FY4A-AGRI. Whereas most images you see have three channels of information (RGB), this satellite records 14 different channels of visible, SWIR, MWIR, and LWIR light. Note how for some bands, like 5.8-6.7μm, the water vapor in the atmosphere is opaque.
In order to visualize the structure of such multichannel data, a common trick in satellite imaging is to re-map a channel of invisible light (like IR) to the R,G,B channels of a regular image. Charlie Loyd discusses this in his terrific article, Putting Landsat8's multispectral imaging to work. This is called false-color imaging.
For example, here's a false-color Landsat8 image by in which SWIR data is used in the "red" channel of the image; NIR data as "green"; and near-UV as "blue". The purpose of doing this is not to create some sort of garish, psychedelic image; rather, the false-color imaging makes the patch of forest in the lower-left (which had been affected by forest fire) starkly visible:
Astronomers use multispectral imaging to understand the Sun. This set of images of the Sun was recorded using mostly deep ultraviolet and X-rays:
Forensic specialists use infrared and ultraviolet imaging and/or fluorescence to recover writing lost to water damage, analyze medieval frescoes, detect fraudulent documents, determine the provenance of artifacts, sort fragments of shredded documents, detect underpaintings in famous artworks, and detect earlier versions of artworks (includes web interactive).
It's common to examine paintings under many sorts of light. In addition to visible, UV, IR, and X-Rays, experts may also use grazing/raking light (from the side), transmitted light (from behind), and co-axial light (retroreflective) to examine a painting.
SWIR radiation passes readily through haze and smog, which tend to scatter visible light. The image below shows two views of an oil rig imaged through 47km of air with a substantial amount of marine haze in the air path. The rig can just barely be seen in the visible image, while the SWIR image shows strong contrast and the presence of a flare at the end of a long boom.
SWIR's ability to see through haze and smoke is useful for firefighters and also self-driving cars.
IR is light that is beyond the red end of the visible spectrum. Wavelengths in the range of ~770 to ~1400 nanometers are called the near-infrared (or NIR) region of the electromagnetic spectrum, while longer wavelengths include Short-Wave IR (SWIR) and Long-Wave IR (LWIR). Near-infrared is widely used in standard security cameras, while far infrared is known as thermal imaging (which we discuss in the section after this).
NIR cameras can be inexpensive and easy to obtain. With IR illumination, they allow us to see in the "dark", so they are widely used as security cameras. Here's a BBC video tracking animals, unobtrusively, licking salt in a cave at night:
Owing to the different infrared reflectivity of blood, you can see veins easily in NIR:
It's important to distinguish between monochromatic IR images (a grayscale image whose content is exclusively from the infrared part of the spectrum), such as the image above, and various types of CIR (Color+IR) images: false-color images which store multiple (spatially aligned) channels of information from different parts of the spectrum. Since RGB images are a common display format for multichannel image data, one common CIR technique (described in this PDF) stores IR information in the Red channel, Red information in the Green channel, and Green information in the Blue channel. This is conceptually similar to the LandSat satellite imaging trick described above. Edward Thompson has compiled an artful book of such images, such as this one:
Incidentally, the visibility of veins in IR has been used in some medical augmented-projection applications, such as the Christie VeinViewer and other infrared vein finder devices:
It's common for CIR imaging to be used for aerial/satellite photography. Foliage, in particular, becomes much more visible:
Below is a technical breakdown of how NIR-R-G "false color composite" images are constructed. If you want to try constructing such an image yourself (using the Channels palette in Photoshop, for example), here are the component images:
trees_nir.png,
trees_rgb.png;
scene_nir.png,
scene_rgb.png; and here's a NIR+RGB dataset. Such images can be useful for calculating the NDVI (Normalized Difference Vegetation Index), a widely-used agricultural metric for quantifying the health and density of vegetation
Richard Mosse's The Enclave (2013) is a documentary film about the ongoing civil war in Congo, shot on CIR film. Mosse's work has spurred controversy for the way in which it aestheticizes turmoil, especially as captured by a European working in Africa.
Some pigments are not visible in NIR. This is used as an anti-counterfeiting measure in US currency. The different patterns of (missing) stripes are also useful in machine recognition; each denomination has its own pattern.
Some materials are opaque in visible wavelengths, but transparent in NIR wavelengths. This means that NIR can be used to see certain kinds of obscured or invisible information. A common technique for this is infrared reflectography, which takes advantage of the NIR-transparency of some kinds of paint, in order to view a painting's underlayers:
For example, The Blue Boy (ca. 1770), an oil painting by Thomas Gainsborough (1727-1788), has an overpainted dog, discovered in 1994. Here the painting is shown in normal light photography (left), digital x-radiography, and infrared reflectography (right).
The paintings "The Lynching of Leo Frank" and "Stella at the Playground" by Oliver Lutz (2010) use a (visibly) black, IR-transparent overpainting—covering a "secret" image that can only be observed by means of a NIR security camera and a nearby CCTV. Lutz makes many projects with this IR-clear, visibly-black overpainting. His work appeared in the Walker Art Center exhibition "Exposed: Voyeurism, Surveillance and The Camera since 1870".
Artist Osman Khan created a strictly IR-viewable image, whose content is visible only through the audience's personal digital capture devices.
In some circumstances, depending on materials, NIR cameras can see through clothes, as in this example with a tattoo, below.
NIR imaging can be used to detect traced (i.e. forged) signatures:
There are also many uses of NIR imaging for real-time body tracking and computer vision.
Thermal imaging senses light wavelengths in the range of ~8000-14000 nanometers, also called long wave infrared, which corresponds to what we experience as heat. In short, we see where something is hot, and to what extent.
A touchless forehead thermometer (about USD20) is essentially a one-pixel thermal camera. Niklas Roy made a DIY Thermal Camera by mounting one on a pan/tilt servo.
What we see when we observe radiation in the ~8000-14000nm range is emissive rather than reflective. Interestingly, there is a band that is not used for imaging — the "low transmittance window", between 5000-8000nm, because air is opaque (absorbs IR) at those wavelengths. Put another way: those are the wavelengths of light-energy that heat the air itself.
- David Attenborough discusses the use of thermal imaging to understand lizard temperature self-regulation, in this BBC video:
- A cult classic, THE OPERATION by Jacob Pander and Marne Lucas (1995) is a hybrid art/porn movie, shot completely with a thermal camera. (NSFW)
- Lucas and Pander have also produced Incident Energy, a multi-channel thermal video which explores "themes of nature and humanity", including live human birth.
- Route 94: My Love (2013) is a more recent music video with much the same idea. Note how the intensity of thermal energy can be interpreted according to a grayscale spectrum, color spectrum, etc.
- Terike Haapoja's Community (2007) presents thermal videos of animals which have just died. We see the heat leaving their bodies.
Note that there is no "correct" way to view thermal imagery. Cameras offer a variety of different spectra for mapping their temperature ranges, including grayscale, black-body, chromatic (blue = cold), etc.
Sometimes simply presenting such alternative views can be a provocative, entertaining or educational experience for audiences.
-
Many science museums, such as the Exploratorium, have a "Heat Camera" display in which the public can see themselves in a thermal camera.
-
City Thermogram, by Peggy Ahwesh (2015), was commissioned to create a live-video installation in New York's Time Square, in which she connected up a thermal camera (trained on passersby) to a large electronic billboard.
- In the exhibition "Laura Poitras: Astro Noise" (2016) at the Whitney Museum of American Art, the installation "Bed Down Location" features time-lapse video projections of night skies in Yemen and Somalia. In the next room over is Poitras' project "Last Seen", which shows a time-lagged view of the previous room's mattress, showing the viewer's own fading thermal imprints.
- Google has recently made satellite thermal imaging of roofs available to the public, to prompt people's awareness about the heat inefficiency of their homes.
- Different gases absorb energy in different parts of the thermal radiation spectrum. Just as air is opaque from 5-8um (MWIR), methane is opaque in the LWIR. Thermal LWIR video was thus useful in visualizing the 2015 Porter Ranch methane leak:
- These Swedish researchers have built a modified thermal camera specifically for seeing methane. This is important because methane is 86 times more impactful in greenhouse warming.
- High-resolution (1024x768) thermal video of bats, accompanied by their audibleized ultrasound
- High-resolution thermal imaging of owls
- Envisioning Chemistry: Getting Hot
- Envisioning Chemistry: Getting Cold
- Simple liquid experiments: boiling water, water convection, oil heating, ice floating in water, etc.
- "3D thermography" (thermal imaging + photogrammetry)
While we're on the topic of visualizing heat: Some substances temporarily change color in response to heat. In different contexts, thermochromic pigments can work as a capture technology or a display technology.
- Jay Watson's Thermochromic Table (2011) reveals where and how people have sat at the furniture.
- The Temperature Sensitive Object chair (2001) by Orléans-based design group, Archilab, is similar.
- The revelatory potential of this technology is directly connected to considerations of mammalian territory-marking behavior in this thermochromic toilet seat and this remarkable thermochromic urinal
- Dutch artist Carina Hesper has created a book, Like a Pearl in My Hand, in which portrait photographs have been overprinted with thermochromic coating. The reader of the book must interactively reveal the underlying photographs
has created a book, Like a Pearl in My Hand
- The World In UV, an excellent overview video by science YouTuber, Veritasium, discusses how UV offers new understandings of atmospheric haze, sunscreen, quinine, flowers, polar bears, and more:
The vast majority of humans cannot see ultraviolet light. However, this person was able to see UV light after he had cataract surgery.
Many animals appear different, and can see in the ultraviolet. For example,
- Butterflies are thought to have the widest spectral visual range of any land animal. Butterflies can use ultraviolet markings to find healthier mates. Ultraviolet patterns also help certain species of butterflies appear similar to predators, while differentiating themselves to potential mates.
- Reindeer rely on ultraviolet light to spot lichens that they eat. They can also easily spot the UV-absorbent urine of predators among the UV-reflective snow.
- One bird species was found to feed its young based on how much UV the chicks reflected.
- Some species of birds use UV markings to tell males and females apart.
- The flower Black-eyed Susans have petals that appear yellow to humans, but UV markings give them a bull's eye-like design that attracts bees.
- Sockeye salmon may use their ultraviolet perception to see food.
Kestrels can see in the UV, which helps them find prey from their UV-fluorescent urine. Here's David Attenborough:
- Scorpions glow under ultraviolet light, but scientists do not know why.
UV video overview by Thomas Leveritt, promoting sunscreen:
UV is also widely used in forensics:
Recently, skin splotches that are visible only in UV have been used as markers for face-tracking.
Fluorescent materials absorb UV light (and thus appear dark in a UV camera). The materials then convert some of that absorbed energy into visible light, which they release. Thus, fluorescent materials appear brighter to the eye because they appear to be glowing.
UV fluorescence as viewed by the STUDIO's UV camera:
UV fluorescence reveals hidden imagery in a Canadian Passport and Danish currency:
UV photography is very different from the photography of UV-excited fluorescence in the visible range. Fluorescence is the emission of longer wavelengths than those of incident light. Thus, illumination with UV radiation can cause the subject to fluoresce by emitting visible light. This is, for all practical purposes, photography in the visible range. (Source)
In addition to psychedelic posters and hidden features of paper currency, there's lots of interesting UV fluorescence in nature, such as in these minerals:
Cara Phillips makes portraits that explore the aesthetics of the human skin in UV:
Using the wet collodion chemical process, an early photographic technique invented by Frederick Scott Archer in 1851, photographer Michael Bradley developed a series of portraits, featuring facial tattoos from the indigenous New Zealand culture, the Māori.
The wet collodion process primarily records UV information, as can be seen in the spectrum recording below. The spectrum was generated by a prism, and was directly photographed using collodion. A photograph of the same spectrum was taken simultaneously with digital color. The two photographs were then overlaid using registration marks to ensure accuracy.
- UV-pass, visible-cut filters are available.
- It is also relatively inexpensive to have a Canon SLR permanently converted for UV.
- It is worth pointing out that UV & NIR photography also benefit from using special lenses which better focus these wavelengths.
- Of course, dedicated UV cameras also exist, some of which, like this Nurugo UV camera attachment for Android, are relatively inexpensive.
The STUDIO has an extremely sensitive monochrome security camera which, used with a UV-pass filter, is able to view the world in UV. The image above was recorded with this camera. You can see some of my skin damage.
Ultraviolet light can also produce dramatic fluorescence in some materials.
Fluorescence is the emission of light by a substance that has absorbed light or other electromagnetic radiation. It is a form of luminescence. In most cases, the emitted light has a longer wavelength, and therefore lower energy, than the absorbed radiation. The most striking example of fluorescence occurs when the absorbed radiation is in the ultraviolet region of the spectrum, and thus invisible to the human eye, while the emitted light is in the visible region, which gives the fluorescent substance a distinct color that can be seen only when exposed to UV light. Fluorescent materials cease to glow nearly immediately when the radiation source stops, unlike phosphorescent materials, which continue to emit light for some time after.
Because the material is absorbing UV light, most fluorescent materials appear dark or black in UV.
- Vitamin B2 fluoresces yellow.
- Tonic water fluoresces blue due to the presence of quinine.
- Highlighter ink is often fluorescent due to the presence of pyranine.
As X-rays can reveal the interior structure of objects and people, we expect to see artists exploring this form of 'revelation'. For example, here is a Rose by Bryan Whitney, from this Survey of X-Ray Photographic Art:
- In his series 'Xograms', Hugh Turvey (Artist in Residence, The British Institute of Radiology) takes a "deeper look at everyday objects":
- This is a stunning project by Cohen+Van Balen: Infrastructures of Natural History, X-rays of taxidermied animals:
It is also possible to design or arrange objects for the express purpose of having them discovered in X-Ray images.
- Evan Roth's TSA Communication project, a series of custom sheet metal cutouts placed in luggage, adopts a strategy similar to the Surveillance Camera Players:
- We can also reimagine extreme insertions as a form of performance art, for an audience of radiologists. (X-ray insertions)
- Backscatter X-Ray imaging operates differently, and measures the X-Rays that bounce off of materials rather than passing through them.
What is the spectrum of light captured by each pixel of a camera?
Using Bayer filters, standard color digital cameras record three channels (RGB) of spectral information. Using less-common Bayer filters, a few specialized cameras have four channels (RGB+NIR), or even as many as 8 or 16. As the number of channels increases even further, multi-spectral imaging becomes hyperspectral imaging, which attempts to capture the entire reflective or emissive spectrum for every pixel — essentially representing an image with a three-dimensional (x,y,λ) data cube. Some hyperspectral cameras have as many as 200 spectral channels, some with bandwidth as narrow as 7 nanometers per channel.
Multivariate statistics and machine learning are often required for the analysis of hyperspectral images, in order to perform "spectral unmixing": discriminating the spectral contributions from different image sources ("endmembers"), based on their unique spectral signatures. For example, in remote sensing and satellite imagery: vegetation, soil, moisture, certain minerals, etc.:
or, in human skin: melanin, hemoglobin, oxyhemoglobin (blood oxygenation), and bilirubin.
There are lots of applications in agriculture, food processing, dermatology, forensics, art conservation, etc....
Multispectral and hyperspectral cameras are built using several different technologies. Most methods require a great deal of time to capture a scene at all of the supported frequencies.
- Liquid Crystal Tunable Filter (LCTF): a device that filters the color of transmitted light, under computer control
- Linear Variable Bandpass Filter: literally a strip of glass with a rainbow on it. Imaging is accomplished by moving the camera and the filter relative to each other at a known rate, and then compiling image columns in a slit-scan-like process.
- Multiplexed monochromatic illumination. Here's a semi-DIY hyperspectral camera using 17 color LEDS
- Filter wheels.
- Adjustable bandpass filter employing a prism
- Multispectral Bayer Filter. Such cameras can operate at video rates, but sacrifice resolution.
A "stud finder" (around USD 30 at Home Depot) is essentially a one-pixel RF camera.
[From Wikipedia] Radar is an object-detection system that uses radio waves to determine the range, angle, or velocity of objects. It can be used to detect aircraft, ships, spacecraft, missiles, motor vehicles, weather formations, and terrain. A radar transmits radio waves or microwaves that reflect from any object in their path.
Marine radar is surprisingly inexpensive and produces fascinating images of the world, allowing (for example) imaging of nearby whales. Devices such as the Furuno 1623 or Furuno DRS4W systems cost under $1500.
(Image from here).
Ground-penetrating radar images features such as underground pipes. We have a contact at Pittsburgh-area GPR providers, Geospatial Corporation.
Radio waves can be used in other ways for imaging, as in RF-Pose by researchers at MIT CSAIL. The STUDIO owns a Walabot RF imaging device (essentially a stud-finder array) which, used with appropriate machine-learning models, can form the basis for this type of imaging.
This is the stuff that really sees through clothing.
- Millimeter wave scanners, are used by the TSA
- Terahertz imagers, which operate around 300 micrometers.
X-ray computed tomography (X-ray CT) and magnetic resonance imaging (MRI) and are medical imaging techniques that employ significant computation to produce 3D models of internal body structures and activities. Perhaps if you have a good reason, you can get a scan of yourself at the hospital.
Kyle McDonald experimenting with some of his own CT scan data in openFrameworks.
Angiography is the process of imaging blood vessels. Recent progress in MRI imaging has made possible whole-body magnetic resonance angiography (MRA):
CT scan by Scott Echols capturing tiny blood vessels in the head of a pigeon, created by a special ‘contrast agent’ to highlight the microvasculatory system.
Neutron Imaging is based on the neutron attenuation properties of the imaged object. The resulting images have much in common with X-ray images, but some things easily visible with neutron imaging may be very challenging or impossible to see with X-ray imaging techniques (and vice versa). X-rays are attenuated based on a material's density. Denser materials will stop more X-rays. With neutrons, a material's likelihood of attenuation of neutrons is not related to its density. Some light materials such as boron will absorb neutrons while hydrogen will generally scatter neutrons, and many commonly used metals allow most neutrons to pass through them completely.
Neutron imaging is often used to see the movement of fluids, such as oil or water, in large metal objects. (Car makers regularly visit neutron imaging facilities to carry out quality control tests on engines). Another application of imaging is the study of wooden objects. Video
Polarization is a property of light which describes, not its frequency or wavelength, but the orientation of the spatial plane in which its waves are traveling. It is useful in visualizing several phenomena which cannot otherwise be seen by the human eye.
Polarized light eliminates reflections. Here, a circular polarizer eliminates reflections on water, making another world visible beneath. From "Removing Glare with a Circular Polarizer", which includes a nice video.
By computing the difference between images of scenes taken with and without polarizations, it's possible to cleave the diffuse appearance of an object from its specular appearance. The images below, taken from "How To Split Specular And Diffuse In Real Images", show how this can be done. The first image is the 'regular' appearance, and then (through image differencing) the diffuse-only and specular-only images.
You can see this technique applied to a human face in this video of the photoreal Digital Emily Project.
This face analysis research combines machine learning, polarimetry (to split specular and diffuse reflections), and hyperspectral imaging (to visualize differences in blood oxygenation and skin pigmentation):
Polarimetric thermal imaging enhances standard thermal imaging (2nd frame, below) with polarizing filters (generally wire-grid polarizers). This produces perpendicularly- and cross-polarized views of the thermal radiation (3rd and 4th frames), which reveal structures not visible in uniform thermal imagery. This information encodes the spatial orientation of the surface emitting the heat.
More information about polarized light in this video from PBS:
Incidentally, there are some other very clever ways of separating specular from diffuse appearances of objects.
Polarized light can also reveal internal stresses in (clear) materials, in a phenomenon known as photoelasticity. Here's a plastic ruler between cross-polarized filters:
Here's the setup to achive this. More information can be found at (e.g.) Andrew Davidhazy's site.
Some nice videos of polarization and stress visualization:
- https://www.youtube.com/watch?v=3QBGgypAjkY
- https://www.youtube.com/watch?v=gP751qpm4n4
- https://www.youtube.com/watch?v=7YaoSODkymc
Sometimes, illuminating a subject from an unusual direction reveals new information about it. Raking (oblique) and transmissive illumination (from behind) are commonly used by art conservators.
Raking light is a technique in which a subject is illuminated from one side only, at an extremely oblique angle in relation to its surface. Raking light is used to reveal a object's surface texture. Raised parts of the surface facing the light are illuminated, while those facing away create shadows.
In this example, a single sheet of blank paper was imaged in normal illumination on the left and in raking light on the right. In the left image, the paper appears smooth and flat, while the right image reveals the laid structure of the paper by exaggerating the texture of the vertical chain lines and the watermark at the upper edge. Raking light also highlights the vertical crease running through the sheet, tears along that crease at the edges, and shorter handling creases scattered throughout.
In the example below, lighting the same piece of paper from behind reveals details (like its watermarks) that would not ordinarily be so easy to discern.
Raking light has been used to create a "crack map" of the well-known Vermeer painting. The middle image is a colorless texture image.
Vaclav Skala has used the raking light from a flatbed scanner, from 4 directions, to compute 3D reconstructions of nearly flat objects.
Schlieren imaging creates images which reveal, and depend on, minute differences in the index of refraction of air. In short, it depends not on a property of light, but on a property of light's medium.
- Kyle McDonald has explored Moiré-Schlieren imaging.
Shadowgraphy is an optical method that reveals non-uniformities in transparent media like air, water, or glass. It is related to, but simpler than, Schlieren methods, generally using a flash image and extremely brief exposure. (It's literally just the shadows of shimmering heat in the air.) Video
The shadowgraph is the simplest form of optical system suitable for observing a flow exhibiting variations of the fluid density. In principle, the system does not need any optical component except a light source and a recording plane onto which to project the shadow of the varying density field (Figure 1). A shadow effect is generated because a light ray is refractively deflected so that the position on the recording plane where the undeflected ray would arrive now remains dark. At the same time the position where the deflected ray arrives appears brighter than the undisturbed environment.
Here are some smartphone shadowgraphy techniques.
Dr. Gary Settles at Penn State University, an expert in Schlieren imaging and shadowgraphy, has built a "Full Scale Schlieren" system using retroreflective material.
It is possible to do Schlieren imaging by doing background-subtraction in front of a white-noise image:
The STUDIO has a portable ultrasound device. ExCap'20 student Cat Ploehn used this to create a typology or portrait series of her classmates' beating hearts.
It's well-known that bats and dolphins image the world through ultrasonic reflections.
But were you aware of human echolocation? Here are three different individuals, who despite being blind are able to map a detailed mental plan of their surroundings:
Of course, there are also devices for sonar imaging. You are probably familiar with ultrasonic fetal imaging. 2D is more common, but recently 3D ultrasound has become available.
It has become popular in Japan to 3D-print copies of the unborn.
Sonar can also be used to image environments in both 2D and 3D. Using equipment such as this, for example, people investigate and discover seafloor shipwrecks.
Microphone arrays, in combination with appropriate analysis software, can be used to localize audible or near-ultrasonic audio, in order to (for example) localize gas leaks in pipes — so-called "acoustic cameras".