Skip to content

Latest commit

 

History

History
649 lines (352 loc) · 52 KB

multispectral.md

File metadata and controls

649 lines (352 loc) · 52 KB

Multispectral Imaging

In this lesson, we discuss images captured from electromagnetic frequencies beyond the range of human perception. Multispectral imaging employs additional channels of electromagnetic radiation beyond human perception, such as infrared (IR), ultraviolet (UV), X-rays, and more.

Multispectral imaging is awesome


Here's a rainbow over CMU. Water droplets in the air split up the sun's white light into component colors.

CMU Rainbow

Here's a prism in someone's window, splitting up sunlight into its component colors.

The visible spectrum is just one slice of electromagnetic radiation that human eyes happen to be able to see. But there are many other frequencies of "light" which can be detected and imaged using special cameras, such as ultraviolet and infrared.

Here is a useful chart showing different spectral bands and some of their applications. (Note that the direction of the spectrum is reversed from the two diagrams above.)

Electromagnet_spectrum


This lesson includes a discussion of the following:

In addition, two other imaging techniques discussed here, while not "multi-spectral", deal with unusual properties of light and its transmissive medium:

Finally, it's not even necessary to image with electromagnetic radiation; let's not forget about sound.


Overview

What does the world look like when observed with non-visible light frequencies?

Because the human eye cannot see things like X-rays, IR, UV, etc., we need to use some sort of imaging technology to translate energy patterns in these spectra into patterns that our eyes can see. The "right" way to visualize such energy patterns is always arbitrary — or, at least, an artifact of the sensing and display technology.

Here is a face, simultaneously imaged in the (a) visible spectrum, 0.4–0.7 µm, (b) SWIR or short-wave infrared, 1.0–3.0µm, (c) MWIR or mid-wave infrared, 3.0–5.0µm, and (d) LWIR or long-wave infrared, 8.0-14.0µm. As we image using increasingly long wavelengths, we shift from seeing the reflective (color) properties of the subject, to the emissive (heat) properties of the subject.

Multi-spectral face

Here are some faces viewed in UV, visible light, and IR. What sorts of features appear differently in different forms of light?

Hyperspectral face by RShephorse

UV, Vis, NIR by Spigget / Nick Spiker

UV, Vis, NIR, from Wikipedia

In the photos below, by Nick Spiker and Nevada Weir, it's clear that depending on the imaging frequency, light skin can look dark, and vice-versa. The dark appearance under Short-Wave IR (SWIR) is because water just below the skin’s surface absorbs radiation in this band.

Vis, NIR, SWIR, from Wikipedia, by Nick Spiker

NIR portrait by Nevada Weir


Some Applications

Satellites employ multispectral imaging to understand the Earth. Here are a set of aligned images taken simultaneously, in different IR bands, by the Chinese weather satellite FY4A-AGRI. Whereas most images you see have three channels of information (RGB), this satellite records 14 different channels of visible, SWIR, MWIR, and LWIR light. Note how for some bands, like 5.8-6.7μm, the water vapor in the atmosphere is opaque.

FY4A_AGRI_IR_earth

In order to visualize the structure of such multichannel data, a common trick in satellite imaging is to re-map a channel of invisible light (like IR) to the R,G,B channels of a regular image. Charlie Loyd discusses this in his terrific article, Putting Landsat8's multispectral imaging to work. This is called false-color imaging.

For example, here's a false-color Landsat8 image by in which SWIR data is used in the "red" channel of the image; NIR data as "green"; and near-UV as "blue". The purpose of doing this is not to create some sort of garish, psychedelic image; rather, the false-color imaging makes the patch of forest in the lower-left (which had been affected by forest fire) starkly visible:

False-color image using thermal data in the red channel

Astronomers use multispectral imaging to understand the Sun. This set of images of the Sun was recorded using mostly deep ultraviolet and X-rays:

Multispectral image of the sun.

Forensic specialists use infrared and ultraviolet imaging and/or fluorescence to recover writing lost to water damage, analyze medieval frescoes, detect fraudulent documents, determine the provenance of artifacts, sort fragments of shredded documents, detect underpaintings in famous artworks, and detect earlier versions of artworks (includes web interactive).

IR image of water-damaged document

Using IR to detect fraud

Using UV fluorescence to distinghuish different fragments of white paper

It's common to examine paintings under many sorts of light. In addition to visible, UV, IR, and X-Rays, experts may also use grazing/raking light (from the side), transmitted light (from behind), and co-axial light (retroreflective) to examine a painting.

multispectral-painting.jpg


SWIR (Short-Wave Infrared)

SWIR radiation passes readily through haze and smog, which tend to scatter visible light. The image below shows two views of an oil rig imaged through 47km of air with a substantial amount of marine haze in the air path. The rig can just barely be seen in the visible image, while the SWIR image shows strong contrast and the presence of a flare at the end of a long boom.swir-marine.jpg

SWIR's ability to see through haze and smoke is useful for firefighters and also self-driving cars.
swir-marine.jpg


Near-Infrared Imaging

IR is light that is beyond the red end of the visible spectrum. Wavelengths in the range of ~770 to ~1400 nanometers are called the near-infrared (or NIR) region of the electromagnetic spectrum, while longer wavelengths include Short-Wave IR (SWIR) and Long-Wave IR (LWIR). Near-infrared is widely used in standard security cameras, while far infrared is known as thermal imaging (which we discuss in the section after this).

NIR Spectrum

NIR cameras can be inexpensive and easy to obtain. With IR illumination, they allow us to see in the "dark", so they are widely used as security cameras. Here's a BBC video tracking animals, unobtrusively, licking salt in a cave at night:

Animals in NIR

Owing to the different infrared reflectivity of blood, you can see veins easily in NIR:

Veins, in NIR by Jasper Nance

It's important to distinguish between monochromatic IR images (a grayscale image whose content is exclusively from the infrared part of the spectrum), such as the image above, and various types of CIR (Color+IR) images: false-color images which store multiple (spatially aligned) channels of information from different parts of the spectrum. Since RGB images are a common display format for multichannel image data, one common CIR technique (described in this PDF) stores IR information in the Red channel, Red information in the Green channel, and Green information in the Blue channel. This is conceptually similar to the LandSat satellite imaging trick described above. Edward Thompson has compiled an artful book of such images, such as this one:

CIR images by Edward Thompson

Incidentally, the visibility of veins in IR has been used in some medical augmented-projection applications, such as the Christie VeinViewer and other infrared vein finder devices:
The VeinViewer visualizes an IR image with an augmented projection

It's common for CIR imaging to be used for aerial/satellite photography. Foliage, in particular, becomes much more visible:
CIR images by Edward Thompson

Below is a technical breakdown of how NIR-R-G "false color composite" images are constructed. If you want to try constructing such an image yourself (using the Channels palette in Photoshop, for example), here are the component images: trees_nir.png, trees_rgb.png; scene_nir.png, scene_rgb.png; and here's a NIR+RGB dataset. Such images can be useful for calculating the NDVI (Normalized Difference Vegetation Index), a widely-used agricultural metric for quantifying the health and density of vegetation
trees-making-composite.png

Richard Mosse's The Enclave (2013) is a documentary film about the ongoing civil war in Congo, shot on CIR film. Mosse's work has spurred controversy for the way in which it aestheticizes turmoil, especially as captured by a European working in Africa.
Richard Mosse's 'The Enclave'

Some pigments are not visible in NIR. This is used as an anti-counterfeiting measure in US currency. The different patterns of (missing) stripes are also useful in machine recognition; each denomination has its own pattern.

nir_10_and_100.jpg

Some materials are opaque in visible wavelengths, but transparent in NIR wavelengths. This means that NIR can be used to see certain kinds of obscured or invisible information. A common technique for this is infrared reflectography, which takes advantage of the NIR-transparency of some kinds of paint, in order to view a painting's underlayers:

NIR infrared reflectography

For example, The Blue Boy (ca. 1770), an oil painting by Thomas Gainsborough (1727-1788), has an overpainted dog, discovered in 1994. Here the painting is shown in normal light photography (left), digital x-radiography, and infrared reflectography (right).

The Blue Boy in visible, X-Ray, and NIR

The paintings "The Lynching of Leo Frank" and "Stella at the Playground" by Oliver Lutz (2010) use a (visibly) black, IR-transparent overpainting—covering a "secret" image that can only be observed by means of a NIR security camera and a nearby CCTV. Lutz makes many projects with this IR-clear, visibly-black overpainting. His work appeared in the Walker Art Center exhibition "Exposed: Voyeurism, Surveillance and The Camera since 1870".

Paintings by Oliver Lutz

Paintings by Oliver Lutz

Artist Osman Khan created a strictly IR-viewable image, whose content is visible only through the audience's personal digital capture devices.
Osman Khan IR-only art

In some circumstances, depending on materials, NIR cameras can see through clothes, as in this example with a tattoo, below.
NIR camera seeing through clothes

NIR imaging can be used to detect traced (i.e. forged) signatures:

Forged signatures in IR

There are also many uses of NIR imaging for real-time body tracking and computer vision.


Thermal Imaging

Thermal imaging senses light wavelengths in the range of ~8000-14000 nanometers, also called long wave infrared, which corresponds to what we experience as heat. In short, we see where something is hot, and to what extent.

forehead-thermometer.jpg
A touchless forehead thermometer (about USD20) is essentially a one-pixel thermal camera. Niklas Roy made a DIY Thermal Camera by mounting one on a pan/tilt servo.

What we see when we observe radiation in the ~8000-14000nm range is emissive rather than reflective. Interestingly, there is a band that is not used for imaging — the "low transmittance window", between 5000-8000nm, because air is opaque (absorbs IR) at those wavelengths. Put another way: those are the wavelengths of light-energy that heat the air itself.

IR Spectrum

  • David Attenborough discusses the use of thermal imaging to understand lizard temperature self-regulation, in this BBC video:

Thermal Lizards

  • A cult classic, THE OPERATION by Jacob Pander and Marne Lucas (1995) is a hybrid art/porn movie, shot completely with a thermal camera. (NSFW)

The Operation by Jacob Pander and Marne Lucas

  • Lucas and Pander have also produced Incident Energy, a multi-channel thermal video which explores "themes of nature and humanity", including live human birth.

Incident Energy by Jacob Pander and Marne Lucas

  • Route 94: My Love (2013) is a more recent music video with much the same idea. Note how the intensity of thermal energy can be interpreted according to a grayscale spectrum, color spectrum, etc.

Thermal image

  • Terike Haapoja's Community (2007) presents thermal videos of animals which have just died. We see the heat leaving their bodies.

Terike Haapoja animals

Portrait by Linda Alterwitz

Note that there is no "correct" way to view thermal imagery. Cameras offer a variety of different spectra for mapping their temperature ranges, including grayscale, black-body, chromatic (blue = cold), etc.

Sometimes simply presenting such alternative views can be a provocative, entertaining or educational experience for audiences.

  • Many science museums, such as the Exploratorium, have a "Heat Camera" display in which the public can see themselves in a thermal camera.

  • City Thermogram, by Peggy Ahwesh (2015), was commissioned to create a live-video installation in New York's Time Square, in which she connected up a thermal camera (trained on passersby) to a large electronic billboard.

City Thermogram by Peggy Ahwesh

  • In the exhibition "Laura Poitras: Astro Noise" (2016) at the Whitney Museum of American Art, the installation "Bed Down Location" features time-lapse video projections of night skies in Yemen and Somalia. In the next room over is Poitras' project "Last Seen", which shows a time-lagged view of the previous room's mattress, showing the viewer's own fading thermal imprints.

Laura Poitras, Bed Down Location Laura Poitras, Last Seen

Google roof tool

  • Different gases absorb energy in different parts of the thermal radiation spectrum. Just as air is opaque from 5-8um (MWIR), methane is opaque in the LWIR. Thermal LWIR video was thus useful in visualizing the 2015 Porter Ranch methane leak:

Thermal video of methane leak

  • These Swedish researchers have built a modified thermal camera specifically for seeing methane. This is important because methane is 86 times more impactful in greenhouse warming.

Thermal video of methane leak

Additional thermal videos of possible interest:


Thermochromic Imaging

While we're on the topic of visualizing heat: Some substances temporarily change color in response to heat. In different contexts, thermochromic pigments can work as a capture technology or a display technology.

  • Jay Watson's Thermochromic Table (2011) reveals where and how people have sat at the furniture.

Jay Watson's Thermochromic Table

Temperature Sensitive Object by Archilab

Thermochromic Urinal

  • Dutch artist Carina Hesper has created a book, Like a Pearl in My Hand, in which portrait photographs have been overprinted with thermochromic coating. The reader of the book must interactively reveal the underlying photographs

Carina Hesper's book, 'Like a Pearl in My Hand' has created a book, Like a Pearl in My Hand


Ultraviolet Imaging

  • The World In UV, an excellent overview video by science YouTuber, Veritasium, discusses how UV offers new understandings of atmospheric haze, sunscreen, quinine, flowers, polar bears, and more:

The world in UV

Ultraviolet animal vision and markings.

The vast majority of humans cannot see ultraviolet light. However, this person was able to see UV light after he had cataract surgery.

seeing-uv-after-surgery.jpg

Many animals appear different, and can see in the ultraviolet. For example,

  • Butterflies are thought to have the widest spectral visual range of any land animal. Butterflies can use ultraviolet markings to find healthier mates. Ultraviolet patterns also help certain species of butterflies appear similar to predators, while differentiating themselves to potential mates.
  • Reindeer rely on ultraviolet light to spot lichens that they eat. They can also easily spot the UV-absorbent urine of predators among the UV-reflective snow.
  • One bird species was found to feed its young based on how much UV the chicks reflected.
  • Some species of birds use UV markings to tell males and females apart.
  • The flower Black-eyed Susans have petals that appear yellow to humans, but UV markings give them a bull's eye-like design that attracts bees.
  • Sockeye salmon may use their ultraviolet perception to see food.

Kestrels can see in the UV, which helps them find prey from their UV-fluorescent urine. Here's David Attenborough:
kestrel-vision.png

  • Scorpions glow under ultraviolet light, but scientists do not know why.

Scorpions glow in UV

Spectra of different species' vision

Bird vision makes use of UV

UV video overview by Thomas Leveritt, promoting sunscreen:
Thomas Leveritt video

UV is also widely used in forensics:
Cleaning marks visualized with UV

Recently, skin splotches that are visible only in UV have been used as markers for face-tracking.
UV face tracking

Fluorescence is not UV Imaging

Fluorescent materials absorb UV light (and thus appear dark in a UV camera). The materials then convert some of that absorbed energy into visible light, which they release. Thus, fluorescent materials appear brighter to the eye because they appear to be glowing.

UV fluorescence as viewed by the STUDIO's UV camera:

fluorescence-test.jpg

UV fluorescence reveals hidden imagery in a Canadian Passport and Danish currency:

canadian-passport-with-uv.jpg

Money fluorescing

UV photography is very different from the photography of UV-excited fluorescence in the visible range. Fluorescence is the emission of longer wavelengths than those of incident light. Thus, illumination with UV radiation can cause the subject to fluoresce by emitting visible light. This is, for all practical purposes, photography in the visible range. (Source)

In addition to psychedelic posters and hidden features of paper currency, there's lots of interesting UV fluorescence in nature, such as in these minerals:
Rocks fluorescing

Artworks using UV

Cara Phillips makes portraits that explore the aesthetics of the human skin in UV:
Portrait by Cara Phillips

Using the wet collodion chemical process, an early photographic technique invented by Frederick Scott Archer in 1851, photographer Michael Bradley developed a series of portraits, featuring facial tattoos from the indigenous New Zealand culture, the Māori.
Michael Bradley uv/vis portrait with Maori tattoos

The wet collodion process primarily records UV information, as can be seen in the spectrum recording below. The spectrum was generated by a prism, and was directly photographed using collodion. A photograph of the same spectrum was taken simultaneously with digital color. The two photographs were then overlaid using registration marks to ensure accuracy.
visible and collodion spectra

UV imaging in practice

Golan in UV
The STUDIO has an extremely sensitive monochrome security camera which, used with a UV-pass filter, is able to view the world in UV. The image above was recorded with this camera. You can see some of my skin damage.

Ultraviolet light can also produce dramatic fluorescence in some materials.

Fluorescence is the emission of light by a substance that has absorbed light or other electromagnetic radiation. It is a form of luminescence. In most cases, the emitted light has a longer wavelength, and therefore lower energy, than the absorbed radiation. The most striking example of fluorescence occurs when the absorbed radiation is in the ultraviolet region of the spectrum, and thus invisible to the human eye, while the emitted light is in the visible region, which gives the fluorescent substance a distinct color that can be seen only when exposed to UV light. Fluorescent materials cease to glow nearly immediately when the radiation source stops, unlike phosphorescent materials, which continue to emit light for some time after.

Because the material is absorbing UV light, most fluorescent materials appear dark or black in UV.

  • Vitamin B2 fluoresces yellow.
  • Tonic water fluoresces blue due to the presence of quinine.
  • Highlighter ink is often fluorescent due to the presence of pyranine.

X-Ray Imaging

X-Ray Imaging in the Arts

As X-rays can reveal the interior structure of objects and people, we expect to see artists exploring this form of 'revelation'. For example, here is a Rose by Bryan Whitney, from this Survey of X-Ray Photographic Art:

Rose X-Ray by Bryan Whitney

  • In his series 'Xograms', Hugh Turvey (Artist in Residence, The British Institute of Radiology) takes a "deeper look at everyday objects":

Xograms, by Hugh Turvey

infrastructures-of-natural-history-leopard.jpg

David Maisel X-Ray

X-ray portraits by Ayako Kanda and Mayuka Hayashi

Lick by Wim Delvoye

Pinup Calendar by BUTTER

It is also possible to design or arrange objects for the express purpose of having them discovered in X-Ray images.

Evan Roth's TSA Interventions

Evan Roth's TSA Interventions

  • We can also reimagine extreme insertions as a form of performance art, for an audience of radiologists. (X-ray insertions)

X-ray insertions

  • Backscatter X-Ray imaging operates differently, and measures the X-Rays that bounce off of materials rather than passing through them.
    Backscatter examination of truck with illegal immigrants

Multispectral Imaging

What is the spectrum of light captured by each pixel of a camera?

bayer_pattern_on_sensor.png

Using Bayer filters, standard color digital cameras record three channels (RGB) of spectral information. Using less-common Bayer filters, a few specialized cameras have four channels (RGB+NIR), or even as many as 8 or 16. As the number of channels increases even further, multi-spectral imaging becomes hyperspectral imaging, which attempts to capture the entire reflective or emissive spectrum for every pixel — essentially representing an image with a three-dimensional (x,y,λ) data cube. Some hyperspectral cameras have as many as 200 spectral channels, some with bandwidth as narrow as 7 nanometers per channel.

Hyperspectral imaging

Multivariate statistics and machine learning are often required for the analysis of hyperspectral images, in order to perform "spectral unmixing": discriminating the spectral contributions from different image sources ("endmembers"), based on their unique spectral signatures. For example, in remote sensing and satellite imagery: vegetation, soil, moisture, certain minerals, etc.:

Hyperspectral imaging with spectral breakdowns

or, in human skin: melanin, hemoglobin, oxyhemoglobin (blood oxygenation), and bilirubin.

Hyperspectral imaging with spectral breakdowns

Hyperspectral imaging with spectral breakdowns

Case Studies

There are lots of applications in agriculture, food processing, dermatology, forensics, art conservation, etc....

Technologies

Multispectral and hyperspectral cameras are built using several different technologies. Most methods require a great deal of time to capture a scene at all of the supported frequencies.


Radio-Wave Imaging

Stud finder
A "stud finder" (around USD 30 at Home Depot) is essentially a one-pixel RF camera.

[From Wikipedia] Radar is an object-detection system that uses radio waves to determine the range, angle, or velocity of objects. It can be used to detect aircraft, ships, spacecraft, missiles, motor vehicles, weather formations, and terrain. A radar transmits radio waves or microwaves that reflect from any object in their path.

Marine radar

Marine radar is surprisingly inexpensive and produces fascinating images of the world, allowing (for example) imaging of nearby whales. Devices such as the Furuno 1623 or Furuno DRS4W systems cost under $1500.

Ground penetrating radar
(Image from here).

Ground-penetrating radar images features such as underground pipes. We have a contact at Pittsburgh-area GPR providers, Geospatial Corporation.

Radio waves can be used in other ways for imaging, as in RF-Pose by researchers at MIT CSAIL. The STUDIO owns a Walabot RF imaging device (essentially a stud-finder array) which, used with appropriate machine-learning models, can form the basis for this type of imaging.
Terahertz imaging

Millimeter Wave & Terahertz Imaging

Terahertz imaging

This is the stuff that really sees through clothing.

CT Scans, MRI, CAT etc.

X-ray computed tomography (X-ray CT) and magnetic resonance imaging (MRI) and are medical imaging techniques that employ significant computation to produce 3D models of internal body structures and activities. Perhaps if you have a good reason, you can get a scan of yourself at the hospital.

Kyle McDonald CT scan
Kyle McDonald experimenting with some of his own CT scan data in openFrameworks.

Angiography is the process of imaging blood vessels. Recent progress in MRI imaging has made possible whole-body magnetic resonance angiography (MRA):
whole body angiography

Bird CT scan
CT scan by Scott Echols capturing tiny blood vessels in the head of a pigeon, created by a special ‘contrast agent’ to highlight the microvasculatory system.


Neutron Imaging

Neutron Imaging is based on the neutron attenuation properties of the imaged object. The resulting images have much in common with X-ray images, but some things easily visible with neutron imaging may be very challenging or impossible to see with X-ray imaging techniques (and vice versa). X-rays are attenuated based on a material's density. Denser materials will stop more X-rays. With neutrons, a material's likelihood of attenuation of neutrons is not related to its density. Some light materials such as boron will absorb neutrons while hydrogen will generally scatter neutrons, and many commonly used metals allow most neutrons to pass through them completely.

Neutron imaging is often used to see the movement of fluids, such as oil or water, in large metal objects. (Car makers regularly visit neutron imaging facilities to carry out quality control tests on engines). Another application of imaging is the study of wooden objects. Video

Neutron video of coffee brewing


Polarimetry

Polarization is a property of light which describes, not its frequency or wavelength, but the orientation of the spatial plane in which its waves are traveling. It is useful in visualizing several phenomena which cannot otherwise be seen by the human eye.

Polarized light eliminates reflections
Polarized light eliminates reflections. Here, a circular polarizer eliminates reflections on water, making another world visible beneath. From "Removing Glare with a Circular Polarizer", which includes a nice video.

By computing the difference between images of scenes taken with and without polarizations, it's possible to cleave the diffuse appearance of an object from its specular appearance. The images below, taken from "How To Split Specular And Diffuse In Real Images", show how this can be done. The first image is the 'regular' appearance, and then (through image differencing) the diffuse-only and specular-only images.

Ordinary image of scissors
Diffuse image of scissors
Specular image of scissors

You can see this technique applied to a human face in this video of the photoreal Digital Emily Project.

This face analysis research combines machine learning, polarimetry (to split specular and diffuse reflections), and hyperspectral imaging (to visualize differences in blood oxygenation and skin pigmentation):

Polarimetric thermal imaging enhances standard thermal imaging (2nd frame, below) with polarizing filters (generally wire-grid polarizers). This produces perpendicularly- and cross-polarized views of the thermal radiation (3rd and 4th frames), which reveal structures not visible in uniform thermal imagery. This information encodes the spatial orientation of the surface emitting the heat.

More information about polarized light in this video from PBS:
PBS Polarizing video

Incidentally, there are some other very clever ways of separating specular from diffuse appearances of objects.

Transmissive Polarized Light: Visualizing Stress

Polarized light can also reveal internal stresses in (clear) materials, in a phenomenon known as photoelasticity. Here's a plastic ruler between cross-polarized filters:

Polarized light stress analysis

here's the setup
Here's the setup to achive this. More information can be found at (e.g.) Andrew Davidhazy's site.

Some nice videos of polarization and stress visualization:


Raking & Transmissive Light

Raking light

Sometimes, illuminating a subject from an unusual direction reveals new information about it. Raking (oblique) and transmissive illumination (from behind) are commonly used by art conservators.

Raking light is a technique in which a subject is illuminated from one side only, at an extremely oblique angle in relation to its surface. Raking light is used to reveal a object's surface texture. Raised parts of the surface facing the light are illuminated, while those facing away create shadows.

In this example, a single sheet of blank paper was imaged in normal illumination on the left and in raking light on the right. In the left image, the paper appears smooth and flat, while the right image reveals the laid structure of the paper by exaggerating the texture of the vertical chain lines and the watermark at the upper edge. Raking light also highlights the vertical crease running through the sheet, tears along that crease at the edges, and shorter handling creases scattered throughout.
Raking light

In the example below, lighting the same piece of paper from behind reveals details (like its watermarks) that would not ordinarily be so easy to discern.
Transmissive light

Raking light has been used to create a "crack map" of the well-known Vermeer painting. The middle image is a colorless texture image.
Crack map from raking light

Vaclav Skala has used the raking light from a flatbed scanner, from 4 directions, to compute 3D reconstructions of nearly flat objects.
vaclav_skala_3D_reconstruction_scanner.jpg


Schlieren Imaging and Shadowgraphy

Schlieren imaging creates images which reveal, and depend on, minute differences in the index of refraction of air. In short, it depends not on a property of light, but on a property of light's medium.

Video:
Schlieren imaging

Video:
Schlieren imaging

Shadowgraphy is an optical method that reveals non-uniformities in transparent media like air, water, or glass. It is related to, but simpler than, Schlieren methods, generally using a flash image and extremely brief exposure. (It's literally just the shadows of shimmering heat in the air.) Video

The shadowgraph is the simplest form of optical system suitable for observing a flow exhibiting variations of the fluid density. In principle, the system does not need any optical component except a light source and a recording plane onto which to project the shadow of the varying density field (Figure 1). A shadow effect is generated because a light ray is refractively deflected so that the position on the recording plane where the undeflected ray would arrive now remains dark. At the same time the position where the deflected ray arrives appears brighter than the undisturbed environment.

shadowgraphy

Here are some smartphone shadowgraphy techniques.

Dr. Gary Settles at Penn State University, an expert in Schlieren imaging and shadowgraphy, has built a "Full Scale Schlieren" system using retroreflective material.

Full Scale Schlieren

Full Scale Schlieren

It is possible to do Schlieren imaging by doing background-subtraction in front of a white-noise image:
Noise Schlieren


Sonar and Ultrasound

The STUDIO has a portable ultrasound device. ExCap'20 student Cat Ploehn used this to create a typology or portrait series of her classmates' beating hearts.

Ploehn hearts

It's well-known that bats and dolphins image the world through ultrasonic reflections.

But were you aware of human echolocation? Here are three different individuals, who despite being blind are able to map a detailed mental plan of their surroundings:

Of course, there are also devices for sonar imaging. You are probably familiar with ultrasonic fetal imaging. 2D is more common, but recently 3D ultrasound has become available.

Fetal ultrasound

It has become popular in Japan to 3D-print copies of the unborn.

3D-printed fetus

Sonar can also be used to image environments in both 2D and 3D. Using equipment such as this, for example, people investigate and discover seafloor shipwrecks.

Sonar shipwreck


Sonic Imaging

Microphone arrays, in combination with appropriate analysis software, can be used to localize audible or near-ultrasonic audio, in order to (for example) localize gas leaks in pipes — so-called "acoustic cameras".

Acoustic camera

Acoustic camera

Acoustic camera