Skip to content

OI (optical image)

Brian Wandell edited this page Jun 18, 2024 · 44 revisions

Scripts for Optical Image and Optics


Optical Image (oi Prefix) fundamentals

ISETCam uses the term optical image to describe the the light that has passed through the optics and is incident at the sensor surface. This light is also called the spectral irradiance at the sensor surface. We use the letters 'oi*' to begin the name of many functions that operate on the optical image. This includes the basic functions oiCreate, oiSet/Get, oiCompute, oiWindow, and oiPlot.

These lines of code create a scene and optical image structures. They then compute the irradiance and render it in a window.

scene = sceneCreate;
oi = oiCreate('wvf');
oi = oiCompute(oi,scene);
oiWindow(oi);

oiWindowMCC

Typing 'oi<TAB>' into the Matlab command prompt will bring up a large number of functions. You can find a large number of scripts and tutorials that compute, plot and derive properties of the optical image.

The scene spectral radiance is measured in units of photons/sec/sr/nm/m^2. The optical image spectral irradiance is defined in terms of photons/sec/nm/m^2.

Optical Image (oi) data are spectral

The image shown in an oiWindow is, of course, an RGB image. But the oi data represented in the software are spectral. For example, when we invoke oiPlot to show the data on the 63rd row of an optical image, the following images appear on the screen:

thisRow = 63; oiPlot(oi,'irradiance hline',[1 thisRow]);

The white line in the oiWindow shows which line is plotted. The mesh in the plot window shows the spectral irradiance (quanta/sec/nm/m2) as a function of position on the sensor surface (microns). This line is plotted through the gray-series in the chart, and for this line the spectral irradiance curves in each of the gray patches are mostly just scaled copies of one another.

Optical Image (oi) rendering

The rendered RGB image is computed from the spectral irradiance by using a standard calculation:

  • The spectral irradiance are converted to CIE XYZ values (see ieXYZFromPhotons).
  • The display values in the window are computed from the XYZ by assuming the user is viewing a standard sRGB display (xyz2srgb).

To learn more about how we represent displays see the ISETCam displays section. Read about the sRGB standard on Wikipedia.

Optics

The relationship between the radiance and irradiance images is defined by the properties of the optics. The optics structure is represented as a slot within the optical image structure.

oi.optics

ans = 

  struct with fields:

             type: 'optics'
             name: 'standard (1/4-inch)'
            model: 'diffractionlimited'
          fNumber: 4
      focalLength: 0.0039
              OTF: [1×1 struct]
    transmittance: [1×1 struct]
          offaxis: 'cos4th'
       vignetting: 0
           cos4th: [1×1 struct]

To access the parameters of the optics, you can use oiGet(''), such as

>> oiGet(oi,'optics f number')

ans =

     4

The link at the top of the page and this one take you to a page of scripts and tutorials that show many calculations of the optical image.

Types of optics

The optics are an essential part of the image systems simulation. We provide a quick overview here, and then link out to separate pages for more detailed information.

Shift-invariant optics - diffraction

Many evaluations use a simple approximation of the scene and optics. The scene is modeled as a two-dimensional plane, and for each wavelength the optics is modeled as a shift-invariant linear system. In this way the calculation from the scene radiance to the optical image is no more than a set of convolutions, one for each wavelength. The convolution kernel, called the point spread function (PSF), varies with wavelength but (in this case) not with position in the visual field. Even systems that are not completely shift-invariant, are approximately shift-invariant over some region (isoplanatic).

There is an important special case of shift-invariant optics: the perfect lens. These are called diffraction-limited optics. See this Airy Disk tutorial. The wavelength-dependent point spread function (PSF) of a diffraction-limited lens can be calculated directly using a formula. The f-number (ratio of focal length to aperture diameter) is the only critical number for defining the diffraction-limited lens properties. As you can see from the code and image above, by default when you use oiCreate() you are returned a diffraction limited optics with an f-number of 4.

Shift-invariant optics - wavefront aberrations

Deviations from ideal are typically described by the optics' wavefront aberrations, and ISETCam implements shift-invariant optics using wavefront aberrations. There are a large number of functions to create, set, get, and plot the wavefront representations (wvf), and these are at the heart of the shift-invariant optical calculations. The wavefront aberrations are described using the international standard Zernike polynomial representations, which includes aberrations with specific labels such as defocus, coma, and different types of astigmatism.

In the spectral irradiance calculations, wavefront aberrations are converted into a point spread function. The PSF is convolved with the spectral radiance. The implementation requires paying attention to the details of the sampling rate for the scene spectral radiance. There are many tutorials and examples of these calculations in the ISETCam toolbox.

Wavefront aberrations are particularly helpful when modeling the properties of the human eye, on axis. Many investigators use adaptive optics to measure these aberrations, and we take advantage of these data in the ISETBio toolbox).

Shift-variant optics - Ray tracing

For many optical systems, the point spread function varies significantly with both wavelength and position in the visual field. We have a computational method ('ray trace') that works for these lenses when we have a lens description. Typically, these can be obtaine from lens designer programs, such as Code V or Zemax. For example, we have a Zemax macro that produces a large number of point spread functions along with a description of the geometric distortion and the relative illumination of the model lens. ISETCam (rtImagePSFFieldHeight) imports these data into the optics and uses them for the oiCompute calculation. We have validated that the ISETCam import of the Zemax data produces a rendered image that matches the one calculated using Zemax directly. And recently, we have built a similar macro for Code V.

ISET3d

Up to this point, we have only described calculations for a planar scene, or equivalently for a scene in which all the objects are far away. For many scenes the three-dimensional nature is important. To model how the three-dimensional scene radiance is converted to the irradiance at the sensor surface we developed ISET3d (and more recently ISET3d-tiny). That software takes 3D graphics models as inputs, allows the user to specify multi-element lenses, and then uses Physically Based Ray Tracing (PBRT) to trace rays in the 3D scene to the spectral irradiance at the sensor surface.

ISET3d includes a great many functions and data, including lens models. The software even includes methods to place microlenses at the sensor surface to model advanced camera designs, such as dual pixel autofocus and light field cameras. These calculations are complex and somewhat specialized. Using the repository requires a more complex hardware configuration and data management. For that reason we separated the code into its own separate repository ISET3d. When it is installed, the ISET3d toolbox can be used to calculate ISET scenes and optical images, and it works with both ISETCam and ISETBio.

Clone this wiki locally