Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sensor model and reference system documentation #36

Merged
merged 14 commits into from
Feb 27, 2024
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion definitions.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,4 +32,5 @@
*[PVL]: Parameter Value Language (PVL) is used extensively by ISIS as a standard keyword value type language for naming and expressing data values. PVL format in ISIS is compatible with syntax used by the Planetary Data System.
*[MAP]: A representation of a three dimensional target such as a sphere, ellipsoid or an irregular shaped body onto a plane
*[IAU]: The International Astronomical Union
*[Scale]: The map resolution measured in pixels per degree
*[Scale]: The map resolution measured in pixels per degree
*[ISD]: Image Support Data. The interior and exterior orientation data used to instantiate a camera model.
Binary file added docs/assets/sensor_models/ess2507-fig-0001-m.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 5 additions & 1 deletion docs/concepts/glossary.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,4 +197,8 @@
###Camera Coordinate System
A camera coordinate system is used to determine positions and directions of objects and vectors with respect to a sensor (i.e. camera). The origin is at the center of the sensor and the axes rotate and move through space with the sensor. These systems are non-inertial, meaning the velocity of the origin is non-constant. The location of the center of a spacecraft and its instruments at a given time are defined in NAIF SPK files. Generally, one or more frames are associated with a particular instrument.
###J2000 Coordinate System
The J2000 coordinate system (also known as EME2000) is based on the earth mean equator and dynamical equinox at midnight January 1, 2000. The origin is at the solar system barycenter. This system is inertial, since it does not rotate with respect to stars and the origin is not accelerating (i.e it has a constant velocity). This coordinate system is the root reference for NAIF's SPICE files and software.
The J2000 coordinate system (also known as EME2000) is based on the earth mean equator and dynamical equinox at midnight January 1, 2000. The origin is at the solar system barycenter. This system is inertial, since it does not rotate with respect to stars and the origin is not accelerating (i.e it has a constant velocity). This coordinate system is the root reference for NAIF's SPICE files and software.
###ISD
Image Support Data. The interior and exterior orientation data used to instantiate a camera model.
###Image Support Data
The interior and exterior orientation data used to instantiate a camera model.
25 changes: 25 additions & 0 deletions docs/concepts/sensor-models/reference-frames.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Reference Systems

In order to describe the location of an object, it is necessary to identify a point of reference by which the object can be localized. One might think of the object with respect to their own position, but it is also possible to describe the object as being at a certain location on a grid, or even some angular distance from a known location. Despite the differences in the methods of localizing the object, each of these descriptions produces a valid representation of the object's location with respect to some point of reference. By specifying an origin about which other objects can be oriented, we begin to form the basis of a "reference system," which establishes a coordinate system that can be used to describe positions. In practice, a reference system is required to define a datum, which is a known coordinate reference by which unknown points can be identified and a set of axes with an associated plane in which angular measurements can be made. Some examples of common photogrammetric reference systems are as follows:

- Image Reference System: a 2-dimensional (column/row or line/sample) reference system with a defined (0,0) datum, typically at the upper-left of the frame
- Distorted Focal Plane Reference System: a 2-dimensional (x,y) reference system nominally represented in Cartesian space and measured millimeters. This reference system is used to account for the distortion in an image reference system.
- Sensor Reference System: a 3-dimensional (x,y,z) reference system in which the z axis typically follows the sensor's boresight (if it is an optical sensor) as well as x and y axes that are traditionally parallel to the sensor's x and y axes.
- Spacecraft Reference System: a 3-dimensional (x,y,z) reference system in which the datum is centered on the spacecraft's center of mass. Axes are often represented using 𝜔, 𝜙, 𝜅, which are analogous to yaw, pitch, and roll, but instead of representing axial rotations with respect to onboard navigation, they are expressed with respect to axes of an arbitrary coordinate system.
- Body Centered Body Fixed (BCBF): a 3-dimensional (x,y,z) reference system in which the datum is centered on the target body's center of mass.

<figure markdown>
![Reference Systems](../../assets/sensor_models/ess2507-fig-0001-m.jpg)
<figcaption>Reference Systems: (a) Framing sensor image reference frame with origin in upper left corner of the upper left pixel; (b) distorted focal plane reference frame with the distorted origin again in the upper left; (c) sensor (1), gimbal (2a, 2b illustrating an articulated gimbal), and spacecraft (3) reference frames; (d) spacecraft reference frame in the standard configuration with the x axis being the orbital path; (e) body-centered body-fixed (BCBF) reference frame (Laura et al., 2020).</figcaption>
</figure>


# Reference Frames
It is important to note that a reference system does not provide a description of an object's location, but it instead provides a _means_ for describing locations. When locations are described using a reference system, the result is a "coordinate reference frame," which offers information related to an object's position, orientation, and velocity at a single instant in time.

!!! INFO "Reference System vs Reference Frame"
A __reference system__ provides an origin, axes / planes, and a fixed point that allows for the description of an object. A __reference frame__ is a description within the context of that reference system, and provides the location of an object at a given moment in time.

# Frame Chains

Reference frames often have dependencies on other reference frames, which results in a "frame chain" that must be calculated. For example, consider a common scenario in which a camera is intended to record an image of a planetary body. The camera is mounted on a gimbal, which is mounted on the spacecraft. In order to move from the image plane to a BCBF system, it is necessary to know the location of both the camera and the planetary body. However, in order to know the location of the camera, it is necessary to know the extrinsics not only of the camera, but also of the gimbal and the spacecraft to which it is mounted. The dependency chain that is formed is known as a "frame chain," and the entire chain of extrinsics must be calculated in order to accurately model the objects' relative locations.
63 changes: 63 additions & 0 deletions docs/concepts/sensor-models/sensor-model-software.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# Sensor Model Software
This document provides a list of several software packages that implement, manage, or interact with sensor models. This is not a complete list, and it is only intended to provide examples of how software packages can be used to create or interact with sensor models.

## The ASC Sensor Model Ecosystem
The Astrogeology Science Center has created and maintains a suite of software packages to manage the creation and exploitation of sensor models. These tools provide an end-to-end pipeline for the creation, management, and exploitation of sensor models, including a metadata specification, a CSM-compliant sensor model implementation, and tools to leverage those sensor models for scientific processing.


### Abstraction Library for Ephemerides (ALE)
!!! INFO "ALE -- Quick Definition"
ALE is a software package that relies on a collection of instrument-specific drivers in order to calculate and provide access to a camera's interior and exterior orientations. This information is output in a standard ISD format that provides all the information required to instantiate a CSM sensor model.

The [Abstraction Library for Ephemerides](https://github.com/DOI-USGS/ale/) (ALE) provides the tools and information necessary to derive and access a sensor's interior and exterior orientation. ALE provides a suite of instrument-specific drivers that combine information from a variety of metadata formats (image labels), sensor types, and data sources into a single CSM compliant format. It is important to note that ALE is responsible for abstracting away intermediate reference frames between the body-centered, body-fixed frame and the sensor frame so that the resulting model can transform directly from the sensor frame to the BCBF frame.

ALE uses a collection of drivers and class mixins to provide an ISD for a variety of sensor models. [Drivers](https://github.com/DOI-USGS/ale/tree/main/ale/drivers) for many major missions are available in ALE. Each driver is required to provide support for at least one label type (ISIS or PDS3) and one source of SPICE data (ISIS or NAIF). It is important to note that not every supported mission provides a driver for all combinations of label types and SPICE data sources.

### Integrated Software for Imagery and Spectrometry (ISIS)
[ISIS](https://github.com/DOI-USGS/ISIS3) is a software suite that comprises a variety of image analysis tools ranging from traditional tools like contrast, stretch, image algebra, and statistical analysis tools to more complex processes such as mosaicking, photometric modeling, noise reduction, and bundle adjustment. In addition to these analysis tools, ISIS defines the .cub format which can capture 3-dimensional image data (lines, samples, and bands) as well as image metadata in the form of an image label. In addition to capturing image data and metadata, .cub files can be augmented with camera geometry information such that elements of the camera model are captured alongside the data.

#### Bundle Adjustment / Jigsaw
ISIS's Jigsaw program provides a bundle adjustment algorithm, which is a full 3-dimensional scene reconstruction algorithm that

1. provides estimates of the 3-dimensional coordinates of ground points
1. provides estimates of the sensor's exterior orientations
1. minimizes error between the reconstructed scene and observed point locations

Bundle adjustmnet is a critical part of the sensor model ecosystem in that it can be used to iteratively refine and correct a model's geometry, which allows for more accurate transitions between reference frames, i.e. correctly geolocating a ground point from an image plane.

More information related to bundle adjustment can be found [here](https://isis.astrogeology.usgs.gov/Application/presentation/Tabbed/jigsaw/jigsaw.html)

#### ISIS Camera Models
Because ISIS predates the CSM, it contains camera models that are not compliant with the CSM API. ISIS contains camera models for framing, pushframe, linescan, radar, point, and rolling shutter cameras. While ISIS's cameras models are authoritative and mathematically correct, they are only usable within the context of ISIS. However, ISIS has also been modified to interoperate with CSM cameras. While ISIS camera models are still actively used, efforts are being taken to replace these proprietary models with CSM compliant models via the USGSCSM library.

### USGS Community Sensor Model (USGSCSM)

!!! INFO "USGSCSM -- Quick Definition"
USGSCSM is a software library that provides generic, CSM-compliant sensor model implementations. USGSCSM sensor models can be instantiated via an ISD or a USGSCSM state string.

The [USGS Community Sensor Model](https://github.com/DOI-USGS/usgscsm) is a concrete implementation of sensor models according to the standards described in the CSM. Where the CSM is the set of rules and standards that guides the creation of sensor models, the USGSCSM is a library of sensor models that adheres to those rules. The USGSCSM library provides generic instances of sensor models for instantaneous (framing) cameras, time-dependent (line-scan) cameras, push-frame cameras, synthetic aperture radar (SAR), and projected sensor models. Additionally, USGSCSM provides an extensible plugin architecture that allows for additional, interface-compliant sensor models to be dynamically loaded.

A camera model can be instantiated using image support data (ISD), but the CSM does not describe any particular source or format for that information. USGSCSM allows ISDs to be formatted as [JSON](https://www.json.org/json-en.html), [NITF](https://pro.arcgis.com/en/pro-app/latest/help/data/nitf/introduction-to-nitf-data.htm), or bytestreams. Because an ISD is intended to provide all the information necessary to instantiate a sensor model, it is required to contain both interior and exterior orientation information.

Sensor models implemented within USGSCSM are stateful in that their underlying geometries can be modified. Moreover, it is possible to save a model's current state to a _state string_ so that a future model can be instantiated with that exact model state. This is an important capability when performing incremental modifications via a process like bundle adjustment or [performing alignment](https://stereopipeline.readthedocs.io/en/latest/tools/pc_align.html) between digital terrain models and reference DTMs or point clouds, particularly if the user decides to undo those adjustments or share the modified state with collaborators.


## Extended Sensor Model Ecosystem
This section details several packages that are created and maintained outside the Astrogeology Science Center but are commonly used in conjunction with elements of the ASC sensor model ecosystem.

### SOCET Geospatial eXploitation Products (GXP)
[SOCET GXP](https://www.geospatialexploitationproducts.com/content/socet-gxp/) is a software toolkit used to identify and analyze ground features. While it is possible to use a subset of GXP's capability's with simple sensor models, its core capabilities are largely dependent on rigorous sensor models. GXP not only includes its own sensor model implementations, but it also allows for external sensor models via CSM plugin support. By leveraging these sensor models, users can perform photogrammetric operations such as triangulation, mensuration, stereo viewing, automated DTM generation, and orthophoto generation. Unlike ISIS, GXP can be used for both terrestrial and extraterrestrial applications.

### Ames Stereo Pipeline (ASP)

The NASA [Ames Stereo Pipeline](https://stereopipeline.readthedocs.io/en/latest/introduction.html) (ASP) is an open-source toolkit used to create cartographic products from stereographic images captured by satellites, rovers, aerial cameras, and historical images. ASP is commonly used to create digital elevation models (DEMs), orthographic images, 3D models, and bundle adjusted networks of images ([Beyer, Ross A., Oleg Alexandrov, and Scott McMichael. 2018](https://doi.org/10.1029/2018EA000409)).

While there is considerable overlap in the tools provided by ISIS and ASP, ASP specializes in stereographic imagery, and it provides both terrestrial and non-terrestrial imaging capabilities while ISIS focuses solely on non-terrestrial imagery. ASP adopts the USGSCSM camera model implementation and can therefore easily interoperate with ISIS and the ASC ecosystem.

``` mermaid
graph TD
A[/Interior Orientation e.g., SPICE instrument kernel/] --> C
B[/Exterior Orientation e.g., SPICE ephemeris/] --> C
C[ALE] --> |ISD|D[USGSCSM]
D <--> |Sensor Model|E[ISIS] & F[SOCET GXP] & G[ASP] --> H(Science Ready Data Product)
```
Loading
Loading