Skip to content

The probabilistic calibration framework of the line-scan and frame camera system. This includes work on uncertainty estimation and active calibration algorithm implementation.

License

Notifications You must be signed in to change notification settings

jmehami1/Line-scan_Frame_Camera_Calibration

Repository files navigation

Contributors Forks Stargazers Issues MIT License reposize

LinkedIn portfolio

Line-scan Frame Camera Calibration

Resonon line-scan hyperspectral camera with a Intel primesense RGB-D camera Cameras and patterns with corresponding coordinate frames Calibrated camera poses with 3D view-line on pattern


Table of Contents


Research Paper

You can find my research paper here which describes the active calibration algorithm:

Observability driven Multi-modal Line-scan Camera Calibration

@inproceedings{mehami2020observability,
  title={Observability driven multi-modal line-scan camera calibration},
  author={Mehami, Jasprabhjit and Vidal-Calleja, Teresa and Alempijevic, Alen},
  booktitle={2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)},
  pages={285--290},
  year={2020},
  organization={IEEE}
}

About

A probabilistic calibration framework for calibrating a camera system comprised of a line-scan camera and a 2D frame camera. This framework includes uncertainty estimation, due to measurement noisy, where the optimised calibration parameters are returned with an associated error. The camera system is assumed to be rigidly mounted together and calibration is performed using a calibration pattern with triangle features and cross-corner markers.

The observability is an important aspect in this calibration as it is common to capture the calibration pattern in similar poses. These similar poses do not have sufficient parallax, so they do not provide any more information to support the calibration, and in some cases they degrade the results. An active calibration algorithm is implemented which only keeps measurements that improve parameter observability, and ignore all others.

Cameras and patterns with corresponding coordinate frames

(back to top)

Requirements

  • MATLAB 2020a or higher (Tested with 2021b)
  • GCC 7.0 or higher (Tested with 7.5.0)
  • OpenCV 4.0 or higher
  • CMake 3.10 or higher

Submodule

This repository has a submodule. Make sure to add the --recurse-submodules flag when pulling or cloning.

#clone repo and submodules
git clone --recurse-submodules **REPO**

#first clone repo and then initialise and update submodule
git clone **REPO**
git submodule init
git submodule update

Calibration Board

The calibration is performed using a pattern with known triangles on an ArUco board as shown below. The triangles provide feature points for the line-scan camera. All triangles must be visible in the FOV of the line-scan camera in order to properly calibrate. The ArUco board markers are used by the frame camera to acquire the board's pose automatically using OpenCV functionality (further details of the ArUco board estimation can be found in the README of submodule).

An A3 PDF version of the board can be found in calibration_board and its corresponding dimension measurements which are used in the calibration are saved in the file config.yaml

ArUco triangle pattern for calibrating line-scan camera

Calibration Images

When capturing calibration images, the board should be moved and rotated such that there are variations in the set of calibration images. Atleast two ArUco markers should be present in the frame camera image to estimate the board pose (see PnP algorithm) and eight line features should be detected in the line-scan image as shown below. The line-scan image may need to be flipped to match the order of the lines on the board (see below for further details).

Estimated pose of ArUco board Detected line features in the line-scan image

Data Directory

The calibration data should be organised as follows:

Calibration_Data_directory
├── Frame
	├── img1.png
	├── img2.png
	└── ...
├── Line-scan
	├── hs1.png
	├── hs2.png
	└── ...
├── frame_camera_intrinsic.mat
└── calibration_results (generated after successfully running calibration script)

Both Frame and Line-scan directories should have the same number of images.

(back to top)

Frame Camera Intrinsic

The intrinsic parameters of your Frame (RGB) camera must be calibrated prior to performing this calibration. This can be done through checkerboard calibration using MATLAB's calibration app. The saved intrinsic parameters should be stored with that data with the filename frame_camera_intrinsic.mat . Inside of the MAT file is a MATLAB cameraParameters object with the name cameraParams.

(back to top)

Setup Steps

  1. Extract images from Rosbag by running the scriptRosbag_Unpack_LineScan_Frame_Cameras.m located in rosbag_processing in MATLAB

    Pass Rosbag location to first window

    A new directory will be created called *NAME OF Rosbag* and inside of that directory will be Frame and Line-scan directories containing images.

    Note: Extraction and saving might take a couple of minutes

  2. Compile ArUco pixel detection MEX function from submodule

    cd ext_lib/mex_ChArUco_Pose/
    mkdir build
    cd build
    cmake ..
    make -j
  3. Edit the calibration configuration file config.yaml as needed. Details of the calibration configuration are described in the next section.

  4. Run calibration script Main_Calibration.m Pass the directory location of the extracted images Images/*NAME OF BAG*/ which contains the Frame and Line-scan directories.

  5. Check that the line-scan lines are properly aligned to the frame camera images. If successful, the results will be saved in the directory calibration_results. If results already exist, program will ask if you want to override results.

(back to top)

Calibration Configuration

Configuration parameters found in the config.yaml file.

Flags Description
display_on TRUE/FALSE Turns the intermediate visualisations on/off. Turn off to speed up processing
t1_approximate The approximate distance from Resonon to RGB camera along the positive x-axis of Resonon (see below for further details)
t3_approximate The approximate distance from Resonon to RGB camera along the positive z-axis of Resonon (see below for further details)
flip_linescan_img TRUE if the x-axis of the pattern is in the opposite direction to the y-axis of the Resonon (see below for further details)
naive_calibration TRUE/FALSE runs naive calibration with all images, else runs active algorithm
algorithm Algorithm for solving the optimisation.
1 - Levenberg-Marquardt (default)
2 -Trust-Region-Reflective
The trust-region-reflective generally will result in proper calibration parameters as it takes in constraints that will ensure the intrinsics of the hyperspectral camera are in proper range. (These should generally remain fixed)
steadystate_readings Stopping criteria for the active algorithm. Minimum number of consecutive optimisations until steady-state is reached. Checks the sum of the relative change in eigenvalues for each optimisation (referred to as summed normalised metric).
minimum_eigen_value Minimum eigenvalue magnitude. This avoids having to deal with values close to 0.
sum_eigen_threshold Threshold value for the summed normalised metric. Default value is 0 to ensure there is an increase in information.
skip_covariance TRUE/FALSE skipping covariance/error estimation. Can only skip if naiveCalibration is TRUE
std_pixel_error STD of pixel error for all cameras (default value 1)
lower_bounds Lower bounds for trust-region-reflective algorithm in the vector form [tx, ty, tz, qw, qx, qy, qz, fy, v0, K1, K2, P2]
upper_bounds Upper bounds for trust-region-reflective algorithm in the vector form [tx, ty, tz, qw, qx, qy, qz, fy, v0, K1, K2, P2]

(back to top)

Initial Guesses

The closed form solution of the calibration requires an initial guess be given for the t1 and t3 parameters. This will avoid the calibration converging to an incorrect mirror pose. The diagram below demonstrates the required approximate measurements of the initial guess (should be measured in metres):

Line-scan camera, frame camera, and calibration pattern coordinate frames Line-scan camera, frame camera, and calibration pattern coordinate frames

Note that the extrinsic transformation describes the pose of the frame camera axis with respect to the line-scan camera axis. In the above example both guesses would be negative.

(back to top)

Flipping Line-scan Image

Depending on the orientation of the calibration board, the line-scan camera images may need to be flipped horizontally so that the order of the feature points in the image correspond to the order on the board. You will need to manually set the flag flip_linescan_img as shown above. This will depend on the setup of the cameras but the general guideline to set the flag to true, is that the x-axis of the pattern needs to to be in the opposite direction to the y-axis of the camera.

For my setup, the following examples show when the line-scan image is flipped and not flipped respectively:

Successful calibration of cameras where line-scan images needed to be flipped Successful calibration of cameras where line-scan images did not need to be flipped

Results

Results are saved as a MAT file called optimised_parameters.mat which stores all the necessary data after calibration. Load this file into MATLAB.

The final optimised calibration parameters can be found in the matrix linescan_Opt_Param. This is a n x 11 matrix where each column represents values for each of the 11 calibration parameters [t1, t2, t3, rz, ry, rx, fy, v0, K1, K2, P2]. The last row contains the final optimised parameters you should use. The covariance matrix for this array of parameters can be found in the 11 x 11 x n array linescan_Opt_ParamErr by taking the very last 11 x 11 matrix.

numOpt tells you how many optimisation iterations were performed. In each iteration of the active algorithm a new image gets added, so therefore numOpt + 1 is the number of images that are used in the final optimisation.

Below is the final output figure showing the calibrated cameras and one instance of the pattern with the projected line-scan view-line.

Sucessful calibration result showing cameras, pattern and projected line

(back to top)

TODO

  • Rotation optimisation over the manifold rather than using quaternion parameterisation. This should fix convergence issues.
  • Read in frame camera intrinsic parameters from data folder or Rosbag
  • Allow for passing in Rosbag of images
  • Allow for any image naming schemes
  • Automatic method to set flip_linescan_img to true/false.

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Jasprabhjit Mehami

Email: [email protected]

(back to top)

Acknowledgments

(back to top)

About

The probabilistic calibration framework of the line-scan and frame camera system. This includes work on uncertainty estimation and active calibration algorithm implementation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published