-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calibration file won't load #375
Comments
Hi Marluca! Yes I was able to reproduce your issue. When running your stereo calibration in the MATLAB Calibrator App, did you save/generate your MAT-file using the "Export Camera Parameters" button in the tool ribbon/strip? Could you please try again but with saving your calibration using the "Save Session" button on the left of the tool ribbon instead? This produces a slightly different format and our code is written against these contents. When you first load the calibration, some diagnostics will be run that internally calibrate some heuristics. This takes some time and creates some output so don't be alarmed. Some of this is opaque but some may be interesting (reprojection errors and the like). |
Hi Allenleetc, Thanks for your quick reply. Yes, exactly! Unrecognized method, property, or field 'BoardSet' for class 'vision.internal.calibration.tool.Session'. Error in CalRigMLStro/autoCalibrateProj2NormFuncTol (line 131) Error in CalRigMLStro (line 84) Error in CalRig.loadCreateCalRigObjFromFile (line 134) Error in LabelerGUI>menu_setup_load_calibration_file_Callback (line 3181) Error in gui_mainfcn (line 95) Error in LabelerGUI (line 48) Error in matlab.graphics.internal.figfile.FigFile/read>@(hObject,eventdata)LabelerGUI('menu_setup_load_calibration_file_Callback',hObject,eventdata,guidata(hObject)) Error while evaluating Menu Callback. I attached the new file as well. The original data used for the calibration can be downloaded here: Do these heuristics also affect the epipolar lines shown for labeling, or is it mostly used during or after training process? |
Ah yes I see. Are you using MATLAB 2021a or later? It looks like there have been some API changes in the Computer Vision toolbox. Should be an easy fix but am seeing something interesting with the Bouget sample calibration data. Will push the fix soon. The heuristics are definitely used for the epipolar lines and that is what is acting a bit strange with this dataset. |
The Matlab version is '9.11.0.1769968 (R2021b)'. Hm okey, I thought mostly the cameras' relative positions are necessary like the essential matrix. What else is needed except intrinsics/extrinsics/relative positions? Because I was thinking of using third-party calibration files and merging them into this format. Thanks! |
Use MATLAB/CV API when drawing epipolar lines, specifically when transforming from image coords to normalized coords. Update zrange-finding algorithm to account for situations where two cameras are aligned on-axis pointed in the same direction (in which case some epipolar rays never intersect the other view, or "saturate" fully in the other view, etc)
Added further commit fec3d0e @marluca I pushed some updates, can you please pull the latest (on the develop branch) and try your workflow again? Your calibrationSession now seems to be working for me. A couple issues were at play. The first as mentioned was that some MATLAB/vision classes changed in 2021 (in a minor way). I updated our code to react and in the future we may insulate ourselves better from these classes. The second issue was that I updated our epipolar line drawing implementation to rely more on MATLAB's Computer Vision Toolbox API. This seems to work better with the MATLAB-generated calibration on this Bouget sample data. That said the MATLAB CV/calibration API is a bit rough and I am going to open a ticket with them as there are some oddities. Please let us know how things go and we can keep iterating if there are further hiccups. |
You are right, basically it is just the camera intrinsic/extrinsic parameters that are needed. Our current implementation happens to also use the extrinsic positions of the particular calibration board patterns as a starting point for some of the heuristic auto-calibrations I mentioned. When drawing the epipolar line, we currently sample 3d points along each epipolar ray, so as to fully incorporate nonlinear distortions. (This can lead to curved epipolar lines, unlike a linear closed-form solution that doesn't incorporate distortions.) In order to sample the points semi-intelligently, we try to guess a range and spacing of z-values etc. We use the calibration board patterns as a crutch for this purpose. I'm guessing it would not be hard to remove this dependence. If you don't mind sharing, what other/3rd-party calibration tools are you planning to use? This is an interesting topic. |
Sorry, I just got back to it now. Perfect, loading the calibration session works out with this commit! So, I was using my own calibration pipeline that uses ChArUco boards in OpenCV, because of partial visibility and the like. In the end, you also get checkerboard corners from ChArUco boards but these are regressed from fiducial markers, which means it should also be rather easy to incorporate into the calibrate session format (as long as estimation errors are not really necessary). I think in general it would be nice to have an advanced option for not auto-calibrating, in case one can only access the extrinsincs, intrinsics and distortion coefficients like with anipose or some other software. Wouldn't you as well get curved lines when using f.ex. the radial distortion parameters plus rectification as well? Thank you very much! |
Great glad to hear you can get running! Thanks for the notes on your pipeline, makes sense. I see your point about wanting an option to not auto-calibrate, when specific calibration board positions/detections are not available. Good suggestion and I think not difficult to implement. Rather than requiring you to put your parameters into the MATLAB data structure, another option could be for APT to accept those parameters directly in some generic format. It's possible for instance that the relevant MATLAB objects are read-only or difficult to manually manipulate. You are right that in general just the distortion parameters will result in curved epipolar lines. What we are currently doing is just that and nothing more sophisticated. We start in Camera1, undistort, extend a 3D ray out in Camera 1's positive z-direction, transform to Camera2, and then project/distort onto Camera2's image. The auto-calibration business is nothing fancy and just tries to numerically select a z-range and z-spacing for points on the epipolar line, rather than doing something geometric. Let us know what you think, thanks! |
Your suggestion is even better - it makes more sense to read these parameters directly into APT, also because there won't be API problems with the Vision toolbox again. And yes, too, bypassing the read-only Matlab objects is a mess. So, I'd be definitely using this option and be glad if it could be implemented. Thanks a bunch! Got it :) P.S: |
Cool, sounds good! Just FYI I am out on a trip for a bit but will put this on the todo list for when I get back. |
I hope you're trip is going all well. A bit related to this problem is the question on how to load calibration info via the calibrationSession for n cameras (n>2)? In the wiki it's mentioned that 3 cameras were used already; so do you have a rather fast solution for my case in mind? Thanks! |
Hey @marluca, thanks for your patience! I just got back and will be getting back to this soon. For 3-camera support, do you have pairwise calibrations between all camera pairs (3 camera pairs in all)? If so, I think it should be straightforward to include 3-camera support for your case with this coming update. (We already have an N-camera calibration rig object that utilizes an array of pairwise calibrations between views.) We were just discussing workflows for another 3-camera rig today. We would be curious about your thoughts here! As you note there are multiple possibilities, eg:
Way back when we had tried 3, but with a newer rig are thinking 1 or 2 might be preferable. Maybe the optimal solution depends on the rig/data/calibration. If you have any thoughts please send them along! |
Hi @allenleetc , So, in my case I have probably 6 cameras available. So, it'd be definitely nice to interface with the N-camera calibration rig object. Yes, sure, I can compute the pairwise camera transformations (15 in total, I suppose). In a future update you might consider to include this step then into APT, because probably external calibration software won't calculate this. So that for each camera only R,t,K, and d are basically necessary. I'm glad you ask! I think this might work out pretty well: Procedure: Reasoning: Thoughts regarding the optimal solution: |
Hey those are some great ideas! Having the EP lines show up only when/after a point is clicked/selected definitely seems interesting. @iskwak Okay first things let's get this simplified initialization in. |
@marluca just FYI I checked a prototype solution into d5257ff (branch iss375). There is an example camera/rig specification file stereo_cam_rig_example.yaml. The parameters are taken from your example calibration! This format is still preliminary. If you want to try this out, pull the above commit/branch, open your two-view project, and then under Label>Select Calibration File, select the rig specification yaml file above. This will create and initialize a Some notes:
Any/all thoughts/suggestions welcome, let us know! |
@allenleetc
Apart from that:
Thanks a lot, looking forward to new updates! |
@marluca Great thanks for trying that! Re: the Rotation Matrix. I actually thought my definition was transposed relative to MATLAB; their row notation seems less idiomatic to me. To clarify, in APT the definition of R and t are supposed to be as in the following. [x;y;z] is a vertical column vector, and same for [p;q;r].
Did I get this wrong or flipped? Re the radial distortion coefficients, sure we can add/accept a third coefficient. For N cameras, yes should be easy to compute the pairwise extrinsic transforms between cameras to initialize a "pairwise-calibrated" rig object (CalRig). Will be continuing these updates and integrating into the develop branch. |
@allenleetc |
@marluca Ah yes I did transpose the "RotationOfCamera2" matrix from your MATLAB calibration session before transforming into a rodrigues vector. Will be getting back to this. Side note FYI. If you want to manually set the "z-range" of your epipolar lines, you can manually specify the range as follows after loading your project and setting the calibration file:
|
to CalRigMLStro, CalRig2CamCaltech. add option to use compute_epipole2 to both CalRigs. In current testing (with user-supplied stereo-calibrated camera pair), all implementations agree. CalRigMLStro+zray is more difficult to use, as it requires choosing a z-range (either manually, or automatically given calibration pattern information). compute_epipole2 has built-in sampling via a geometric construction. Zray does have the benefit that "unphysical" or "wrong-sided" z-values are not drawn; eg, for two cameras positioned side-by-side and pointing in the same direction, negative z-values would not be visible in either camera yet are drawn by compute_epipole2.
Hey @marluca, pushed a fix for 'incomplete' epipolar lines. Now using a geometric procedure by default; no z-sampling or calibration pattern locations required. I also added the 3rd radial distortion parameter to the stereo rig yaml file. This is all integrated into the develop branch, let me know if this works for you! Note that if you have an existing project, it's probably easiest to reload/re-specify the calibration for your project (via Label > Selection Calibration File) if you want to add in a 3rd radial distortion coefficient. Minor tech note: the previous z-projection method for drawing epipolar lines had the side benefit that unphysical/negative-z points were not drawn. With your example stereo pair (cameras side-by-side), the geometric approach draws a full epipolar line which includes points "past infinity" along positive z. Kind of a fun note. For the N-camera rig, if you have an example rig yaml (eg for 3 cameras) please send along as I can use it for testing. |
Hey @allenleetc , sorry for my late response here. Ye, the negative z-values are a bit unfortunate, but should only occur if the other camera is visible in the frame, right? Sure, I attached a 3-camera rig below. |
Sweet thanks for the example rig! Re: negative z-values, it would certainly occur in the example you describe. But it occurs in other cases as well. For instance suppose you have two cameras aligned side by side (call them L and R), with their positive z/optical axes parallel extending to infinity. Assume the principal points of each camera are in the centered in their images. Consider the principal point in L and its epipolar line as projected in R. If we consider only positive z-values, this line will "come from the left" in R and then never go past the middle of the R image (it will asymptotically approach the principal point of R). However, the usual definitions I've seen define the epipolar line as the entire line, including points in the right half of R. In this example I believe those points will be mathematically generated if by projecting the point in L in the negative-z direction. Hope that makes sense? Anyway it's mostly just a curiosity I imagine but kind of fun. Maybe it could even be relevant for labeling on the rare occasion. |
Ah, yes. You're right, thanks! I suppose labeling points beyond 'infinity' would mess up the triangulation. So this might be dealt with in the future. |
I generated a test calibration file with the stereoCameraCalibrator app. The mat file is attached to this issue.
calib_test.zip
I tried to load this file into APT by clicking Label -> Select calibration file .. (There is no Load calibration file)
but the following error occurs:
_Error using CalRig.loadCreateCalRigObjFromFile (line 145)
Calibration file 'D:\Repositories\APT\calib_test.mat' has unrecognized contents.
Error in LabelerGUI>menu_setup_load_calibration_file_Callback (line 3181)
crObj = CalRig.loadCreateCalRigObjFromFile(fname);
Error in gui_mainfcn (line 95)
feval(varargin{:});
Error in LabelerGUI (line 48)
gui_mainfcn(gui_State, varargin{:});
Error in matlab.graphics.internal.figfile.FigFile/read>@(hObject,eventdata)LabelerGUI('menu_setup_load_calibration_file_Callback',hObject,eventdata,guidata(hObject))
Error while evaluating Menu Callback._
I'd really appreciate help solving this issue.
Thanks!
The text was updated successfully, but these errors were encountered: