Calibration Tool

From RidgeRun Developer Wiki



Previous: Spherical_Video/Stitcher Index Next: Performance







Panoramic Estimation Tool

The following section will introduce how to calibrate the equirectangular projection that can be used in the rreqrprojector element. This tool consists of a Python application that estimates the equirectangular projection of fisheye images and the horizontal offset to match the images.

Dependencies

  • Python 3.8
  • OpenCV headless
  • Numpy
  • PyQt5
  • PyQtDarkTheme

Virtual Environment

It is recommended to work in a virtual environment since the required dependencies will be automatically installed. To create a new Python virtual environment, run the following commands:

ENV_NAME=calibration-env
python3 -m venv $ENV_NAME

A new folder will be created with the name ENV_NAME. To activate the virtual environment, run the following command:

source $ENV_NAME/bin/activate

In the directory of the stitching-calibrator project, run the following command:

pip3 install .

Image Directory

In order to correctly upload the fisheye images to the calibration tool, a root folder with a series of folders named with the camera indexes starting from 0, 1, 2, ... all the way up to the number of cameras of the array being calibrated. The content of each folder will be a fisheye image of the specific camera index. The fisheye images must use the name format image_XX with XX being the camera index.

Note: The images of each sensor should be in sync (all captured at the same time) to achieve the images overlap.

An example of the hierarchical structure of the root folder for an array of three fisheye cameras looks like this:

fisheye-images/
├── 0
│   └── image_0.jpeg
├── 1
│   └── image_1.jpeg
└── 2
    └── image_2.jpeg

Calibration tool usage

1. To use the calibration tool run the following command and select the Panoramic Calibration button.

stitching-calibrator
Main-Window
Main-Window

2. Upload the fisheye images. In the menu bar select File->Open, this will open a file dialog to select the root directory with the cameras folders.

Open menu
Open menu

An initial circle estimation and equirectangular projection will be estimated for each image, this will take a few seconds. The image of sensor 0 will be automatically selected in the Samples view to calibrate.

Initial estimation
Initial estimation

3. Adjust the parameter of the first fisheye image using the buttons of the right panel where:

  • radius: A float property that defines in pixels the radius of the circle containing the fisheye image.
  • lens: A float property that defines the fisheye lens aperture.
  • circle-center-x: An integer property that defines the center position of the fisheye circle over the input’s image X axis(horizontal).
  • circle-center-y: An integer property that defines the center position of the fisheye circle over the input’s image Y axis(vertical).
  • rot-x: A float property that defines the camera’s tilt angle correction in degrees between -180 and 180. This is assuming a coordinate system over the camera where the X axis looks over the side of the camera, if you rotate that axis, you will be rotating the camera up and down.
  • rot-y: A float property that defines the camera’s roll angle correction in degrees between -180 and 180. This is assuming a coordinate system over the camera where the camera lens looks over the Y axis, if you rotate that axis, the camera will rotate rolling over its center.
  • rot-z: A float property that defines the camera’s pan angle correction in degrees between -180 and 180. This is assuming a coordinate system over the camera where the Z axis is over the camera, if you rotate that axis, the camera will pan over its center moving around the 360 horizon.

The Preview view will show the result projection of the selected image with the selected parameters.

Projection parameters for sample 1
Projection parameters for sample 1

Note: The view size can be individually adjusted using the slider for a better appreciation of the images, use scroll for vertical adjustment and Shift+scroll for horizontal adjustment. Also, the images can be zoom in and zoom out using Ctrl+scroll.

Sliders of views
Sliders of views

4. Select another sensor image in the Samples view (by clicking on it) to adjust the projection parameters for the next image, and continue with all the available sensor images. The current sensor image will be highlighted to recognize which image you are currently adjusting.

Projection parameters for sample 1
Projection parameters for sample 1

5. Once all the images are adjusted with the projection parameters adjust the horizontal offset to match the images in the Stitching view. To adjust the offset drag the images one on top of the other one and match the scene.

If the scene still doesn't fully match, you can continue modifying the parameters for each sensor image until you get a good fit of the scene in the Stitching view.

You can use the transparency and lock features to help you match the scene. The transparency feature allows transparent images to match objects between the projections. The lock feature locks the image movement for an easier projection tune.

Stitching view
Stitching view

6. When the calibration is finished, save the parameters and homographies in a JSON file. In the menu bar select File->Save, this will open a file dialog to select the path and name for the JSON file.

Save menu
Save menu

An example of the JSON file output using two fisheye images looks like:

{
    "projections": [
        {
            "0": {
                "radius": 640.0,
                "lens": 195.0,
                "center_x": 637.0,
                "center_y": 640.0,
                "rot_x": 0.0,
                "rot_y": 0,
                "rot_z": 180,
                "fisheye": true
            }
        },
        {
            "1": {
                "radius": 639.0,
                "lens": 195.0,
                "center_x": 641.0,
                "center_y": 638.0,
                "rot_x": 0.0,
                "rot_y": -0.8,
                "rot_z": 180,
                "fisheye": true
            }
        }
    ],
    "homographies": [
        {
            "images": {
                "target": 1,
                "reference": 0
            },
            "matrix": {
                "h00": 1,
                "h01": 0,
                "h02": 1309.612371682265,
                "h10": 0,
                "h11": 1,
                "h12": 0,
                "h20": 0,
                "h21": 0,
                "h22": 1
            }
        }
    ]
}

The Panoramic Calibration tool have a dark mode feature. You can enable the dark theme in the menu View->Dark Mode.


Dark mode
Dark mode

Projector usage example

The following examples show how to use the calibrator tool JSON output with the stitcher and the projector. This example uses the above JSON output example of two fisheye images as result.json.

Write the projection parameters of each sensor image from the JSON file to environment variables:


S0_C_X=637
S0_C_Y=640
S0_rad=640
S0_R_X=0
S0_R_Y=0
S0_LENS=195

S1_C_X=641
S1_C_Y=638
S1_rad=639
S1_R_X=0
S1_R_Y=-0.8
S1_LENS=195

The homography list can be used in the pipeline with the following format:

CONFIG_FILE="result.json"
HOMOGRAPHIES="`cat $CONFIG_FILE | tr -d "\n" | tr -d " "`"

Assuming the Calibrator tool output is stored in a result.json file, a full pipeline looks like:


S0_C_X=637
S0_C_Y=640
S0_rad=640
S0_R_X=0
S0_R_Y=0
S0_LENS=195

S1_C_X=641
S1_C_Y=638
S1_rad=639
S1_R_X=0
S1_R_Y=-0.8
S1_LENS=195

CONFIG_FILE="result.json"
HOMOGRAPHIES="`cat $CONFIG_FILE | tr -d "\n" | tr -d " "`"

GST_DEBUG=WARNING gst-launch-1.0 -e -v \
	cudastitcher name=stitcher homography-list=$HOMOGRAPHIES sink_0::right=1200 sink_1::right=1227 \
	filesrc location= ~/videos/video0-s0.mp4 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! rreqrprojector center_x=$S0_C_X center_y=$S0_C_Y radius=$S0_rad rot-X=$S0_R_X rot-y=$S0_R_Y lens=$S0_LENS name=proj0 !  queue ! stitcher.sink_0 \
	filesrc location= ~/videos/video0-s1.mp4 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! rreqrprojector center_x=$S1_C_X center_y=$S1_C_Y radius=$S1_rad rot-X=$S1_R_X rot-y=$S1_R_Y lens=$S1_LENS name=proj1 !  queue ! stitcher.sink_1 \
	stitcher.  ! queue ! nvvidconv ! nvv4l2h264enc bitrate=30000000 ! h264parse ! queue !  qtmux ! filesink location=360_stitched_video.mp4


Previous: Spherical_Video/Stitcher Index Next: Performance