GStreamer pipelines examples
RidgeRun Image Projector |
---|
![]() |
Image Projector Basics |
Overview |
Getting Started |
User Guide |
Examples |
Performance |
Xavier
|
Contact Us |
|
In this page you can find basic pipeline examples usages for the rrprojector plug-in.
Equirectangular projections
For the examples below assume that there is a fisheye camera with the following projection parameters:
S0_rad=750.0 S0_LENS=187.0 S0_C_X=993.0 S0_C_Y=762.0 S0_R_X=0.0 S0_R_Y=0.0 S0_R_Z=-89.6
For more information on how to set these values, visit Rrprojector wiki page.
Projection from cameras
Saving a projection to mp4
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! nvvidconv ! "video/x-raw(memory:NVMM), width=1920, height=1080" ! queue ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y ! nvvidconv ! nvv4l2h264enc bitrate=20000000 ! h264parse ! mp4mux ! filesink location=projection.mp4
Projection from videos
Saving a projection from MP4 videos
gst-launch-1.0 filesrc location=fisheye180.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! "video/x-raw(memory:NVMM), width=1920, height=1080, format=RGBA" ! queue ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y ! nvvidconv ! nvv4l2h264enc bitrate=20000000 ! h264parse ! mp4mux ! filesink location=projection.mp4
Projection from images
Saving a projection from a JPEG image
Example pipeline for Jetson platforms
gst-launch-1.0 filesrc location=fisheye180.jpeg ! jpegdec ! nvvidconv ! queue ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y ! videoconvert ! queue ! jpegenc ! filesink location=proj0.jpeg
Example pipeline for X86
gst-launch-1.0 filesrc location=fisheye180.jpeg ! jpegdec ! videoconvert ! queue ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y ! videoconvert ! queue ! jpegenc ! filesink location=proj0.jpeg
360 Stitching
For the examples below assume that there are two inputs and the projection parameters and homographies are save in a file that looks like this:
{
"projections": [
{
"0": {
"radius": 750,
"lens": 187.0,
"center_x": 993,
"center_y": 762,
"rot_x": 0.0,
"rot_y": 0.0,
"rot_z": -89.6,
"fisheye": true
}
},
{
"1": {
"radius": 750,
"lens": 186.0,
"center_x": 1044,
"center_y": 776,
"rot_x": 0.0,
"rot_y": 0.0,
"rot_z": 88.7,
"fisheye": true
}
}
],
"homographies": [
{
"images": {
"target": 1,
"reference": 0
},
"matrix": {
"h00": 1,
"h01": 0,
"h02": 0.0,
"h10": 0,
"h11": 1,
"h12": 0,
"h20": 0,
"h21": 0,
"h22": 1
}
}
]
}
The projection parameters and homography list is stored in the parameters.json
file and contains projection parameters for N cameras and N-1 homographies for N cameras, for more information on how to set these values, visit Rrprojector and Calibration wiki pages.
Save the projection parameters describe in the parameters.json
in environment variables.
S0_rad=750 S0_LENS=187.0 S0_C_X=993 S0_C_Y=762 S0_R_X=0.0 S0_R_Y=0.0 S0_R_Z=-89.6 S1_rad=750 S1_LENS=186 S1_C_X=1044 S1_C_Y=776 S1_R_X=0.0 S1_R_Y=0.0 S0_R_Z=88.7
360 Stitch from cameras
Saving a 360 stitch to MP4
OUTPUT=/tmp/360_stitching_result.mp4
gst-launch-1.0 -e -v cudastitcher name=stitcher \
homography-list="`cat result.json | tr -d "\n" | tr -d "\t" | tr -d " "`" \
nvarguscamerasrc sensor-id=0 ! nvvidconv ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y rot-z=$S0_R_Z name=proj0 ! queue ! stitcher.sink_0 \
nvarguscamerasrc sensor-id=1 ! nvvidconv ! rreqrprojector radius=$S1_rad lens=$S1_LENS center-x=$S1_C_X center-y=$S1_C_Y rot-x=$S1_R_X rot-y=$S1_R_Y name=proj1 ! queue ! stitcher.sink_1 \
stitcher. ! queue ! nvvidconv ! nvv4l2h264enc bitrate=30000000 ! h264parse ! queue ! qtmux ! filesink location=$OUTPUT
Displaying a 360 stitch
gst-launch-1.0 -e -v cudastitcher name=stitcher \
homography-list="`cat result.json | tr -d "\n" | tr -d "\t" | tr -d " "`" \
nvarguscamerasrc sensor-id=0 ! nvvidconv ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y rot-z=$S0_R_Z name=proj0 ! queue ! stitcher.sink_0 \
nvarguscamerasrc sensor-id=1 ! nvvidconv ! rreqrprojector radius=$S1_rad lens=$S1_LENS center-x=$S1_C_X center-y=$S1_C_ rot-x=$S1_R_X rot-y=$S1_R_Y name=proj1 ! queue ! stitcher.sink_1 \
stitcher. ! queue ! nvvidconv ! xvimagesink
360 Stitch from videos
Saving a 360 stitch from two MP4 videos
INPUT_0=video_0.mp4
INPUT_1=video_1.mp4
OUTPUT=/tmp/360_stitching_result.mp4
gst-launch-1.0 -e -v cudastitcher name=stitcher \
homography-list="`cat result.json | tr -d "\n" | tr -d "\t" | tr -d " "`" \
filesrc location=$INPUT_0 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y rot-z=$S0_R_Z name=proj0 ! queue ! stitcher.sink_0 \
filesrc location=$INPUT_1 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! rreqrprojector radius=$S1_rad lens=$S1_LENS center-x=$S1_C_X center-y=$S1_C_ rot-x=$S1_R_X rot-y=$S1_R_Y name=proj1 ! queue ! stitcher.sink_1 \
stitcher. ! queue ! nvvidconv ! nvv4l2h264enc bitrate=30000000 ! h264parse ! queue ! qtmux ! filesink location=$OUTPUT
360 Stitch from images
Saving a 360 stitch from two JPEG images
INPUT_0=video_0.jpeg
INPUT_1=video_1.jpeg
OUTPUT=/tmp/360_stitching_result.jpeg
gst-launch-1.0 -e -v cudastitcher name=stitcher \
homography-list="`cat result.json | tr -d "\n" | tr -d "\t" | tr -d " "`" \
filesrc location=$INPUT_0 ! jpegparse ! jpegdec ! nvvidconv ! rreqrprojector radius=$S0_rad lens=$S0_LENS center-x=$S0_C_X center-y=$S0_C_Y rot-x=$S0_R_X rot-y=$S0_R_Y rot-z=$S0_R_Z name=proj0 ! queue ! stitcher.sink_0 \
filesrc location=$INPUT_1 ! jpegparse ! jpegdec ! nvvidconv ! rreqrprojector radius=$S1_rad lens=$S1_LENS center-x=$S1_C_X center-y=$S1_C_ rot-x=$S1_R_X rot-y=$S1_R_Y name=proj1 ! queue ! stitcher.sink_1 \
stitcher. ! queue ! videoconvert ! jpegenc ! filesink location=$OUTPUT