OmniVision OS08A10 Linux driver

From RidgeRun Developer Wiki


Template:Eval SDK Download, Demo Image download and Contact Us buttons

Problems running the pipelines shown on this page?
Please see our GStreamer Debugging guide for help.

Keywords: OS08A10 Jetson TX2, Gstreamer, NVIDIA, RidgeRun, V4L2 Driver


OS08A10 Features

The Omnivision OS08A10 is an image sensor with the following features:

  • 2 µm x 2 µm pixel
  • Optical size of 1/1.8"
  • Programmable controls for:
    • Frame rate
    • Mirror and flip
    • Cropping
    • Windowing
  • Supports output formats:
    • 12-/10-bit RGB RAW
  • Supports image sizes:
    • 4K2K (3840x2160)
    • 2560 x 1440
    • 1080p (1920x1080)
    • 720p (1280x720)
  • Supports 2x2 binning
  • Standard serial SCCB interface
  • 12-bit ADC
  • Up to 4-lane MIPI/LVDS serial output interface (supports maximum speed up to 1500 Mbps/lane)
  • 2-exposure staggered HDR support
  • Programmable I/O drive capability
  • Light sensing mode (LSM)
  • PLL with SCC support
  • Support for FSIN


RidgeRun has developed a driver for the Jetson TX1 platform with the following support:

  • L4T 28.2.1 and Jetpack 3.3
  • V4l2 Media controller driver
  • Tested resolution 3840 x 2160 @ 30 fps
  • Capture with v4l2src and also with nvcamerasrc using the ISP.

Enabling the driver

In order to use this driver, you have to patch and compile the kernel source using JetPack:


Using Jetpack

  • Follow the instructions in [1] to get the kernel source code.
  • Once you have the source code, apply the following three patches in order to add the changes required for the os08a10 cameras at kernel and dtb level.
3.2.1_os08a10.patch
  • Follow the instructions in (Build Kernel) for building the kernel, and then flash the image.

Make sure to enable os08a10 driver support:

make menuconfig
-> Device Drivers                                                                                                                        
  -> Multimedia support                                                                                           
    -> Encoders, decoders, sensors and other helper chips
       -> <*> OS08A10 camera sensor support

Using the driver

Gstreamer examples

The Gstreamer version distributed with Jetpack doesn't support bayer RAW10 only RAW8 so gstreamer needs to be patched in order to capture using v4l2src. Follow the steps in the following wiki page to add the support for RAW10:

http://developer.ridgerun.com/wiki/index.php?title=Compile_gstreamer_on_tegra_X1

Important Note: When you are accessing to the board through serial or ssh and you want to run a pipeline to display with autovideosink, nveglglessink, xvimagesink or any other video sink, you have to run your pipeline with DISPLAY=:0 at the beginning of the description:

DISPLAY=:0 gst-launch-1.0 ...

Snapshots

In order to check the snapshot, you can use the following tool:

https://github.com/jdthomas/bayer2rgb

So, run the following commands to download the tool and compile it:

git clone git@github.com:jdthomas/bayer2rgb.git
cd bayer2rgb
make
cp bayer2rgb /usr/bin/


Bayer2rgb will convert naked (no header) bayer grid data into rgb data. There are several choices of interpolation (though they all look essentially the same to my eye). It can output tiff files, and can integrate with ImageMagick to output other formats.

  • 3280x2464
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=1 ! "video/x-bayer, format=rggb, width=3280, height=2464" ! filesink location=test_3280x2464.bayer
./bayer2rgb --input=test_3280x2464.bayer --output=data.tiff --width=3296 --height=2464 --bpp=16 --first=RGGB --method=BILINEAR --tiff

Use image_magik to convert the tiff to png:

convert data.tiff data.png
  • 1920x1080
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=1 ! "video/x-bayer, format=rggb, width=1920, height=1080" ! filesink location=test_1920x1080.bayer

Check the snapshot with:

./bayer2rgb --input=test_1920x1080.bayer --output=data.tiff --width=1920 --height=1080 --bpp=16 --first=RGGB --method=BILINEAR --tiff

Use image_magik to convert the tiff to png:

convert data.tiff data.png
  • 1280x720
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=1 ! "video/x-bayer, format=rggb, width=1280, height=720" ! filesink location=test_1280x720.bayer

Check the snapshot with:

./bayer2rgb --input=test_1280x720.bayer --output=data.tiff --width=1280 --height=720 --bpp=16 --first=RGGB --method=BILINEAR --tiff

Use image_magik to convert the tiff to png:

convert data.tiff data.png
  • 1640x1232 (Binning x2)
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=1 ! "video/x-bayer, format=rggb, width=1640, height=1232" ! filesink location=test_1640x1232.bayer

Check the snapshot with:

./bayer2rgb --input=test_1640x1232.bayer --output=data.tiff --width=1664 --height=1232 --bpp=16 --first=RGGB --method=BILINEAR --tiff

Use image_magik to convert the tiff to png:

convert data.tiff data.png
  • 820x616 (Binning x4)
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=1 ! "video/x-bayer, format=rggb, width=820, height=616" ! filesink location=test_820x616.bayer

Check the snapshot with:

./bayer2rgb --input=test_820x616.bayer --output=data.tiff --width=832 --height=616 --bpp=16 --first=RGGB --method=BILINEAR --tiff

Use image_magik to convert the tiff to png:

convert data.tiff data.png

Capture

V4l2src

You can use the raw2rgbpnm tool to check all the buffers:

https://github.com/martinezjavier/raw2rgbpnm

So, run the following commands to download the tool and compile it:

git clone git clone git@github.com:martinezjavier/raw2rgbpnm.git
cd raw2rgbpnm

Open the file raw2rgbpnm.c and change the line 489 with:

int c = getopt(argc, argv, "a:b:f:ghs:wn");

This is to enable the option to extract multiple frames from a file. Now, you can build the application:

make

This tool converts from GRBG10 to pnm. We capture RGGB in the IMX219, so you will see that the colors at the output of the image are wrong.


In order to capture 10 buffers and save them in a file, you can run the following pipelines:

  • 3280x2464
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=10 ! "video/x-bayer, format=rggb, width=3280, height=2464" ! filesink location=test_3280x2464.bayer

Check the buffers with:

./raw2rgbpnm -f SGRBG10 -s 3296x2464 -b 5.0 -n test_3280x2464.bayer output_3280x2464
  • 1920x1080
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=10 ! "video/x-bayer, format=rggb, width=1920, height=1080" ! filesink location=test_1920x1080.bayer

Check the buffers with:

./raw2rgbpnm -f SGRBG10 -s 1920x1080 -b 5.0 -n test_1920x1080.bayer output_1920x1080
  • 1280x720
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=10 ! "video/x-bayer, format=rggb, width=1280, height=720" ! filesink location=test_1280x720.bayer

Check the buffers with:

./raw2rgbpnm -f SGRBG10 -s 1280x720 -b 5.0 -n test_1280x720.bayer output_1280x720
  • 1640x1232 (Binning x2)
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=10 ! "video/x-bayer, format=rggb, width=1640, height=1232" ! filesink location=test_1640x1232.bayer

Check the buffers with:

./raw2rgbpnm -f SGRBG10 -s 1664x1232 -b 5.0 -n test_1640x1232.bayer output_1640x1232
  • 820x616 (Binning x4)
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=10 ! "video/x-bayer, format=rggb, width=820, height=616" ! filesink location=test_820x616.bayer

Check the buffers with:

./raw2rgbpnm -f SGRBG10 -s 832x616 -b 5.0 -n test_820x616.bayer output_820x616

Nvcamerasrc

  • 3280x2464
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="21 21" ! 'video/x-raw(memory:NVMM), width=(int)3280, height=(int)2464, format=(string)I420, framerate=(fraction)21/1' ! autovideosink
  • 1920x1080
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! autovideosink
  • 1640x1232
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1640, height=(int)1232, format=(string)I420, framerate=(fraction)30/1' ! autovideosink

This is an image captured with the above pipeline:

IMX219 capture with nvcamerasrc. Camera aimed at computer monitor on left which is reflecting the wall and ceiling shown on the right.


  • 1280x720
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)I420, framerate=(fraction)30/1' ! autovideosink
  • 820x616
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)820, height=(int)616, format=(string)I420, framerate=(fraction)30/1' ! autovideosink

Dual Capture

Using the following pipelines we can test the performance of the Jetson TX1 when doing dual capture:

V4l2src

gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-bayer,format=rggb,width=3280,height=2464' ! fakesink 
v4l2src device=/dev/video3 ! 'video/x-bayer,format=rggb,width=3280,height=2464' ! fakesink

We noticed that using two cameras with the max resolution 3280x2464, the cpu load consumption measured with tegrastats doesn't change considerably, and it remains almost the same:

  • Tegrastats in normal operation:
RAM 563/3994MB (lfb 748x4MB) cpu [51%,0%,0%,0%]@518 GR3D 0%@76 EDP limit 0
RAM 563/3994MB (lfb 748x4MB) cpu [50%,0%,0%,0%]@518 GR3D 0%@76 EDP limit 0
RAM 563/3994MB (lfb 748x4MB) cpu [50%,0%,0%,0%]@518 GR3D 0%@76 EDP limit 0
RAM 563/3994MB (lfb 748x4MB) cpu [51%,0%,0%,0%]@518 GR3D 0%@76 EDP limit 0
RAM 563/3994MB (lfb 748x4MB) cpu [49%,0%,0%,0%]@518 GR3D 0%@76 EDP limit 0
RAM 563/3994MB (lfb 748x4MB) cpu [53%,0%,0%,0%]@518 GR3D 0%@76 EDP limit 0
RAM 563/3994MB (lfb 748x4MB) cpu [52%,0%,0%,0%]@403 GR3D 0%@76 EDP limit 0
  • Tegrastats with the above pipeline running
RAM 608/3994MB (lfb 651x4MB) cpu [57%,3%,0%,0%]@825 GR3D 0%@76 EDP limit 0
RAM 608/3994MB (lfb 651x4MB) cpu [52%,29%,0%,0%]@403 GR3D 0%@76 EDP limit 0
RAM 608/3994MB (lfb 651x4MB) cpu [56%,0%,1%,0%]@403 GR3D 0%@76 EDP limit 0
RAM 608/3994MB (lfb 651x4MB) cpu [60%,1%,1%,0%]@403 GR3D 0%@76 EDP limit 0
RAM 608/3994MB (lfb 651x4MB) cpu [57%,1%,0%,0%]@403 GR3D 0%@76 EDP limit 0
RAM 608/3994MB (lfb 651x4MB) cpu [58%,0%,1%,0%]@403 GR3D 0%@76 EDP limit 0
RAM 608/3994MB (lfb 651x4MB) cpu [58%,1%,0%,0%]@403 GR3D 0%@76 EDP limit 0

Nvcamerasrc

gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="21 21" ! 'video/x-raw(memory:NVMM), width=(int)3280, height=(int)2464, format=(string)I420, framerate=(fraction)21/1' ! \
fakesink nvcamerasrc sensor-id=2 fpsRange="21 21" ! 'video/x-raw(memory:NVMM), width=(int)3280, height=(int)2464, format=(string)I420, framerate=(fraction)21/1' ! fakesink

These tests were done using the J20 board from Auvidea Getting started guide for Auvidea J20 board

Dual capture and dual display

DISPLAY=:0 gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1640, height=(int)1232, format=(string)I420, \
framerate=(fraction)21/1' ! nvegltransform ! nveglglessink nvcamerasrc sensor-id=2 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1640, \
height=(int)1232, format=(string)I420, framerate=(fraction)21/1' ! nvegltransform ! nveglglessink -e

Video Encoding Transport Stream 1640x1232@30fps

CAPS="video/x-raw(memory:NVMM), width=(int)1640, height=(int)1232, format=(string)I420, framerate=(fraction)30/1"

gst-launch-1.0 nvcamerasrc sensor-id=1 fpsRange="30 30" num-buffers=500 ! capsfilter caps="$CAPS" ! omxh264enc ! \
               mpegtsmux ! filesink location=test.ts

Snapshots

gst-launch-1.0 -v nvcamerasrc sensor-id=1 fpsRange="30 30" num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)1640, height=(int)1232, format=(string)I420, \
framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw, width=(int)1640, height=(int)1232, format=(string)I420, framerate=(fraction)30/1' ! multifilesink location=test_%d.yuv

Demo SD card for Jetson TX1 J20

http://developer.ridgerun.com/wiki/index.php?title=RidgeRun_Demo_Images

Articles related

Imx219 vs ov5693 armload
Jetson J20 imx219 glass to glass latency

Contact Us

If you are interested in the evaluation version or for technical questions please send an email to support@ridgerun.com. If you are interested in purchasing the driver, please post your inquiry at our Contact Us link.