Jetson TX2 ON Semiconductor MT9V024 image sensor Linux driver
|
|
|
Keywords: MT9V024 Jetson TX2, Gstreamer, NVIDIA, RidgeRun, V4L2 Driver
MT9V024 Features
The ON Semiconductor MT9V024 is an image sensor with the following features:
- Array format: Wide-VGA, active 752H x 480V (360,960 pixels)
- Global shutter photodiode pixels; simultaneous integration and readout
- RGB Bayer, Monochrome, or RCCC: NIR enhanced performance for use with non-visible NIR illumination
- Readout modes: progressive or interlaced
- Shutter efficiency: >99%
- Simple two-wire serial interface
- Real-time exposure context switching - dual register set
- Register lock capability
- Window size: User programmable to any smaller format (QVGA, CIF, QCIF). The data rate can be maintained independently of window size
- Binning: 2 x 2 and 4 x 4 of the full resolution
- ADC: On-chip, 10-bit column-parallel (option to operate in 12-bit to 10-bit companding mode)
- Automatic controls: Auto exposure control (AEC) and auto gain control (AGC); variable regional and variable weight AEC/AGC
- Support for four unique serial control register IDs to control multiple images on the same bus
- Data output formats:
- Single sensor mode:
- 10-bit parallel/stand-alone
- 8-bit or 10-bit serial LVDS
- Stereo sensor mode:
- Interspersed 8-bit serial LVDS
- Single sensor mode:
- High dynamic range (HDR) mode
RidgeRun has developed a driver for the Jetson TX2 platform with the following support:
- L4T 32.1 and Jetpack 4.2
- V4l2 Media controller driver
- Tested resolution: 752 x 48 @ 30 fps
- Tested Color filter array: Monochrome
- Capture with nvarguscamerasrc using the ISP.
Enabling the driver
In order to use this driver, you have to patch and compile the kernel source using JetPack:
- Follow the instructions in (Downloading sources) to get the kernel source code.
- Once you have the source code, apply the following three patches in order to add the changes required for the MT9V024 camera at kernel and dtb level.
4.2_mt9v024.patch
- Follow the instructions in (Build Kernel) for building the kernel, and then flash the image.
Make sure to enable MT9V024 driver support:
make menuconfig
-> Device Drivers -> Multimedia support -> Encoders, decoders, sensors and other helper chips -> <*> MT9V024 camera sensor support
Using the driver
V4L2 Capture
To get RAW data from the MT9V024 sensor, run the following command:
v4l2-ctl -d /dev/video0 --set-fmt-video=width=752,height=480 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=10 --stream-to=mt9v024_752x480.raw
This will generate mt9v024_752x480.raw file, but you will need to use a tool to check the frames.
Gstreamer examples
Single Capture
gst-launch-1.0 nvarguscamerasrc ! "video/x-raw(memory:NVMM),width=(int)752,height=(int)480,format=(string)NV12,framerate=(fraction)30/1" ! autovideosink
The sensor will capture in the 752x480@30 mode
Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock GST_ARGUS: Creating output stream CONSUMER: Waiting until producer is connected... GST_ARGUS: Available Sensor modes : GST_ARGUS: 752 x 480 FR = 59,999999 fps Duration = 16666667 ; Analog Gain range min 1,000000, max 64,000000; Exposure Range min 31000, max 166490000; GST_ARGUS: Running with following settings: Camera index = 0 Camera mode = 0 Output Stream W = 752 H = 480 seconds to Run = 0 Frame Rate = 59,999999 GST_ARGUS: PowerService: requested_clock_Hz=4737600 GST_ARGUS: Setup Complete, Starting captures for 0 seconds GST_ARGUS: Starting repeat capture requests. CONSUMER: Producer has connected; continuing.
Note: Image colors will change because the color-filter-array supported by the driver is Monochrome, but this is not supported by the ISP.
Multi Capture (6 cameras)
The following is a multi-capture test using gst-launch application to run the 6 streams.
gst-launch-1.0 \ nvarguscamerasrc sensor-id=0 num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)752, height=(int)480, format=(string)NV12, framerate=(fraction)30/1' ! fakesink \ nvarguscamerasrc sensor-id=1 num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)752, height=(int)480, format=(string)NV12, framerate=(fraction)30/1' ! fakesink \ nvarguscamerasrc sensor-id=2 num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)752, height=(int)480, format=(string)NV12, framerate=(fraction)30/1' ! fakesink \ nvarguscamerasrc sensor-id=3 num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)752, height=(int)480, format=(string)NV12, framerate=(fraction)30/1' ! fakesink \ nvarguscamerasrc sensor-id=4 num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)752, height=(int)480, format=(string)NV12, framerate=(fraction)30/1' ! fakesink \ nvarguscamerasrc sensor-id=5 num-buffers=100 ! 'video/x-raw(memory:NVMM), width=(int)752, height=(int)480, format=(string)NV12, framerate=(fraction)30/1' ! fakesink
Performance
Framerate
V4L2-ctl
We get 30FPS from the sensor on a V4L2 capture:
v4l2-ctl -d /dev/video0 --set-fmt-video=width=752,height=480,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.04 fps <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.03 fps <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.02 fps <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.01 fps <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.01 fps <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.01 fps
GStreamer
Using the next pipeline we were able to measure the framerate for single capture with the perf element:
gst-launch-1.0 nvarguscamerasrc ! "video/x-raw(memory:NVMM),width=(int)752,height=(int)480,format=(string)NV12,framerate=(fraction)30/1" ! fakesink
perf: perf0; timestamp: 1:35:55.751602001; bps: 193879.384; mean_bps: 193879.384; fps: 29.994; mean_fps: 29.994 INFO: perf: perf0; timestamp: 1:35:56.761051137; bps: 192104.776; mean_bps: 192992.080; fps: 29.719; mean_fps: 29.856 INFO: perf: perf0; timestamp: 1:35:57.770470774; bps: 192110.390; mean_bps: 192698.183; fps: 29.720; mean_fps: 29.811 INFO: perf: perf0; timestamp: 1:35:58.779928270; bps: 192103.185; mean_bps: 192549.434; fps: 29.719; mean_fps: 29.788 INFO: perf: perf0; timestamp: 1:35:59.789433771; bps: 192094.050; mean_bps: 192458.357; fps: 29.718; mean_fps: 29.774 INFO: perf: perf0; timestamp: 1:36:00.798881673; bps: 192105.011; mean_bps: 192399.466; fps: 29.719; mean_fps: 29.765 INFO: perf: perf0; timestamp: 1:36:01.808314377; bps: 192107.903; mean_bps: 192357.814; fps: 29.720; mean_fps: 29.758 INFO: perf: perf0; timestamp: 1:36:02.817684397; bps: 192119.833; mean_bps: 192328.066; fps: 29.722; mean_fps: 29.754 INFO: perf: perf0; timestamp: 1:36:03.827163605; bps: 192099.053; mean_bps: 192302.621; fps: 29.718; mean_fps: 29.750 INFO: perf: perf0; timestamp: 1:36:04.836652768; bps: 192097.159; mean_bps: 192282.074; fps: 29.718; mean_fps: 29.747
The results show the framerate constant at just below 30FPS that use nvarguscamerasrc and passing frames through the ISP to convert from Bayer to YUV.
For direct inquiries, please refer to the contact information available on our Contact page. Alternatively, you may complete and submit the form provided at the same link. We will respond to your request at our earliest opportunity.
Links to RidgeRun Resources and RidgeRun Artificial Intelligence Solutions can be found in the footer below.