Mira130 Linux Driver

From RidgeRun Developer Wiki



Problems running the pipelines shown on this page? Please see our GStreamer Debugging guide for help.

Driver List Information
Refer to the RidgeRun Linux Camera Drivers to meet all the list of Drivers available

AMS MIRA130 Features

The Mira130 is a global shutter CMOS and monochrome sensor with an effective pixel array output of 1080 H x 1280 V. This sensor supports NIR enhancement of the QE, and operations such as high dynamic range (HDR) mode, external triggering, windowing, horizontal or vertical mirroring. This sensor can perform a framerate of 120 fps with 10-bit data at a resolution of 1080 H x 1280 V as a maximum. This chip operates with analog 2.5 V, digital 1.8 V, and interface 1.8 V. High sensitivity, programmable registers through I2C, low power consumption, build-in temperature sensor are features that this sensor provides. (Applications: 3D structured light, 3D Active Stereo systems, Machine vision)

Supported Platforms

  • NVIDIA Jetson Nano Development Kit B01

Features Included in the Driver

Nano
Feature Details SDK Support
1080x1280@120fps 2 Lanes, RAW10, Y10 L4T 32.6.1 / Jetpack 4.6

RidgeRun has developed a driver for the Jetson Nano platform with the following support:

  • V4l2 Media controller driver
  • Capture with GStreamer v4l2src and v4l2-ctl

Enabling the driver

To use this driver, you have to patch and compile the kernel source.

Using Jetpack

Follow these instructions:

1. Download the toolchain following the instructions from:
Download and install the Toolchain

2. Follow the instructions to download and install the NVIDIA SDK Manager from:
NVIDIA SDK Manager
- Then choose the platform (Jetson Nano) and version of JetPack (4.6). - The NVIDIA SDK manager is going to install in a directory similar to:

$HOME/nvidia/nvidia_sdk/JetPack_4.6_Linux_JETSON_NANO_TARGETS/

3. Get the L4T Nano sources from:

cd $HOME/nvidia/nvidia_sdk/JetPack_4.6_Linux_JETSON_NANO_TARGETS/Linux_for_Tegra/
./source_sync.sh -t tegra-l4t-r32.6.1

4. Apply the contents provided in 4.6_evm_mira130_v0.1.tar in the sources directory:
- First untar the provided tarball:

tar -xvf 4.6_evm_mira130_v0.1.tar

You can then apply the patch:

quilt push -a

5. To compile the code follow the steps:

export DEVDIR=$HOME/nvidia/nvidia_sdk/JetPack_4.6_Linux_JETSON_NANO_TARGETS/Linux_for_Tegra
export PATCHESPATH=$HOME/nvidia/nvidia_sdk/JetPack_4.6_Linux_JETSON_NANO_TARGETS/Linux_for_Tegra/sources/patches/
cd $DEVDIR
# Create the directory to store the compiled image and dtb
mkdir -p $DEVDIR/images/dtb
export TEGRA_KERNEL_OUT=$DEVDIR/images
export ARCH=arm64
export KERNEL_DIR=$DEVDIR/sources/kernel/kernel-4.9
export CROSS_COMPILE=$HOME/l4t-gcc/gcc-linaro-7.3.1-2018.05-x86_64_aarch64-linux-gnu/bin/aarch64-linux-gnu-
export LOCALVERSION=-tegra
cd $KERNEL_DIR
make mrproper
  • Make sure to enable MIRA130 driver support:
make O=$TEGRA_KERNEL_OUT tegra_defconfig
make O=$TEGRA_KERNEL_OUT menuconfig
  • In the terminal menu that appears, select:
Device Drivers  --->
  <*> Multimedia support  --->
      NVIDIA overlay Encoders, decoders, sensors and other helper chips  --->
          <*> MIRA130 camera sensor support

If the driver is not selected, press the Y key in order to select the MIRA130 option. Go back by hitting the double Esc key until you get the message: Do you want to save your new configuration?, select Yes and press Enter'

  • Compile the kernel:
make O=$TEGRA_KERNEL_OUT CROSS_COMPILE=${CROSS_COMPILE} -j4 zImage
  • Compile the device tree:
make O=$TEGRA_KERNEL_OUT CROSS_COMPILE=${CROSS_COMPILE} -j4 dtbs

6. Flash the Jetson Nano:

Make sure the Jetson Nano is in recovery mode.

  • Copy the compiled image to the kernel directory.
cp $TEGRA_KERNEL_OUT/arch/arm64/boot/Image $TEGRA_KERNEL_OUT/arch/arm64/boot/zImage $DEVDIR/kernel/
  • Copy the compiled device tree to the kernel directory.
cp -r $TEGRA_KERNEL_OUT/arch/arm64/boot/dts/* $DEVDIR/kernel/dtb/
  • Flash the memory following the next guide:
cd $DEVDIR
sudo ./flash.sh jetson-nano-qspi-sd mmcblk0p1
  • Reboot the board after the flashing is completed.

Apply the v4l2src patch

In order to capture with v4l2src, a patch needs to be applied to GStreamer in order for v4l2src to support a Y10 format output.

1. Please extract the contents provided in extra_gstreamer_flashing_patches.tar in sources/patches directory:

cd $PATCHESPATH
tar -xvf extra_gstreamer_flashing_patches.tar

Apply the v4l2src patch to the Jetson Nano Devkit board

1. Transfer the patch to the board:

cd $PATCHESPATH
scp add-Y10-support-1.14.5.patch <nvidia-nano-user>@<nvidia-nano-ip>:/home/<nvidia-nano-username>

2. In the board, download the necessary gstreamer dependencies:

sudo apt update
sudo apt install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev

3. Download gst-plugins-good:

mkdir gstreamer-1.14.5; cd gstreamer-1.14.5
wget https://gstreamer.freedesktop.org/src/gst-plugins-good/gst-plugins-good-1.14.5.tar.xz
tar -xvf gst-plugins-good-1.14.5.tar.xz

4. Apply the patch:

cd gst-plugins-good-1.14.5/sys/v4l2/
patch -i $HOME/add-Y10-support-1.14.5.patch

Compile and install

1. Compile:

cd ~/gstreamer-1.14.5/gst-plugins-good-1.14.5
./configure --prefix=/usr --libdir=/usr/lib/aarch64-linux-gnu/
make
DESTDIR=$(pwd)/install make install

2. Install the library:

sudo cp install/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstvideo4linux2.so /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstvideo4linux2.so

Using the Driver

Capture with v4l2-ctl

  • Install v4l utils:
sudo apt install v4l-utils
  • Test the capture framerate:
v4l2-ctl -d /dev/video0 --set-fmt-video=width=1080,height=1280,pixelformat=Y10 --set-ctrl bypass_mode=0 --stream-mmap

The output should look like the following:

ams@ams-desktop:~$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1080,height=1280,pixelformat=Y10 --set-ctrl bypass_mode=0 --stream-mmap
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 120.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 120.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 120.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 120.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 120.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 120.00 fps
  • Capture a single frame:
v4l2-ctl -d /dev/video0 --set-fmt-video=width=1080,height=1280,pixelformat=Y10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --stream-to=test_frame_120fps.raw

The MIRA130 supports a resolution of 1080x1280 but the platform used (Jetson Nano) defined padding of 8 pixels to the image in order to align and optimize the capture process. The post-process applied is to open the image with a 1088x1280 resolution using Vooya o Rawpixels.

Please consider the following settings to be able to view it correctly:

  • rawpixels:

- width: 1088
- height: 1280
- Predefined format: Grayscale 8bit
- Pixel format: Grayscale
- bpp1: 16
- Little Endian box checked

  • vooya:

- width: 1088
- height: 1280
- Frames/Second: 120
- Color Space: Single Channel
- Data Container: Single Integer
- Bit Depth (Value): 14bit

Capture with GStreamer

sudo jetson_clocks

Performance statistics

gst-launch-1.0 v4l2src device=/dev/video0 ! "video/x-raw,width=1080,height=1280,framerate=120/1,format=GRAY16_LE" ! queue ! fakesink
RAM 1233/3963MB (lfb 400x4MB) CPU [0%@1479,0%@1479,0%@1479,50%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [1%@1479,0%@1479,0%@1479,52%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [3%@1479,0%@1479,0%@1479,52%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [8%@1479,1%@1479,0%@1479,51%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [2%@1479,0%@1479,0%@1479,52%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [1%@1479,0%@1479,0%@1479,51%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [1%@1479,1%@1479,0%@1479,51%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [2%@1479,2%@1479,16%@1479,37%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [0%@1479,0%@1479,0%@1479,51%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [0%@1479,0%@1479,45%@1479,7%@1479]
RAM 1233/3963MB (lfb 400x4MB) CPU [1%@1479,1%@1479,52%@1479,0%@1479]

Framerate

Using the next pipeline we were able to measure the framerate for single capture with perf element:

gst-launch-1.0 v4l2src device=/dev/video0 ! perf ! "video/x-raw,width=1080,height=1280,framerate=120/1,format=GRAY16_LE" ! fakesink
perf: perf0; timestamp: 0:56:38.972435551; bps: 0,000; mean_bps: 0,000; fps: 0,000; mean_fps: 0,000
INFO:
perf: perf0; timestamp: 0:56:39.979731288; bps: 2540175360,000; mean_bps: 0,000; fps: 120,124; mean_fps: 120,124
INFO:
perf: perf0; timestamp: 0:56:40.979764633; bps: 2673868800,000; mean_bps: 2673868800,000; fps: 119,996; mean_fps: 120,060
INFO:
perf: perf0; timestamp: 0:56:41.979774896; bps: 2673868800,000; mean_bps: 2673868800,000; fps: 119,999; mean_fps: 120,039
INFO:
perf: perf0; timestamp: 0:56:42.979792024; bps: 2673868800,000; mean_bps: 2673868800,000; fps: 119,998; mean_fps: 120,029
INFO:
perf: perf0; timestamp: 0:56:43.979810601; bps: 2673868800,000; mean_bps: 2673868800,000; fps: 119,998; mean_fps: 120,023
INFO:
perf: perf0; timestamp: 0:56:44.979836408; bps: 2673868800,000; mean_bps: 2673868800,000; fps: 119,997; mean_fps: 120,018
INFO:
perf: perf0; timestamp: 0:56:45.979851268; bps: 2696151040,000; mean_bps: 2677582506,667; fps: 119,998; mean_fps: 120,016
INFO:
perf: perf0; timestamp: 0:56:46.979881952; bps: 2673868800,000; mean_bps: 2677051977,143; fps: 119,996; mean_fps: 120,013
INFO:
perf: perf0; timestamp: 0:56:47.979909813; bps: 2673868800,000; mean_bps: 2676654080,000; fps: 119,997; mean_fps: 120,011
INFO:
perf: perf0; timestamp: 0:56:48.979927144; bps: 2673868800,000; mean_bps: 2676344604,444; fps: 119,998; mean_fps: 120,010
INFO:
perf: perf0; timestamp: 0:56:49.979933060; bps: 2673868800,000; mean_bps: 2676097024,000; fps: 119,999; mean_fps: 120,009
INFO:
perf: perf0; timestamp: 0:56:50.979961414; bps: 2673868800,000; mean_bps: 2675894458,182; fps: 119,997; mean_fps: 120,008
INFO:
perf: perf0; timestamp: 0:56:51.979984914; bps: 2673868800,000; mean_bps: 2675725653,333; fps: 119,997; mean_fps: 120,007
INFO:
perf: perf0; timestamp: 0:56:52.980041219; bps: 2673868800,000; mean_bps: 2675582818,462; fps: 119,993; mean_fps: 120,006
INFO:
perf: perf0; timestamp: 0:56:53.988375122; bps: 2696151040,000; mean_bps: 2677051977,143; fps: 120,000; mean_fps: 120,006
INFO:
perf: perf0; timestamp: 0:56:54.996700006; bps: 2673868800,000; mean_bps: 2676839765,333; fps: 120,001; mean_fps: 120,005
INFO:
perf: perf0; timestamp: 0:56:55.996715395; bps: 2673868800,000; mean_bps: 2676654080,000; fps: 119,998; mean_fps: 120,005
INFO:
perf: perf0; timestamp: 0:56:56.996737859; bps: 2673868800,000; mean_bps: 2676490240,000; fps: 119,997; mean_fps: 120,005
INFO:
perf: perf0; timestamp: 0:56:57.996773699; bps: 2673868800,000; mean_bps: 2676344604,444; fps: 119,996; mean_fps: 120,004
INFO:
perf: perf0; timestamp: 0:56:58.996792290; bps: 2673868800,000; mean_bps: 2676214298,947; fps: 119,998; mean_fps: 120,004
INFO:
perf: perf0; timestamp: 0:56:59.996808788; bps: 2673868800,000; mean_bps: 2676097024,000; fps: 119,998; mean_fps: 120,004
INFO:
perf: perf0; timestamp: 0:57:00.996826163; bps: 2673868800,000; mean_bps: 2675990918,095; fps: 119,998; mean_fps: 120,003
INFO:
perf: perf0; timestamp: 0:57:01.996847486; bps: 2696151040,000; mean_bps: 2676907287,273; fps: 119,997; mean_fps: 120,003

GStreamer Examples

Capture and Display
gst-launch-1.0 v4l2src device=/dev/video0 ! "video/x-raw,width=1080,height=1280,framerate=120/1,format=GRAY16_LE" ! queue ! videoconvert ! xvimagesink sync=false
Video Encoding
gst-launch-1.0 v4l2src device=/dev/video0 ! "video/x-raw,width=1080,height=1280,framerate=120/1,format=GRAY16_LE" ! queue ! videoconvert ! queue ! omxh265enc ! h265parse ! qtmux ! filesink location=out.mp4 -e

The sensor will capture in the 1080x1280@120fps mode and the pipeline will encode the video and save it into an out.mp4 file.



For direct inquiries, please refer to the contact information available on our Contact page. Alternatively, you may complete and submit the form provided at the same link. We will respond to your request at our earliest opportunity.


Links to RidgeRun Resources and RidgeRun Artificial Intelligence Solutions can be found in the footer below.