Sony IMX327 Linux Driver: Difference between revisions
Line 368: | Line 368: | ||
==== ARM Load ==== | ==== ARM Load ==== | ||
Tegrastats display the following output when capturing with the sensor driver used in the | Tegrastats display the following output when capturing with the sensor driver used in the Jetson Nano platform: | ||
<pre> | <pre> | ||
RAM | RAM 1167/3963MB (lfb 522x4MB) CPU [25%@1132,16%@1132,9%@1132,12%@1132] | ||
RAM | RAM 1168/3963MB (lfb 522x4MB) CPU [28%@921,12%@921,9%@921,13%@921] | ||
RAM | RAM 1167/3963MB (lfb 522x4MB) CPU [23%@921,12%@921,13%@921,10%@921] | ||
RAM | RAM 1167/3963MB (lfb 522x4MB) CPU [28%@921,8%@921,12%@921,12%@921] | ||
RAM | RAM 1169/3963MB (lfb 522x4MB) CPU [26%@1479,9%@1479,16%@1479,9%@1479] | ||
RAM | RAM 1167/3963MB (lfb 522x4MB) CPU [28%@921,13%@921,9%@921,16%@921] | ||
RAM | RAM 1168/3963MB (lfb 522x4MB) CPU [23%@1036,13%@1036,14%@1036,7%@1036] | ||
RAM | RAM 1167/3963MB (lfb 522x4MB) CPU [25%@921,12%@921,9%@921,11%@921] | ||
RAM | RAM 1168/3963MB (lfb 522x4MB) CPU [25%@921,13%@921,16%@921,12%@921] | ||
RAM | RAM 1169/3963MB (lfb 522x4MB) CPU [27%@921,12%@921,8%@921,13%@921] | ||
RAM 1168/3963MB (lfb 522x4MB) CPU [24%@921,8%@921,13%@921,10%@921] | |||
RAM 1169/3963MB (lfb 522x4MB) CPU [29%@921,13%@921,15%@921,6%@921] | |||
</pre> | </pre> | ||
Revision as of 23:43, 7 July 2021
|
|
|
Sony IMX327 Features
The IMX327LQR-C is a diagonal 6.46 mm (Type 1/2.8) CMOS active pixel type solid-state image sensor with a square pixel array and 2.13 M effective pixels. This chip operates with analog 2.9 V, digital 1.2 V, and interface 1.8 V triple power supply, and has low power consumption. High sensitivity, low dark current and no smear are achieved through the adoption of R, G and B primary color mosaic filters. This chip features an electronic shutter with variable charge-integration time. (Applications: Surveillance cameras, FA cameras, Industrial cameras)
Supported Platforms
- NVIDIA Jetson Nano
Features Included in the Driver
Nano |
|
---|
Enabling the Driver
In order to use this driver, you have to patch and compile the kernel source using JetPack:
- Follow the instructions in (Downloading sources) to get the kernel source code.
- Once you have the source code, apply the following the patches in order to add the changes required for the IMX327 camera at kernel and dtb level.
4.5_imx327.patch
- Follow the instructions in (Build Kernel) for building the kernel, and then flash the image.
Make sure to enable IMX327 driver support:
make menuconfig
-> Device Drivers -> Multimedia support -> NVIDIA overlay Encoders, decoders, sensors and other helper chips -> <M> IMX327 camera sensor support
And to select the runtime device tree blob by editing the $JETSON_L4T/rootfs/boot/extlinux/extlinux.conf
to add the "FDT" line:
TIMEOUT 30 DEFAULT primary MENU TITLE L4T boot options LABEL primary MENU LABEL primary kernel LINUX /boot/Image INITRD /boot/initrd FDT /boot/tegra210-p3448-0000-p3449-0000-a02.dtb APPEND ${cbootargs} quiet
Using the Driver
GStreamer Examples
Capture and Display
- 1920x1080@30fps RGGB12
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=NV12, framerate=30/1' ! nvvidconv ! xvimagesink
Video Encoding
CAPS="video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1" gst-launch-1.0 nvarguscamerasrc sensor-id=0 num-buffers=500 ! "video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1" ! omxh264enc ! mpegtsmux ! filesink location=test.ts
The sensor will capture in the 3840x2160@30 mode and the pipeline will encode the video and save it into test.ts file.
Performance
ARM Load
Tegrastats display the following output when capturing with the sensor driver used in the Jetson Nano platform:
RAM 1167/3963MB (lfb 522x4MB) CPU [25%@1132,16%@1132,9%@1132,12%@1132] RAM 1168/3963MB (lfb 522x4MB) CPU [28%@921,12%@921,9%@921,13%@921] RAM 1167/3963MB (lfb 522x4MB) CPU [23%@921,12%@921,13%@921,10%@921] RAM 1167/3963MB (lfb 522x4MB) CPU [28%@921,8%@921,12%@921,12%@921] RAM 1169/3963MB (lfb 522x4MB) CPU [26%@1479,9%@1479,16%@1479,9%@1479] RAM 1167/3963MB (lfb 522x4MB) CPU [28%@921,13%@921,9%@921,16%@921] RAM 1168/3963MB (lfb 522x4MB) CPU [23%@1036,13%@1036,14%@1036,7%@1036] RAM 1167/3963MB (lfb 522x4MB) CPU [25%@921,12%@921,9%@921,11%@921] RAM 1168/3963MB (lfb 522x4MB) CPU [25%@921,13%@921,16%@921,12%@921] RAM 1169/3963MB (lfb 522x4MB) CPU [27%@921,12%@921,8%@921,13%@921] RAM 1168/3963MB (lfb 522x4MB) CPU [24%@921,8%@921,13%@921,10%@921] RAM 1169/3963MB (lfb 522x4MB) CPU [29%@921,13%@921,15%@921,6%@921]
Framerate
Using the next pipeline we were able to measure the framerate for single capture with perf element:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1' ! perf ! fakesink
GST-PERF INFO --> Timestamp: 0:07:19.108602798; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:20.141189052; Bps: 782; fps: 30.3 GST-PERF INFO --> Timestamp: 0:07:21.174265435; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:22.207318757; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:23.240543516; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:24.273697886; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:25.306822764; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:26.340117514; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:27.373087284; Bps: 782; fps: 30.3 GST-PERF INFO --> Timestamp: 0:07:28.406069581; Bps: 782; fps: 30.3 GST-PERF INFO --> Timestamp: 0:07:29.439238457; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:30.472398102; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:31.472948042; Bps: 808; fps: 30.0
The results show the framerate constant at 30FPS that use nvarguscamerasrc and passing frames through the ISP to convert from Bayer to YUV.
Latency measurement
RidgeRun Resources | |||||
Contact Us
|