Sony IMX327 Linux Driver: Difference between revisions
Line 491: | Line 491: | ||
The sensor will capture in the 3840x2160@30 mode and the pipeline will encode the video and save it into test.ts file. | The sensor will capture in the 3840x2160@30 mode and the pipeline will encode the video and save it into test.ts file. | ||
=== Performance === | === Performance === |
Revision as of 18:50, 6 July 2021
|
|
|
Sony IMX327 Features
The IMX327LQR-C is a diagonal 6.46 mm (Type 1/2.8) CMOS active pixel type solid-state image sensor with a square pixel array and 2.13 M effective pixels. This chip operates with analog 2.9 V, digital 1.2 V, and interface 1.8 V triple power supply, and has low power consumption. High sensitivity, low dark current and no smear are achieved through the adoption of R, G and B primary color mosaic filters. This chip features an electronic shutter with variable charge-integration time. (Applications: Surveillance cameras, FA cameras, Industrial cameras)
Supported Platforms
- NVIDIA Jetson Nano
Features Included in the Driver
Nano |
| |||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
TX2 |
|
Enabling the Driver
In order to use this driver, you have to patch and compile the kernel source using JetPack:
- Follow the instructions in (Downloading sources) to get the kernel source code.
- Once you have the source code, apply the following three patches in order to add the changes required for the AR1335 camera at kernel and dtb level.
4.2.2_ar1335.patch
- Follow the instructions in (Build Kernel) for building the kernel, and then flash the image.
Make sure to enable AR1335 driver support:
make menuconfig
-> Device Drivers -> Multimedia support -> NVIDIA overlay Encoders, decoders, sensors and other helper chips -> <*> AR1335 camera sensor support
Using the Driver
GStreamer Examples
Capture and Display
- 3840x2160@30fps GRBG10
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1' ! autovideosink
- 4208x3120@15fps GRBG10
gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! 'video/x-raw(memory:NVMM), width=4208, height=3120, format=NV12, framerate=15/1' ! nvoverlaysink
- 4208x3120@15fps GRBG8
gst-launch-1.0 v4l2src ! "video/x-bayer, format=grbg, width=4208, height=3120, framerate=15/1" ! capssetter caps="video/x-bayer, format=grbg, width=4224, height=3120" ! bayer2rgb ! videoconvert ! xvimagesink
- 4208x3120@15fps GREY8
gst-launch-1.0 v4l2src ! "video/x-raw,width=4208,height=3120,format=GRAY8" ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvoverlaysink sync=0
- 2104x1560@29fps GREY8
gst-launch-1.0 v4l2src ! "video/x-raw,width=2104,height=1560,format=GRAY8" ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvoverlaysink sync=0
- 512x512@86fps GREY8
gst-launch-1.0 v4l2src ! "video/x-raw,width=512,height=512,format=GRAY8" ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvoverlaysink sync=0
Video Encoding
CAPS="video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1" gst-launch-1.0 nvarguscamerasrc sensor-id=0 num-buffers=500 ! "video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1" ! omxh264enc ! mpegtsmux ! filesink location=test.ts
The sensor will capture in the 3840x2160@30 mode and the pipeline will encode the video and save it into test.ts file.
Performance
ARM Load
Tegrastats display the following output when capturing with the sensor driver used in the TX2 platform:
RAM 1263/7855MB (lfb 1501x4MB) CPU [0%@2035,off,off,0%@2035,0%@2035,0%@2035] RAM 1263/7855MB (lfb 1501x4MB) CPU [23%@960,off,off,17%@960,16%@960,23%@960] RAM 1263/7855MB (lfb 1500x4MB) CPU [17%@345,off,off,17%@345,18%@345,20%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [20%@345,off,off,16%@345,18%@345,15%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [19%@345,off,off,13%@345,15%@345,14%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [20%@345,off,off,15%@345,12%@345,15%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [19%@345,off,off,15%@345,15%@345,16%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [20%@345,off,off,18%@345,18%@345,17%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [16%@345,off,off,15%@345,27%@345,17%@345] RAM 1263/7855MB (lfb 1500x4MB) CPU [19%@345,off,off,18%@345,17%@345,19%@345]
Framerate
Using the next pipeline we were able to measure the framerate for single capture with perf element:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)NV12, framerate=(fraction)30/1' ! perf ! fakesink
GST-PERF INFO --> Timestamp: 0:07:19.108602798; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:20.141189052; Bps: 782; fps: 30.3 GST-PERF INFO --> Timestamp: 0:07:21.174265435; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:22.207318757; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:23.240543516; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:24.273697886; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:25.306822764; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:26.340117514; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:27.373087284; Bps: 782; fps: 30.3 GST-PERF INFO --> Timestamp: 0:07:28.406069581; Bps: 782; fps: 30.3 GST-PERF INFO --> Timestamp: 0:07:29.439238457; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:30.472398102; Bps: 782; fps: 30.0 GST-PERF INFO --> Timestamp: 0:07:31.472948042; Bps: 808; fps: 30.0
The results show the framerate constant at 30FPS that use nvarguscamerasrc and passing frames through the ISP to convert from Bayer to YUV.
Latency measurement
RidgeRun Resources | |||||
Contact Us
|