ISP Architecture i.MX8M Plus : V4L2 Sensor Drivers Porting Guide
Overview
The Image Signal Processor (ISP) receives an image from the camera sensor and converts it from raw Bayer to YUV so it can be processed by the chip. The ISP also provides additional processing to improve the image quality. Supported image quality processes include:
- HDR to retain image details in high-contrast scenes.
- Dewarp to correct the image geometry caused by lens distortion (e.g. fisheye lens).
- Image enhancements (e.g. AWB, Denoise, AE, etc).
This wiki explains the architecture of the i.MX8M Plus Image Signal Processor (ISP sensor driver, API functions, and calling process). It also describes how to port a V4L2 camera sensor driver to use the methods provided by the ISP. In the References Section you can find all the documentation that supports the content of this wiki.
Display, Imaging, and Camera Overview
The following diagram shows the high-level details of the peripherals that comprise the display, imaging, and camera interfaces on the i.MX8M Plus. These peripherals are interconnected through the chip to support several application-specific solutions that include the use of ML / AI accelerator and vision.
The chip supports two 4-lane MIPI CSI2 camera inputs and two instances of an ISP. Key features of each ISP include:
- Bayer de-mosaicing and filtering (including denoising, sharpening, and blurring).
- Defect pixel cluster correction.
- Color processor.
- Chromatic aberration correction.
- Denoise.
- Histogram.
- Lens shading correction.
- Wide Dynamic Range (local tone mapping).
- Color noise removal.
- Automatic white balance measurements (AWB).
- Exposure measurement for AE (AEC/AGC) Autofocus measurement (AF).
- High Dynamic Range (HDR).
A separate Dewarp engine is also supported, capable of reading from and writing
to DRAM to adjust an image such as from a fisheye lens at a resolution up to the
maximum resolution supported by the ISP. One fisheye lens is supported at up to
4k or 12MP (4096x3072) at 30fps.
ISP High-Level Block Diagram
The Image Signal Processing (ISP) core is a complete video and still picture input block. It contains image processing and color space conversion (RAW Bayer to YUV) functions. The integrated image processing unit supports simple CMOS sensors delivering RGB Bayer pattern without any integrated image processing and also image sensors with integrated YCbCr processing.
Input formats:
- YCbCr420
- YCbCr422
- RAW8
- RAW10
- RAW12
- RAW14
Output formats:
- YUYV 4:2:2
- NV12 4:2:0
- NV16 4:2:2
- RG10 10-bit Bayer
The supported resolutions and formats for 1 ISP are shown in the table below:
Resolution | Without HDR | 2-exposure line-interleaved
HDR |
3-exposure line-interleaved
HDR |
---|---|---|---|
12MP (4096x3072) | 30 fps | 15 fps | 10 fps |
4k (3848x2168) | 45 fps | 22.5 fps | 15 fps |
1080p (1936x1198) | 120 fps | 60 fps | 40 fps |
The supported resolutions and formats for 2 ISPs used simultaneously (capabilities for each ISP), are shown in the table below:
Resolution | Without HDR | 2-exposure line-interleaved
HDR |
3-exposure line-interleaved
HDR |
---|---|---|---|
12MP (4096x3072) | - | - | - |
4k (3848x2168) | - | - | - |
1080p (1936x1198) | 80 fps | 40 fps | 27 fps |
Important Note: The video unit driver that comes by default with ISP sources 4.2.2.16.0, found at $ISP_SOURCES_TOP/vvcam/v4l2/video/ , has some macros that define the maximum width and height parameters. According to the information indicated in this NXP forum discussion about the maximum supported resolution, the code parameters indicate a maximum resolution of 4k, so if it is necessary to increase the resolution, the code (video.h file) would have to be modified. |
ISP Software Architecture
- ISI (Independent Sensor Interface) Layer: In this layer lives the ISS (Image Sensor Specific) driver. The purpose of this driver is to define function pointers, so that the ISP Driver can use the functions defined by the API for different sensors independently, without altering other codes. The driver as such is programmed using only the ISI Sensor API functions. The functions defined by the API are the ones that allow handling of the features provided by the ISP, like autoexposure, auto white balance, black level correction, color space conversion, etc.
- VVCAM: It is the ISP kernel driver integration layer that includes drivers of MIPI, I2C Kernel driver, Camera sensors drivers, and the ISP driver. It can operate in two modes: V4L2 mode and native mode. In V4L2 mode, a device node of the sensor can be created for direct access. The function <custom_sensor>_priv_ioctl() is used in the kernel space to receive the commands and parameters passed down by the user space through ioctl and to call the corresponding functions in the sensor driver code. The VVCAM driver refers to the base code of the sensor driver using the V4L2 standard, but with some modifications. It is necessary to add a struct of the vvcam modes, besides including the functions for communication with ioctl. The application layer makes use of the API functions through the ISS driver to manage the ISP features from a user's point of view. The ISS driver communicates with the driver in the VVCAM via ioctl, so it could be thought of as a "communication bridge" between the ISI API, and the kernel drivers.
Camera Sensor Driver in V4L2 Mode: Porting Guide
This section provides an explanation of how to port a sensor driver in V4L2 mode. This guide assumes that the developer has knowledge of the V4L2 API and has worked or is familiar with sensor drivers and their operation within the Linux kernel. Therefore, and to facilitate the explanation, this guide does not focus on the details of the sensor driver development that you want to port. It is assumed that you already have an existing driver for your sensor, before making the port. Many technical details may vary between sensors and it is the developer's responsibility to properly use the datasheets provided by the manufacturer or contact them directly to be given access to the driver code.
It should be noted that at the time of writing this guide, the version of the ISP's sources was 4.2.2.16.0 and Linux Kernel 5.10.72. If a different version is used, it is the developer's responsibility to review the API documentation for the corresponding version, since there may be changes that affect what is indicated in this guide. Still, the general process can serve as a basis for adapting it to your needs.
Important Note: Where <sensor> or <SENSOR> is indicated in the guide, it should be replaced by the specific name of the sensor you are working with, respecting capital letters when apply. For example imx219 or IMX219, etc. |
To port the camera sensor, the following steps must be taken as described in the following sections:
- Define sensor attributes and create the sensor instance in CamDevice.
- ISS Driver and ISP Media Server.
- Sensor Calibration Files.
- VVCAM Driver Creation.
- Device Tree Modifications.
Define Sensor Attributes and Create Sensor Instance in CamDevice
According to the NXP documentation, the following three steps are already implemented in CamDevice and are included for reference only. Developers may not modify any code in CamDevice.
- Step 1: Define the sensor attributes in the
IsiSensor_s
data structure. - Step 2: Define the
IsiSensorInstanceConfig_t
configuration structure that will be used to create a new sensor instance. - Step 3: Call the
IsiCreateSensorIss()
function to create a new sensor instance.
ISS Driver and ISP Media Server
- Step 0 - Use a driver template as base code: The first step that we consider convenient when porting your sensor is to reuse the base code of one of the drivers that are offered as part of the ISP's sources. Creating the ISS driver code from scratch can be time-consuming, so starting from a base to make the respective changes is the most practical at this point. Example drivers can be found in
$ISP_SOURCES_TOP/units/isi/drv/
. For example, the ISP sources, 4.2.2.16.0, come with the OV2775 and OS08a20 drivers.$ISP_SOURCES_TOP
indicates the path of your working directory, where the respective sources are located.
- Step 1 - Add your <SENSOR> ISS Driver: Create the driver entry for your sensor in the path
$ISP_SOURCES_TOP/units/isi/drv/<SENSOR>/source/<SENSOR>.c
. For the code, you can take as a basis one of the drivers mentioned in the previous step. Change all occurrences of the respective sensor name within the code, for instance, OV2775 -> <SENSOR>, respecting capital letters where applicable.
- Step 2 - Check the information on the
IsiCamDrvConfig_s
data structure: According to the NXP documentation, data members defined in this data structure include the sensor ID (CameraDriverID
) and the function pointer to theIsiSensor
data structure. By using the address of theIsiCamDrvConfig_s
structure, the driver can then access the sensor API attached to the function pointer. The following is an example of the structure:
/***************************************************************************** * Each sensor driver needs to declare this struct for ISI load *****************************************************************************/ IsiCamDrvConfig_t IsiCamDrvConfig = { .CameraDriverID = 0x0000, .pIsiHalQuerySensor = <SENSOR>_IsiHalQuerySensorIss, .pfIsiGetSensorIss = <SENSOR>_IsiGetSensorIss, };
Important Note: Modify the CameraDriverID according to the chip ID of your sensor. Apply this change to any Chip ID occurrence within the code. |
- Step 3 - Check sensor macro definitions: In case there is any macro definition in the ISS Driver code, which involves specific properties of the sensor, you should modify it according to your requirements. For example:
#define <SENSOR>_MIN_GAIN_STEP (1.0f/16.0f)
- Step 4 - Modify ISP Media Server build tools: Changes required in this step include:
- Add a CMakeLists.txt file in
$ISP_SOURCES_TOP/units/isi/drv/<SENSOR>/
that builds your sensor module. - Modify the CMakeLists.txt located at
$ISP_SOURCES_TOP/units/isi/drv/CMakeLists.txt
to include and reference your sensor directory. - Modify the
$ISP_SOURCES_TOP/appshell/
and$ISP_SOURCES_TOP/mediacontrol/
build tools, since by default they refer to the construction of a particular sensor, for example, the OV2775, so it is necessary to change the name of the corresponding sensor. - Modify the
$ISP_SOURCES_TOP/build-all-isp.sh
script to reference the sensor modules and generate the corresponding binaries when building the ISP media server instance.
- Add a CMakeLists.txt file in
- Step 5 - ISP Media Server run script: You need to add the operation modes defined for your sensor in the script. Each operating mode is associated with an order (mode 0, mode 1 ... mode N), a name used to execute the command in the terminal (e.g <sensor>_custom_mode_1), a resolution, and a specific calibration file for the sensor. The script is located at
$ISP_SOURCES_TOP/imx/run.sh
.
- Step 6 - Sensor<X> config: At
$ISP_SOURCES_TOP/units/isi/drv/
you can find the files to configure each sensor entry to the ISP, calledSensor0_Entry.cfg
andSensor1_Entry.cfg
. There, the associated calibration files are indicated for each sensor operating mode, including the calibration files in XML format and the Dewarp Unit configuration files in JSON format. In addition, the.drv
file generated for your sensor is referenced, creating the association between the respective/dev/video<X>
node and the sensor driver module outputted from the ISP Media Server. In case you are using only one ISP channel, just modifySensor0_Entry.cfg
. In case you require both instances of the ISP, you will need to modify both files.
Sensor Calibration Files
It is a requirement for using the ISP, to have a calibration file in XML format, specific to the sensor you are using and according to the resolution and working mode. To obtain the calibration files in XML format, there are 3 options:
- Use the NXP ISP tuning tool for this (you will need to ask for access).
- Pay NXP professional services to do the tune.
- Pay a third-party vendor to do the tune (you can get a list from NXP).
In the References Section, you can find documentation of the structure and format of these XML files. Once you have the corresponding calibration files, you will need to add them to the path $ISP_SOURCES_TOP/units/isi/drv/<SENSOR>/calib/<SENSOR>/
, along with a CMakeLists.txt file. This allows the ISP Media Server to generate the full driver module (driver source code + calibration file).
VVCAM Driver Creation
The changes indicated below are based on the assumption that there is a functional sensor driver in its base form, and that it is compatible with the V4L2 API. From now on we focus on applying the changes suggested in the NXP documentation, specifically to establish the communication of the VVCAM Driver (kernel side) and the ISI Layer.
- Step 0 - Create the sensor driver entry: Developers must add the driver code to the file located at
$ISP_SOURCES_TOP/vvcam/v4l2/sensor/<sensor>/<sensor>_xxxx.c
, along with a Makefile for the sensor driver module. In the same way, as indicated in the ISS Driver section, you can refer to one of the sample drivers that are included as part of the ISP sources, to review details about the implementation of the driver and the structure of the required Makefile.
- Step 1 - Add the VVCAM mode info data structure array: This array stores all the supported modes information for your sensor. The ISI layer can get all the modes with the
VVSENSORIOC_QUERY
command. The following is an example of the structure, please fill in the information using the attributes of your sensor and the modes it supports.
#include "vvsensor.h" . . . static struct vvcam_mode_info_s <sensor>_mode_info[] = { { .index = 0, .width = ... , .height = ... , .hdr_mode = ... , .bit_width = ... , .data_compress.enable = ... , .bayer_pattern = ... , .ae_info = { . . . }, .mipi_info = { .mipi_lane = ... , }, }, { .index = 1, . . . }, }; . . .
- Step 2 - Define sensor client to i2c : Define the
client_to_sensor
macro (in case you don't have any already) and check the segments of the driver code that require this macro.
#define client_to_<sensor>(client)\ container_of(i2c_get_clientdata(client), struct <sensor>, subdev)
- Step 3 - Define the V4L2-subdev IOCTL function: Define and implement the
<sensor>_priv_ioctl
, which is used to receive the commands and parameters passed down by the user space throughioctl()
and control the sensor.
long <sensor>_priv_ioctl(struct v4l2_subdev *subdev, unsigned int cmd, void *arg) { struct i2c_client *client = v4l2_get_subdevdata(subdev); struct <sensor> *sensor = client_to_<sensor>(client); struct vvcam_sccb_data_s reg; uint32_t value = 0; long ret = 0; if(!sensor){ return -EINVAL; } switch (cmd) { case VVSENSORIOC_G_CLK: { ret = custom_implementation(); break; } case VIDIOC_QUERYCAP: { ret = custom_implementation(); break; } case VVSENSORIOC_QUERY: { ret = custom_implementation(); break; } case VVSENSORIOC_G_CHIP_ID: { ret = custom_implementation(); break; } case VVSENSORIOC_G_RESERVE_ID: { ret = custom_implementation(); break; } case VVSENSORIOC_G_SENSOR_MODE:{ ret = custom_implementation(); break; } case VVSENSORIOC_S_SENSOR_MODE: { ret = custom_implementation(); break; } case VVSENSORIOC_S_STREAM: { ret = custom_implementation(); break; } case VVSENSORIOC_WRITE_REG: { ret = custom_implementation(); break; } case VVSENSORIOC_READ_REG: { ret = custom_implementation(); break; } case VVSENSORIOC_S_EXP: { ret = custom_implementation(); break; } case VVSENSORIOC_S_POWER: case VVSENSORIOC_S_CLK: case VVSENSORIOC_RESET: case VVSENSORIOC_S_FPS: case VVSENSORIOC_G_FPS: case VVSENSORIOC_S_LONG_GAIN: case VVSENSORIOC_S_GAIN: case VVSENSORIOC_S_VSGAIN: case VVSENSORIOC_S_LONG_EXP: case VVSENSORIOC_S_VSEXP: case VVSENSORIOC_S_WB: case VVSENSORIOC_S_BLC: case VVSENSORIOC_G_EXPAND_CURVE: break; default: break; } return ret; }
As you can see in the example, some cases are implemented but others are not. Developers are free to implement the features they consider necessary, as long as a minimum base of operation of the driver is guaranteed (query commands, read and write registers, among others). It is the developer's responsibility to implement each custom function, for each case or scenario that may arise when interacting with the sensor. In addition to what was shown previously, a link must be created to make the ioctl connection with the driver in question. Link your priv_ioctl
function on the v4l2_subdev_core_ops
struct, as in the example below:
. . . static const struct v4l2_subdev_core_ops <sensor>_core_ops = { .s_power = v4l2_s_power, .subscribe_event = v4l2_ctrl_subdev_subscribe_event, .unsubscribe_event = v4l2_event_subdev_unsubscribe, // IOCTL link .ioctl = <sensor>_priv_ioctl, }; . . .
- Step 4 - Verify your sensor's private data structure: After performing the modifications suggested, it would be a good practice to double-check your sensor's private data structure properties, in case there is one missing, and also check that the properties are initialized correctly on the driver's probe.
- Step 5 - Modify VVCAM V4L2 sensor Makefile : At
$ISP_SOURCES_TOP/vvcam/v4l2/sensor/Makefile
, include your sensor object as follows:
. . . obj-m += <sensor>/ . . .
- Important Note: There is a very common issue that appears when working with camera sensor drivers in i.MX8MP platforms. The kernel log message shows something similar to the following:
mxc-mipi-csi2.<X>: is_entity_link_setup, No remote pad found!
The link setup callback is required by the Media Controller when performing the linking process of the media entities involved in the capture process of the camera. Normally, this callback is triggered by the imx8-media-dev driver included as part of the Kernel sources. To make sure that the problem is not related to your sensor driver, verify the link setup callback is already created in the code, and if is not, you can add the following template:
/* Function needed by i.MX8MP */ static int <sensor>_camera_link_setup(struct media_entity *entity, const struct media_pad *local, const struct media_pad *remote, u32 flags) { /* Return always zero */ return 0; } /* Add the link setup callback to the media entity operations struct */ static const struct media_entity_operations <sensor>_camera_subdev_media_ops = { .link_setup = <sensor>_camera_link_setup, }; /* Verify the initialization process of the media entity ops in the sensor driver's probe function*/ static int <sensor>_probe(struct i2c_client *client, ...) { . . . /* Initialize subdev */ sd = &<sensor>->subdev; sd->dev = &client->dev; <sensor>->subdev.internal_ops = ... <sensor>->subdev.flags |= ... <sensor->subdev.entity.function = ... /* Entity ops initialization */ <sensor->subdev.entity.ops = &<sensor>_camera_subdev_media_ops; . . . }
In most cases, adding the link setup function will solve the media controller issue, or at least it discards problems on the driver side.
Device Tree Modifications
On the Device Tree side, it is necessary to enable the ISP channels that will be used. Likewise, it is necessary to disable the ISI channels, which are normally the ones that connect to the MIPI_CSI2 ports to extract raw data from the sensor (in case the ISP is not used). A MIPI_CSI2 port can be mapped to either an ISI channel or an ISP channel, but not both simultaneously. In this guide, we focus on using the ISP, so any other custom configuration that you want to implement may vary from what is shown. In the code below, ISP channel 0 is enabled, and the connection is made to the port where the sensor is connected (mipi_csi_0).
&mipi_csi_0 { status = "okay"; port@0 { // Example endpoint to <sensor>_ep mipi0_sensor_ep: endpoint@1 { remote-endpoint = <&<sensor>_ep>; }; }; }; &cameradev { status = "okay"; }; &isi_0 { status = "disabled"; }; &isi_1 { status = "disabled"; }; &isp_0 { status = "okay"; }; &isp_1 { status = "disabled"; }; &dewarp { status = "okay"; };
What is shown above does not represent a complete device tree file, is only a general skeleton of the points you should pay attention to when working with ISP channels. For simplicity, we omitted all the attributes that are normally defined when working with camera sensor drivers and their respective configurations in the i2c port of the hardware.
Important Note: Due to hardware restrictions when using ISP channels, it is recommended to use the isp_0 channel, when working with only one sensor. In case you need to use two sensors, you can enable both channels, taking into account the limitations regarding the output resolutions and the clock frequency when both channels are working simultaneously. What is not recommended is to use the isp_1 channel when working with a single sensor. |
References
Note: Some of the documents will require you to sign on the NXP website before downloading them. In case you don't have an account you can create one on the same page. |
- ISP Independent Sensor Interface (ISI) API reference: i.MX 8M Plus Camera Sensor Porting User Guide Chapter 3
- IOCTL : i.MX 8M Plus Camera Sensor Porting User Guide Chapter 4
- VVCAM API Reference: i.MX 8M Plus Camera Sensor Porting User Guide Chapter 5
- Camera Sensor Driver in V4L2 Mode : i.MX 8M Plus Camera Sensor Porting User Guide Chapter 6
- Sensor Calibration Tool : Sensor Calibration Parameter Specifications for ISP Calibration Tool XML Generator
- Reference Manual : i.MX 8M Plus Applications Processor Reference Manual