Librraew 1.0

From RidgeRun Developer Wiki
Revision as of 14:21, 27 July 2011 by Mmontero (talk | contribs)

Overview

Auto white balance

The librraew supports several AWB algorithms:

Gray World
The Gray World algorithm is based on the assumption that given an image with sufficient amount of color variation, the average reflectance of the scene is achromatic(gray). To be achromatic, the mean of the red, green, and blue channels in a given scene should be roughly equal. If the scene has a color that dominates, the results of the algorithm may not be satisfactory.
White Patch
The White Patch algorithm assumes that the maximum response in an image is caused by a perfect reflector, that represents the color of the illumination. White balancing attempts to equalize the maximum value of the three channels to produce a white patch. This algorithm must be used on scenes that ensure the captured image does not include saturated pixels.
White Patch 2
This variation of the White Patch algorithm aims to resolve the problems with the saturated pixels. White Patch 2 ses an average of local maximums instead of an absolute maximum. The results can change depending of the number of sample windows used in processing the image.

Auto exposure

Librraew includes one AE algorithm that can use five brightness metering methods. The algorithm uses an electronic-centric approach based on the mid-tone idea. This algorithm uses metrics gotten for the sensor capture image as light metering. Calculates the required exposure time to get an optimal image brightness, using an expression that relates actual scene brightness, actual exposure time and the defined optimal image brightness. The optimal image brightness is defined as the value of the mid-tone.

All the metering systems define a pixel brightness as the average of the red, blue and green component. Bellow a brief description of the brightness metering methods is presented:

Partial
Averages the brightness of the pixels in a portion in the center of the frame, the rest of the frame is ignored. It is generally used when very bright or very dark areas on the edges of the frame would otherwise influence on a bad way the scene illumination.
Center weighted
Average the light information coming from the entire frame. Get brightness average of to regions: the pixels in a portion in the center and the pixels in the rest of the frame (background). The total brightness is calculated with emphasis placed on the center area. 75% of the total brightness is given by the center brightness and 30% by the background. This algorithm can be used when you want that the whole scene be well illuminated and not be affected for the small edges brightness variations. The subjects of the picture must be at the center of the image. But, if a back light is present into the scene the central part results darker than the rest of the scene and unpleasant underexposed foreground is produced.
Segmented
This algorithm is designed for scenes that have a principal object in backlighting condition. Emphasize the luminance of the main object according to the degree of backlighting. Divides the frame on 6 pieces and weighting them.
Average
Average the light information coming from the entire frame without weighting any particular portion of the metered area. This algorithm can be used on scenes that not have a principal object and you want an average illumination. If the scene has a high contrast, the algorithm cause under or over exposure of parts of the scene.

The center area for all the metering system is defined by a percentage of the image size and can be set by a librraew parameter.

Implementation Details

Three applications are used to support the auto exposure and auto white-balance (AEW) adjustments in the RidgeRun's SDK:

  • Ipiped, a D-Bus server for controlling and configuring the camera sensor, the dm365 video processor and the aew library.
  • librraew, a library that includes auto white balance and auto exposure algorithms.
  • Ipipe-client, an D-Bus client that can be used to invoke any of the methods supported by the Ipiped.

Basic example of use

In this section it will be shown a basic example of use of the ipiped and the librraew library.

JPEG capture for 640x480 pixels image

Once your board has started run the following commands:

ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 200000 30 50 640 480 50

The first command will chain the ipiped using an existing script. The second one will start the auto-white balance and the auto-exposure algorithms for an image size of 640x480 pixels and a minimum frame rate of 30fps (See Controlling librraew with ipipe section for more details).

Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.

As an example it was used the pipeline shown below which captures 30 images from the camera module and encodes them to JPEG format in order to get 30 different JPEG images.

gst-launch -e v4l2src always-copy=false num-buffers=30 chain-ipipe=false ! video/x-raw-yuv,format=\(fourcc\)UYVY, \
width=640, height=480 !  dmaienc_jpeg ! queue !  multifilesink location=image%0d.jpeg

The images 1 and 2 show the difference between an image captured with and without the AEW algorithm, respectively. As it is possible to see, the AEW algorithm provides a better image quality controlling the brightness and the exposure time.

Figure 1. Image taken using the AEW algorithm.
Figure 2. Image taken without the AEW algorithm.

JPEG capture for 1280x720 (720P) pixels image

Once your board has started run the following commands:

ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 200000 30 50 1280 720 50

The first command will chain the ipiped using an existing script. The second one will start the auto-white balance and the auto-exposure algorithms for an image size of 640x480 pixels and a minimum frame rate of 30fps (See Controlling librraew with ipipe section for more details).

Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.

As an example it was used the pipeline shown below which captures 30 images from the camera module and encodes them to JPEG format in order to get 30 different JPEG images.

gst-launch -e v4l2src always-copy=false num-buffers=30 chain-ipipe=false ! video/x-raw-yuv,format=\(fourcc\)UYVY, \
width=1280, height=720 !  dmaienc_jpeg ! queue !  multifilesink location=image%0d.jpeg

The images 3 and 4 show the difference between an image captured with and without the AEW algorithm, respectively. As it is possible to see, the AEW algorithm provides a better image quality controlling the brightness and the exposure time.

Figure 3. 720P image taken using the AEW algorithm.
Figure 4. 720P image taken without the AEW algorithm.