Librraew 1.0: Difference between revisions

From RidgeRun Developer Wiki
mNo edit summary
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
The librraew library version 1.0 documentation is available for historic reasons.  There is a newer version of the librraew library available.  Version 1.0 obsolete and not available.
= API documentation =
[https://www.ridgerun.com/developer/api/librraew/v1.0 librraew version 1.0 API documentation] is available online.
= Overview =
= Overview =


== Auto white balance ==
== Auto white balance ==
The librraew supports several AWB algorithms:
The [[RidgeRun Auto exposure/Auto white balance library | librraew]] supports several AWB algorithms:
;Gray World
;Gray World
:The Gray World algorithm is based on the assumption that given an image with sufficient amount of color variation, the average reflectance of the scene is achromatic(gray). To be achromatic, the mean of the red, green, and blue channels in a given scene should be roughly equal. If the scene has a color that dominates, the results of the algorithm may not be satisfactory.
:The Gray World algorithm is based on the assumption that given an image with sufficient amount of color variation, the average reflectance of the scene is achromatic (gray). To be achromatic, the mean of the red, green, and blue channels in a given scene should be roughly equal. If the scene has a color that dominates, the results of the algorithm may not be satisfactory.


;White Patch
;White Patch
:The White Patch algorithm assumes that the maximum response in an image is caused by a perfect reflector, that represents the color of the illumination.  White balancing attempts to equalize the maximum value of the three channels to produce a white patch. This algorithm must be used on scenes that ensure the captured image does not include saturated pixels.
:The White Patch algorithm assumes that the maximum response in an image is caused by a perfect reflector, that represents the color of the illumination.  White balancing attempts to equalize the maximum value of the three channels to produce a white patch. This algorithm will only produce satisfactory results if the captured image does not include saturated pixels.


;White Patch 2  
;White Patch 2  
:This variation of the White Patch algorithm aims to resolve the problems with the saturated pixels. White Patch 2 ses an average of local maximums instead of an absolute maximum. The results can change depending of the number of sample windows used in processing the image.
:This variation of the White Patch algorithm aims to resolve the problems with the saturated pixels. White Patch 2 gets an average of local maximums instead of an absolute maximum. The results can change depending of the number of sample windows used in processing the image.


== Auto exposure ==
== Auto exposure ==
Librraew includes one AE algorithm that can use five brightness metering methods. The algorithm uses an electronic-centric approach based on the mid-tone idea. This algorithm uses metrics gotten for the sensor capture image as light metering. Calculates the required exposure time to get an optimal image brightness, using an expression that relates actual scene brightness, actual exposure time and the defined optimal image brightness. The optimal image brightness is defined as the value of the mid-tone.  
Librraew includes one AE algorithm that can use five brightness metering methods. The algorithm uses an electronic-centric approach based on the mid-tone idea. This algorithm uses metrics from the sensor capture image as light metering. Calculates the required exposure time to get an optimal image brightness, using an expression that relates actual scene brightness, actual exposure time and the defined optimal image brightness. The optimal image brightness is defined as the value of the mid-tone.  


All the metering systems define a pixel brightness as the average of the red, blue and green component. Bellow a brief description of the brightness metering methods is presented:
All the metering systems define a pixel brightness as the average of the red, blue and green components. The center area for all the different metering systems is defined by a percentage of the image size and can be set by a librraew parameter.  Below is a brief description of the brightness metering methods that are supported:
;Partial:  
;Partial:  
:Averages the brightness of the pixels in a portion in the center of the frame, the rest of the frame is ignored. It is generally used when very bright or very dark areas on the edges of the frame would otherwise influence on a bad way the scene illumination.
:Averages the brightness of the pixels in a portion in the center of the frame, the rest of the frame is ignored. It is generally used when very bright or very dark areas on the edges of the frame would otherwise degrade the scene illumination.
;Center weighted
;Center weighted
:Average the light information coming from the entire frame. Get brightness average of to regions: the pixels in a portion in the center and the pixels in the rest of the frame (background). The total brightness is calculated with emphasis placed on the center area. 75% of the total brightness is given by the center brightness and 30% by the background. This algorithm can be used when you want that the whole scene be well illuminated and not be affected for the small edges brightness variations. The subjects of the picture must be at the center of the image. But, if a back light is present into the scene the central part results darker than the rest of the scene and unpleasant underexposed foreground is produced.
:Average the light information coming from the entire frame. Get brightness average of two regions: the pixels in a portion in the center and the pixels in the rest of the frame (background). The total brightness is calculated with emphasis placed on the center area. 75% of the total brightness is given by the center brightness and 30% by the background. This algorithm can be used when you want that the whole scene to be well illuminated and not be affected for the small edges brightness variations. The subjects of the picture must be at the center of the image. But, if a back light is present in the scene, the central part results darker than the rest of the scene and the foreground is underexposed.
;Segmented:  
;Segmented:
:This algorithm is designed for scenes that have a principal object in backlighting condition. Emphasize the luminance of the main object according to the degree of backlighting. Divides the frame on 6 pieces and weighting them.
:This metering system is designed for scenes that have a principal object in a backlit condition. The frame is divided into five regions. This algorithm  assumes that  background region is located in the upper part of the scene and the main object is in the center of the scene. The metering systems organizes the regions into two groups: main object region and background region. The luminance difference between the background and the main object region is called degree of backlighting.  According to the degree of backlighting, the metering system emphasize the luminance of the main object to get the frame total brightness.
;Average
;Average
:Average the light information coming from the entire frame without weighting any particular portion of the metered area. This algorithm can be used on scenes that not have a principal object and you want an average illumination. If the scene has a high contrast, the algorithm cause under or over exposure of parts of the scene.
:Average the light information coming from the entire frame without weighting any particular portion of the metered area. This algorithm can be used on scenes that not have a principal object and you want an average illumination. If the scene has a high contrast, the algorithm cause under or over exposure of parts of the scene.


The center area for all the metering system is defined by a percentage of the image size and can be set by a librraew parameter.
= Using librraew library =
= Usage =
 
== Using librraew with ipiped ==
== ipiped using librraew library ==
Two applications are used to support the auto exposure and auto white-balance (AEW) library in the RidgeRun's SDK:
Two applications are used to support the auto exposure and auto white-balance (AEW) library in the RidgeRun's SDK:
*Ipiped, a D-Bus server for controlling and configuring the camera sensor, the dm365 video processor and the aew library.
*ipiped, a D-Bus server for controlling and configuring the camera sensor, the dm365 video processor, and the librraew library.
*Ipipe-client, an D-Bus client that can be used to invoke any of the methods supported by the Ipiped.  
*ipipe-client, a D-Bus client that can be used to invoke any of the methods supported by ipiped.  
More information about this application can be found in [https://github.com/RidgeRun/ipiped/wiki/Image-Pipe-Daemon-1.0 Image-Pipe-Daemon-1.0]
More information about these applications can be found in [https://github.com/RidgeRun/ipiped/wiki Image Pipe Daemon Documentation]. This library version is supported by ipiped 1.0.
 
== Using librraew with custom software ==
librraew is a plain C library and can be re-used and integrated with any custom application capable of making C function calls.  [https://www.ridgerun.com/developer/api/librraew/v1.0 librraew version 1.0 API documentation] is available online.


== Using librraew with other software rather than ipiped ==
librraew is a plain C library and can be re-used and integrated with any custom application. Please contact RidgeRun for the documentation of the librraew API.
= Basic example of use =
= Basic example of use =


Line 41: Line 47:
== JPEG capture for 640x480 pixels image ==
== JPEG capture for 640x480 pixels image ==


Once your board has started run the following commands:
In this example librraew is configured to use:
* The gray world white balance's algorithm working with the sensor gain.
* The auto exposure with the center weighted metering system.
* The center size of 50% image width and height.
* 200ms between algorithms iterations.
* Minimum frame rate of 30fps.
* 50% of image segmentation.
* 640x480 image.
 
This can be initialized with the ipipe-client using the command: (See [https://github.com/RidgeRun/ipiped/wiki/Image-Pipe-Daemon-1.0#wiki-Controlling_librraew_with_ipipe Controlling librraew with ipipe] section for more details)


<pre>
<pre>
ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 200000 30 50 640 480 50
ipipe-client init-aew G EC S C 200000 30 50 640 480 50
</pre>
</pre>
The first command will chain the ipiped using an existing script. The second one will start the auto-white balance and the auto-exposure algorithms for an image size of 640x480 pixels and a minimum frame rate of 30fps (See [https://github.com/RidgeRun/ipiped/wiki/Image-Pipe-Daemon-1.0#wiki-Controlling_librraew_with_ipipe Controlling librraew with ipipe] section for more details).


Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.
Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.
Line 64: Line 76:


== JPEG capture for 1280x720 (720P) pixels image ==
== JPEG capture for 1280x720 (720P) pixels image ==
In this example librraew is configured to use:
* The gray world white balance's algorithm working with the sensor gain.
* The auto exposure with the center weighted metering system.
* The center size of 50% image width and height.
* 200ms between algorithms iterations.
* Minimum frame rate of 30fps.
* 50% of image segmentation.
* 1280x720 image.


Once your board has started run the following commands:
This can be initialized with the ipipe-client using the command: (See [https://github.com/RidgeRun/ipiped/wiki/Image-Pipe-Daemon-1.0#wiki-Controlling_librraew_with_ipipe Controlling librraew with ipipe] section for more details)
 
<pre>
<pre>
ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 200000 30 50 1280 720 50
ipipe-client init-aew G EC S C 200000 30 50 1280 720 50
</pre>
</pre>
The first command will chain the ipiped using an existing script. The second one will start the auto-white balance and the auto-exposure algorithms for an image size of 640x480 pixels and a minimum frame rate of 30fps (See [https://github.com/RidgeRun/ipiped/wiki/Image-Pipe-Daemon-1.0#wiki-Controlling_librraew_with_ipipe Controlling librraew with ipipe] section for more details).


Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.
Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.
Line 86: Line 102:


[[File:Image_with_aew_720P.jpeg|300px|thumb|center| Figure 3. 720P image taken using the AEW algorithm.]] [[File:Image_without_aew_720P.jpeg|300px|thumb|center| Figure 4. 720P image taken without the AEW algorithm.]]
[[File:Image_with_aew_720P.jpeg|300px|thumb|center| Figure 3. 720P image taken using the AEW algorithm.]] [[File:Image_without_aew_720P.jpeg|300px|thumb|center| Figure 4. 720P image taken without the AEW algorithm.]]
[[Category:Whitepaper]] [[Category:RidgeRunTechnology]]

Revision as of 19:13, 7 March 2014

The librraew library version 1.0 documentation is available for historic reasons. There is a newer version of the librraew library available. Version 1.0 obsolete and not available.

API documentation

librraew version 1.0 API documentation is available online.

Overview

Auto white balance

The librraew supports several AWB algorithms:

Gray World
The Gray World algorithm is based on the assumption that given an image with sufficient amount of color variation, the average reflectance of the scene is achromatic (gray). To be achromatic, the mean of the red, green, and blue channels in a given scene should be roughly equal. If the scene has a color that dominates, the results of the algorithm may not be satisfactory.
White Patch
The White Patch algorithm assumes that the maximum response in an image is caused by a perfect reflector, that represents the color of the illumination. White balancing attempts to equalize the maximum value of the three channels to produce a white patch. This algorithm will only produce satisfactory results if the captured image does not include saturated pixels.
White Patch 2
This variation of the White Patch algorithm aims to resolve the problems with the saturated pixels. White Patch 2 gets an average of local maximums instead of an absolute maximum. The results can change depending of the number of sample windows used in processing the image.

Auto exposure

Librraew includes one AE algorithm that can use five brightness metering methods. The algorithm uses an electronic-centric approach based on the mid-tone idea. This algorithm uses metrics from the sensor capture image as light metering. Calculates the required exposure time to get an optimal image brightness, using an expression that relates actual scene brightness, actual exposure time and the defined optimal image brightness. The optimal image brightness is defined as the value of the mid-tone.

All the metering systems define a pixel brightness as the average of the red, blue and green components. The center area for all the different metering systems is defined by a percentage of the image size and can be set by a librraew parameter. Below is a brief description of the brightness metering methods that are supported:

Partial
Averages the brightness of the pixels in a portion in the center of the frame, the rest of the frame is ignored. It is generally used when very bright or very dark areas on the edges of the frame would otherwise degrade the scene illumination.
Center weighted
Average the light information coming from the entire frame. Get brightness average of two regions: the pixels in a portion in the center and the pixels in the rest of the frame (background). The total brightness is calculated with emphasis placed on the center area. 75% of the total brightness is given by the center brightness and 30% by the background. This algorithm can be used when you want that the whole scene to be well illuminated and not be affected for the small edges brightness variations. The subjects of the picture must be at the center of the image. But, if a back light is present in the scene, the central part results darker than the rest of the scene and the foreground is underexposed.
Segmented
This metering system is designed for scenes that have a principal object in a backlit condition. The frame is divided into five regions. This algorithm assumes that background region is located in the upper part of the scene and the main object is in the center of the scene. The metering systems organizes the regions into two groups: main object region and background region. The luminance difference between the background and the main object region is called degree of backlighting. According to the degree of backlighting, the metering system emphasize the luminance of the main object to get the frame total brightness.
Average
Average the light information coming from the entire frame without weighting any particular portion of the metered area. This algorithm can be used on scenes that not have a principal object and you want an average illumination. If the scene has a high contrast, the algorithm cause under or over exposure of parts of the scene.

Using librraew library

ipiped using librraew library

Two applications are used to support the auto exposure and auto white-balance (AEW) library in the RidgeRun's SDK:

  • ipiped, a D-Bus server for controlling and configuring the camera sensor, the dm365 video processor, and the librraew library.
  • ipipe-client, a D-Bus client that can be used to invoke any of the methods supported by ipiped.

More information about these applications can be found in Image Pipe Daemon Documentation. This library version is supported by ipiped 1.0.

Using librraew with custom software

librraew is a plain C library and can be re-used and integrated with any custom application capable of making C function calls. librraew version 1.0 API documentation is available online.

Basic example of use

In this section it will be shown a basic example of use of the ipiped and the librraew library.

JPEG capture for 640x480 pixels image

In this example librraew is configured to use:

  • The gray world white balance's algorithm working with the sensor gain.
  • The auto exposure with the center weighted metering system.
  • The center size of 50% image width and height.
  • 200ms between algorithms iterations.
  • Minimum frame rate of 30fps.
  • 50% of image segmentation.
  • 640x480 image.

This can be initialized with the ipipe-client using the command: (See Controlling librraew with ipipe section for more details)

ipipe-client init-aew G EC S C 200000 30 50 640 480 50

Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.

As an example it was used the pipeline shown below which captures 30 images from the camera module and encodes them to JPEG format in order to get 30 different JPEG images.

gst-launch -e v4l2src always-copy=false num-buffers=30 chain-ipipe=false ! video/x-raw-yuv,format=\(fourcc\)UYVY, \
width=640, height=480 !  dmaienc_jpeg ! queue !  multifilesink location=image%0d.jpeg

The images 1 and 2 show the difference between an image captured with and without the AEW algorithm, respectively. As it is possible to see, the AEW algorithm provides a better image quality controlling the brightness and the exposure time.

Figure 1. Image taken using the AEW algorithm.
Figure 2. Image taken without the AEW algorithm.

JPEG capture for 1280x720 (720P) pixels image

In this example librraew is configured to use:

  • The gray world white balance's algorithm working with the sensor gain.
  • The auto exposure with the center weighted metering system.
  • The center size of 50% image width and height.
  • 200ms between algorithms iterations.
  • Minimum frame rate of 30fps.
  • 50% of image segmentation.
  • 1280x720 image.

This can be initialized with the ipipe-client using the command: (See Controlling librraew with ipipe section for more details)

ipipe-client init-aew G EC S C 200000 30 50 1280 720 50

Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.

As an example it was used the pipeline shown below which captures 30 images from the camera module and encodes them to JPEG format in order to get 30 different JPEG images.

gst-launch -e v4l2src always-copy=false num-buffers=30 chain-ipipe=false ! video/x-raw-yuv,format=\(fourcc\)UYVY, \
width=1280, height=720 !  dmaienc_jpeg ! queue !  multifilesink location=image%0d.jpeg

The images 3 and 4 show the difference between an image captured with and without the AEW algorithm, respectively. As it is possible to see, the AEW algorithm provides a better image quality controlling the brightness and the exposure time.

Figure 3. 720P image taken using the AEW algorithm.
Figure 4. 720P image taken without the AEW algorithm.