RidgeRun Auto exposure/Auto white balance library for DM368 and DM365: Difference between revisions

From RidgeRun Developer Wiki
mNo edit summary
 
(43 intermediate revisions by 6 users not shown)
Line 1: Line 1:
= Introduction =
<seo title="Auto Exposure | Auto White Balance Algorithm" titlemode="replace" keywords="GStreamer, Linux SDK, Linux BSP,  Embedded Linux, Device Drivers, Nvidia, Xilinx, TI, NXP, Freescale, Embedded Linux driver development, Linux Software development, Embedded Linux SDK, Embedded Linux Application development, GStreamer Multimedia Framework."  description="Read a comprehensive introduction to the auto exposure and auto white balance library for DM368 and DM365."></seo>
CMOS or CCD sensor video capture quality can be enhance with image processing, like auto white balance (AWB) and auto exposure algorithms (AE):
*Auto exposure performs automatic adjustments of the image brightness according to the amount of light that reaches the camera sensor.
*Auto white balance automatically compensates color differences based on lighting so white actually appears white.


Some camera sensors don't include auto white balance and/or auto exposure processing, so RidgeRun offers a library with AE and AWB algorithms called '''librraew'''. This library was initially developed for the DM365 platform. The DM365 video processing front end (VPFE) has an H3A module designed to support control loops for auto focus, auto white balance and auto exposure by collecting statistics about the imaging/video data. There are two blocks in this module:
<table>
* Auto focus engine
<tr>
* Auto exposure and auto white balance engine
<td><div class="clear; float:right">__TOC__</div></td>
<td valign=top>
{{ContactUs Button}}
</td>
</table>
== Auto Exposure and Auto White Balance Introduction  ==


The librraew only use the auto exposure and auto white balance engine. This engine divides the frames into two dimensional blocks of pixels referred as windows. The engine can provide some image/video metrics:
The DM365 and DM368 support the H3A hardware accelerator for supporting [https://www.ridgerun.com/auto-exposure-auto-white-balance auto white balance (AWB) and auto exposure (AE)]. CMOS or CCD sensor video capture quality can be enhanced with AWB and AE image processing:
* Accumulation of clipped pixels along with all non-saturated pixels in each window per color.
* Accumulation of the sum of squared pixels per color.
* Minimum and maximum pixels values in each window per color.
The  AE/AWB engine can be configured to use up to 36 horizontal windows with sum + {sum of squares or min+max} output or up to 56 horizontal windows with sum output. The AE/AWB engine can also be configure to use up to 128 vertical windows. The width and height for the windows is programmable.


Currently, librraew has testing has focused on Aptina CMOS sensor mt9p031, but if you provide the appropriate functions for the library, it can works with any sensor. The implementation is a plain C library and can be re-used with and integrated with any application. RidgeRun uses ipiped (see below) for testing and demonstration.
*Auto exposure performs automatic adjustments of the image brightness according to the amount of light that reaches the camera sensor.  
 
*Auto white balance automatically compensates color differences based on lighting so white actually appears white.
= Algorithms =
== Auto white balance ==
When an image of a scene is captured by a digital camera sensor, the sensor response at each pixel depends on the scene illumination. Depending of the illumination, a distinct color cast appears over the captured scene. This effect appears in the captured image due to the color temperature of the light. If a white object is illuminated with a low color temperature light source, the object in the captured image will be reddish. Similarly, when the white object is illuminated with a high color temperature light source, the object in the captured image will be bluish. The human eye compensates for color cast automatically through a characteristic known as color constancy, allowing the colors to be independent of the illumination. Auto white balance tries to simulate the color constancy for images capture.
 
Many AWB algorithms follow a two-stage process:
* Illumination estimation: can be done explicitly by choosing from a known set of possible illuminations or implicitly with assumptions about the effect of such illuminations. The algorithms implemented in librraew use the implicitly estimation.     
* Image color correction: is achieved through an independent gain adjustment of the three color signals. Commonly only the blue and red gains are adjusted.
 
The librraew supports several AWB algorithms:
;Gray World
:The Gray World algorithm is based on the assumption that given an image with sufficient amount of color variation, the average reflectance of the scene is achromatic(gray). To be achromatic, the mean of the red, green, and blue channels in a given scene should be roughly equal. If the scene has a color that dominates, the results of the algorithm may not be satisfactory.
 
;White Patch
:The White Patch algorithm assumes that the maximum response in an image is caused by a perfect reflector, that represents the color of the illumination.  White balancing attempts to equalize the maximum value of the three channels to produce a white patch. This algorithm must be used on scenes that ensure the captured image does not include saturated pixels.
 
;White Patch 2
:This variation of the White Patch algorithm aims to resolve the problems with the saturated pixels. White Patch 2 ses an average of local maximums instead of an absolute maximum. The results can change depending of the number of sample windows used in processing the image.
 
== Auto exposure ==
One of the main problems affecting image quality, leading to unpleasant pictures, comes from improper exposure to light. The exposure is the amount of light that reaches the image sensor. Exposure determines the lightness or darkness of the resulting image. If too much light strikes the image sensor, the image will be overexposed, washed out, and faded. If too little light reaches the camera sensor produces an underexposed image, dark and lacking in details especially in shadow areas. Auto exposure (AE) algorithms adjust the captured image in an attempt to reproduce the most important regions (according to contextual or perceptive criteria) with a level of brightness, more or less in the middle of the possible range.
 
Auto exposure algorithms involves three processes:
* Light metering: generally accomplished using the camera sensor itself or an external device as exposure detector.  
* Scene analysis: brightness metering methods used to estimate the scene illumination according to image metrics. Using this value, calculates the brightness adjustments to have the best exposure.
* Image brightness correction: ensures that the correct amount of light reaches the image sensor, illumination and shutter time parameters are adjusted. The CMOS image sensor parameter is often called exposure time. The exposure time is defined as the amount of time that the sensor integrates light. In other words, it determines how long the sensor photo diodes array is exposed to light.
 
Librraew includes one AE algorithm that can use five brightness metering methods. The algorithm uses an electronic-centric approach based on the mid-tone idea. This algorithm uses metrics gotten for the sensor capture image as light metering. Calculates the required exposure time to get an optimal image brightness, using an expression that relates actual scene brightness, actual exposure time and the defined optimal image brightness. The optimal image brightness is defined as the value of the mid-tone.
 
All the metering systems define a pixel brightness as the average of the red, blue and green component. Bellow a brief description of the brightness metering methods is presented:
;Partial:
:Averages the brightness of the pixels in a portion in the center of the frame, the rest of the frame is ignored. It is generally used when very bright or very dark areas on the edges of the frame would otherwise influence on a bad way the scene illumination.
;Center weighted
:Average the light information coming from the entire frame. Get brightness average of to regions: the pixels in a portion in the center and the pixels in the rest of the frame (background). The total brightness is calculated with emphasis placed on the center area. 75% of the total brightness is given by the center brightness and 30% by the background. This algorithm can be used when you want that the whole scene be well illuminated and not be affected for the small edges brightness variations. The subjects of the picture must be at the center of the image. But, if a back light is present into the scene the central part results darker than the rest of the scene and unpleasant underexposed foreground is produced.
;Segmented:
:This algorithm is designed for scenes that have a principal object in backlighting condition. Emphasize the luminance of the main object according to the degree of backlighting. Divides the frame on 6 pieces and weighting them.
;Average
:Average the light information coming from the entire frame without weighting any particular portion of the metered area. This algorithm can be used on scenes that not have a principal object and you want an average illumination. If the scene has a high contrast, the algorithm cause under or over exposure of parts of the scene.
 
The center area for all the metering system is defined by a percentage of the image size and can be set by a librraew parameter.
 
= Implementation Details =
Three applications are used to support the auto exposure and auto white-balance (AEW) adjustments in the RidgeRun's SDK:
*Ipiped, a D-Bus server for controlling and configuring the camera sensor, the dm365 video processor and the aew library.
*librraew, a library that includes auto white balance and auto exposure algorithms.
*Ipipe-client, an D-Bus client that can be used to invoke any of the methods supported by the Ipiped.
 
== Running Ipiped ==
Ipiped must run in background. If you are using RidgeRun's SDK and enabled ipiped it will be setup to start automatically when the system boots.
<pre>
ipiped &
</pre>


Ipiped registers with D-Bus and waits until ipipe-client requests to execute a method.
Some camera sensors don't include auto white balance and/or auto exposure processing, so RidgeRun offers a library with auto exposure and auto white balance algorithms called '''librraew'''. This library was initially developed for the DM365/DM368 (DM36x) platform. The DM36x video processing front end (VPFE) has an H3A engine designed to support control loops for auto focus, auto white balance and auto exposure by collecting statistics about the imaging/video data. There are two blocks in this module:


== Running Ipipe-client ==
*Auto focus engine
Ipipe-client is a Dbus client that use commands to invoke methods of the ipiped, so ipiped must be running to use ipipe-client. A command can required arguments depending of the functionality. Ipipe-client has two operation modes, you can ask to execute a single command or you can open an interactive console to execute a group of commands.   
*Auto exposure and auto white balance engine


To execute a single command, you can use the following command line syntax 
The librraew library only uses the auto exposure and auto white balance hardware engine which requires the video frames to be in the Bayer color space. The DM36x does not allow the H3A engine to be used when the color space is YCbCr, which is common if you are using NTSC/PAL composite video input.  
<pre>
ipipe-client <command> <argument 1> ... <argument n>
</pre>


To get into the interactive console, you have to run ipipe-client without any command. Then to execute a command you only need to use the command and the required arguments.
The H3A engine divides the frames into two dimensional blocks of pixels referred as windows. The engine provides image/video metrics:
<pre>
ipipe-client
ipipe-client$ <command 1> <argument 1> ... <argument n>
ipipe-client$ <command 2> <argument 1> ... <argument n>
</pre>
To quit the interactive console you can use quit or exit.


In order to know the commands that are available run:
*Accumulation of clipped pixels along with all non-saturated pixels in each window on a per color basis.
<pre>
*Accumulation of the sum of squared pixels per color.
ipipe-client help
*Minimum and maximum pixels values in each window on a per color basis.
</pre>
or get into the interactive console and execute help.


This shows a description of each command, as follows:
The DM36x H3A engine can be configured to use up to 36 horizontal windows with sum + {sum of squares or min+max} output or up to 56 horizontal windows with sum output. The H3A engine can also be configure to use up to 128 vertical windows. The width and height for the windows is programmable.


<pre>
The librraew library was tested using an Aptina MT9P031 CMOS sensor. Support for other sensors was added later, thus validating the librraew design. If you provide the appropriate sensor-specific functions for the library, it can work with any sensor. The implementation is a plain C library and can be re-used with and integrated with any application capable of making C function calls. [https://github.com/RidgeRun/ipiped/wiki Image Pipe Daemon] uses librraew to provide auto exposure/Auto white balance.
Command                        Description


help                    Displays the help text for all the possible commands or a specific command.
== Auto Exposure/Auto White Balance License ==
set-debug              Enable/Disable debug messages.
init-aew                Initialize AEW algorithms.
stop-aew                End AEW algorithm.
shell                  Execute a shell command(shell_cmd) using interactive console.
ping                    Show if ipipe-daemon is alive.
quit                    Quit from the interactive console.
exit                    Exit from the interactive console.
get-video-processor    Show the video processor that is being used.
get-sensor              Show the sensor that is being used.
run-config-script      Execute a group of ipipe-client commands.
set-previewer-mode      Configure previewer on continuous or one-shot mode.
set-bayer-pattern      Sets R/Gr/Gb/B color pattern to the previewer.
set-digital-gain        Sets red (R), green (G) and blue gains (G) on the ipipe.
get-digital-gain        Returns the gain value for each color component(RGB).
set-luminance          Brightness(Br) and contrast(C) adjustment.
get-luminance          Returns the value of the Brightness(Br) and contrast(C) adjustment.
flip-vertical          Flips the image vertically(on the sensor).
flip-horizontal        Flips the image horizontally (on the sensor).
set-exposure            Sets the effective shutter time of the sensor for the light integration.
get-exposure            Gets the exposure time of the sensor in us.
set-sensor-gain        Sets red(R), green(G) and blue(B) gain directly on the sensor.
get-sensor-gain        Gets sensor red(R), green(G) and blue(B).
</pre>


If you want more detailed information about a command execute:
RidgeRun auto-exposure/auto-white-balance library (C) Copyright 2010 - RidgeRun LLC.
<pre>
ipipe-client help <command>
</pre>


== Controlling librraew with ipipe ==
=== Evaluation and Development License  ===


Auto exposure and auto white balance adjustments can be started with an ipipe-client's command called init-aew. Init-aew requires some arguments to define the algorithms and other parameters. To see the arguments required you can request for help that show you the list as follows:
Subject to the terms and conditions of RidgeRun's SDK license, RidgeRun hereby grants to customer a product - based, non - exclusive, non - transferable, non - sublicensable, limited, worldwide license to install and use, for internal purposes only, an unlimited number of copies of the source and object code versions.  


<pre>
=== Distribution License ===
Command: init-aew
Syntax: init-aew <WB> <AE> <G> <EM> <T[us]> <fps> <seg> <width> <height>
Description: Initialize AEW algorithms                                 
Arguments:
        WB: white balance algorithm, the options are:
                G -for gray world algorithm
                W -for retinex algorithm
                W2 -for variant of retinex algorithm
                N -for none
        AE: auto exposure algorithm, the options are
                EC -for electronic centric
                N -for none
        G: gain type, the options are:
                S -to use the sensor gain
                D -to use the digital
        EM: exposure metering method, the options are:
                P -for partial metering that take into account the light
                information of a portion in the center and the rest of
                the frame is ignored. The size of the center depends of
                of the parameter center_percentage
                C -for center weighted metering that take into account
                the light information coming from the entire frame with
                emphasis placed on the center area
                A -for average metering that take into account the light
                information from the entire frame without weighting
                SG -for segmented metering that divides the frame
                on 6 pieces and weighting them to avoid backlighting
        T: wait time in us, specifies the time between
                algorithm adjustments, max value=1s=1000000us
        fps: minimum frame rate
        seg: frame segmentation factor, each frame is segmented into
                regions, this factor represents the percentage of the
                maximum number of possible regions
        width: captured video/image horizontal size
        height: captured video/image vertical size
        center_percentage: defines the percentage of the image width
                and height to be used as the center size


Subject to the terms and conditions of RidgeRun's SDK license, RidgeRun grants customers a non-exclusive, non-transferable, non-sublicensable, limited, worldwide license to distribute RidgeRun Software in object code format only (no source code) in one product model sold by the customer.


</pre>
== AWB/AE Limitations for H3A Engine  ==


Also you can stop automatic adjustments with the command stop-aew
There are some not-so-obvious limitations when using the H3A engine:


Some of the init-aew arguments need to be explained in more detail:
*AWB/AE correction limited to window sampling method listed above.
*T: the time between interactions defines how fast the algorithm can adjust the scene parameters. If you don't need fast changes you can use a greater time to get less CPU usage.  
*Can not use H3A engine with YCbCr color space, which includes NTSC/PAL composite video input.
*seg: this factor is related with the amount of CPU usage and the auto-adjustments precision. If you use a high segmentation percentage you will have greater CPU usage but you will get more precision on the adjustments.  
*Auto exposure can affect the video frame rate with dark images. Set a maximum exposure limit to keep the frame rate from dropping below an acceptable value.  
*Only tested with Linux 2.6.32 and the RidgeRun MT9P031 V4L2 driver.


= Using librraew with other software rather than ipiped =
== Auto Exposure and Auto White Balance Algorithms ==
librraew is a plain C library and can be re-used and integrated with any custom application. Please contact RidgeRun for the documentation of the librraew API.


= Using the demo version of librraew =
=== Auto white balance algorithms ===


You can request a demo version of librraew in order to test the auto-white balance and auto-exposure algorithms before you decide to buy it. This library will allow you to use all the features that comes with the full version of the librraew but with the following limitations:
When an image of a scene is captured by a digital camera sensor, the sensor response at each pixel depends on the scene illumination. Depending on the illumination, a distinct color cast appears over the captured scene. This effect appears in the captured image due to the color temperature of the light. If a white object is illuminated with a low color temperature light source, the object in the captured image will have a reddish tint. Similarly, when the white object is illuminated with a high color temperature light source, the object in the captured image will appear somewhat blue instead of pure white. The human eye compensates for color cast automatically through a characteristic known as color constancy, allowing the colors to be independent of the illumination. Auto white balance tries to simulate the color constancy for captured images.


* The algorithm will be darken the image periodically.
Many auto white balance algorithms follow a two-stage process:
* After a while the algorithm will stop working and the image capturing will be done with the last values calculated by the library. In order to test the library again you will need to restart the algorithm.


= Basic example of use =
*Illumination estimation: this can be done explicitly by choosing from a known set of possible illuminations or implicitly with assumptions about the effect of such illuminations. The white balance algorithms implemented in librraew use implicit estimation.
*Image color correction: this is achieved through an independent gain adjustment of the three color signals. Commonly only the blue and red gains are adjusted assuming the red gain is fixed.


In this section it will be shown a basic example of use of the ipiped and the librraew library. Once your board has started run the following commands:
=== Auto exposure algorithms ===


<pre>
One of the main problems affecting image quality, leading to disappointing pictures, comes from improper light exposure. The image exposure is the amount of light that reaches the sensor. Exposure determines the lightness or darkness of the resulting image. If too much light strikes the image sensor, the image will be overexposed, washed out, and faded. If too little light reaches the camera sensor produces an underexposed image, dark and lacking in details especially in shadow areas. Auto exposure (AE) algorithms adjust the captured image in an attempt to reproduce the most important regions (according to contextual or perceptive criteria) with an average level of brightness, more or less in the middle of the possible range.
ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 200000 30 50 640 480 50
</pre>


The first command will chain the ipiped using an existing script. The second one will start the auto-white balance and the auto-exposure algorithms for an image size of 640x480 pixels and a minimum frame rate of 30fps (See [https://www.ridgerun.com/developer/wiki/index.php?title=RidgeRun_Auto_exposure/Auto_white_balance_library&action=submit#Controlling_librraew_with_ipipe Controlling librraew with ipipe] section for more details).
Auto exposure algorithms involve three processes:  


Once you have the ipiped and the AEW algorithms running you can run any image/video capture GStreamer pipeline to test it.
*Light metering: this is generally accomplished using the camera sensor itself or an external device as exposure detector.
*Scene analysis: brightness metering methods use an estimation of the scene illumination according to image metrics. Using the overall illumination value, brightness adjustments can be calculated to produce the best exposure.
*Image brightness correction: this ensures that the correct amount of light reaches the image sensor by adjusting the illumination and shutter time parameters. The image sensor parameter is often called the exposure time. The exposure time is defined as the amount of time that the sensor integrates light. In other words, it determines how long the sensor photo diodes array is exposed to light.


As an example it was used the pipeline shown below which captures 30 images from the camera module and encodes them to JPEG format in order to get 30 different JPEG images.
== Documentation  ==


<pre>
*[[Librraew 1.1]] (current version)  
gst-launch -e v4l2src always-copy=false num-buffers=30 chain-ipipe=false ! video/x-raw-yuv,format=\(fourcc\)UYVY, \
*[[Librraew 1.0]]
width=640, height=480 !  dmaienc_jpeg ! queue !  multifilesink location=image%0d.jpeg
</pre>


The images 1 and 2 show the difference between an capture made with and without the AEW algorithm, respectively.
== Using the demo version of librraew  ==


[[File:Image_with_aew.jpeg|300px|thumb|center| Figure 1. Image taken using the AEW algorithm.]] [[File:Image_without_aew.jpeg|300px|thumb|center| Figure 2. Image taken without the AEW algorithm.]]
You can request at '''support@ridgerun.com''' for a demo version of librraew in order to test the auto-white balance and auto-exposure algorithms to see if the technology meets your needs. This library will allow you to use all the features that comes with the full version of the librraew but with the following limitations:  


= References =
*The algorithm will darken the image periodically.
*After awhile the algorithm will stop working and the image capturing will be done with the last values calculated by the library. In order to test the library again you will need to restart the algorithm.


# Battiato, G. Messina, and A. Castorina. Exposure correction for imaging devices: an overview. In Single-Sensor Imaging: Methods and Applications for Digital Cameras, chapter 12. Rastislav Lukac, October 2008.
=== How to Buy  ===
# Lee J.S., Jung Y.Y, Kim B.S., and Ko S.J. An advanced video camera system with robust af, ae, and awb control. IEEE Transactions on Consumer Electronics, 47:694–699, August 2001
# Edmund Y. Lam. Combining gray world and retinex theory for automatic white balance in digital photography. Consumer Electronics, 2005. (ISCE 2005). Proceedings of the Ninth International Symposium on, pages 134–139, June 2005.
# Edmund Y. Lam and George S. K. Fung. Automatic white balancing in digital photography. In Single-Sensor Imaging: Methods and Applications for Digital Cameras. Taylor & Francis Group, LLC, 2009.
# Nitin Sampat, Shyam Venkataraman, Thomas Yeh, and Robert L. Kremens. System implications of implementing auto-exposure on consumer digital cameras. Proc. SPIE. Sensors, Cameras, and Applications for Digital Photography, 3650:100–107, March 1999


You can purchase commercial version of '''librraew''' using our [http://www.ridgerun.com/#!online-store/pogiw/!/Auto-Exposure-Auto-White-Balance-DM36x-processor-only/p/59063248/category=16360695 Online Store] or you can post your purchasing inquiry at our [http://www.ridgerun.com/contact Contact Us] link.
<div id="-chrome-auto-translate-plugin-dialog" style="opacity: 1 !important; background-image: initial !important; background-attachment: initial !important; background-origin: initial !important; background-clip: initial !important; background-color: transparent !important; padding-top: 0px !important; padding-right: 0px !important; padding-bottom: 0px !important; padding-left: 0px !important; margin-top: 0px !important; margin-right: 0px !important; margin-bottom: 0px !important; margin-left: 0px !important; position: absolute !important; top: 0px; left: 0px; overflow-x: visible !important; overflow-y: visible !important; z-index: 999999 !important; text-align: left !important; display: none; background-position: initial initial !important; background-repeat: initial initial !important;"><div style="max-width: 300px !important;color: #fafafa !important;opacity: 0.8 !important;border-color: #000000 !important;border-width: 0px !important;-webkit-border-radius: 10px !important;background-color: #363636 !important;font-size: 16px !important;padding: 8px !important;overflow: visible !important;background-image: -webkit-gradient(linear, left top, right bottom, color-stop(0%, #000), color-stop(50%, #363636), color-stop(100%, #000));z-index: 999999 !important;text-align: left !important;"><div class="translate"></div><div class="additional"></div></div>[[Image:]]</div>


== References  ==


#Battiato, G. Messina, and A. Castorina. Exposure correction for imaging devices: an overview. In Single-Sensor Imaging: Methods and Applications for Digital Cameras, chapter 12. Rastislav Lukac, October 2008.
#Lee J.S., Jung Y.Y, Kim B.S., and Ko S.J. An advanced video camera system with robust af, ae, and awb control. IEEE Transactions on Consumer Electronics, 47:694–699, August 2001
#Edmund Y. Lam. Combining gray world and retinex theory for automatic white balance in digital photography. Consumer Electronics, 2005. (ISCE 2005). Proceedings of the Ninth International Symposium on, pages 134–139, June 2005.
#Edmund Y. Lam and George S. K. Fung. Automatic white balancing in digital photography. In Single-Sensor Imaging: Methods and Applications for Digital Cameras. Taylor &amp; Francis Group, LLC, 2009.
#Nitin Sampat, Shyam Venkataraman, Thomas Yeh, and Robert L. Kremens. System implications of implementing auto-exposure on consumer digital cameras. Proc. SPIE. Sensors, Cameras, and Applications for Digital Photography, 3650:100–107, March 1999


[[Category:Whitepaper]]
{{ContactUs}}
[[Category:Whitepaper]][[Category:RidgeRunTechnology]][[Category:DM36x]]

Latest revision as of 18:38, 29 May 2020


Auto Exposure and Auto White Balance Introduction

The DM365 and DM368 support the H3A hardware accelerator for supporting auto white balance (AWB) and auto exposure (AE). CMOS or CCD sensor video capture quality can be enhanced with AWB and AE image processing:

  • Auto exposure performs automatic adjustments of the image brightness according to the amount of light that reaches the camera sensor.
  • Auto white balance automatically compensates color differences based on lighting so white actually appears white.

Some camera sensors don't include auto white balance and/or auto exposure processing, so RidgeRun offers a library with auto exposure and auto white balance algorithms called librraew. This library was initially developed for the DM365/DM368 (DM36x) platform. The DM36x video processing front end (VPFE) has an H3A engine designed to support control loops for auto focus, auto white balance and auto exposure by collecting statistics about the imaging/video data. There are two blocks in this module:

  • Auto focus engine
  • Auto exposure and auto white balance engine

The librraew library only uses the auto exposure and auto white balance hardware engine which requires the video frames to be in the Bayer color space. The DM36x does not allow the H3A engine to be used when the color space is YCbCr, which is common if you are using NTSC/PAL composite video input.

The H3A engine divides the frames into two dimensional blocks of pixels referred as windows. The engine provides image/video metrics:

  • Accumulation of clipped pixels along with all non-saturated pixels in each window on a per color basis.
  • Accumulation of the sum of squared pixels per color.
  • Minimum and maximum pixels values in each window on a per color basis.

The DM36x H3A engine can be configured to use up to 36 horizontal windows with sum + {sum of squares or min+max} output or up to 56 horizontal windows with sum output. The H3A engine can also be configure to use up to 128 vertical windows. The width and height for the windows is programmable.

The librraew library was tested using an Aptina MT9P031 CMOS sensor. Support for other sensors was added later, thus validating the librraew design. If you provide the appropriate sensor-specific functions for the library, it can work with any sensor. The implementation is a plain C library and can be re-used with and integrated with any application capable of making C function calls. Image Pipe Daemon uses librraew to provide auto exposure/Auto white balance.

Auto Exposure/Auto White Balance License

RidgeRun auto-exposure/auto-white-balance library (C) Copyright 2010 - RidgeRun LLC.

Evaluation and Development License

Subject to the terms and conditions of RidgeRun's SDK license, RidgeRun hereby grants to customer a product - based, non - exclusive, non - transferable, non - sublicensable, limited, worldwide license to install and use, for internal purposes only, an unlimited number of copies of the source and object code versions.

Distribution License

Subject to the terms and conditions of RidgeRun's SDK license, RidgeRun grants customers a non-exclusive, non-transferable, non-sublicensable, limited, worldwide license to distribute RidgeRun Software in object code format only (no source code) in one product model sold by the customer.

AWB/AE Limitations for H3A Engine

There are some not-so-obvious limitations when using the H3A engine:

  • AWB/AE correction limited to window sampling method listed above.
  • Can not use H3A engine with YCbCr color space, which includes NTSC/PAL composite video input.
  • Auto exposure can affect the video frame rate with dark images. Set a maximum exposure limit to keep the frame rate from dropping below an acceptable value.
  • Only tested with Linux 2.6.32 and the RidgeRun MT9P031 V4L2 driver.

Auto Exposure and Auto White Balance Algorithms

Auto white balance algorithms

When an image of a scene is captured by a digital camera sensor, the sensor response at each pixel depends on the scene illumination. Depending on the illumination, a distinct color cast appears over the captured scene. This effect appears in the captured image due to the color temperature of the light. If a white object is illuminated with a low color temperature light source, the object in the captured image will have a reddish tint. Similarly, when the white object is illuminated with a high color temperature light source, the object in the captured image will appear somewhat blue instead of pure white. The human eye compensates for color cast automatically through a characteristic known as color constancy, allowing the colors to be independent of the illumination. Auto white balance tries to simulate the color constancy for captured images.

Many auto white balance algorithms follow a two-stage process:

  • Illumination estimation: this can be done explicitly by choosing from a known set of possible illuminations or implicitly with assumptions about the effect of such illuminations. The white balance algorithms implemented in librraew use implicit estimation.
  • Image color correction: this is achieved through an independent gain adjustment of the three color signals. Commonly only the blue and red gains are adjusted assuming the red gain is fixed.

Auto exposure algorithms

One of the main problems affecting image quality, leading to disappointing pictures, comes from improper light exposure. The image exposure is the amount of light that reaches the sensor. Exposure determines the lightness or darkness of the resulting image. If too much light strikes the image sensor, the image will be overexposed, washed out, and faded. If too little light reaches the camera sensor produces an underexposed image, dark and lacking in details especially in shadow areas. Auto exposure (AE) algorithms adjust the captured image in an attempt to reproduce the most important regions (according to contextual or perceptive criteria) with an average level of brightness, more or less in the middle of the possible range.

Auto exposure algorithms involve three processes:

  • Light metering: this is generally accomplished using the camera sensor itself or an external device as exposure detector.
  • Scene analysis: brightness metering methods use an estimation of the scene illumination according to image metrics. Using the overall illumination value, brightness adjustments can be calculated to produce the best exposure.
  • Image brightness correction: this ensures that the correct amount of light reaches the image sensor by adjusting the illumination and shutter time parameters. The image sensor parameter is often called the exposure time. The exposure time is defined as the amount of time that the sensor integrates light. In other words, it determines how long the sensor photo diodes array is exposed to light.

Documentation

Using the demo version of librraew

You can request at support@ridgerun.com for a demo version of librraew in order to test the auto-white balance and auto-exposure algorithms to see if the technology meets your needs. This library will allow you to use all the features that comes with the full version of the librraew but with the following limitations:

  • The algorithm will darken the image periodically.
  • After awhile the algorithm will stop working and the image capturing will be done with the last values calculated by the library. In order to test the library again you will need to restart the algorithm.

How to Buy

You can purchase commercial version of librraew using our Online Store or you can post your purchasing inquiry at our Contact Us link.

References

  1. Battiato, G. Messina, and A. Castorina. Exposure correction for imaging devices: an overview. In Single-Sensor Imaging: Methods and Applications for Digital Cameras, chapter 12. Rastislav Lukac, October 2008.
  2. Lee J.S., Jung Y.Y, Kim B.S., and Ko S.J. An advanced video camera system with robust af, ae, and awb control. IEEE Transactions on Consumer Electronics, 47:694–699, August 2001
  3. Edmund Y. Lam. Combining gray world and retinex theory for automatic white balance in digital photography. Consumer Electronics, 2005. (ISCE 2005). Proceedings of the Ninth International Symposium on, pages 134–139, June 2005.
  4. Edmund Y. Lam and George S. K. Fung. Automatic white balancing in digital photography. In Single-Sensor Imaging: Methods and Applications for Digital Cameras. Taylor & Francis Group, LLC, 2009.
  5. Nitin Sampat, Shyam Venkataraman, Thomas Yeh, and Robert L. Kremens. System implications of implementing auto-exposure on consumer digital cameras. Proc. SPIE. Sensors, Cameras, and Applications for Digital Photography, 3650:100–107, March 1999


RidgeRun Resources

Quick Start Client Engagement Process RidgeRun Blog Homepage
Technical and Sales Support RidgeRun Online Store RidgeRun Videos Contact Us
RidgeRun.ai: Artificial Intelligence | Generative AI | Machine Learning

Contact Us

Visit our Main Website for the RidgeRun Products and Online Store. RidgeRun Engineering informations are available in RidgeRun Professional Services, RidgeRun Subscription Model and Client Engagement Process wiki pages. Please email to support@ridgerun.com for technical questions and contactus@ridgerun.com for other queries. Contact details for sponsoring the RidgeRun GStreamer projects are available in Sponsor Projects page.