RidgeRun Auto exposure/Auto white balance library for DM368 and DM365

From RidgeRun Developer Wiki
Revision as of 20:04, 26 July 2011 by Mmontero (talk | contribs)

Introduction

CMOS or CCD sensor video capture quality can be enhance with image processing, like auto white balance (AWB) and auto exposure algorithms (AE):

  • Auto exposure performs automatic adjustments of the image brightness according to the amount of light that reaches the camera sensor.
  • Auto white balance automatically compensates color differences based on lighting so white actually appears white.

Some camera sensors don't include auto white balance and/or auto exposure processing, so RidgeRun offers a library with AE and AWB algorithms called librraew. This library was initially developed for the DM365 platform. The DM365 video processing front end (VPFE) has an H3A module designed to support control loops for auto focus, auto white balance and auto exposure by collecting statistics about the imaging/video data. There are two blocks in this module:

  • Auto focus engine
  • Auto exposure and auto white balance engine

The librraew only use the auto exposure and auto white balance engine. This engine divides the frames into two dimensional blocks of pixels referred as windows. The engine can provide some image/video metrics:

  • Accumulation of clipped pixels along with all non-saturated pixels in each window per color.
  • Accumulation of the sum of squared pixels per color.
  • Minimum and maximum pixels values in each window per color.

The AE/AWB engine can be configured to use up to 36 horizontal windows with sum + {sum of squares or min+max} output or up to 56 horizontal windows with sum output. The AE/AWB engine can also be configure to use up to 128 vertical windows. The width and height for the windows is programmable.

Currently, librraew has testing has focused on Aptina CMOS sensor mt9p031, but if you provide the appropriate functions for the library, it can works with any sensor. The implementation is a plain C library and can be re-used with and integrated with any application. RidgeRun uses ipiped (see below) for testing and demonstration.

Algorithms

Auto white balance

When an image of a scene is captured by a digital camera sensor, the sensor response at each pixel depends on the scene illumination. Depending of the illumination, a distinct color cast appears over the captured scene. This effect appears in the captured image due to the color temperature of the light. If a white object is illuminated with a low color temperature light source, the object in the captured image will be reddish. Similarly, when the white object is illuminated with a high color temperature light source, the object in the captured image will be bluish. The human eye compensates for color cast automatically through a characteristic known as color constancy, allowing the colors to be independent of the illumination. Auto white balance tries to simulate the color constancy for images capture.

Many AWB algorithms follow a two-stage process:

  • Illumination estimation: can be done explicitly by choosing from a known set of possible illuminations or implicitly with assumptions about the effect of such illuminations. The algorithms implemented in librraew use the implicitly estimation.
  • Image color correction: is achieved through an independent gain adjustment of the three color signals. Commonly only the blue and red gains are adjusted.

Auto exposure

One of the main problems affecting image quality, leading to unpleasant pictures, comes from improper exposure to light. The exposure is the amount of light that reaches the image sensor. Exposure determines the lightness or darkness of the resulting image. If too much light strikes the image sensor, the image will be overexposed, washed out, and faded. If too little light reaches the camera sensor produces an underexposed image, dark and lacking in details especially in shadow areas. Auto exposure (AE) algorithms adjust the captured image in an attempt to reproduce the most important regions (according to contextual or perceptive criteria) with a level of brightness, more or less in the middle of the possible range.

Auto exposure algorithms involves three processes:

  • Light metering: generally accomplished using the camera sensor itself or an external device as exposure detector.
  • Scene analysis: brightness metering methods used to estimate the scene illumination according to image metrics. Using this value, calculates the brightness adjustments to have the best exposure.
  • Image brightness correction: ensures that the correct amount of light reaches the image sensor, illumination and shutter time parameters are adjusted. The CMOS image sensor parameter is often called exposure time. The exposure time is defined as the amount of time that the sensor integrates light. In other words, it determines how long the sensor photo diodes array is exposed to light.

Documentation

References

  1. Battiato, G. Messina, and A. Castorina. Exposure correction for imaging devices: an overview. In Single-Sensor Imaging: Methods and Applications for Digital Cameras, chapter 12. Rastislav Lukac, October 2008.
  2. Lee J.S., Jung Y.Y, Kim B.S., and Ko S.J. An advanced video camera system with robust af, ae, and awb control. IEEE Transactions on Consumer Electronics, 47:694–699, August 2001
  3. Edmund Y. Lam. Combining gray world and retinex theory for automatic white balance in digital photography. Consumer Electronics, 2005. (ISCE 2005). Proceedings of the Ninth International Symposium on, pages 134–139, June 2005.
  4. Edmund Y. Lam and George S. K. Fung. Automatic white balancing in digital photography. In Single-Sensor Imaging: Methods and Applications for Digital Cameras. Taylor & Francis Group, LLC, 2009.
  5. Nitin Sampat, Shyam Venkataraman, Thomas Yeh, and Robert L. Kremens. System implications of implementing auto-exposure on consumer digital cameras. Proc. SPIE. Sensors, Cameras, and Applications for Digital Photography, 3650:100–107, March 1999