Android Camera HAL3.2 Properties

Table of Contents

Properties

Property Name Type Description Units Range Tags
colorCorrection
controls
Property Name Type Description Units Range Tags
android.colorCorrection.mode byte [public]
  • TRANSFORM_MATRIX

    Use the android.colorCorrection.transform matrix and android.colorCorrection.gains to do color conversion.

    All advanced white balance adjustments (not specified by our white balance pipeline) must be disabled.

    If AWB is enabled with android.control.awbMode != OFF, then TRANSFORM_MATRIX is ignored. The camera device will override this value to either FAST or HIGH_QUALITY.

  • FAST

    Must not slow down capture rate relative to sensor raw output.

    Advanced white balance adjustments above and beyond the specified white balance pipeline may be applied.

    If AWB is enabled with android.control.awbMode != OFF, then the camera device uses the last frame's AWB values (or defaults if AWB has never been run).

  • HIGH_QUALITY

    Capture rate (relative to sensor raw output) may be reduced by high quality.

    Advanced white balance adjustments above and beyond the specified white balance pipeline may be applied.

    If AWB is enabled with android.control.awbMode != OFF, then the camera device uses the last frame's AWB values (or defaults if AWB has never been run).

The mode control selects how the image data is converted from the sensor's native color into linear sRGB color.

Details

When auto-white balance is enabled with android.control.awbMode, this control is overridden by the AWB routine. When AWB is disabled, the application controls how the color mapping is performed.

We define the expected processing pipeline below. For consistency across devices, this is always the case with TRANSFORM_MATRIX.

When either FULL or HIGH_QUALITY is used, the camera device may do additional processing but android.colorCorrection.gains and android.colorCorrection.transform will still be provided by the camera device (in the results) and be roughly correct.

Switching to TRANSFORM_MATRIX and using the data provided from FAST or HIGH_QUALITY will yield a picture with the same white point as what was produced by the camera device in the earlier frame.

The expected processing pipeline is as follows:

White balance processing pipeline

The white balance is encoded by two values, a 4-channel white-balance gain vector (applied in the Bayer domain), and a 3x3 color transform matrix (applied after demosaic).

The 4-channel white-balance gains are defined as:

android.colorCorrection.gains = [ R G_even G_odd B ]

where G_even is the gain for green pixels on even rows of the output, and G_odd is the gain for green pixels on the odd rows. These may be identical for a given camera device implementation; if the camera device does not support a separate gain for even/odd green channels, it will use the G_even value, and write G_odd equal to G_even in the output result metadata.

The matrices for color transforms are defined as a 9-entry vector:

android.colorCorrection.transform = [ I0 I1 I2 I3 I4 I5 I6 I7 I8 ]

which define a transform from input sensor colors, P_in = [ r g b ], to output linear sRGB, P_out = [ r' g' b' ],

with colors as follows:

r' = I0r + I1g + I2b
g' = I3r + I4g + I5b
b' = I6r + I7g + I8b

Both the input and output value ranges must match. Overflow/underflow values are clipped to fit within the range.

android.colorCorrection.transform rational x 3 x 3 [public]
3x3 rational matrix in row-major order

A color transform matrix to use to transform from sensor RGB color space to output linear sRGB color space

Details

This matrix is either set by the camera device when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

In the latter case, the camera device may round the matrix to account for precision issues; the final rounded matrix should be reported back in this matrix result metadata. The transform should keep the magnitude of the output color values within [0, 1.0] (assuming input color values is within the normalized range [0, 1.0]), or clipping may occur.

android.colorCorrection.gains float x 4 [public]
A 1D array of floats for 4 color channel gains

Gains applying to Bayer raw color channels for white-balance.

Details

The 4-channel white-balance gains are defined in the order of [R G_even G_odd B], where G_even is the gain for green pixels on even rows of the output, and G_odd is the gain for green pixels on the odd rows. if a HAL does not support a separate gain for even/odd green channels, it should use the G_even value, and write G_odd equal to G_even in the output result metadata.

This array is either set by the camera device when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

The output should be the gains actually applied by the camera device to the current frame.

dynamic
Property Name Type Description Units Range Tags
android.colorCorrection.mode byte [public]
  • TRANSFORM_MATRIX

    Use the android.colorCorrection.transform matrix and android.colorCorrection.gains to do color conversion.

    All advanced white balance adjustments (not specified by our white balance pipeline) must be disabled.

    If AWB is enabled with android.control.awbMode != OFF, then TRANSFORM_MATRIX is ignored. The camera device will override this value to either FAST or HIGH_QUALITY.

  • FAST

    Must not slow down capture rate relative to sensor raw output.

    Advanced white balance adjustments above and beyond the specified white balance pipeline may be applied.

    If AWB is enabled with android.control.awbMode != OFF, then the camera device uses the last frame's AWB values (or defaults if AWB has never been run).

  • HIGH_QUALITY

    Capture rate (relative to sensor raw output) may be reduced by high quality.

    Advanced white balance adjustments above and beyond the specified white balance pipeline may be applied.

    If AWB is enabled with android.control.awbMode != OFF, then the camera device uses the last frame's AWB values (or defaults if AWB has never been run).

The mode control selects how the image data is converted from the sensor's native color into linear sRGB color.

Details

When auto-white balance is enabled with android.control.awbMode, this control is overridden by the AWB routine. When AWB is disabled, the application controls how the color mapping is performed.

We define the expected processing pipeline below. For consistency across devices, this is always the case with TRANSFORM_MATRIX.

When either FULL or HIGH_QUALITY is used, the camera device may do additional processing but android.colorCorrection.gains and android.colorCorrection.transform will still be provided by the camera device (in the results) and be roughly correct.

Switching to TRANSFORM_MATRIX and using the data provided from FAST or HIGH_QUALITY will yield a picture with the same white point as what was produced by the camera device in the earlier frame.

The expected processing pipeline is as follows:

White balance processing pipeline

The white balance is encoded by two values, a 4-channel white-balance gain vector (applied in the Bayer domain), and a 3x3 color transform matrix (applied after demosaic).

The 4-channel white-balance gains are defined as:

android.colorCorrection.gains = [ R G_even G_odd B ]

where G_even is the gain for green pixels on even rows of the output, and G_odd is the gain for green pixels on the odd rows. These may be identical for a given camera device implementation; if the camera device does not support a separate gain for even/odd green channels, it will use the G_even value, and write G_odd equal to G_even in the output result metadata.

The matrices for color transforms are defined as a 9-entry vector:

android.colorCorrection.transform = [ I0 I1 I2 I3 I4 I5 I6 I7 I8 ]

which define a transform from input sensor colors, P_in = [ r g b ], to output linear sRGB, P_out = [ r' g' b' ],

with colors as follows:

r' = I0r + I1g + I2b
g' = I3r + I4g + I5b
b' = I6r + I7g + I8b

Both the input and output value ranges must match. Overflow/underflow values are clipped to fit within the range.

android.colorCorrection.transform rational x 3 x 3 [public]
3x3 rational matrix in row-major order

A color transform matrix to use to transform from sensor RGB color space to output linear sRGB color space

Details

This matrix is either set by the camera device when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

In the latter case, the camera device may round the matrix to account for precision issues; the final rounded matrix should be reported back in this matrix result metadata. The transform should keep the magnitude of the output color values within [0, 1.0] (assuming input color values is within the normalized range [0, 1.0]), or clipping may occur.

android.colorCorrection.gains float x 4 [public]
A 1D array of floats for 4 color channel gains

Gains applying to Bayer raw color channels for white-balance.

Details

The 4-channel white-balance gains are defined in the order of [R G_even G_odd B], where G_even is the gain for green pixels on even rows of the output, and G_odd is the gain for green pixels on the odd rows. if a HAL does not support a separate gain for even/odd green channels, it should use the G_even value, and write G_odd equal to G_even in the output result metadata.

This array is either set by the camera device when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

The output should be the gains actually applied by the camera device to the current frame.

control
controls
Property Name Type Description Units Range Tags
android.control.aeAntibandingMode byte [public]
  • OFF

    The camera device will not adjust exposure duration to avoid banding problems.

  • 50HZ

    The camera device will adjust exposure duration to avoid banding problems with 50Hz illumination sources.

  • 60HZ

    The camera device will adjust exposure duration to avoid banding problems with 60Hz illumination sources.

  • AUTO

    The camera device will automatically adapt its antibanding routine to the current illumination conditions. This is the default.

The desired setting for the camera device's auto-exposure algorithm's antibanding compensation.

android.control.aeAvailableAntibandingModes

Details

Some kinds of lighting fixtures, such as some fluorescent lights, flicker at the rate of the power supply frequency (60Hz or 50Hz, depending on country). While this is typically not noticeable to a person, it can be visible to a camera device. If a camera sets its exposure time to the wrong value, the flicker may become visible in the viewfinder as flicker or in a final captured image, as a set of variable-brightness bands across the image.

Therefore, the auto-exposure routines of camera devices include antibanding routines that ensure that the chosen exposure value will not cause such banding. The choice of exposure time depends on the rate of flicker, which the camera device can detect automatically, or the expected rate can be selected by the application using this control.

A given camera device may not support all of the possible options for the antibanding mode. The android.control.aeAvailableAntibandingModes key contains the available modes for a given camera device.

The default mode is AUTO, which must be supported by all camera devices.

If manual exposure control is enabled (by setting android.control.aeMode or android.control.mode to OFF), then this setting has no effect, and the application must ensure it selects exposure times that do not cause banding issues. The android.statistics.sceneFlicker key can assist the application in this.

HAL Implementation Details

For all capture request templates, this field must be set to AUTO. AUTO is the only mode that must supported; OFF, 50HZ, 60HZ are all optional.

If manual exposure control is enabled (by setting android.control.aeMode or android.control.mode to OFF), then the exposure values provided by the application must not be adjusted for antibanding.

android.control.aeExposureCompensation int32 [public]

Adjustment to AE target image brightness

count of positive/negative EV steps
Details

For example, if EV step is 0.333, '6' will mean an exposure compensation of +2 EV; -3 will mean an exposure compensation of -1 EV. Note that this control will only be effective if android.control.aeMode != OFF. This control will take effect even when android.control.aeLock == true.

In the event of exposure compensation value being changed, camera device may take several frames to reach the newly requested exposure target. During that time, android.control.aeState field will be in the SEARCHING state. Once the new exposure target is reached, android.control.aeState will change from SEARCHING to either CONVERGED, LOCKED (if AE lock is enabled), or FLASH_REQUIRED (if the scene is too dark for still capture).

android.control.aeLock byte [public as boolean]
  • OFF

    Autoexposure lock is disabled; the AE algorithm is free to update its parameters.

  • ON

    Autoexposure lock is enabled; the AE algorithm must not update the exposure and sensitivity parameters while the lock is active. android.control.aeExposureCompensation setting changes will still take effect while auto-exposure is locked.

Whether AE is currently locked to its latest calculated values.

Details

Note that even when AE is locked, the flash may be fired if the android.control.aeMode is ON_AUTO_FLASH / ON_ALWAYS_FLASH / ON_AUTO_FLASH_REDEYE.

When android.control.aeExposureCompensation is changed, even if the AE lock is ON, the camera device will still adjust its exposure value.

If AE precapture is triggered (see android.control.aePrecaptureTrigger) when AE is already locked, the camera device will not change the exposure time (android.sensor.exposureTime) and sensitivity (android.sensor.sensitivity) parameters. The flash may be fired if the android.control.aeMode is ON_AUTO_FLASH/ON_AUTO_FLASH_REDEYE and the scene is too dark. If the android.control.aeMode is ON_ALWAYS_FLASH, the scene may become overexposed.

See android.control.aeState for AE lock related state transition details.

android.control.aeMode byte [public]

The desired mode for the camera device's auto-exposure routine.

android.control.aeAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to any of the ON modes, the camera device's auto-exposure routine is enabled, overriding the application's selected exposure time, sensor sensitivity, and frame duration (android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration). If one of the FLASH modes is selected, the camera device's flash unit controls are also overridden.

The FLASH modes are only available if the camera device has a flash unit (android.flash.info.available is true).

If flash TORCH mode is desired, this field must be set to ON or OFF, and android.flash.mode set to TORCH.

When set to any of the ON modes, the values chosen by the camera device auto-exposure routine for the overridden fields for a given capture will be available in its CaptureResult.

android.control.aeRegions int32 x 5 x area_count [public as meteringRectangle]

List of areas to use for metering.

area_count <= android.control.maxRegions[0]

Details

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the camera device. If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the region and output the used sections in the result metadata.

HAL Implementation Details

The HAL level representation of MeteringRectangle[] is a int[5 * area_count]. Every five elements represent a metering region of (xmin, ymin, xmax, ymax, weight). The rectangle is defined to be inclusive on xmin and ymin, but exclusive on xmax and ymax.

android.control.aeTargetFpsRange int32 x 2 [public as rangeInt]

Range over which fps can be adjusted to maintain exposure

android.control.aeAvailableTargetFpsRanges

Details

Only constrains AE algorithm, not manual control of android.sensor.exposureTime

android.control.aePrecaptureTrigger byte [public]
  • IDLE

    The trigger is idle.

  • START

    The precapture metering sequence will be started by the camera device. The exact effect of the precapture trigger depends on the current AE mode and state.

Whether the camera device will trigger a precapture metering sequence when it processes this request.

Details

This entry is normally set to IDLE, or is not included at all in the request settings. When included and set to START, the camera device will trigger the autoexposure precapture metering sequence.

The effect of AE precapture trigger depends on the current AE mode and state; see android.control.aeState for AE precapture state transition details.

android.control.afMode byte [public]
  • OFF

    The auto-focus routine does not control the lens; android.lens.focusDistance is controlled by the application

  • AUTO

    If lens is not fixed focus.

    Use android.lens.info.minimumFocusDistance to determine if lens is fixed-focus. In this mode, the lens does not move unless the autofocus trigger action is called. When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED).

    Triggering AF_CANCEL resets the lens position to default, and sets the AF state to INACTIVE.

  • MACRO

    In this mode, the lens does not move unless the autofocus trigger action is called.

    When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED). Triggering cancel AF resets the lens position to default, and sets the AF state to INACTIVE.

  • CONTINUOUS_VIDEO

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for good quality video recording; typically this means slower focus movement and no overshoots. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate. When the AF trigger is activated, the algorithm should immediately transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    Once cancel is received, the algorithm should transition back to INACTIVE and resume passive scan. Note that this behavior is not identical to CONTINUOUS_PICTURE, since an ongoing PASSIVE_SCAN must immediately be canceled.

  • CONTINUOUS_PICTURE

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for still image capture; typically this means focusing as fast as possible. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate as it attempts to maintain focus. When the AF trigger is activated, the algorithm should finish its PASSIVE_SCAN if active, and then transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    When the AF cancel trigger is activated, the algorithm should transition back to INACTIVE and then act as if it has just been started.

  • EDOF

    Extended depth of field (digital focus). AF trigger is ignored, AF state should always be INACTIVE.

Whether AF is currently enabled, and what mode it is set to

android.control.afAvailableModes

Details

Only effective if android.control.mode = AUTO and the lens is not fixed focus (i.e. android.lens.info.minimumFocusDistance > 0).

If the lens is controlled by the camera device auto-focus algorithm, the camera device will report the current AF status in android.control.afState in result metadata.

android.control.afRegions int32 x 5 x area_count [public as meteringRectangle]

List of areas to use for focus estimation.

area_count <= android.control.maxRegions[2]

Details

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the camera device. If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the region and output the used sections in the result metadata.

HAL Implementation Details

The HAL level representation of MeteringRectangle[] is a int[5 * area_count]. Every five elements represent a metering region of (xmin, ymin, xmax, ymax, weight). The rectangle is defined to be inclusive on xmin and ymin, but exclusive on xmax and ymax.

android.control.afTrigger byte [public]
  • IDLE

    The trigger is idle.

  • START

    Autofocus will trigger now.

  • CANCEL

    Autofocus will return to its initial state, and cancel any currently active trigger.

Whether the camera device will trigger autofocus for this request.

Details

This entry is normally set to IDLE, or is not included at all in the request settings.

When included and set to START, the camera device will trigger the autofocus algorithm. If autofocus is disabled, this trigger has no effect.

When set to CANCEL, the camera device will cancel any active trigger, and return to its initial AF state.

See android.control.afState for what that means for each AF mode.

android.control.awbLock byte [public as boolean]
  • OFF

    Auto-whitebalance lock is disabled; the AWB algorithm is free to update its parameters if in AUTO mode.

  • ON

    Auto-whitebalance lock is enabled; the AWB algorithm must not update its parameters while the lock is active.

Whether AWB is currently locked to its latest calculated values.

Details

Note that AWB lock is only meaningful for AUTO mode; in other modes, AWB is already fixed to a specific setting.

android.control.awbMode byte [public]
  • OFF

    The camera device's auto white balance routine is disabled; the application-selected color transform matrix (android.colorCorrection.transform) and gains (android.colorCorrection.gains) are used by the camera device for manual white balance control.

  • AUTO

    The camera device's auto white balance routine is active; the application's values for android.colorCorrection.transform and android.colorCorrection.gains are ignored.

  • INCANDESCENT

    The camera device's auto white balance routine is disabled; the camera device uses incandescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant A.

  • FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F2.

  • WARM_FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses warm fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F4.

  • DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses daylight light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant D65.

  • CLOUDY_DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses cloudy daylight light as the assumed scene illumination for white balance.

  • TWILIGHT

    The camera device's auto white balance routine is disabled; the camera device uses twilight light as the assumed scene illumination for white balance.

  • SHADE

    The camera device's auto white balance routine is disabled; the camera device uses shade light as the assumed scene illumination for white balance.

Whether AWB is currently setting the color transform fields, and what its illumination target is.

android.control.awbAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to the ON mode, the camera device's auto white balance routine is enabled, overriding the application's selected android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to the OFF mode, the camera device's auto white balance routine is disabled. The application manually controls the white balance by android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to any other modes, the camera device's auto white balance routine is disabled. The camera device uses each particular illumination target for white balance adjustment.

android.control.awbRegions int32 x 5 x area_count [public as meteringRectangle]

List of areas to use for illuminant estimation.

area_count <= android.control.maxRegions[1]

Details

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the camera device. If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the region and output the used sections in the result metadata.

HAL Implementation Details

The HAL level representation of MeteringRectangle[] is a int[5 * area_count]. Every five elements represent a metering region of (xmin, ymin, xmax, ymax, weight). The rectangle is defined to be inclusive on xmin and ymin, but exclusive on xmax and ymax.

android.control.captureIntent byte [public]
  • CUSTOM

    This request doesn't fall into the other categories. Default to preview-like behavior.

  • PREVIEW

    This request is for a preview-like usecase. The precapture trigger may be used to start off a metering w/flash sequence

  • STILL_CAPTURE

    This request is for a still capture-type usecase.

  • VIDEO_RECORD

    This request is for a video recording usecase.

  • VIDEO_SNAPSHOT

    This request is for a video snapshot (still image while recording video) usecase

  • ZERO_SHUTTER_LAG

    This request is for a ZSL usecase; the application will stream full-resolution images and reprocess one or several later for a final capture

  • MANUAL

    This request is for manual capture use case where the applications want to directly control the capture parameters (e.g. android.sensor.exposureTime, android.sensor.sensitivity etc.).

Information to the camera device 3A (auto-exposure, auto-focus, auto-white balance) routines about the purpose of this capture, to help the camera device to decide optimal 3A strategy.

All must be supported except for ZERO_SHUTTER_LAG and MANUAL.

Details

This control (except for MANUAL) is only effective if android.control.mode != OFF and any 3A routine is active.

ZERO_SHUTTER_LAG must be supported if android.request.availableCapabilities contains ZSL. MANUAL must be supported if android.request.availableCapabilities contains MANUAL_SENSOR.

android.control.effectMode byte [public]
  • OFF

    No color effect will be applied.

  • MONO optional

    A "monocolor" effect where the image is mapped into a single color. This will typically be grayscale.

  • NEGATIVE optional

    A "photo-negative" effect where the image's colors are inverted.

  • SOLARIZE optional

    A "solarisation" effect (Sabattier effect) where the image is wholly or partially reversed in tone.

  • SEPIA optional

    A "sepia" effect where the image is mapped into warm gray, red, and brown tones.

  • POSTERIZE optional

    A "posterization" effect where the image uses discrete regions of tone rather than a continuous gradient of tones.

  • WHITEBOARD optional

    A "whiteboard" effect where the image is typically displayed as regions of white, with black or grey details.

  • BLACKBOARD optional

    A "blackboard" effect where the image is typically displayed as regions of black, with white or grey details.

  • AQUA optional

    An "aqua" effect where a blue hue is added to the image.

A special color effect to apply.

android.control.availableEffects

Details

When this mode is set, a color effect will be applied to images produced by the camera device. The interpretation and implementation of these color effects is left to the implementor of the camera device, and should not be depended on to be consistent (or present) across all devices.

A color effect will only be applied if android.control.mode != OFF.

android.control.mode byte [public]
  • OFF

    Full application control of pipeline. All 3A routines are disabled, no other settings in android.control.* have any effect

  • AUTO

    Use settings for each individual 3A routine. Manual control of capture parameters is disabled. All controls in android.control.* besides sceneMode take effect

  • USE_SCENE_MODE

    Use specific scene mode. Enabling this disables control.aeMode, control.awbMode and control.afMode controls; the camera device will ignore those settings while USE_SCENE_MODE is active (except for FACE_PRIORITY scene mode). Other control entries are still active. This setting can only be used if scene mode is supported (i.e. android.control.availableSceneModes contain some modes other than DISABLED).

  • OFF_KEEP_STATE

    Same as OFF mode, except that this capture will not be used by camera device background auto-exposure, auto-white balance and auto-focus algorithms to update their statistics.

Overall mode of 3A control routines.

all must be supported

Details

High-level 3A control. When set to OFF, all 3A control by the camera device is disabled. The application must set the fields for capture parameters itself.

When set to AUTO, the individual algorithm controls in android.control.* are in effect, such as android.control.afMode.

When set to USE_SCENE_MODE, the individual controls in android.control.* are mostly disabled, and the camera device implements one of the scene mode settings (such as ACTION, SUNSET, or PARTY) as it wishes. The camera device scene mode 3A settings are provided by android.control.sceneModeOverrides.

When set to OFF_KEEP_STATE, it is similar to OFF mode, the only difference is that this frame will not be used by camera device background 3A statistics update, as if this frame is never captured. This mode can be used in the scenario where the application doesn't want a 3A manual control capture to affect the subsequent auto 3A capture results.

android.control.sceneMode byte [public]
  • DISABLED 0

    Indicates that no scene modes are set for a given capture request.

  • FACE_PRIORITY

    If face detection support exists, use face detection data for auto-focus, auto-white balance, and auto-exposure routines. If face detection statistics are disabled (i.e. android.statistics.faceDetectMode is set to OFF), this should still operate correctly (but will not return face detection statistics to the framework).

    Unlike the other scene modes, android.control.aeMode, android.control.awbMode, and android.control.afMode remain active when FACE_PRIORITY is set.

  • ACTION optional

    Optimized for photos of quickly moving objects. Similar to SPORTS.

  • PORTRAIT optional

    Optimized for still photos of people.

  • LANDSCAPE optional

    Optimized for photos of distant macroscopic objects.

  • NIGHT optional

    Optimized for low-light settings.

  • NIGHT_PORTRAIT optional

    Optimized for still photos of people in low-light settings.

  • THEATRE optional

    Optimized for dim, indoor settings where flash must remain off.

  • BEACH optional

    Optimized for bright, outdoor beach settings.

  • SNOW optional

    Optimized for bright, outdoor settings containing snow.

  • SUNSET optional

    Optimized for scenes of the setting sun.

  • STEADYPHOTO optional

    Optimized to avoid blurry photos due to small amounts of device motion (for example: due to hand shake).

  • FIREWORKS optional

    Optimized for nighttime photos of fireworks.

  • SPORTS optional

    Optimized for photos of quickly moving people. Similar to ACTION.

  • PARTY optional

    Optimized for dim, indoor settings with multiple moving people.

  • CANDLELIGHT optional

    Optimized for dim settings where the main light source is a flame.

  • BARCODE optional

    Optimized for accurately capturing a photo of barcode for use by camera applications that wish to read the barcode value.

A camera mode optimized for conditions typical in a particular capture setting.

android.control.availableSceneModes

Details

This is the mode that that is active when android.control.mode == USE_SCENE_MODE. Aside from FACE_PRIORITY, these modes will disable android.control.aeMode, android.control.awbMode, and android.control.afMode while in use.

The interpretation and implementation of these scene modes is left to the implementor of the camera device. Their behavior will not be consistent across all devices, and any given device may only implement a subset of these modes.

HAL Implementation Details

HAL implementations that include scene modes are expected to provide the per-scene settings to use for android.control.aeMode, android.control.awbMode, and android.control.afMode in android.control.sceneModeOverrides.

android.control.videoStabilizationMode byte [public]
  • OFF
  • ON

Whether video stabilization is active

Details

If enabled, video stabilization can modify the android.scaler.cropRegion to keep the video stream stabilized

static
Property Name Type Description Units Range Tags
android.control.aeAvailableAntibandingModes byte x n [public]
list of enums

The set of auto-exposure antibanding modes that are supported by this camera device.

Details

Not all of the auto-exposure anti-banding modes may be supported by a given camera device. This field lists the valid anti-banding modes that the application may request for this camera device; they must include AUTO.

android.control.aeAvailableModes byte x n [public]
list of enums

The set of auto-exposure modes that are supported by this camera device.

Details

Not all the auto-exposure modes may be supported by a given camera device, especially if no flash unit is available. This entry lists the valid modes for android.control.aeMode for this camera device.

All camera devices support ON, and all camera devices with flash units support ON_AUTO_FLASH and ON_ALWAYS_FLASH.

FULL mode camera devices always support OFF mode, which enables application control of camera exposure time, sensitivity, and frame duration.

android.control.aeAvailableTargetFpsRanges int32 x 2 x n [public as rangeInt]
list of pairs of frame rates

List of frame rate ranges supported by the AE algorithm/hardware

android.control.aeCompensationRange int32 x 2 [public as rangeInt]

Maximum and minimum exposure compensation setting, in counts of android.control.aeCompensationStep.

At least (-2,2)/(exp compensation step size)

android.control.aeCompensationStep rational [public]

Smallest step by which exposure compensation can be changed

<= 1/2

android.control.afAvailableModes byte x n [public]
List of enums

List of AF modes that can be selected with android.control.afMode.

Details

Not all the auto-focus modes may be supported by a given camera device. This entry lists the valid modes for android.control.afMode for this camera device.

All camera devices will support OFF mode, and all camera devices with adjustable focuser units (android.lens.info.minimumFocusDistance > 0) will support AUTO mode.

android.control.availableEffects byte x n [public]
List of enums (android.control.effectMode).

List containing the subset of color effects specified in android.control.effectMode that is supported by this device.

Any subset of enums from those specified in android.control.effectMode. OFF must be included in any subset.

Details

This list contains the color effect modes that can be applied to images produced by the camera device. Only modes that have been fully implemented for the current device may be included here. Implementations are not expected to be consistent across all devices. If no color effect modes are available for a device, this should simply be set to OFF.

A color effect will only be applied if android.control.mode != OFF.

android.control.availableSceneModes byte x n [public]
List of enums (android.control.sceneMode).

List containing a subset of scene modes specified in android.control.sceneMode.

Any subset of the enums specified in android.control.sceneMode not including DISABLED, or solely DISABLED if no scene modes are available. FACE_PRIORITY must be included if face detection is supported (i.e.android.statistics.info.maxFaceCount > 0).

Details

This list contains scene modes that can be set for the camera device. Only scene modes that have been fully implemented for the camera device may be included here. Implementations are not expected to be consistent across all devices. If no scene modes are supported by the camera device, this will be set to [DISABLED].

android.control.availableVideoStabilizationModes byte x n [public]
List of enums.

List of video stabilization modes that can be supported

OFF must be included

android.control.awbAvailableModes byte x n [public]
List of enums

The set of auto-white-balance modes (android.control.awbMode) that are supported by this camera device.

Details

Not all the auto-white-balance modes may be supported by a given camera device. This entry lists the valid modes for android.control.awbMode for this camera device.

All camera devices will support ON mode.

FULL mode camera devices will always support OFF mode, which enables application control of white balance, by using android.colorCorrection.transform and android.colorCorrection.gains(android.colorCorrection.mode must be set to TRANSFORM_MATRIX).

android.control.maxRegions int32 x 3 [hidden]

List of the maximum number of regions that can be used for metering in auto-exposure (AE), auto-white balance (AWB), and auto-focus (AF); this corresponds to the the maximum number of elements in android.control.aeRegions, android.control.awbRegions, and android.control.afRegions.

Value must be >= 0 for each element. For full-capability devices this value must be >= 1 for AE and AF. The order of the elements is: (AE, AWB, AF).

android.control.maxRegionsAe int32 [public] [synthetic]

List of the maximum number of regions that can be used for metering in auto-exposure (AE); this corresponds to the the maximum number of elements in android.control.aeRegions.

Value will be >= 0. For FULL-capability devices, this value will be >= 1.

HAL Implementation Details

This entry is private to the framework. Fill in maxRegions to have this entry be automatically populated.

android.control.maxRegionsAwb int32 [public] [synthetic]

List of the maximum number of regions that can be used for metering in auto-white balance (AWB); this corresponds to the the maximum number of elements in android.control.awbRegions.

Value will be >= 0.

HAL Implementation Details

This entry is private to the framework. Fill in maxRegions to have this entry be automatically populated.

android.control.maxRegionsAf int32 [public] [synthetic]

List of the maximum number of regions that can be used for metering in auto-focus (AF); this corresponds to the the maximum number of elements in android.control.afRegions.

Value will be >= 0. For FULL-capability devices, this value will be >= 1.

HAL Implementation Details

This entry is private to the framework. Fill in maxRegions to have this entry be automatically populated.

android.control.sceneModeOverrides byte x 3 x length(availableSceneModes) [system]

Ordered list of auto-exposure, auto-white balance, and auto-focus settings to use with each available scene mode.

For each available scene mode, the list must contain three entries containing the android.control.aeMode, android.control.awbMode, and android.control.afMode values used by the camera device. The entry order is (aeMode, awbMode, afMode) where aeMode has the lowest index position.

Details

When a scene mode is enabled, the camera device is expected to override android.control.aeMode, android.control.awbMode, and android.control.afMode with its preferred settings for that scene mode.

The order of this list matches that of availableSceneModes, with 3 entries for each mode. The overrides listed for FACE_PRIORITY are ignored, since for that mode the application-set android.control.aeMode, android.control.awbMode, and android.control.afMode values are used instead, matching the behavior when android.control.mode is set to AUTO. It is recommended that the FACE_PRIORITY overrides should be set to 0.

For example, if availableSceneModes contains (FACE_PRIORITY, ACTION, NIGHT), then the camera framework expects sceneModeOverrides to have 9 entries formatted like: (0, 0, 0, ON_AUTO_FLASH, AUTO, CONTINUOUS_PICTURE, ON_AUTO_FLASH, INCANDESCENT, AUTO).

HAL Implementation Details

To maintain backward compatibility, this list will be made available in the static metadata of the camera service. The camera service will use these values to set android.control.aeMode, android.control.awbMode, and android.control.afMode when using a scene mode other than FACE_PRIORITY.

dynamic
Property Name Type Description Units Range Tags
android.control.aePrecaptureId int32 [system] [deprecated]

The ID sent with the latest CAMERA2_TRIGGER_PRECAPTURE_METERING call

Deprecated. Do not use.

Details

Must be 0 if no CAMERA2_TRIGGER_PRECAPTURE_METERING trigger received yet by HAL. Always updated even if AE algorithm ignores the trigger

android.control.aeAntibandingMode byte [public]
  • OFF

    The camera device will not adjust exposure duration to avoid banding problems.

  • 50HZ

    The camera device will adjust exposure duration to avoid banding problems with 50Hz illumination sources.

  • 60HZ

    The camera device will adjust exposure duration to avoid banding problems with 60Hz illumination sources.

  • AUTO

    The camera device will automatically adapt its antibanding routine to the current illumination conditions. This is the default.

The desired setting for the camera device's auto-exposure algorithm's antibanding compensation.

android.control.aeAvailableAntibandingModes

Details

Some kinds of lighting fixtures, such as some fluorescent lights, flicker at the rate of the power supply frequency (60Hz or 50Hz, depending on country). While this is typically not noticeable to a person, it can be visible to a camera device. If a camera sets its exposure time to the wrong value, the flicker may become visible in the viewfinder as flicker or in a final captured image, as a set of variable-brightness bands across the image.

Therefore, the auto-exposure routines of camera devices include antibanding routines that ensure that the chosen exposure value will not cause such banding. The choice of exposure time depends on the rate of flicker, which the camera device can detect automatically, or the expected rate can be selected by the application using this control.

A given camera device may not support all of the possible options for the antibanding mode. The android.control.aeAvailableAntibandingModes key contains the available modes for a given camera device.

The default mode is AUTO, which must be supported by all camera devices.

If manual exposure control is enabled (by setting android.control.aeMode or android.control.mode to OFF), then this setting has no effect, and the application must ensure it selects exposure times that do not cause banding issues. The android.statistics.sceneFlicker key can assist the application in this.

HAL Implementation Details

For all capture request templates, this field must be set to AUTO. AUTO is the only mode that must supported; OFF, 50HZ, 60HZ are all optional.

If manual exposure control is enabled (by setting android.control.aeMode or android.control.mode to OFF), then the exposure values provided by the application must not be adjusted for antibanding.

android.control.aeExposureCompensation int32 [public]

Adjustment to AE target image brightness

count of positive/negative EV steps
Details

For example, if EV step is 0.333, '6' will mean an exposure compensation of +2 EV; -3 will mean an exposure compensation of -1 EV. Note that this control will only be effective if android.control.aeMode != OFF. This control will take effect even when android.control.aeLock == true.

In the event of exposure compensation value being changed, camera device may take several frames to reach the newly requested exposure target. During that time, android.control.aeState field will be in the SEARCHING state. Once the new exposure target is reached, android.control.aeState will change from SEARCHING to either CONVERGED, LOCKED (if AE lock is enabled), or FLASH_REQUIRED (if the scene is too dark for still capture).

android.control.aeLock byte [public as boolean]
  • OFF

    Autoexposure lock is disabled; the AE algorithm is free to update its parameters.

  • ON

    Autoexposure lock is enabled; the AE algorithm must not update the exposure and sensitivity parameters while the lock is active. android.control.aeExposureCompensation setting changes will still take effect while auto-exposure is locked.

Whether AE is currently locked to its latest calculated values.

Details

Note that even when AE is locked, the flash may be fired if the android.control.aeMode is ON_AUTO_FLASH / ON_ALWAYS_FLASH / ON_AUTO_FLASH_REDEYE.

When android.control.aeExposureCompensation is changed, even if the AE lock is ON, the camera device will still adjust its exposure value.

If AE precapture is triggered (see android.control.aePrecaptureTrigger) when AE is already locked, the camera device will not change the exposure time (android.sensor.exposureTime) and sensitivity (android.sensor.sensitivity) parameters. The flash may be fired if the android.control.aeMode is ON_AUTO_FLASH/ON_AUTO_FLASH_REDEYE and the scene is too dark. If the android.control.aeMode is ON_ALWAYS_FLASH, the scene may become overexposed.

See android.control.aeState for AE lock related state transition details.

android.control.aeMode byte [public]

The desired mode for the camera device's auto-exposure routine.

android.control.aeAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to any of the ON modes, the camera device's auto-exposure routine is enabled, overriding the application's selected exposure time, sensor sensitivity, and frame duration (android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration). If one of the FLASH modes is selected, the camera device's flash unit controls are also overridden.

The FLASH modes are only available if the camera device has a flash unit (android.flash.info.available is true).

If flash TORCH mode is desired, this field must be set to ON or OFF, and android.flash.mode set to TORCH.

When set to any of the ON modes, the values chosen by the camera device auto-exposure routine for the overridden fields for a given capture will be available in its CaptureResult.

android.control.aeRegions int32 x 5 x area_count [public as meteringRectangle]

List of areas to use for metering.

area_count <= android.control.maxRegions[0]

Details

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the camera device. If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the region and output the used sections in the result metadata.

HAL Implementation Details

The HAL level representation of MeteringRectangle[] is a int[5 * area_count]. Every five elements represent a metering region of (xmin, ymin, xmax, ymax, weight). The rectangle is defined to be inclusive on xmin and ymin, but exclusive on xmax and ymax.

android.control.aeTargetFpsRange int32 x 2 [public as rangeInt]

Range over which fps can be adjusted to maintain exposure

android.control.aeAvailableTargetFpsRanges

Details

Only constrains AE algorithm, not manual control of android.sensor.exposureTime

android.control.aePrecaptureTrigger byte [public]
  • IDLE

    The trigger is idle.

  • START

    The precapture metering sequence will be started by the camera device. The exact effect of the precapture trigger depends on the current AE mode and state.

Whether the camera device will trigger a precapture metering sequence when it processes this request.

Details

This entry is normally set to IDLE, or is not included at all in the request settings. When included and set to START, the camera device will trigger the autoexposure precapture metering sequence.

The effect of AE precapture trigger depends on the current AE mode and state; see android.control.aeState for AE precapture state transition details.

android.control.aeState byte [public]
  • INACTIVE

    AE is off or recently reset. When a camera device is opened, it starts in this state. This is a transient state, the camera device may skip reporting this state in capture result.

  • SEARCHING

    AE doesn't yet have a good set of control values for the current scene. This is a transient state, the camera device may skip reporting this state in capture result.

  • CONVERGED

    AE has a good set of control values for the current scene.

  • LOCKED

    AE has been locked.

  • FLASH_REQUIRED

    AE has a good set of control values, but flash needs to be fired for good quality still capture.

  • PRECAPTURE

    AE has been asked to do a precapture sequence (through the android.control.aePrecaptureTrigger START), and is currently executing it. Once PRECAPTURE completes, AE will transition to CONVERGED or FLASH_REQUIRED as appropriate. This is a transient state, the camera device may skip reporting this state in capture result.

Current state of AE algorithm

Details

Switching between or enabling AE modes (android.control.aeMode) always resets the AE state to INACTIVE. Similarly, switching between android.control.mode, or android.control.sceneMode if android.control.mode == USE_SCENE_MODE resets all the algorithm states to INACTIVE.

The camera device can do several state transitions between two results, if it is allowed by the state transition table. For example: INACTIVE may never actually be seen in a result.

The state in the result is the state for this image (in sync with this image): if AE state becomes CONVERGED, then the image data associated with this result should be good to use.

Below are state transition tables for different AE modes.

State Transition Cause New State Notes
INACTIVE INACTIVE Camera device auto exposure algorithm is disabled

When android.control.aeMode is AE_MODE_ON_*:

State Transition Cause New State Notes
INACTIVE Camera device initiates AE scan SEARCHING Values changing
INACTIVE android.control.aeLock is ON LOCKED Values locked
SEARCHING Camera device finishes AE scan CONVERGED Good values, not changing
SEARCHING Camera device finishes AE scan FLASH_REQUIRED Converged but too dark w/o flash
SEARCHING android.control.aeLock is ON LOCKED Values locked
CONVERGED Camera device initiates AE scan SEARCHING Values changing
CONVERGED android.control.aeLock is ON LOCKED Values locked
FLASH_REQUIRED Camera device initiates AE scan SEARCHING Values changing
FLASH_REQUIRED android.control.aeLock is ON LOCKED Values locked
LOCKED android.control.aeLock is OFF SEARCHING Values not good after unlock
LOCKED android.control.aeLock is OFF CONVERGED Values good after unlock
LOCKED android.control.aeLock is OFF FLASH_REQUIRED Exposure good, but too dark
PRECAPTURE Sequence done. android.control.aeLock is OFF CONVERGED Ready for high-quality capture
PRECAPTURE Sequence done. android.control.aeLock is ON LOCKED Ready for high-quality capture
Any state android.control.aePrecaptureTrigger is START PRECAPTURE Start AE precapture metering sequence

For the above table, the camera device may skip reporting any state changes that happen without application intervention (i.e. mode switch, trigger, locking). Any state that can be skipped in that manner is called a transient state.

For example, for above AE modes (AE_MODE_ON_*), in addition to the state transitions listed in above table, it is also legal for the camera device to skip one or more transient states between two results. See below table for examples:

State Transition Cause New State Notes
INACTIVE Camera device finished AE scan CONVERGED Values are already good, transient states are skipped by camera device.
Any state android.control.aePrecaptureTrigger is START, sequence done FLASH_REQUIRED Converged but too dark w/o flash after a precapture sequence, transient states are skipped by camera device.
Any state android.control.aePrecaptureTrigger is START, sequence done CONVERGED Converged after a precapture sequence, transient states are skipped by camera device.
CONVERGED Camera device finished AE scan FLASH_REQUIRED Converged but too dark w/o flash after a new scan, transient states are skipped by camera device.
FLASH_REQUIRED Camera device finished AE scan CONVERGED Converged after a new scan, transient states are skipped by camera device.
android.control.afMode byte [public]
  • OFF

    The auto-focus routine does not control the lens; android.lens.focusDistance is controlled by the application

  • AUTO

    If lens is not fixed focus.

    Use android.lens.info.minimumFocusDistance to determine if lens is fixed-focus. In this mode, the lens does not move unless the autofocus trigger action is called. When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED).

    Triggering AF_CANCEL resets the lens position to default, and sets the AF state to INACTIVE.

  • MACRO

    In this mode, the lens does not move unless the autofocus trigger action is called.

    When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED). Triggering cancel AF resets the lens position to default, and sets the AF state to INACTIVE.

  • CONTINUOUS_VIDEO

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for good quality video recording; typically this means slower focus movement and no overshoots. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate. When the AF trigger is activated, the algorithm should immediately transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    Once cancel is received, the algorithm should transition back to INACTIVE and resume passive scan. Note that this behavior is not identical to CONTINUOUS_PICTURE, since an ongoing PASSIVE_SCAN must immediately be canceled.

  • CONTINUOUS_PICTURE

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for still image capture; typically this means focusing as fast as possible. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate as it attempts to maintain focus. When the AF trigger is activated, the algorithm should finish its PASSIVE_SCAN if active, and then transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    When the AF cancel trigger is activated, the algorithm should transition back to INACTIVE and then act as if it has just been started.

  • EDOF

    Extended depth of field (digital focus). AF trigger is ignored, AF state should always be INACTIVE.

Whether AF is currently enabled, and what mode it is set to

android.control.afAvailableModes

Details

Only effective if android.control.mode = AUTO and the lens is not fixed focus (i.e. android.lens.info.minimumFocusDistance > 0).

If the lens is controlled by the camera device auto-focus algorithm, the camera device will report the current AF status in android.control.afState in result metadata.

android.control.afRegions int32 x 5 x area_count [public as meteringRectangle]

List of areas to use for focus estimation.

area_count <= android.control.maxRegions[2]

Details

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the camera device. If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the region and output the used sections in the result metadata.

HAL Implementation Details

The HAL level representation of MeteringRectangle[] is a int[5 * area_count]. Every five elements represent a metering region of (xmin, ymin, xmax, ymax, weight). The rectangle is defined to be inclusive on xmin and ymin, but exclusive on xmax and ymax.

android.control.afTrigger byte [public]
  • IDLE

    The trigger is idle.

  • START

    Autofocus will trigger now.

  • CANCEL

    Autofocus will return to its initial state, and cancel any currently active trigger.

Whether the camera device will trigger autofocus for this request.

Details

This entry is normally set to IDLE, or is not included at all in the request settings.

When included and set to START, the camera device will trigger the autofocus algorithm. If autofocus is disabled, this trigger has no effect.

When set to CANCEL, the camera device will cancel any active trigger, and return to its initial AF state.

See android.control.afState for what that means for each AF mode.

android.control.afState byte [public]
  • INACTIVE

    AF off or has not yet tried to scan/been asked to scan. When a camera device is opened, it starts in this state. This is a transient state, the camera device may skip reporting this state in capture result.

  • PASSIVE_SCAN

    if CONTINUOUS_* modes are supported. AF is currently doing an AF scan initiated by a continuous autofocus mode. This is a transient state, the camera device may skip reporting this state in capture result.

  • PASSIVE_FOCUSED

    if CONTINUOUS_* modes are supported. AF currently believes it is in focus, but may restart scanning at any time. This is a transient state, the camera device may skip reporting this state in capture result.

  • ACTIVE_SCAN

    if AUTO or MACRO modes are supported. AF is doing an AF scan because it was triggered by AF trigger. This is a transient state, the camera device may skip reporting this state in capture result.

  • FOCUSED_LOCKED

    if any AF mode besides OFF is supported. AF believes it is focused correctly and is locked.

  • NOT_FOCUSED_LOCKED

    if any AF mode besides OFF is supported. AF has failed to focus successfully and is locked.

  • PASSIVE_UNFOCUSED

    if CONTINUOUS_* modes are supported. AF finished a passive scan without finding focus, and may restart scanning at any time. This is a transient state, the camera device may skip reporting this state in capture result.

Current state of AF algorithm.

Details

Switching between or enabling AF modes (android.control.afMode) always resets the AF state to INACTIVE. Similarly, switching between android.control.mode, or android.control.sceneMode if android.control.mode == USE_SCENE_MODE resets all the algorithm states to INACTIVE.

The camera device can do several state transitions between two results, if it is allowed by the state transition table. For example: INACTIVE may never actually be seen in a result.

The state in the result is the state for this image (in sync with this image): if AF state becomes FOCUSED, then the image data associated with this result should be sharp.

Below are state transition tables for different AF modes.

When android.control.afMode is AF_MODE_OFF or AF_MODE_EDOF:

State Transition Cause New State Notes
INACTIVE INACTIVE Never changes

When android.control.afMode is AF_MODE_AUTO or AF_MODE_MACRO:

State Transition Cause New State Notes
INACTIVE AF_TRIGGER ACTIVE_SCAN Start AF sweep, Lens now moving
ACTIVE_SCAN AF sweep done FOCUSED_LOCKED Focused, Lens now locked
ACTIVE_SCAN AF sweep done NOT_FOCUSED_LOCKED Not focused, Lens now locked
ACTIVE_SCAN AF_CANCEL INACTIVE Cancel/reset AF, Lens now locked
FOCUSED_LOCKED AF_CANCEL INACTIVE Cancel/reset AF
FOCUSED_LOCKED AF_TRIGGER ACTIVE_SCAN Start new sweep, Lens now moving
NOT_FOCUSED_LOCKED AF_CANCEL INACTIVE Cancel/reset AF
NOT_FOCUSED_LOCKED AF_TRIGGER ACTIVE_SCAN Start new sweep, Lens now moving
Any state Mode change INACTIVE

For the above table, the camera device may skip reporting any state changes that happen without application intervention (i.e. mode switch, trigger, locking). Any state that can be skipped in that manner is called a transient state.

For example, for these AF modes (AF_MODE_AUTO and AF_MODE_MACRO), in addition to the state transitions listed in above table, it is also legal for the camera device to skip one or more transient states between two results. See below table for examples:

State Transition Cause New State Notes
INACTIVE AF_TRIGGER FOCUSED_LOCKED Focus is already good or good after a scan, lens is now locked.
INACTIVE AF_TRIGGER NOT_FOCUSED_LOCKED Focus failed after a scan, lens is now locked.
FOCUSED_LOCKED AF_TRIGGER FOCUSED_LOCKED Focus is already good or good after a scan, lens is now locked.
NOT_FOCUSED_LOCKED AF_TRIGGER FOCUSED_LOCKED Focus is good after a scan, lens is not locked.

When android.control.afMode is AF_MODE_CONTINUOUS_VIDEO:

State Transition Cause New State Notes
INACTIVE Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
INACTIVE AF_TRIGGER NOT_FOCUSED_LOCKED AF state query, Lens now locked
PASSIVE_SCAN Camera device completes current scan PASSIVE_FOCUSED End AF scan, Lens now locked
PASSIVE_SCAN Camera device fails current scan PASSIVE_UNFOCUSED End AF scan, Lens now locked
PASSIVE_SCAN AF_TRIGGER FOCUSED_LOCKED Immediate trans. If focus is good, Lens now locked
PASSIVE_SCAN AF_TRIGGER NOT_FOCUSED_LOCKED Immediate trans. if focus is bad, Lens now locked
PASSIVE_SCAN AF_CANCEL INACTIVE Reset lens position, Lens now locked
PASSIVE_FOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_UNFOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_FOCUSED AF_TRIGGER FOCUSED_LOCKED Immediate trans. Lens now locked
PASSIVE_UNFOCUSED AF_TRIGGER NOT_FOCUSED_LOCKED Immediate trans. Lens now locked
FOCUSED_LOCKED AF_TRIGGER FOCUSED_LOCKED No effect
FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan
NOT_FOCUSED_LOCKED AF_TRIGGER NOT_FOCUSED_LOCKED No effect
NOT_FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan

When android.control.afMode is AF_MODE_CONTINUOUS_PICTURE:

State Transition Cause New State Notes
INACTIVE Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
INACTIVE AF_TRIGGER NOT_FOCUSED_LOCKED AF state query, Lens now locked
PASSIVE_SCAN Camera device completes current scan PASSIVE_FOCUSED End AF scan, Lens now locked
PASSIVE_SCAN Camera device fails current scan PASSIVE_UNFOCUSED End AF scan, Lens now locked
PASSIVE_SCAN AF_TRIGGER FOCUSED_LOCKED Eventual trans. once focus good, Lens now locked
PASSIVE_SCAN AF_TRIGGER NOT_FOCUSED_LOCKED Eventual trans. if cannot focus, Lens now locked
PASSIVE_SCAN AF_CANCEL INACTIVE Reset lens position, Lens now locked
PASSIVE_FOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_UNFOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_FOCUSED AF_TRIGGER FOCUSED_LOCKED Immediate trans. Lens now locked
PASSIVE_UNFOCUSED AF_TRIGGER NOT_FOCUSED_LOCKED Immediate trans. Lens now locked
FOCUSED_LOCKED AF_TRIGGER FOCUSED_LOCKED No effect
FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan
NOT_FOCUSED_LOCKED AF_TRIGGER NOT_FOCUSED_LOCKED No effect
NOT_FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan

When switch between AF_MODE_CONTINUOUS_* (CAF modes) and AF_MODE_AUTO/AF_MODE_MACRO (AUTO modes), the initial INACTIVE or PASSIVE_SCAN states may be skipped by the camera device. When a trigger is included in a mode switch request, the trigger will be evaluated in the context of the new mode in the request. See below table for examples:

State Transition Cause New State Notes
any state CAF-->AUTO mode switch INACTIVE Mode switch without trigger, initial state must be INACTIVE
any state CAF-->AUTO mode switch with AF_TRIGGER trigger-reachable states from INACTIVE Mode switch with trigger, INACTIVE is skipped
any state AUTO-->CAF mode switch passively reachable states from INACTIVE Mode switch without trigger, passive transient state is skipped
android.control.afTriggerId int32 [system] [deprecated]

The ID sent with the latest CAMERA2_TRIGGER_AUTOFOCUS call

Deprecated. Do not use.

Details

Must be 0 if no CAMERA2_TRIGGER_AUTOFOCUS trigger received yet by HAL. Always updated even if AF algorithm ignores the trigger

android.control.awbLock byte [public as boolean]
  • OFF

    Auto-whitebalance lock is disabled; the AWB algorithm is free to update its parameters if in AUTO mode.

  • ON

    Auto-whitebalance lock is enabled; the AWB algorithm must not update its parameters while the lock is active.

Whether AWB is currently locked to its latest calculated values.

Details

Note that AWB lock is only meaningful for AUTO mode; in other modes, AWB is already fixed to a specific setting.

android.control.awbMode byte [public]
  • OFF

    The camera device's auto white balance routine is disabled; the application-selected color transform matrix (android.colorCorrection.transform) and gains (android.colorCorrection.gains) are used by the camera device for manual white balance control.

  • AUTO

    The camera device's auto white balance routine is active; the application's values for android.colorCorrection.transform and android.colorCorrection.gains are ignored.

  • INCANDESCENT

    The camera device's auto white balance routine is disabled; the camera device uses incandescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant A.

  • FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F2.

  • WARM_FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses warm fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F4.

  • DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses daylight light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant D65.

  • CLOUDY_DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses cloudy daylight light as the assumed scene illumination for white balance.

  • TWILIGHT

    The camera device's auto white balance routine is disabled; the camera device uses twilight light as the assumed scene illumination for white balance.

  • SHADE

    The camera device's auto white balance routine is disabled; the camera device uses shade light as the assumed scene illumination for white balance.

Whether AWB is currently setting the color transform fields, and what its illumination target is.

android.control.awbAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to the ON mode, the camera device's auto white balance routine is enabled, overriding the application's selected android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to the OFF mode, the camera device's auto white balance routine is disabled. The application manually controls the white balance by android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to any other modes, the camera device's auto white balance routine is disabled. The camera device uses each particular illumination target for white balance adjustment.

android.control.awbRegions int32 x 5 x area_count [public as meteringRectangle]

List of areas to use for illuminant estimation.

area_count <= android.control.maxRegions[1]

Details

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the camera device. If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the region and output the used sections in the result metadata.

HAL Implementation Details

The HAL level representation of MeteringRectangle[] is a int[5 * area_count]. Every five elements represent a metering region of (xmin, ymin, xmax, ymax, weight). The rectangle is defined to be inclusive on xmin and ymin, but exclusive on xmax and ymax.

android.control.captureIntent byte [public]
  • CUSTOM

    This request doesn't fall into the other categories. Default to preview-like behavior.

  • PREVIEW

    This request is for a preview-like usecase. The precapture trigger may be used to start off a metering w/flash sequence

  • STILL_CAPTURE

    This request is for a still capture-type usecase.

  • VIDEO_RECORD

    This request is for a video recording usecase.

  • VIDEO_SNAPSHOT

    This request is for a video snapshot (still image while recording video) usecase

  • ZERO_SHUTTER_LAG

    This request is for a ZSL usecase; the application will stream full-resolution images and reprocess one or several later for a final capture

  • MANUAL

    This request is for manual capture use case where the applications want to directly control the capture parameters (e.g. android.sensor.exposureTime, android.sensor.sensitivity etc.).

Information to the camera device 3A (auto-exposure, auto-focus, auto-white balance) routines about the purpose of this capture, to help the camera device to decide optimal 3A strategy.

All must be supported except for ZERO_SHUTTER_LAG and MANUAL.

Details

This control (except for MANUAL) is only effective if android.control.mode != OFF and any 3A routine is active.

ZERO_SHUTTER_LAG must be supported if android.request.availableCapabilities contains ZSL. MANUAL must be supported if android.request.availableCapabilities contains MANUAL_SENSOR.

android.control.awbState byte [public]
  • INACTIVE

    AWB is not in auto mode. When a camera device is opened, it starts in this state. This is a transient state, the camera device may skip reporting this state in capture result.

  • SEARCHING

    AWB doesn't yet have a good set of control values for the current scene. This is a transient state, the camera device may skip reporting this state in capture result.

  • CONVERGED

    AWB has a good set of control values for the current scene.

  • LOCKED

    AWB has been locked.

Current state of AWB algorithm

Details

Switching between or enabling AWB modes (android.control.awbMode) always resets the AWB state to INACTIVE. Similarly, switching between android.control.mode, or android.control.sceneMode if android.control.mode == USE_SCENE_MODE resets all the algorithm states to INACTIVE.

The camera device can do several state transitions between two results, if it is allowed by the state transition table. So INACTIVE may never actually be seen in a result.

The state in the result is the state for this image (in sync with this image): if AWB state becomes CONVERGED, then the image data associated with this result should be good to use.

Below are state transition tables for different AWB modes.

When android.control.awbMode != AWB_MODE_AUTO:

State Transition Cause New State Notes
INACTIVE INACTIVE Camera device auto white balance algorithm is disabled

When android.control.awbMode is AWB_MODE_AUTO:

State Transition Cause New State Notes
INACTIVE Camera device initiates AWB scan SEARCHING Values changing
INACTIVE android.control.awbLock is ON LOCKED Values locked
SEARCHING Camera device finishes AWB scan CONVERGED Good values, not changing
SEARCHING android.control.awbLock is ON LOCKED Values locked
CONVERGED Camera device initiates AWB scan SEARCHING Values changing
CONVERGED android.control.awbLock is ON LOCKED Values locked
LOCKED android.control.awbLock is OFF SEARCHING Values not good after unlock

For the above table, the camera device may skip reporting any state changes that happen without application intervention (i.e. mode switch, trigger, locking). Any state that can be skipped in that manner is called a transient state.

For example, for this AWB mode (AWB_MODE_AUTO), in addition to the state transitions listed in above table, it is also legal for the camera device to skip one or more transient states between two results. See below table for examples:

State Transition Cause New State Notes
INACTIVE Camera device finished AWB scan CONVERGED Values are already good, transient states are skipped by camera device.
LOCKED android.control.awbLock is OFF CONVERGED Values good after unlock, transient states are skipped by camera device.
android.control.effectMode byte [public]
  • OFF

    No color effect will be applied.

  • MONO optional

    A "monocolor" effect where the image is mapped into a single color. This will typically be grayscale.

  • NEGATIVE optional

    A "photo-negative" effect where the image's colors are inverted.

  • SOLARIZE optional

    A "solarisation" effect (Sabattier effect) where the image is wholly or partially reversed in tone.

  • SEPIA optional

    A "sepia" effect where the image is mapped into warm gray, red, and brown tones.

  • POSTERIZE optional

    A "posterization" effect where the image uses discrete regions of tone rather than a continuous gradient of tones.

  • WHITEBOARD optional

    A "whiteboard" effect where the image is typically displayed as regions of white, with black or grey details.

  • BLACKBOARD optional

    A "blackboard" effect where the image is typically displayed as regions of black, with white or grey details.

  • AQUA optional

    An "aqua" effect where a blue hue is added to the image.

A special color effect to apply.

android.control.availableEffects

Details

When this mode is set, a color effect will be applied to images produced by the camera device. The interpretation and implementation of these color effects is left to the implementor of the camera device, and should not be depended on to be consistent (or present) across all devices.

A color effect will only be applied if android.control.mode != OFF.

android.control.mode byte [public]
  • OFF

    Full application control of pipeline. All 3A routines are disabled, no other settings in android.control.* have any effect

  • AUTO

    Use settings for each individual 3A routine. Manual control of capture parameters is disabled. All controls in android.control.* besides sceneMode take effect

  • USE_SCENE_MODE

    Use specific scene mode. Enabling this disables control.aeMode, control.awbMode and control.afMode controls; the camera device will ignore those settings while USE_SCENE_MODE is active (except for FACE_PRIORITY scene mode). Other control entries are still active. This setting can only be used if scene mode is supported (i.e. android.control.availableSceneModes contain some modes other than DISABLED).

  • OFF_KEEP_STATE

    Same as OFF mode, except that this capture will not be used by camera device background auto-exposure, auto-white balance and auto-focus algorithms to update their statistics.

Overall mode of 3A control routines.

all must be supported

Details

High-level 3A control. When set to OFF, all 3A control by the camera device is disabled. The application must set the fields for capture parameters itself.

When set to AUTO, the individual algorithm controls in android.control.* are in effect, such as android.control.afMode.

When set to USE_SCENE_MODE, the individual controls in android.control.* are mostly disabled, and the camera device implements one of the scene mode settings (such as ACTION, SUNSET, or PARTY) as it wishes. The camera device scene mode 3A settings are provided by android.control.sceneModeOverrides.

When set to OFF_KEEP_STATE, it is similar to OFF mode, the only difference is that this frame will not be used by camera device background 3A statistics update, as if this frame is never captured. This mode can be used in the scenario where the application doesn't want a 3A manual control capture to affect the subsequent auto 3A capture results.

android.control.sceneMode byte [public]
  • DISABLED 0

    Indicates that no scene modes are set for a given capture request.

  • FACE_PRIORITY

    If face detection support exists, use face detection data for auto-focus, auto-white balance, and auto-exposure routines. If face detection statistics are disabled (i.e. android.statistics.faceDetectMode is set to OFF), this should still operate correctly (but will not return face detection statistics to the framework).

    Unlike the other scene modes, android.control.aeMode, android.control.awbMode, and android.control.afMode remain active when FACE_PRIORITY is set.

  • ACTION optional

    Optimized for photos of quickly moving objects. Similar to SPORTS.

  • PORTRAIT optional

    Optimized for still photos of people.

  • LANDSCAPE optional

    Optimized for photos of distant macroscopic objects.

  • NIGHT optional

    Optimized for low-light settings.

  • NIGHT_PORTRAIT optional

    Optimized for still photos of people in low-light settings.

  • THEATRE optional

    Optimized for dim, indoor settings where flash must remain off.

  • BEACH optional

    Optimized for bright, outdoor beach settings.

  • SNOW optional

    Optimized for bright, outdoor settings containing snow.

  • SUNSET optional

    Optimized for scenes of the setting sun.

  • STEADYPHOTO optional

    Optimized to avoid blurry photos due to small amounts of device motion (for example: due to hand shake).

  • FIREWORKS optional

    Optimized for nighttime photos of fireworks.

  • SPORTS optional

    Optimized for photos of quickly moving people. Similar to ACTION.

  • PARTY optional

    Optimized for dim, indoor settings with multiple moving people.

  • CANDLELIGHT optional

    Optimized for dim settings where the main light source is a flame.

  • BARCODE optional

    Optimized for accurately capturing a photo of barcode for use by camera applications that wish to read the barcode value.

A camera mode optimized for conditions typical in a particular capture setting.

android.control.availableSceneModes

Details

This is the mode that that is active when android.control.mode == USE_SCENE_MODE. Aside from FACE_PRIORITY, these modes will disable android.control.aeMode, android.control.awbMode, and android.control.afMode while in use.

The interpretation and implementation of these scene modes is left to the implementor of the camera device. Their behavior will not be consistent across all devices, and any given device may only implement a subset of these modes.

HAL Implementation Details

HAL implementations that include scene modes are expected to provide the per-scene settings to use for android.control.aeMode, android.control.awbMode, and android.control.afMode in android.control.sceneModeOverrides.

android.control.videoStabilizationMode byte [public]
  • OFF
  • ON

Whether video stabilization is active

Details

If enabled, video stabilization can modify the android.scaler.cropRegion to keep the video stream stabilized

demosaic
controls
Property Name Type Description Units Range Tags
android.demosaic.mode byte [system]
  • FAST

    Minimal or no slowdown of frame rate compared to Bayer RAW output

  • HIGH_QUALITY

    High-quality may reduce output frame rate

Controls the quality of the demosaicing processing

edge
controls
Property Name Type Description Units Range Tags
android.edge.mode byte [public]
  • OFF

    No edge enhancement is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Operation mode for edge enhancement.

Details

Edge/sharpness/detail enhancement. OFF means no enhancement will be applied by the camera device.

This must be set to one of the modes listed in android.edge.availableEdgeModes.

FAST/HIGH_QUALITY both mean camera device determined enhancement will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality enhancement algorithms, even if it slows down capture rate. FAST means the camera device will not slow down capture rate when applying edge enhancement.

android.edge.strength byte [system]

Control the amount of edge enhancement applied to the images

1-10; 10 is maximum sharpening
static
Property Name Type Description Units Range Tags
android.edge.availableEdgeModes byte x n [public]
list of enums

The set of edge enhancement modes supported by this camera device.

Details

This tag lists the valid modes for android.edge.mode.

Full-capability camera devices must always support OFF and FAST.

dynamic
Property Name Type Description Units Range Tags
android.edge.mode byte [public]
  • OFF

    No edge enhancement is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Operation mode for edge enhancement.

Details

Edge/sharpness/detail enhancement. OFF means no enhancement will be applied by the camera device.

This must be set to one of the modes listed in android.edge.availableEdgeModes.

FAST/HIGH_QUALITY both mean camera device determined enhancement will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality enhancement algorithms, even if it slows down capture rate. FAST means the camera device will not slow down capture rate when applying edge enhancement.

flash
controls
Property Name Type Description Units Range Tags
android.flash.firingPower byte [system]

Power for flash firing/torch

10 is max power; 0 is no flash. Linear

0 - 10

Details

Power for snapshot may use a different scale than for torch mode. Only one entry for torch mode will be used

android.flash.firingTime int64 [system]

Firing time of flash relative to start of exposure

nanoseconds

0-(exposure time-flash duration)

Details

Clamped to (0, exposure time - flash duration).

android.flash.mode byte [public]
  • OFF

    Do not fire the flash for this capture.

  • SINGLE

    If the flash is available and charged, fire flash for this capture.

  • TORCH

    Transition flash to continuously on.

The desired mode for for the camera device's flash control.

Details

This control is only effective when flash unit is available (android.flash.info.available == true).

When this control is used, the android.control.aeMode must be set to ON or OFF. Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control.

When set to OFF, the camera device will not fire flash for this capture.

When set to SINGLE, the camera device will fire flash regardless of the camera device's auto-exposure routine's result. When used in still capture case, this control should be used along with AE precapture metering sequence (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed.

When set to TORCH, the flash will be on continuously. This mode can be used for use cases such as preview, auto-focus assist, still capture, or video recording.

The flash status will be reported by android.flash.state in the capture result metadata.

static
Property Name Type Description Units Range Tags
android.flash.info.available byte [public as boolean]
  • FALSE
  • TRUE

Whether this camera device has a flash.

Details

If no flash, none of the flash controls do anything. All other metadata should return 0.

android.flash.info.chargeDuration int64 [system]

Time taken before flash can fire again

nanoseconds

0-1e9

Details

1 second too long/too short for recharge? Should this be power-dependent?

android.flash.colorTemperature byte [system]

The x,y whitepoint of the flash

pair of floats

0-1 for both

android.flash.maxEnergy byte [system]

Max energy output of the flash for a full power single flash

lumen-seconds

>= 0

dynamic
Property Name Type Description Units Range Tags
android.flash.firingPower byte [system]

Power for flash firing/torch

10 is max power; 0 is no flash. Linear

0 - 10

Details

Power for snapshot may use a different scale than for torch mode. Only one entry for torch mode will be used

android.flash.firingTime int64 [system]

Firing time of flash relative to start of exposure

nanoseconds

0-(exposure time-flash duration)

Details

Clamped to (0, exposure time - flash duration).

android.flash.mode byte [public]
  • OFF

    Do not fire the flash for this capture.

  • SINGLE

    If the flash is available and charged, fire flash for this capture.

  • TORCH

    Transition flash to continuously on.

The desired mode for for the camera device's flash control.

Details

This control is only effective when flash unit is available (android.flash.info.available == true).

When this control is used, the android.control.aeMode must be set to ON or OFF. Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control.

When set to OFF, the camera device will not fire flash for this capture.

When set to SINGLE, the camera device will fire flash regardless of the camera device's auto-exposure routine's result. When used in still capture case, this control should be used along with AE precapture metering sequence (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed.

When set to TORCH, the flash will be on continuously. This mode can be used for use cases such as preview, auto-focus assist, still capture, or video recording.

The flash status will be reported by android.flash.state in the capture result metadata.

android.flash.state byte [public]
  • UNAVAILABLE

    No flash on camera.

  • CHARGING

    Flash is charging and cannot be fired.

  • READY

    Flash is ready to fire.

  • FIRED

    Flash fired for this capture.

  • PARTIAL

    Flash partially illuminated this frame. This is usually due to the next or previous frame having the flash fire, and the flash spilling into this capture due to hardware limitations.

Current state of the flash unit.

Details

When the camera device doesn't have flash unit (i.e. android.flash.info.available == false), this state will always be UNAVAILABLE. Other states indicate the current flash status.

hotPixel
controls
Property Name Type Description Units Range Tags
android.hotPixel.mode byte [public]
  • OFF

    The frame rate must not be reduced relative to sensor raw output for this option.

    No hot pixel correction is applied. The hotpixel map may be returned in android.statistics.hotPixelMap.

  • FAST

    The frame rate must not be reduced relative to sensor raw output for this option.

    Hot pixel correction is applied. The hotpixel map may be returned in android.statistics.hotPixelMap.

  • HIGH_QUALITY

    The frame rate may be reduced relative to sensor raw output for this option.

    A high-quality hot pixel correction is applied. The hotpixel map may be returned in android.statistics.hotPixelMap.

Set operational mode for hot pixel correction.

Details

Valid modes for this camera device are listed in android.hotPixel.availableHotPixelModes.

Hotpixel correction interpolates out, or otherwise removes, pixels that do not accurately encode the incoming light (i.e. pixels that are stuck at an arbitrary value).

static
Property Name Type Description Units Range Tags
android.hotPixel.availableHotPixelModes byte x n [public]
list of enums

The set of hot pixel correction modes that are supported by this camera device.

Details

This tag lists valid modes for android.hotPixel.mode.

FULL mode camera devices will always support FAST.

HAL Implementation Details

To avoid performance issues, there will be significantly fewer hot pixels than actual pixels on the camera sensor.

dynamic
Property Name Type Description Units Range Tags
android.hotPixel.mode byte [public]
  • OFF

    The frame rate must not be reduced relative to sensor raw output for this option.

    No hot pixel correction is applied. The hotpixel map may be returned in android.statistics.hotPixelMap.

  • FAST

    The frame rate must not be reduced relative to sensor raw output for this option.

    Hot pixel correction is applied. The hotpixel map may be returned in android.statistics.hotPixelMap.

  • HIGH_QUALITY

    The frame rate may be reduced relative to sensor raw output for this option.

    A high-quality hot pixel correction is applied. The hotpixel map may be returned in android.statistics.hotPixelMap.

Set operational mode for hot pixel correction.

Details

Valid modes for this camera device are listed in android.hotPixel.availableHotPixelModes.

Hotpixel correction interpolates out, or otherwise removes, pixels that do not accurately encode the incoming light (i.e. pixels that are stuck at an arbitrary value).

jpeg
controls
Property Name Type Description Units Range Tags
android.jpeg.gpsCoordinates double x 3 [public]
latitude, longitude, altitude. First two in degrees, the third in meters

GPS coordinates to include in output JPEG EXIF

(-180 - 180], [-90,90], [-inf, inf]

android.jpeg.gpsProcessingMethod byte [public as string]

32 characters describing GPS algorithm to include in EXIF

UTF-8 null-terminated string
android.jpeg.gpsTimestamp int64 [public]

Time GPS fix was made to include in EXIF

UTC in seconds since January 1, 1970
android.jpeg.orientation int32 [public]

Orientation of JPEG image to write

Degrees in multiples of 90

0, 90, 180, 270

android.jpeg.quality byte [public]

Compression quality of the final JPEG image

1-100; larger is higher quality

Details

85-95 is typical usage range

android.jpeg.thumbnailQuality byte [public]

Compression quality of JPEG thumbnail

1-100; larger is higher quality

android.jpeg.thumbnailSize int32 x 2 [public as size]

Resolution of embedded JPEG thumbnail

Size must be one of the size from android.jpeg.availableThumbnailSizes

Details

When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, but the captured JPEG will still be a valid image.

When a jpeg image capture is issued, the thumbnail size selected should have the same aspect ratio as the jpeg image.

static
Property Name Type Description Units Range Tags
android.jpeg.availableThumbnailSizes int32 x 2 x n [public as size]

Supported resolutions for the JPEG thumbnail

Will include at least one valid resolution, plus (0,0) for no thumbnail generation, and each size will be distinct.

Details

Below condiditions will be satisfied for this size list:

  • The sizes will be sorted by increasing pixel area (width x height). If several resolutions have the same area, they will be sorted by increasing width.
  • The aspect ratio of the largest thumbnail size will be same as the aspect ratio of largest JPEG output size in android.scaler.availableStreamConfigurations. The largest size is defined as the size that has the largest pixel area in a given size list.
  • Each output JPEG size in android.scaler.availableStreamConfigurations will have at least one corresponding size that has the same aspect ratio in availableThumbnailSizes, and vice versa.
  • All non (0, 0) sizes will have non-zero widths and heights.
android.jpeg.maxSize int32 [system]

Maximum size in bytes for the compressed JPEG buffer

Must be large enough to fit any JPEG produced by the camera

Details

This is used for sizing the gralloc buffers for JPEG

dynamic
Property Name Type Description Units Range Tags
android.jpeg.gpsCoordinates double x 3 [public]
latitude, longitude, altitude. First two in degrees, the third in meters

GPS coordinates to include in output JPEG EXIF

(-180 - 180], [-90,90], [-inf, inf]

android.jpeg.gpsProcessingMethod byte [public as string]

32 characters describing GPS algorithm to include in EXIF

UTF-8 null-terminated string
android.jpeg.gpsTimestamp int64 [public]

Time GPS fix was made to include in EXIF

UTC in seconds since January 1, 1970
android.jpeg.orientation int32 [public]

Orientation of JPEG image to write

Degrees in multiples of 90

0, 90, 180, 270

android.jpeg.quality byte [public]

Compression quality of the final JPEG image

1-100; larger is higher quality

Details

85-95 is typical usage range

android.jpeg.size int32 [system]

The size of the compressed JPEG image, in bytes

>= 0

Details

If no JPEG output is produced for the request, this must be 0.

Otherwise, this describes the real size of the compressed JPEG image placed in the output stream. More specifically, if android.jpeg.maxSize = 1000000, and a specific capture has android.jpeg.size = 500000, then the output buffer from the JPEG stream will be 1000000 bytes, of which the first 500000 make up the real data.

android.jpeg.thumbnailQuality byte [public]

Compression quality of JPEG thumbnail

1-100; larger is higher quality

android.jpeg.thumbnailSize int32 x 2 [public as size]

Resolution of embedded JPEG thumbnail

Size must be one of the size from android.jpeg.availableThumbnailSizes

Details

When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, but the captured JPEG will still be a valid image.

When a jpeg image capture is issued, the thumbnail size selected should have the same aspect ratio as the jpeg image.

lens
controls
Property Name Type Description Units Range Tags
android.lens.aperture float [public]

The ratio of lens focal length to the effective aperture diameter.

f-number (f/NNN)

android.lens.info.availableApertures

Details

This will only be supported on the camera devices that have variable aperture lens. The aperture value can only be one of the values listed in android.lens.info.availableApertures.

When this is supported and android.control.aeMode is OFF, this can be set along with android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration to achieve manual exposure control.

The requested aperture value may take several frames to reach the requested value; the camera device will report the current (intermediate) aperture size in capture result metadata while the aperture is changing. While the aperture is still changing, android.lens.state will be set to MOVING.

When this is supported and android.control.aeMode is one of the ON modes, this will be overridden by the camera device auto-exposure algorithm, the overridden values are then provided back to the user in the corresponding result.

android.lens.filterDensity float [public]

State of lens neutral density filter(s).

Steps of Exposure Value (EV).

android.lens.info.availableFilterDensities

Details

This will not be supported on most camera devices. On devices where this is supported, this may only be set to one of the values included in android.lens.info.availableFilterDensities.

Lens filters are typically used to lower the amount of light the sensor is exposed to (measured in steps of EV). As used here, an EV step is the standard logarithmic representation, which are non-negative, and inversely proportional to the amount of light hitting the sensor. For example, setting this to 0 would result in no reduction of the incoming light, and setting this to 2 would mean that the filter is set to reduce incoming light by two stops (allowing 1/4 of the prior amount of light to the sensor).

It may take several frames before the lens filter density changes to the requested value. While the filter density is still changing, android.lens.state will be set to MOVING.

android.lens.focalLength float [public]

The current lens focal length; used for optical zoom.

focal length in mm

android.lens.info.availableFocalLengths

Details

This setting controls the physical focal length of the camera device's lens. Changing the focal length changes the field of view of the camera device, and is usually used for optical zoom.

Like android.lens.focusDistance and android.lens.aperture, this setting won't be applied instantaneously, and it may take several frames before the lens can change to the requested focal length. While the focal length is still changing, android.lens.state will be set to MOVING.

This is expected not to be supported on most devices.

android.lens.focusDistance float [public]

Distance to plane of sharpest focus, measured from frontmost surface of the lens

See android.lens.info.focusDistanceCalibration for details.

>= 0

Details

0 means infinity focus. Used value will be clamped to [0, android.lens.info.minimumFocusDistance].

Like android.lens.focalLength, this setting won't be applied instantaneously, and it may take several frames before the lens can move to the requested focus distance. While the lens is still moving, android.lens.state will be set to MOVING.

android.lens.opticalStabilizationMode byte [public]
  • OFF

    Optical stabilization is unavailable.

  • ON optional

    Optical stabilization is enabled.

Sets whether the camera device uses optical image stabilization (OIS) when capturing images.

android.lens.info.availableOpticalStabilization

Details

OIS is used to compensate for motion blur due to small movements of the camera during capture. Unlike digital image stabilization, OIS makes use of mechanical elements to stabilize the camera sensor, and thus allows for longer exposure times before camera shake becomes apparent.

This is not expected to be supported on most devices.

static
Property Name Type Description Units Range Tags
android.lens.info.availableApertures float x n [public]

List of supported aperture values.

one entry required, > 0

Details

If the camera device doesn't support variable apertures, listed value will be the fixed aperture.

If the camera device supports variable apertures, the aperture value in this list will be sorted in ascending order.

android.lens.info.availableFilterDensities float x n [public]

List of supported neutral density filter values for android.lens.filterDensity.

At least one value is required. Values must be >= 0.

Details

If changing android.lens.filterDensity is not supported, availableFilterDensities must contain only 0. Otherwise, this list contains only the exact filter density values available on this camera device.

android.lens.info.availableFocalLengths float x n [public]
The list of available focal lengths

The available focal lengths for this device for use with android.lens.focalLength.

Each value in this list must be > 0. This list must contain at least one value.

Details

If optical zoom is not supported, this will only report a single value corresponding to the static focal length of the device. Otherwise, this will report every focal length supported by the device.

android.lens.info.availableOpticalStabilization byte x n [public]
list of enums

List containing a subset of the optical image stabilization (OIS) modes specified in android.lens.opticalStabilizationMode.

Details

If OIS is not implemented for a given camera device, this should contain only OFF.

android.lens.info.hyperfocalDistance float [public]

Optional. Hyperfocal distance for this lens.

See android.lens.info.focusDistanceCalibration for details.

If lens is fixed focus, >= 0. If lens has focuser unit, the range is (0, android.lens.info.minimumFocusDistance]

Details

If the lens is not fixed focus, the camera device will report this field when android.lens.info.focusDistanceCalibration is APPROXIMATE or CALIBRATED.

android.lens.info.minimumFocusDistance float [public]

Shortest distance from frontmost surface of the lens that can be focused correctly.

See android.lens.info.focusDistanceCalibration for details.

>= 0

Details

If the lens is fixed-focus, this should be 0.

android.lens.info.shadingMapSize int32 x 2 [public as size]
width and height (N, M) of lens shading map provided by the camera device.

Dimensions of lens shading map.

Both values >= 1

Details

The map should be on the order of 30-40 rows and columns, and must be smaller than 64x64.

android.lens.info.focusDistanceCalibration byte [public]
  • UNCALIBRATED

    The lens focus distance is not accurate, and the units used for android.lens.focusDistance do not correspond to any physical units. Setting the lens to the same focus distance on separate occasions may result in a different real focus distance, depending on factors such as the orientation of the device, the age of the focusing mechanism, and the device temperature. The focus distance value will still be in the range of [0, android.lens.info.minimumFocusDistance], where 0 represents the farthest focus.

  • APPROXIMATE

    The lens focus distance is measured in diopters. However, setting the lens to the same focus distance on separate occasions may result in a different real focus distance, depending on factors such as the orientation of the device, the age of the focusing mechanism, and the device temperature.

  • CALIBRATED

    The lens focus distance is measured in diopters. The lens mechanism is calibrated so that setting the same focus distance is repeatable on multiple occasions with good accuracy, and the focus distance corresponds to the real physical distance to the plane of best focus.

The lens focus distance calibration quality.

Details

The lens focus distance calibration quality determines the reliability of focus related metadata entries, i.e. android.lens.focusDistance, android.lens.focusRange, android.lens.info.hyperfocalDistance, and android.lens.info.minimumFocusDistance.

android.lens.facing byte [public]
  • FRONT
  • BACK

Direction the camera faces relative to device screen

android.lens.opticalAxisAngle float x 2 [system]
degrees. First defines the angle of separation between the perpendicular to the screen and the camera optical axis. The second then defines the clockwise rotation of the optical axis from native device up.

Relative angle of camera optical axis to the perpendicular axis from the display

[0-90) for first angle, [0-360) for second

Details

Examples:

(0,0) means that the camera optical axis is perpendicular to the display surface;

(45,0) means that the camera points 45 degrees up when device is held upright;

(45,90) means the camera points 45 degrees to the right when the device is held upright.

Use FACING field to determine perpendicular outgoing direction

android.lens.position float x 3, location in mm, in the sensor coordinate system [system]

Coordinates of camera optical axis on device

dynamic
Property Name Type Description Units Range Tags
android.lens.aperture float [public]

The ratio of lens focal length to the effective aperture diameter.

f-number (f/NNN)

android.lens.info.availableApertures

Details

This will only be supported on the camera devices that have variable aperture lens. The aperture value can only be one of the values listed in android.lens.info.availableApertures.

When this is supported and android.control.aeMode is OFF, this can be set along with android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration to achieve manual exposure control.

The requested aperture value may take several frames to reach the requested value; the camera device will report the current (intermediate) aperture size in capture result metadata while the aperture is changing. While the aperture is still changing, android.lens.state will be set to MOVING.

When this is supported and android.control.aeMode is one of the ON modes, this will be overridden by the camera device auto-exposure algorithm, the overridden values are then provided back to the user in the corresponding result.

android.lens.filterDensity float [public]

State of lens neutral density filter(s).

Steps of Exposure Value (EV).

android.lens.info.availableFilterDensities

Details

This will not be supported on most camera devices. On devices where this is supported, this may only be set to one of the values included in android.lens.info.availableFilterDensities.

Lens filters are typically used to lower the amount of light the sensor is exposed to (measured in steps of EV). As used here, an EV step is the standard logarithmic representation, which are non-negative, and inversely proportional to the amount of light hitting the sensor. For example, setting this to 0 would result in no reduction of the incoming light, and setting this to 2 would mean that the filter is set to reduce incoming light by two stops (allowing 1/4 of the prior amount of light to the sensor).

It may take several frames before the lens filter density changes to the requested value. While the filter density is still changing, android.lens.state will be set to MOVING.

android.lens.focalLength float [public]

The current lens focal length; used for optical zoom.

focal length in mm

android.lens.info.availableFocalLengths

Details

This setting controls the physical focal length of the camera device's lens. Changing the focal length changes the field of view of the camera device, and is usually used for optical zoom.

Like android.lens.focusDistance and android.lens.aperture, this setting won't be applied instantaneously, and it may take several frames before the lens can change to the requested focal length. While the focal length is still changing, android.lens.state will be set to MOVING.

This is expected not to be supported on most devices.

android.lens.focusDistance float [public]

Distance to plane of sharpest focus, measured from frontmost surface of the lens

See android.lens.info.focusDistanceCalibration for details.

>= 0

Details

Should be zero for fixed-focus cameras

android.lens.focusRange float x 2 [public as rangeFloat]
Range of scene distances that are in focus

The range of scene distances that are in sharp focus (depth of field)

pair of focus distances in diopters: (near, far), see android.lens.info.focusDistanceCalibration for details.

>=0

Details

If variable focus not supported, can still report fixed depth of field range

android.lens.opticalStabilizationMode byte [public]
  • OFF

    Optical stabilization is unavailable.

  • ON optional

    Optical stabilization is enabled.

Sets whether the camera device uses optical image stabilization (OIS) when capturing images.

android.lens.info.availableOpticalStabilization

Details

OIS is used to compensate for motion blur due to small movements of the camera during capture. Unlike digital image stabilization, OIS makes use of mechanical elements to stabilize the camera sensor, and thus allows for longer exposure times before camera shake becomes apparent.

This is not expected to be supported on most devices.

android.lens.state byte [public]

Current lens status.

Details

For lens parameters android.lens.focalLength, android.lens.focusDistance, android.lens.filterDensity and android.lens.aperture, when changes are requested, they may take several frames to reach the requested values. This state indicates the current status of the lens parameters.

When the state is STATIONARY, the lens parameters are not changing. This could be either because the parameters are all fixed, or because the lens has had enough time to reach the most recently-requested values. If all these lens parameters are not changable for a camera device, as listed below:

Then this state will always be STATIONARY.

When the state is MOVING, it indicates that at least one of the lens parameters is changing.

noiseReduction
controls
Property Name Type Description Units Range Tags
android.noiseReduction.mode byte [public]
  • OFF

    No noise reduction is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    May slow down frame rate to provide highest quality

Mode of operation for the noise reduction algorithm

Details

Noise filtering control. OFF means no noise reduction will be applied by the camera device.

This must be set to a valid mode in android.noiseReduction.availableNoiseReductionModes.

FAST/HIGH_QUALITY both mean camera device determined noise filtering will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality noise filtering algorithms, even if it slows down capture rate. FAST means the camera device should not slow down capture rate when applying noise filtering.

android.noiseReduction.strength byte [system]

Control the amount of noise reduction applied to the images

1-10; 10 is max noise reduction

1 - 10

static
Property Name Type Description Units Range Tags
android.noiseReduction.availableNoiseReductionModes byte x n [public]
list of enums

The set of noise reduction modes supported by this camera device.

Details

This tag lists the valid modes for android.noiseReduction.mode.

Full-capability camera devices must laways support OFF and FAST.

dynamic
Property Name Type Description Units Range Tags
android.noiseReduction.mode byte [public]
  • OFF

    No noise reduction is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    May slow down frame rate to provide highest quality

Mode of operation for the noise reduction algorithm

Details

Noise filtering control. OFF means no noise reduction will be applied by the camera device.

This must be set to a valid mode in android.noiseReduction.availableNoiseReductionModes.

FAST/HIGH_QUALITY both mean camera device determined noise filtering will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality noise filtering algorithms, even if it slows down capture rate. FAST means the camera device should not slow down capture rate when applying noise filtering.

quirks
static
Property Name Type Description Units Range Tags
android.quirks.meteringCropRegion byte [system] [deprecated]

If set to 1, the camera service does not scale 'normalized' coordinates with respect to the crop region. This applies to metering input (a{e,f,wb}Region and output (face rectangles).

Deprecated. Do not use.

Details

Normalized coordinates refer to those in the (-1000,1000) range mentioned in the android.hardware.Camera API.

HAL implementations should instead always use and emit sensor array-relative coordinates for all region data. Does not need to be listed in static metadata. Support will be removed in future versions of camera service.

android.quirks.triggerAfWithAuto byte [system] [deprecated]

If set to 1, then the camera service always switches to FOCUS_MODE_AUTO before issuing a AF trigger.

Deprecated. Do not use.

Details

HAL implementations should implement AF trigger modes for AUTO, MACRO, CONTINUOUS_FOCUS, and CONTINUOUS_PICTURE modes instead of using this flag. Does not need to be listed in static metadata. Support will be removed in future versions of camera service

android.quirks.useZslFormat byte [system] [deprecated]

If set to 1, the camera service uses CAMERA2_PIXEL_FORMAT_ZSL instead of HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED for the zero shutter lag stream

Deprecated. Do not use.

Details

HAL implementations should use gralloc usage flags to determine that a stream will be used for zero-shutter-lag, instead of relying on an explicit format setting. Does not need to be listed in static metadata. Support will be removed in future versions of camera service.

android.quirks.usePartialResult byte [hidden] [deprecated]

If set to 1, the HAL will always split result metadata for a single capture into multiple buffers, returned using multiple process_capture_result calls.

Deprecated. Do not use.

Details

Does not need to be listed in static metadata. Support for partial results will be reworked in future versions of camera service. This quirk will stop working at that point; DO NOT USE without careful consideration of future support.

HAL Implementation Details

Refer to camera3_capture_result::partial_result for information on how to implement partial results.

dynamic
Property Name Type Description Units Range Tags
android.quirks.partialResult byte [hidden as boolean] [deprecated]
  • FINAL

    The last or only metadata result buffer for this capture.

  • PARTIAL

    A partial buffer of result metadata for this capture. More result buffers for this capture will be sent by the camera device, the last of which will be marked FINAL.

Whether a result given to the framework is the final one for the capture, or only a partial that contains a subset of the full set of dynamic metadata values.

Deprecated. Do not use.

Optional. Default value is FINAL.

Details

The entries in the result metadata buffers for a single capture may not overlap, except for this entry. The FINAL buffers must retain FIFO ordering relative to the requests that generate them, so the FINAL buffer for frame 3 must always be sent to the framework after the FINAL buffer for frame 2, and before the FINAL buffer for frame 4. PARTIAL buffers may be returned in any order relative to other frames, but all PARTIAL buffers for a given capture must arrive before the FINAL buffer for that capture. This entry may only be used by the camera device if quirks.usePartialResult is set to 1.

HAL Implementation Details

Refer to camera3_capture_result::partial_result for information on how to implement partial results.

request
controls
Property Name Type Description Units Range Tags
android.request.frameCount int32 [system] [deprecated]

A frame counter set by the framework. Must be maintained unchanged in output frame. This value monotonically increases with every new result (that is, each new result has a unique frameCount value).

incrementing integer

Deprecated. Do not use.

Any int.

android.request.id int32 [hidden]

An application-specified ID for the current request. Must be maintained unchanged in output frame

arbitrary integer assigned by application

Any int

android.request.inputStreams int32 x n [system] [deprecated]

List which camera reprocess stream is used for the source of reprocessing data.

List of camera reprocess stream IDs

Deprecated. Do not use.

Typically, only one entry allowed, must be a valid reprocess stream ID.

Details

Only meaningful when android.request.type == REPROCESS. Ignored otherwise

android.request.metadataMode byte [system]
  • NONE

    No metadata should be produced on output, except for application-bound buffer data. If no application-bound streams exist, no frame should be placed in the output frame queue. If such streams exist, a frame should be placed on the output queue with null metadata but with the necessary output buffer information. Timestamp information should still be included with any output stream buffers

  • FULL

    All metadata should be produced. Statistics will only be produced if they are separately enabled

How much metadata to produce on output

android.request.outputStreams int32 x n [system] [deprecated]

Lists which camera output streams image data from this capture must be sent to

List of camera stream IDs

Deprecated. Do not use.

List must only include streams that have been created

Details

If no output streams are listed, then the image data should simply be discarded. The image data must still be captured for metadata and statistics production, and the lens and flash must operate as requested.

android.request.type byte [system] [deprecated]
  • CAPTURE

    Capture a new image from the imaging hardware, and process it according to the settings

  • REPROCESS

    Process previously captured data; the android.request.inputStreams parameter determines the source reprocessing stream. TODO: Mark dynamic metadata needed for reprocessing with [RP]

The type of the request; either CAPTURE or REPROCESS. For HAL3, this tag is redundant.

Deprecated. Do not use.

static
Property Name Type Description Units Range Tags
android.request.maxNumOutputStreams int32 x 3 [hidden]

The maximum numbers of different types of output streams that can be configured and used simultaneously by a camera device.

For processed (and stalling) format streams, >= 1.

For Raw format (either stalling or non-stalling) streams, >= 0.

For processed (but not stalling) format streams, >= 3 for FULL mode devices (android.info.supportedHardwareLevel == FULL); >= 2 for LIMITED mode devices (android.info.supportedHardwareLevel == LIMITED).

Details

This is a 3 element tuple that contains the max number of output simultaneous streams for raw sensor, processed (but not stalling), and processed (and stalling) formats respectively. For example, assuming that JPEG is typically a processed and stalling stream, if max raw sensor format output stream number is 1, max YUV streams number is 3, and max JPEG stream number is 2, then this tuple should be (1, 3, 2).

This lists the upper bound of the number of output streams supported by the camera device. Using more streams simultaneously may require more hardware and CPU resources that will consume more power. The image format for an output stream can be any supported format provided by android.scaler.availableStreamConfigurations. The formats defined in android.scaler.availableStreamConfigurations can be catergorized into the 3 stream types as below:

  • Processed (but stalling): any non-RAW format with a stallDurations > 0. Typically JPEG format (ImageFormat#JPEG).
  • Raw formats: ImageFormat#RAW_SENSOR and ImageFormat#RAW_OPAQUE.
  • Processed (but not-stalling): any non-RAW format without a stall duration. Typically ImageFormat#YUV_420_888, ImageFormat#NV21, ImageFormat#YV12.
android.request.maxNumOutputRaw int32 [public] [synthetic]

The maximum numbers of different types of output streams that can be configured and used simultaneously by a camera device for any RAW formats.

>= 0

Details

This value contains the max number of output simultaneous streams from the raw sensor.

This lists the upper bound of the number of output streams supported by the camera device. Using more streams simultaneously may require more hardware and CPU resources that will consume more power. The image format for this kind of an output stream can be any RAW and supported format provided by android.scaler.streamConfigurationMap.

In particular, a RAW format is typically one of:

  • ImageFormat#RAW_SENSOR
  • Opaque RAW
android.request.maxNumOutputProc int32 [public] [synthetic]

The maximum numbers of different types of output streams that can be configured and used simultaneously by a camera device for any processed (but not-stalling) formats.

>= 3 for FULL mode devices (android.info.supportedHardwareLevel == FULL); >= 2 for LIMITED mode devices (android.info.supportedHardwareLevel == LIMITED).

Details

This value contains the max number of output simultaneous streams for any processed (but not-stalling) formats.

This lists the upper bound of the number of output streams supported by the camera device. Using more streams simultaneously may require more hardware and CPU resources that will consume more power. The image format for this kind of an output stream can be any non-RAW and supported format provided by android.scaler.streamConfigurationMap.

Processed (but not-stalling) is defined as any non-RAW format without a stall duration. Typically:

  • ImageFormat#YUV_420_888
  • ImageFormat#NV21
  • ImageFormat#YV12
  • Implementation-defined formats, i.e. StreamConfiguration#isOutputSupportedFor(Class)

For full guarantees, query StreamConfigurationMap#getOutputStallDuration with a processed format -- it will return 0 for a non-stalling stream.

android.request.maxNumOutputProcStalling int32 [public] [synthetic]

The maximum numbers of different types of output streams that can be configured and used simultaneously by a camera device for any processed (and stalling) formats.

>= 1

Details

This value contains the max number of output simultaneous streams for any processed (but not-stalling) formats.

This lists the upper bound of the number of output streams supported by the camera device. Using more streams simultaneously may require more hardware and CPU resources that will consume more power. The image format for this kind of an output stream can be any non-RAW and supported format provided by android.scaler.streamConfigurationMap.

A processed and stalling format is defined as any non-RAW format with a stallDurations > 0. Typically only the JPEG format (ImageFormat#JPEG)

For full guarantees, query StreamConfigurationMap#getOutputStallDuration with a processed format -- it will return a non-0 value for a stalling stream.

android.request.maxNumReprocessStreams int32 x 1 [system] [deprecated]

How many reprocessing streams of any type can be allocated at the same time.

Deprecated. Do not use.

>= 0

Details

Only used by HAL2.x.

When set to 0, it means no reprocess stream is supported.

android.request.maxNumInputStreams int32 [public]

The maximum numbers of any type of input streams that can be configured and used simultaneously by a camera device.

>= 0 for LIMITED mode device (android.info.supportedHardwareLevel == LIMITED). >= 1 for FULL mode device (android.info.supportedHardwareLevel == FULL).

Details

When set to 0, it means no input stream is supported.

The image format for a input stream can be any supported format provided by android.scaler.availableInputOutputFormatsMap. When using an input stream, there must be at least one output stream configured to to receive the reprocessed images.

For example, for Zero Shutter Lag (ZSL) still capture use case, the input stream image format will be RAW_OPAQUE, the associated output stream image format should be JPEG.

android.request.pipelineMaxDepth byte [public]

Specifies the number of maximum pipeline stages a frame has to go through from when it's exposed to when it's available to the framework.

Details

A typical minimum value for this is 2 (one stage to expose, one stage to readout) from the sensor. The ISP then usually adds its own stages to do custom HW processing. Further stages may be added by SW processing.

Depending on what settings are used (e.g. YUV, JPEG) and what processing is enabled (e.g. face detection), the actual pipeline depth (specified by android.request.pipelineDepth) may be less than the max pipeline depth.

A pipeline depth of X stages is equivalent to a pipeline latency of X frame intervals.

This value will be 8 or less.

HAL Implementation Details

This value should be 4 or less.

android.request.partialResultCount int32 [public]

Optional. Defaults to 1. Defines how many sub-components a result will be composed of.

>= 1

Details

In order to combat the pipeline latency, partial results may be delivered to the application layer from the camera device as soon as they are available.

A value of 1 means that partial results are not supported.

A typical use case for this might be: after requesting an AF lock the new AF state might be available 50% of the way through the pipeline. The camera device could then immediately dispatch this state via a partial result to the framework/application layer, and the rest of the metadata via later partial results.

android.request.availableCapabilities byte [public]
  • BACKWARD_COMPATIBLE

    The minimal set of capabilities that every camera device (regardless of android.info.supportedHardwareLevel) will support.

    The full set of features supported by this capability makes the camera2 api backwards compatible with the camera1 (android.hardware.Camera) API.

    TODO: @hide this. Doesn't really mean anything except act as a catch-all for all the 'base' functionality.

  • OPTIONAL

    This is a catch-all capability to include all other tags or functionality not encapsulated by one of the other capabilities.

    A typical example is all tags marked 'optional'.

    TODO: @hide. We may not need this if we @hide all the optional tags not belonging to a capability.

  • MANUAL_SENSOR

    The camera device can be manually controlled (3A algorithms such as auto exposure, and auto focus can be bypassed), this includes but is not limited to:

    If any of the above 3A algorithms are enabled, then the camera device will accurately report the values applied by 3A in the result.

  • GCAM optional

    TODO: This should be @hide

    If auto white balance is enabled, then the camera device will accurately report the values applied by AWB in the result.

    The camera device will also support everything in MANUAL_SENSOR except manual lens control and manual flash control.

  • ZSL

    The camera device supports the Zero Shutter Lag use case.

    • At least one input stream can be used.
    • RAW_OPAQUE is supported as an output/input format
    • Using RAW_OPAQUE does not cause a frame rate drop relative to the sensor's maximum capture rate (at that resolution).
    • RAW_OPAQUE will be reprocessable into both YUV_420_888 and JPEG formats.
    • The maximum available resolution for RAW_OPAQUE streams (both input/output) will match the maximum available resolution of JPEG streams.
  • DNG optional

    The camera device supports outputting RAW buffers that can be saved offline into a DNG format. It can reprocess DNG files (produced from the same camera device) back into YUV.

    • At least one input stream can be used.
    • RAW16 is supported as output/input format.
    • RAW16 is reprocessable into both YUV_420_888 and JPEG formats.
    • The maximum available resolution for RAW16 streams (both input/output) will match the value in android.sensor.info.pixelArraySize.
    • All DNG-related optional metadata entries are provided by the camera device.

List of capabilities that the camera device advertises as fully supporting.

Details

A capability is a contract that the camera device makes in order to be able to satisfy one or more use cases.

Listing a capability guarantees that the whole set of features required to support a common use will all be available.

Using a subset of the functionality provided by an unsupported capability may be possible on a specific camera device implementation; to do this query each of android.request.availableRequestKeys, android.request.availableResultKeys, android.request.availableCharacteristicsKeys.

XX: Maybe these should go into android.info.supportedHardwareLevel as a table instead?

The following capabilities are guaranteed to be available on android.info.supportedHardwareLevel == FULL devices:

  • MANUAL_SENSOR
  • ZSL

Other capabilities may be available on either FULL or LIMITED devices, but the app. should query this field to be sure.

HAL Implementation Details

Additional constraint details per-capability will be available in the Compatibility Test Suite.

BACKWARD_COMPATIBLE capability requirements are not explicitly listed. Instead refer to "BC" tags and the camera CTS tests in the android.hardware.cts package.

Listed controls that can be either request or result (e.g. android.sensor.exposureTime) must be available both in the request and the result in order to be considered to be capability-compliant.

For example, if the HAL claims to support MANUAL control, then exposure time must be configurable via the request and the actual exposure applied must be available via the result.

android.request.availableRequestKeys int32 x n [hidden]

A list of all keys that the camera device has available to use with CaptureRequest.

Details

Attempting to set a key into a CaptureRequest that is not listed here will result in an invalid request and will be rejected by the camera device.

This field can be used to query the feature set of a camera device at a more granular level than capabilities. This is especially important for optional keys that are not listed under any capability in android.request.availableCapabilities.

TODO: This should be used by #getAvailableCaptureRequestKeys.

HAL Implementation Details

Vendor tags must not be listed here. Use the vendor tag metadata extensions C api instead (refer to camera3.h for more details).

Setting/getting vendor tags will be checked against the metadata vendor extensions API and not against this field.

The HAL must not consume any request tags that are not listed either here or in the vendor tag list.

The public camera2 API will always make the vendor tags visible via CameraCharacteristics#getAvailableCaptureRequestKeys.

android.request.availableResultKeys int32 x n [hidden]

A list of all keys that the camera device has available to use with CaptureResult.

Details

Attempting to get a key from a CaptureResult that is not listed here will always return a null value. Getting a key from a CaptureResult that is listed here must never return a null value.

The following keys may return null unless they are enabled:

(Those sometimes-null keys should nevertheless be listed here if they are available.)

This field can be used to query the feature set of a camera device at a more granular level than capabilities. This is especially important for optional keys that are not listed under any capability in android.request.availableCapabilities.

TODO: This should be used by #getAvailableCaptureResultKeys.

HAL Implementation Details

Tags listed here must always have an entry in the result metadata, even if that size is 0 elements. Only array-type tags (e.g. lists, matrices, strings) are allowed to have 0 elements.

Vendor tags must not be listed here. Use the vendor tag metadata extensions C api instead (refer to camera3.h for more details).

Setting/getting vendor tags will be checked against the metadata vendor extensions API and not against this field.

The HAL must not produce any result tags that are not listed either here or in the vendor tag list.

The public camera2 API will always make the vendor tags visible via CameraCharacteristics#getAvailableCaptureResultKeys.

android.request.availableCharacteristicsKeys int32 x n [hidden]

A list of all keys that the camera device has available to use with CameraCharacteristics.

Details

This entry follows the same rules as android.request.availableResultKeys (except that it applies for CameraCharacteristics instead of CaptureResult). See above for more details.

TODO: This should be used by CameraCharacteristics#getKeys.

HAL Implementation Details

Tags listed here must always have an entry in the static info metadata, even if that size is 0 elements. Only array-type tags (e.g. lists, matrices, strings) are allowed to have 0 elements.

Vendor tags must not be listed here. Use the vendor tag metadata extensions C api instead (refer to camera3.h for more details).

Setting/getting vendor tags will be checked against the metadata vendor extensions API and not against this field.

The HAL must not have any tags in its static info that are not listed either here or in the vendor tag list.

The public camera2 API will always make the vendor tags visible via CameraCharacteristics#getKeys.

dynamic
Property Name Type Description Units Range Tags
android.request.frameCount int32 [public]

A frame counter set by the framework. This value monotonically increases with every new result (that is, each new result has a unique frameCount value).

count of frames

> 0

Details

Reset on release()

android.request.id int32 [hidden]

An application-specified ID for the current request. Must be maintained unchanged in output frame

arbitrary integer assigned by application

Any int

android.request.metadataMode byte [system]
  • NONE

    No metadata should be produced on output, except for application-bound buffer data. If no application-bound streams exist, no frame should be placed in the output frame queue. If such streams exist, a frame should be placed on the output queue with null metadata but with the necessary output buffer information. Timestamp information should still be included with any output stream buffers

  • FULL

    All metadata should be produced. Statistics will only be produced if they are separately enabled

How much metadata to produce on output

android.request.outputStreams int32 x n [system] [deprecated]

Lists which camera output streams image data from this capture must be sent to

List of camera stream IDs

Deprecated. Do not use.

List must only include streams that have been created

Details

If no output streams are listed, then the image data should simply be discarded. The image data must still be captured for metadata and statistics production, and the lens and flash must operate as requested.

android.request.pipelineDepth byte [public]

Specifies the number of pipeline stages the frame went through from when it was exposed to when the final completed result was available to the framework.

<= android.request.pipelineMaxDepth

Details

Depending on what settings are used in the request, and what streams are configured, the data may undergo less processing, and some pipeline stages skipped.

See android.request.pipelineMaxDepth for more details.

HAL Implementation Details

This value must always represent the accurate count of how many pipeline stages were actually used.

scaler
controls
Property Name Type Description Units Range Tags
android.scaler.cropRegion int32 x 4 [public as rectangle]

(x, y, width, height).

A rectangle with the top-level corner of (x,y) and size (width, height). The region of the sensor that is used for output. Each stream must use this rectangle to produce its output, cropping to a smaller region if necessary to maintain the stream's aspect ratio.

HAL2.x uses only (x, y, width)

(x,y) of top-left corner, width and height of region in pixels; (0,0) is top-left corner of android.sensor.activeArraySize
Details

The crop region is applied after the RAW to other color space (e.g. YUV) conversion. Since raw streams (e.g. RAW16) don't have the conversion stage, it is not croppable. The crop region will be ignored by raw streams.

For non-raw streams, any additional per-stream cropping will be done to maximize the final pixel area of the stream.

For example, if the crop region is set to a 4:3 aspect ratio, then 4:3 streams should use the exact crop region. 16:9 streams should further crop vertically (letterbox).

Conversely, if the crop region is set to a 16:9, then 4:3 outputs should crop horizontally (pillarbox), and 16:9 streams should match exactly. These additional crops must be centered within the crop region.

The output streams must maintain square pixels at all times, no matter what the relative aspect ratios of the crop region and the stream are. Negative values for corner are allowed for raw output if full pixel array is larger than active pixel array. Width and height may be rounded to nearest larger supportable width, especially for raw output, where only a few fixed scales may be possible. The width and height of the crop region cannot be set to be smaller than floor( activeArraySize.width / android.scaler.availableMaxDigitalZoom ) and floor( activeArraySize.height / android.scaler.availableMaxDigitalZoom), respectively.

static
Property Name Type Description Units Range Tags
android.scaler.availableFormats int32 x n [hidden as imageFormat] [deprecated]
  • RAW16 optional 0x20

    RAW16 is a standard, cross-platform format for raw image buffers with 16-bit pixels. Buffers of this format are typically expected to have a Bayer Color Filter Array (CFA) layout, which is given in android.sensor.info.colorFilterArrangement. Sensors with CFAs that are not representable by a format in android.sensor.info.colorFilterArrangement should not use this format.

    Buffers of this format will also follow the constraints given for RAW_OPAQUE buffers, but with relaxed performance constraints.

    See android.scaler.availableInputOutputFormatsMap for the full set of performance guarantees.

  • RAW_OPAQUE optional 0x24

    RAW_OPAQUE is a format for raw image buffers coming from an image sensor. The actual structure of buffers of this format is platform-specific, but must follow several constraints:

    1. No image post-processing operations may have been applied to buffers of this type. These buffers contain raw image data coming directly from the image sensor.
    2. If a buffer of this format is passed to the camera device for reprocessing, the resulting images will be identical to the images produced if the buffer had come directly from the sensor and was processed with the same settings.

    The intended use for this format is to allow access to the native raw format buffers coming directly from the camera sensor without any additional conversions or decrease in framerate.

    See android.scaler.availableInputOutputFormatsMap for the full set of performance guarantees.

  • YV12 optional 0x32315659

    YCrCb 4:2:0 Planar

  • YCrCb_420_SP optional 0x11

    NV21

  • IMPLEMENTATION_DEFINED 0x22

    System internal format, not application-accessible

  • YCbCr_420_888 0x23

    Flexible YUV420 Format

  • BLOB 0x21

    JPEG format

The list of image formats that are supported by this camera device for output streams.

Deprecated. Do not use.

Details

All camera devices will support JPEG and YUV_420_888 formats.

When set to YUV_420_888, application can access the YUV420 data directly.

HAL Implementation Details

These format values are from HAL_PIXEL_FORMAT_* in system/core/include/system/graphics.h.

When IMPLEMENTATION_DEFINED is used, the platform gralloc module will select a format based on the usage flags provided by the camera HAL device and the other endpoint of the stream. It is usually used by preview and recording streams, where the application doesn't need access the image data.

YCbCr_420_888 format must be supported by the HAL. When an image stream needs CPU/application direct access, this format will be used.

The BLOB format must be supported by the HAL. This is used for the JPEG stream.

A RAW_OPAQUE buffer should contain only pixel data. It is strongly recommended that any information used by the camera device when processing images is fully expressed by the result metadata for that image buffer.

android.scaler.availableJpegMinDurations int64 x n [hidden] [deprecated]

The minimum frame duration that is supported for each resolution in android.scaler.availableJpegSizes.

ns

Deprecated. Do not use.

TODO: Remove property.

Details

This corresponds to the minimum steady-state frame duration when only that JPEG stream is active and captured in a burst, with all processing (typically in android.*.mode) set to FAST.

When multiple streams are configured, the minimum frame duration will be >= max(individual stream min durations)

android.scaler.availableJpegSizes int32 x n x 2 [hidden as size] [deprecated]

The JPEG resolutions that are supported by this camera device.

Deprecated. Do not use.

TODO: Remove property.

Details

The resolutions are listed as (width, height) pairs. All camera devices will support sensor maximum resolution (defined by android.sensor.info.activeArraySize).

HAL Implementation Details

The HAL must include sensor maximum resolution (defined by android.sensor.info.activeArraySize), and should include half/quarter of sensor maximum resolution.

android.scaler.availableMaxDigitalZoom float [public]

The maximum ratio between active area width and crop region width, or between active area height and crop region height, if the crop region height is larger than width

>=1

android.scaler.availableProcessedMinDurations int64 x n [hidden] [deprecated]

For each available processed output size (defined in android.scaler.availableProcessedSizes), this property lists the minimum supportable frame duration for that size.

ns

Deprecated. Do not use.

TODO: Remove property.

Details

This should correspond to the frame duration when only that processed stream is active, with all processing (typically in android.*.mode) set to FAST.

When multiple streams are configured, the minimum frame duration will be >= max(individual stream min durations).

android.scaler.availableProcessedSizes int32 x n x 2 [hidden as size] [deprecated]

The resolutions available for use with processed output streams, such as YV12, NV12, and platform opaque YUV/RGB streams to the GPU or video encoders.

Deprecated. Do not use.

TODO: Remove property.

Details

The resolutions are listed as (width, height) pairs.

For a given use case, the actual maximum supported resolution may be lower than what is listed here, depending on the destination Surface for the image data. For example, for recording video, the video encoder chosen may have a maximum size limit (e.g. 1080p) smaller than what the camera (e.g. maximum resolution is 3264x2448) can provide.

Please reference the documentation for the image data destination to check if it limits the maximum size for image data.

HAL Implementation Details

For FULL capability devices (android.info.supportedHardwareLevel == FULL), the HAL must include all JPEG sizes listed in android.scaler.availableJpegSizes and each below resolution if it is smaller than or equal to the sensor maximum resolution (if they are not listed in JPEG sizes already):

  • 240p (320 x 240)
  • 480p (640 x 480)
  • 720p (1280 x 720)
  • 1080p (1920 x 1080)

For LIMITED capability devices (android.info.supportedHardwareLevel == LIMITED), the HAL only has to list up to the maximum video size supported by the devices.

android.scaler.availableRawMinDurations int64 x n [system] [deprecated]

For each available raw output size (defined in android.scaler.availableRawSizes), this property lists the minimum supportable frame duration for that size.

ns

Deprecated. Do not use.

TODO: Remove property.

Details

Should correspond to the frame duration when only the raw stream is active.

When multiple streams are configured, the minimum frame duration will be >= max(individual stream min durations)

android.scaler.availableRawSizes int32 x n x 2 [system as size] [deprecated]

The resolutions available for use with raw sensor output streams, listed as width, height

Deprecated. Do not use.

TODO: Remove property. Must include: - sensor maximum resolution.

android.scaler.availableInputOutputFormatsMap int32 x n [hidden as imageFormat]

The mapping of image formats that are supported by this camera device for input streams, to their corresponding output formats.

Details

All camera devices with at least 1 android.request.maxNumInputStreams will have at least one available input format.

The camera device will support the following map of formats, if its dependent capability is supported:

Input Format Output Format Capability
RAW_OPAQUE JPEG ZSL
RAW_OPAQUE YUV_420_888 ZSL
RAW_OPAQUE RAW16 DNG
RAW16 YUV_420_888 DNG
RAW16 JPEG DNG

For ZSL-capable camera devices, using the RAW_OPAQUE format as either input or output will never hurt maximum frame rate (i.e. StreamConfigurationMap#getOutputStallDuration(int,Size) for a format = RAW_OPAQUE is always 0).

Attempting to configure an input stream with output streams not listed as available in this map is not valid.

TODO: typedef to ReprocessFormatMap

HAL Implementation Details

For the formats, see system/core/include/system/graphics.h for a definition of the image format enumerations.

This value is encoded as a variable-size array-of-arrays. The inner array always contains [format, length, ...] where ... has length elements. An inner array is followed by another inner array if the total metadata entry size hasn't yet been exceeded.

A code sample to read/write this encoding (with a device that supports reprocessing RAW_OPAQUE to RAW16, YUV_420_888, and JPEG, and reprocessing RAW16 to YUV_420_888 and JPEG):

// reading
int32_t* contents = &entry.i32[0];
for (size_t i = 0; i < entry.count; ) {
    int32_t format = contents[i++];
    int32_t length = contents[i++];
    int32_t output_formats[length];
    memcpy(&output_formats[0], &contents[i],
           length * sizeof(int32_t));
    i += length;
}

// writing (static example, DNG+ZSL)
int32_t[] contents = {
  RAW_OPAQUE, 3, RAW16, YUV_420_888, BLOB,
  RAW16, 2, YUV_420_888, BLOB,
};
update_camera_metadata_entry(metadata, index, &contents[0],
      sizeof(contents)/sizeof(contents[0]), &updated_entry);

If the HAL claims to support any of the capabilities listed in the above details, then it must also support all the input-output combinations listed for that capability. It can optionally support additional formats if it so chooses.

Refer to android.scaler.availableFormats for the enum values which correspond to HAL_PIXEL_FORMAT_* in system/core/include/system/graphics.h.

android.scaler.availableStreamConfigurations int32 x n x 4 [hidden as streamConfiguration]
  • OUTPUT
  • INPUT

The available stream configurations that this camera device supports (i.e. format, width, height, output/input stream).

Details

The configurations are listed as (format, width, height, input?) tuples.

All camera devices will support sensor maximum resolution (defined by android.sensor.info.activeArraySize) for the JPEG format.

For a given use case, the actual maximum supported resolution may be lower than what is listed here, depending on the destination Surface for the image data. For example, for recording video, the video encoder chosen may have a maximum size limit (e.g. 1080p) smaller than what the camera (e.g. maximum resolution is 3264x2448) can provide.

Please reference the documentation for the image data destination to check if it limits the maximum size for image data.

Not all output formats may be supported in a configuration with an input stream of a particular format. For more details, see android.scaler.availableInputOutputFormatsMap.

The following table describes the minimum required output stream configurations based on the hardware level (android.info.supportedHardwareLevel):

Format Size Hardware Level Notes
JPEG android.sensor.info.activeArraySize Any
JPEG 1920x1080 (1080p) Any if 1080p <= activeArraySize
JPEG 1280x720 (720) Any if 720p <= activeArraySize
JPEG 640x480 (480p) Any if 480p <= activeArraySize
JPEG 320x240 (240p) Any if 240p <= activeArraySize
YUV_420_888 all output sizes available for JPEG FULL
YUV_420_888 all output sizes available for JPEG, up to the maximum video size LIMITED
IMPLEMENTATION_DEFINED same as YUV_420_888 Any

Refer to android.request.availableCapabilities for additional mandatory stream configurations on a per-capability basis.

HAL Implementation Details

It is recommended (but not mandatory) to also include half/quarter of sensor maximum resolution for JPEG formats (regardless of hardware level).

(The following is a rewording of the above required table):

The HAL must include sensor maximum resolution (defined by android.sensor.info.activeArraySize).

For FULL capability devices (android.info.supportedHardwareLevel == FULL), the HAL must include all YUV_420_888 sizes that have JPEG sizes listed here as output streams.

It must also include each below resolution if it is smaller than or equal to the sensor maximum resolution (for both YUV_420_888 and JPEG formats), as output streams:

  • 240p (320 x 240)
  • 480p (640 x 480)
  • 720p (1280 x 720)
  • 1080p (1920 x 1080)

For LIMITED capability devices (android.info.supportedHardwareLevel == LIMITED), the HAL only has to list up to the maximum video size supported by the device.

Regardless of hardware level, every output resolution available for YUV_420_888 must also be available for IMPLEMENTATION_DEFINED.

This supercedes the following fields, which are now deprecated:

  • availableFormats
  • available[Processed,Raw,Jpeg]Sizes
android.scaler.availableMinFrameDurations int64 x 4 x n [hidden as streamConfigurationDuration]

This lists the minimum frame duration for each format/size combination.

(format, width, height, ns) x n
Details

This should correspond to the frame duration when only that stream is active, with all processing (typically in android.*.mode) set to either OFF or FAST.

When multiple streams are used in a request, the minimum frame duration will be max(individual stream min durations).

The minimum frame duration of a stream (of a particular format, size) is the same regardless of whether the stream is input or output.

See android.sensor.frameDuration and android.scaler.availableStallDurations for more details about calculating the max frame rate.

(Keep in sync with StreamConfigurationMap#getOutputMinFrameDuration)

android.scaler.availableStallDurations int64 x 4 x n [hidden as streamConfigurationDuration]

This lists the maximum stall duration for each format/size combination.

(format, width, height, ns) x n
Details

A stall duration is how much extra time would get added to the normal minimum frame duration for a repeating request that has streams with non-zero stall.

For example, consider JPEG captures which have the following characteristics:

  • JPEG streams act like processed YUV streams in requests for which they are not included; in requests in which they are directly referenced, they act as JPEG streams. This is because supporting a JPEG stream requires the underlying YUV data to always be ready for use by a JPEG encoder, but the encoder will only be used (and impact frame duration) on requests that actually reference a JPEG stream.
  • The JPEG processor can run concurrently to the rest of the camera pipeline, but cannot process more than 1 capture at a time.

In other words, using a repeating YUV request would result in a steady frame rate (let's say it's 30 FPS). If a single JPEG request is submitted periodically, the frame rate will stay at 30 FPS (as long as we wait for the previous JPEG to return each time). If we try to submit a repeating YUV + JPEG request, then the frame rate will drop from 30 FPS.

In general, submitting a new request with a non-0 stall time stream will not cause a frame rate drop unless there are still outstanding buffers for that stream from previous requests.

Submitting a repeating request with streams (call this S) is the same as setting the minimum frame duration from the normal minimum frame duration corresponding to S, added with the maximum stall duration for S.

If interleaving requests with and without a stall duration, a request will stall by the maximum of the remaining times for each can-stall stream with outstanding buffers.

This means that a stalling request will not have an exposure start until the stall has completed.

This should correspond to the stall duration when only that stream is active, with all processing (typically in android.*.mode) set to FAST or OFF. Setting any of the processing modes to HIGH_QUALITY effectively results in an indeterminate stall duration for all streams in a request (the regular stall calculation rules are ignored).

The following formats may always have a stall duration:

  • JPEG
  • RAW16

The following formats will never have a stall duration:

  • YUV_420_888
  • IMPLEMENTATION_DEFINED

All other formats may or may not have an allowed stall duration on a per-capability basis; refer to android.request.availableCapabilities for more details.

See android.sensor.frameDuration for more information about calculating the max frame rate (absent stalls).

(Keep up to date with StreamConfigurationMap#getOutputStallDuration(int, Size) )

HAL Implementation Details

If possible, it is recommended that all non-JPEG formats (such as RAW16) should not have a stall duration.

android.scaler.streamConfigurationMap int32 [public as streamConfigurationMap] [synthetic]

The available stream configurations that this camera device supports; also includes the minimum frame durations and the stall durations for each format/size combination.

Details

All camera devices will support sensor maximum resolution (defined by android.sensor.info.activeArraySize) for the JPEG format.

For a given use case, the actual maximum supported resolution may be lower than what is listed here, depending on the destination Surface for the image data. For example, for recording video, the video encoder chosen may have a maximum size limit (e.g. 1080p) smaller than what the camera (e.g. maximum resolution is 3264x2448) can provide.

Please reference the documentation for the image data destination to check if it limits the maximum size for image data.

The following table describes the minimum required output stream configurations based on the hardware level (android.info.supportedHardwareLevel):

Format Size Hardware Level Notes
JPEG android.sensor.info.activeArraySize Any
JPEG 1920x1080 (1080p) Any if 1080p <= activeArraySize
JPEG 1280x720 (720) Any if 720p <= activeArraySize
JPEG 640x480 (480p) Any if 480p <= activeArraySize
JPEG 320x240 (240p) Any if 240p <= activeArraySize
YUV_420_888 all output sizes available for JPEG FULL
YUV_420_888 all output sizes available for JPEG, up to the maximum video size LIMITED
IMPLEMENTATION_DEFINED same as YUV_420_888 Any

Refer to android.request.availableCapabilities for additional mandatory stream configurations on a per-capability basis.

HAL Implementation Details

Do not set this property directly (it is synthetic and will not be available at the HAL layer); set the android.scaler.availableStreamConfigurations instead.

Not all output formats may be supported in a configuration with an input stream of a particular format. For more details, see android.scaler.availableInputOutputFormatsMap.

It is recommended (but not mandatory) to also include half/quarter of sensor maximum resolution for JPEG formats (regardless of hardware level).

(The following is a rewording of the above required table):

The HAL must include sensor maximum resolution (defined by android.sensor.info.activeArraySize).

For FULL capability devices (android.info.supportedHardwareLevel == FULL), the HAL must include all YUV_420_888 sizes that have JPEG sizes listed here as output streams.

It must also include each below resolution if it is smaller than or equal to the sensor maximum resolution (for both YUV_420_888 and JPEG formats), as output streams:

  • 240p (320 x 240)
  • 480p (640 x 480)
  • 720p (1280 x 720)
  • 1080p (1920 x 1080)

For LIMITED capability devices (android.info.supportedHardwareLevel == LIMITED), the HAL only has to list up to the maximum video size supported by the device.

Regardless of hardware level, every output resolution available for YUV_420_888 must also be available for IMPLEMENTATION_DEFINED.

This supercedes the following fields, which are now deprecated:

  • availableFormats
  • available[Processed,Raw,Jpeg]Sizes
android.scaler.croppingType byte [public]
  • CENTER_ONLY

    The camera device will only support centered crop regions.

  • FREEFORM

    The camera device will support arbitrarily chosen crop regions.

The crop type that this camera device supports.

Details

When passing a non-centered crop region (android.scaler.cropRegion) to a camera device that only supports CENTER_ONLY cropping, the camera device will move the crop region to the center of the sensor active array (android.sensor.info.activeArraySize) and keep the crop region width and height unchanged. The camera device will return the final used crop region in metadata result android.scaler.cropRegion.

Camera devices that support FREEFORM cropping will support any crop region that is inside of the active array. The camera device will apply the same crop region and return the final used crop region in capture result metadata android.scaler.cropRegion.

FULL capability devices (android.info.supportedHardwareLevel == FULL) will support FREEFORM cropping.

dynamic
Property Name Type Description Units Range Tags
android.scaler.cropRegion int32 x 4 [public as rectangle]

(x, y, width, height).

A rectangle with the top-level corner of (x,y) and size (width, height). The region of the sensor that is used for output. Each stream must use this rectangle to produce its output, cropping to a smaller region if necessary to maintain the stream's aspect ratio.

HAL2.x uses only (x, y, width)

(x,y) of top-left corner, width and height of region in pixels; (0,0) is top-left corner of android.sensor.activeArraySize
Details

The crop region is applied after the RAW to other color space (e.g. YUV) conversion. Since raw streams (e.g. RAW16) don't have the conversion stage, it is not croppable. The crop region will be ignored by raw streams.

For non-raw streams, any additional per-stream cropping will be done to maximize the final pixel area of the stream.

For example, if the crop region is set to a 4:3 aspect ratio, then 4:3 streams should use the exact crop region. 16:9 streams should further crop vertically (letterbox).

Conversely, if the crop region is set to a 16:9, then 4:3 outputs should crop horizontally (pillarbox), and 16:9 streams should match exactly. These additional crops must be centered within the crop region.

The output streams must maintain square pixels at all times, no matter what the relative aspect ratios of the crop region and the stream are. Negative values for corner are allowed for raw output if full pixel array is larger than active pixel array. Width and height may be rounded to nearest larger supportable width, especially for raw output, where only a few fixed scales may be possible. The width and height of the crop region cannot be set to be smaller than floor( activeArraySize.width / android.scaler.availableMaxDigitalZoom ) and floor( activeArraySize.height / android.scaler.availableMaxDigitalZoom), respectively.

sensor
controls
Property Name Type Description Units Range Tags
android.sensor.exposureTime int64 [public]

Duration each pixel is exposed to light.

nanoseconds

android.sensor.info.exposureTimeRange

Details

If the sensor can't expose this exact duration, it should shorten the duration exposed to the nearest possible value (rather than expose longer).

android.sensor.frameDuration int64 [public]

Duration from start of frame exposure to start of next frame exposure.

nanoseconds

See android.sensor.info.maxFrameDuration, android.scaler.streamConfigurationMap. The duration is capped to max(duration, exposureTime + overhead).

Details

The maximum frame rate that can be supported by a camera subsystem is a function of many factors:

  • Requested resolutions of output image streams
  • Availability of binning / skipping modes on the imager
  • The bandwidth of the imager interface
  • The bandwidth of the various ISP processing blocks

Since these factors can vary greatly between different ISPs and sensors, the camera abstraction tries to represent the bandwidth restrictions with as simple a model as possible.

The model presented has the following characteristics:

  • The image sensor is always configured to output the smallest resolution possible given the application's requested output stream sizes. The smallest resolution is defined as being at least as large as the largest requested output stream size; the camera pipeline must never digitally upsample sensor data when the crop region covers the whole sensor. In general, this means that if only small output stream resolutions are configured, the sensor can provide a higher frame rate.
  • Since any request may use any or all the currently configured output streams, the sensor and ISP must be configured to support scaling a single capture to all the streams at the same time. This means the camera pipeline must be ready to produce the largest requested output size without any delay. Therefore, the overall frame rate of a given configured stream set is governed only by the largest requested stream resolution.
  • Using more than one output stream in a request does not affect the frame duration.
  • Certain format-streams may need to do additional background processing before data is consumed/produced by that stream. These processors can run concurrently to the rest of the camera pipeline, but cannot process more than 1 capture at a time.

The necessary information for the application, given the model above, is provided via the android.scaler.streamConfigurationMap field using StreamConfigurationMap#getOutputMinFrameDuration(int, Size). These are used to determine the maximum frame rate / minimum frame duration that is possible for a given stream configuration.

Specifically, the application can use the following rules to determine the minimum frame duration it can request from the camera device:

  1. Let the set of currently configured input/output streams be called S.
  2. Find the minimum frame durations for each stream in S, by looking it up in android.scaler.streamConfigurationMap using StreamConfigurationMap#getOutputMinFrameDuration(int, Size) (with its respective size/format). Let this set of frame durations be called F.
  3. For any given request R, the minimum frame duration allowed for R is the maximum out of all values in F. Let the streams used in R be called S_r.

If none of the streams in S_r have a stall time (listed in StreamConfigurationMap#getOutputStallDuration(int,Size) using its respective size/format), then the frame duration in F determines the steady state frame rate that the application will get if it uses R as a repeating request. Let this special kind of request be called Rsimple.

A repeating request Rsimple can be occasionally interleaved by a single capture of a new request Rstall (which has at least one in-use stream with a non-0 stall time) and if Rstall has the same minimum frame duration this will not cause a frame rate loss if all buffers from the previous Rstall have already been delivered.

For more details about stalling, see StreamConfigurationMap#getOutputStallDuration(int,Size).

HAL Implementation Details

For more details about stalling, see android.scaler.availableStallDurations.

android.sensor.sensitivity int32 [public]

Gain applied to image data. Must be implemented through analog gain only if set to values below 'maximum analog sensitivity'.

If the sensor can't apply this exact gain, it should lessen the gain to the nearest possible value (rather than gain more).

ISO arithmetic units

android.sensor.info.sensitivityRange

Details

ISO 12232:2006 REI method

android.sensor.testPatternData int32 x 4 [public]

A pixel [R, G_even, G_odd, B] that supplies the test pattern when android.sensor.testPatternMode is SOLID_COLOR.

Optional. Must be supported if android.sensor.availableTestPatternModes contains SOLID_COLOR.

Details

Each color channel is treated as an unsigned 32-bit integer. The camera device then uses the most significant X bits that correspond to how many bits are in its Bayer raw sensor output.

For example, a sensor with RAW10 Bayer output would use the 10 most significant bits from each color channel.

HAL Implementation Details
android.sensor.testPatternMode int32 [public]
  • OFF

    Default. No test pattern mode is used, and the camera device returns captures from the image sensor.

  • SOLID_COLOR

    Each pixel in [R, G_even, G_odd, B] is replaced by its respective color channel provided in android.sensor.testPatternData.

    For example:

    android.testPatternData = [0, 0xFFFFFFFF, 0xFFFFFFFF, 0]
    

    All green pixels are 100% green. All red/blue pixels are black.

    android.testPatternData = [0xFFFFFFFF, 0, 0xFFFFFFFF, 0]
    

    All red pixels are 100% red. Only the odd green pixels are 100% green. All blue pixels are 100% black.

  • COLOR_BARS

    All pixel data is replaced with an 8-bar color pattern.

    The vertical bars (left-to-right) are as follows:

    • 100% white
    • yellow
    • cyan
    • green
    • magenta
    • red
    • blue
    • black

    In general the image would look like the following:

    W Y C G M R B K
    W Y C G M R B K
    W Y C G M R B K
    W Y C G M R B K
    W Y C G M R B K
    . . . . . . . .
    . . . . . . . .
    . . . . . . . .
    
    (B = Blue, K = Black)
    

    Each bar should take up 1/8 of the sensor pixel array width. When this is not possible, the bar size should be rounded down to the nearest integer and the pattern can repeat on the right side.

    Each bar's height must always take up the full sensor pixel array height.

    Each pixel in this test pattern must be set to either 0% intensity or 100% intensity.

  • COLOR_BARS_FADE_TO_GRAY

    The test pattern is similar to COLOR_BARS, except that each bar should start at its specified color at the top, and fade to gray at the bottom.

    Furthermore each bar is further subdivided into a left and right half. The left half should have a smooth gradient, and the right half should have a quantized gradient.

    In particular, the right half's should consist of blocks of the same color for 1/16th active sensor pixel array width.

    The least significant bits in the quantized gradient should be copied from the most significant bits of the smooth gradient.

    The height of each bar should always be a multiple of 128. When this is not the case, the pattern should repeat at the bottom of the image.

  • PN9

    All pixel data is replaced by a pseudo-random sequence generated from a PN9 512-bit sequence (typically implemented in hardware with a linear feedback shift register).

    The generator should be reset at the beginning of each frame, and thus each subsequent raw frame with this test pattern should be exactly the same as the last.

  • CUSTOM1 256

    The first custom test pattern. All custom patterns that are available only on this camera device are at least this numeric value.

    All of the custom test patterns will be static (that is the raw image must not vary from frame to frame).

When enabled, the sensor sends a test pattern instead of doing a real exposure from the camera.

Optional. Defaults to OFF. Value must be one of android.sensor.availableTestPatternModes

Details

When a test pattern is enabled, all manual sensor controls specified by android.sensor.* should be ignored. All other controls should work as normal.

For example, if manual flash is enabled, flash firing should still occur (and that the test pattern remain unmodified, since the flash would not actually affect it).

HAL Implementation Details

All test patterns are specified in the Bayer domain.

The HAL may choose to substitute test patterns from the sensor with test patterns from on-device memory. In that case, it should be indistinguishable to the ISP whether the data came from the sensor interconnect bus (such as CSI2) or memory.

static
Property Name Type Description Units Range Tags
android.sensor.info.activeArraySize int32 x 4 [public as rectangle]
Four ints defining the active pixel rectangle

Area of raw data which corresponds to only active pixels.

This array contains (xmin, ymin, width, height). The (xmin, ymin) must be >= (0,0). The (width, height) must be <= android.sensor.info.pixelArraySize.

Details

It is smaller or equal to sensor full pixel array, which could include the black calibration pixels.

android.sensor.info.sensitivityRange int32 x 2 [public]
Range of supported sensitivities

Range of valid sensitivities

Min <= 100, Max >= 1600

android.sensor.info.colorFilterArrangement byte [public]
  • RGGB
  • GRBG
  • GBRG
  • BGGR
  • RGB

    Sensor is not Bayer; output has 3 16-bit values for each pixel, instead of just 1 16-bit value per pixel.

Arrangement of color filters on sensor; represents the colors in the top-left 2x2 section of the sensor, in reading order

android.sensor.info.exposureTimeRange int64 x 2 [public]
nanoseconds

Range of valid exposure times used by android.sensor.exposureTime.

Min <= 100e3 (100 us). For FULL capability devices (android.info.supportedHardwareLevel == FULL), Max SHOULD be >= 1e9 (1sec), MUST be >= 100e6 (100ms)

HAL Implementation Details

For FULL capability devices (android.info.supportedHardwareLevel == FULL), The maximum of the range SHOULD be at least 1 second (1e9), MUST be at least 100ms.

android.sensor.info.maxFrameDuration int64 [public]

Maximum possible frame duration (minimum frame rate).

nanoseconds

For FULL capability devices (android.info.supportedHardwareLevel == FULL), Max SHOULD be >= 1e9 (1sec), MUST be >= 100e6 (100ms)

Details

The largest possible android.sensor.frameDuration that will be accepted by the camera device. Attempting to use frame durations beyond the maximum will result in the frame duration being clipped to the maximum. See that control for a full definition of frame durations.

Refer to StreamConfigurationMap#getOutputMinFrameDuration(int,Size) for the minimum frame duration values.

HAL Implementation Details

For FULL capability devices (android.info.supportedHardwareLevel == FULL), The maximum of the range SHOULD be at least 1 second (1e9), MUST be at least 100ms (100e6).

android.sensor.info.maxFrameDuration must be greater or equal to the android.sensor.info.exposureTimeRange max value (since exposure time overrides frame duration).

Available minimum frame durations for JPEG must be no greater than that of the YUV_420_888/IMPLEMENTATION_DEFINED minimum frame durations (for that respective size).

Since JPEG processing is considered offline and can take longer than a single uncompressed capture, refer to android.scaler.availableStallDurations for details about encoding this scenario.

android.sensor.info.physicalSize float x 2 [public]
width x height in millimeters

The physical dimensions of the full pixel array

Details

Needed for FOV calculation for old API

android.sensor.info.pixelArraySize int32 x 2 [public as size]

Dimensions of full pixel array, possibly including black calibration pixels.

Details

Maximum output resolution for raw format must match this in android.scaler.availableStreamConfigurations.

android.sensor.info.whiteLevel int32 [public]

Maximum raw value output by sensor.

> 255 (8-bit output)

Details

This specifies the fully-saturated encoding level for the raw sample values from the sensor. This is typically caused by the sensor becoming highly non-linear or clipping. The minimum for each channel is specified by the offset in the android.sensor.blackLevelPattern tag.

The white level is typically determined either by sensor bit depth (8-14 bits is expected), or by the point where the sensor response becomes too non-linear to be useful. The default value for this is maximum representable value for a 16-bit raw sample (2^16 - 1).

HAL Implementation Details

The full bit depth of the sensor must be available in the raw data, so the value for linear sensors should not be significantly lower than maximum raw value supported, i.e. 2^(sensor bits per pixel).

android.sensor.referenceIlluminant1 byte [public]
  • DAYLIGHT 1
  • FLUORESCENT 2
  • TUNGSTEN 3

    Incandescent light

  • FLASH 4
  • FINE_WEATHER 9
  • CLOUDY_WEATHER 10
  • SHADE 11
  • DAYLIGHT_FLUORESCENT 12

    D 5700 - 7100K

  • DAY_WHITE_FLUORESCENT 13

    N 4600 - 5400K

  • COOL_WHITE_FLUORESCENT 14

    W 3900 - 4500K

  • WHITE_FLUORESCENT 15

    WW 3200 - 3700K

  • STANDARD_A 17
  • STANDARD_B 18
  • STANDARD_C 19
  • D55 20
  • D65 21
  • D75 22
  • D50 23
  • ISO_STUDIO_TUNGSTEN 24

The standard reference illuminant used as the scene light source when calculating the android.sensor.colorTransform1, android.sensor.calibrationTransform1, and android.sensor.forwardMatrix1 matrices.

Details

The values in this tag correspond to the values defined for the EXIF LightSource tag. These illuminants are standard light sources that are often used calibrating camera devices.

If this tag is present, then android.sensor.colorTransform1, android.sensor.calibrationTransform1, and android.sensor.forwardMatrix1 will also be present.

Some devices may choose to provide a second set of calibration information for improved quality, including android.sensor.referenceIlluminant2 and its corresponding matrices.

HAL Implementation Details

The first reference illuminant (android.sensor.referenceIlluminant1) and corresponding matrices must be present to support DNG output.

When producing raw images with a color profile that has only been calibrated against a single light source, it is valid to omit android.sensor.referenceIlluminant2 along with the android.sensor.colorTransform2, android.sensor.calibrationTransform2, and android.sensor.forwardMatrix2 matrices.

If only android.sensor.referenceIlluminant1 is included, it should be chosen so that it is representative of typical scene lighting. In general, D50 or DAYLIGHT will be chosen for this case.

If both android.sensor.referenceIlluminant1 and android.sensor.referenceIlluminant2 are included, they should be chosen to represent the typical range of scene lighting conditions. In general, low color temperature illuminant such as Standard-A will be chosen for the first reference illuminant and a higher color temperature illuminant such as D65 will be chosen for the second reference illuminant.

android.sensor.referenceIlluminant2 byte [public]

The standard reference illuminant used as the scene light source when calculating the android.sensor.colorTransform2, android.sensor.calibrationTransform2, and android.sensor.forwardMatrix2 matrices.

Details

See android.sensor.referenceIlluminant1 for more details. Valid values for this are the same as those given for the first reference illuminant.

If this tag is present, then android.sensor.colorTransform2, android.sensor.calibrationTransform2, and android.sensor.forwardMatrix2 will also be present.

android.sensor.calibrationTransform1 rational x 3 x 3 [public]
3x3 matrix in row-major-order

A per-device calibration transform matrix that maps from the reference sensor colorspace to the actual device sensor colorspace.

Details

This matrix is used to correct for per-device variations in the sensor colorspace, and is used for processing raw buffer data.

The matrix is expressed as a 3x3 matrix in row-major-order, and contains a per-device calibration transform that maps colors from reference sensor color space (i.e. the "golden module" colorspace) into this camera device's native sensor color space under the first reference illuminant (android.sensor.referenceIlluminant1).

android.sensor.calibrationTransform2 rational x 3 x 3 [public]
3x3 matrix in row-major-order

A per-device calibration transform matrix that maps from the reference sensor colorspace to the actual device sensor colorspace (this is the colorspace of the raw buffer data).

Details

This matrix is used to correct for per-device variations in the sensor colorspace, and is used for processing raw buffer data.

The matrix is expressed as a 3x3 matrix in row-major-order, and contains a per-device calibration transform that maps colors from reference sensor color space (i.e. the "golden module" colorspace) into this camera device's native sensor color space under the second reference illuminant (android.sensor.referenceIlluminant2).

This matrix will only be present if the second reference illuminant is present.

android.sensor.colorTransform1 rational x 3 x 3 [public]
3x3 matrix in row-major-order

A matrix that transforms color values from CIE XYZ color space to reference sensor color space.

Details

This matrix is used to convert from the standard CIE XYZ color space to the reference sensor colorspace, and is used when processing raw buffer data.

The matrix is expressed as a 3x3 matrix in row-major-order, and contains a color transform matrix that maps colors from the CIE XYZ color space to the reference sensor color space (i.e. the "golden module" colorspace) under the first reference illuminant (android.sensor.referenceIlluminant1).

The white points chosen in both the reference sensor color space and the CIE XYZ colorspace when calculating this transform will match the standard white point for the first reference illuminant (i.e. no chromatic adaptation will be applied by this transform).

android.sensor.colorTransform2 rational x 3 x 3 [public]
3x3 matrix in row-major-order

A matrix that transforms color values from CIE XYZ color space to reference sensor color space.

Details

This matrix is used to convert from the standard CIE XYZ color space to the reference sensor colorspace, and is used when processing raw buffer data.

The matrix is expressed as a 3x3 matrix in row-major-order, and contains a color transform matrix that maps colors from the CIE XYZ color space to the reference sensor color space (i.e. the "golden module" colorspace) under the second reference illuminant (android.sensor.referenceIlluminant2).

The white points chosen in both the reference sensor color space and the CIE XYZ colorspace when calculating this transform will match the standard white point for the second reference illuminant (i.e. no chromatic adaptation will be applied by this transform).

This matrix will only be present if the second reference illuminant is present.

android.sensor.forwardMatrix1 rational x 3 x 3 [public]
3x3 matrix in row-major-order

A matrix that transforms white balanced camera colors from the reference sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint.

Details

This matrix is used to convert to the standard CIE XYZ colorspace, and is used when processing raw buffer data.

This matrix is expressed as a 3x3 matrix in row-major-order, and contains a color transform matrix that maps white balanced colors from the reference sensor color space to the CIE XYZ color space with a D50 white point.

Under the first reference illuminant (android.sensor.referenceIlluminant1) this matrix is chosen so that the standard white point for this reference illuminant in the reference sensor colorspace is mapped to D50 in the CIE XYZ colorspace.

android.sensor.forwardMatrix2 rational x 3 x 3 [public]
3x3 matrix in row-major-order

A matrix that transforms white balanced camera colors from the reference sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint.

Details

This matrix is used to convert to the standard CIE XYZ colorspace, and is used when processing raw buffer data.

This matrix is expressed as a 3x3 matrix in row-major-order, and contains a color transform matrix that maps white balanced colors from the reference sensor color space to the CIE XYZ color space with a D50 white point.

Under the second reference illuminant (android.sensor.referenceIlluminant2) this matrix is chosen so that the standard white point for this reference illuminant in the reference sensor colorspace is mapped to D50 in the CIE XYZ colorspace.

This matrix will only be present if the second reference illuminant is present.

android.sensor.baseGainFactor rational [system]

Gain factor from electrons to raw units when ISO=100

android.sensor.blackLevelPattern int32 x 4 [public]
2x2 raw count block

A fixed black level offset for each of the color filter arrangement (CFA) mosaic channels.

>= 0 for each.

Details

This tag specifies the zero light value for each of the CFA mosaic channels in the camera sensor. The maximal value output by the sensor is represented by the value in android.sensor.info.whiteLevel.

The values are given in row-column scan order, with the first value corresponding to the element of the CFA in row=0, column=0.

android.sensor.maxAnalogSensitivity int32 [public]

Maximum sensitivity that is implemented purely through analog gain.

Details

For android.sensor.sensitivity values less than or equal to this, all applied gain must be analog. For values above this, the gain applied can be a mix of analog and digital.

android.sensor.orientation int32 [public]

Clockwise angle through which the output image needs to be rotated to be upright on the device screen in its native orientation. Also defines the direction of rolling shutter readout, which is from top to bottom in the sensor's coordinate system

degrees clockwise rotation, only multiples of 90

0,90,180,270

android.sensor.profileHueSatMapDimensions int32 x 3 [system]
Number of samples for hue, saturation, and value

The number of input samples for each dimension of android.sensor.profileHueSatMap.

Hue >= 1, Saturation >= 2, Value >= 1

Details

The number of input samples for the hue, saturation, and value dimension of android.sensor.profileHueSatMap. The order of the dimensions given is hue, saturation, value; where hue is the 0th element.

android.sensor.availableTestPatternModes int32 x n [public]
list of enums

Optional. Defaults to [OFF]. Lists the supported test pattern modes for android.sensor.testPatternMode.

Must include OFF. All custom modes must be >= CUSTOM1

dynamic
Property Name Type Description Units Range Tags
android.sensor.exposureTime int64 [public]

Duration each pixel is exposed to light.

nanoseconds

android.sensor.info.exposureTimeRange

Details

If the sensor can't expose this exact duration, it should shorten the duration exposed to the nearest possible value (rather than expose longer).

android.sensor.frameDuration int64 [public]

Duration from start of frame exposure to start of next frame exposure.

nanoseconds

See android.sensor.info.maxFrameDuration, android.scaler.streamConfigurationMap. The duration is capped to max(duration, exposureTime + overhead).

Details

The maximum frame rate that can be supported by a camera subsystem is a function of many factors:

  • Requested resolutions of output image streams
  • Availability of binning / skipping modes on the imager
  • The bandwidth of the imager interface
  • The bandwidth of the various ISP processing blocks

Since these factors can vary greatly between different ISPs and sensors, the camera abstraction tries to represent the bandwidth restrictions with as simple a model as possible.

The model presented has the following characteristics:

  • The image sensor is always configured to output the smallest resolution possible given the application's requested output stream sizes. The smallest resolution is defined as being at least as large as the largest requested output stream size; the camera pipeline must never digitally upsample sensor data when the crop region covers the whole sensor. In general, this means that if only small output stream resolutions are configured, the sensor can provide a higher frame rate.
  • Since any request may use any or all the currently configured output streams, the sensor and ISP must be configured to support scaling a single capture to all the streams at the same time. This means the camera pipeline must be ready to produce the largest requested output size without any delay. Therefore, the overall frame rate of a given configured stream set is governed only by the largest requested stream resolution.
  • Using more than one output stream in a request does not affect the frame duration.
  • Certain format-streams may need to do additional background processing before data is consumed/produced by that stream. These processors can run concurrently to the rest of the camera pipeline, but cannot process more than 1 capture at a time.

The necessary information for the application, given the model above, is provided via the android.scaler.streamConfigurationMap field using StreamConfigurationMap#getOutputMinFrameDuration(int, Size). These are used to determine the maximum frame rate / minimum frame duration that is possible for a given stream configuration.

Specifically, the application can use the following rules to determine the minimum frame duration it can request from the camera device:

  1. Let the set of currently configured input/output streams be called S.
  2. Find the minimum frame durations for each stream in S, by looking it up in android.scaler.streamConfigurationMap using StreamConfigurationMap#getOutputMinFrameDuration(int, Size) (with its respective size/format). Let this set of frame durations be called F.
  3. For any given request R, the minimum frame duration allowed for R is the maximum out of all values in F. Let the streams used in R be called S_r.

If none of the streams in S_r have a stall time (listed in StreamConfigurationMap#getOutputStallDuration(int,Size) using its respective size/format), then the frame duration in F determines the steady state frame rate that the application will get if it uses R as a repeating request. Let this special kind of request be called Rsimple.

A repeating request Rsimple can be occasionally interleaved by a single capture of a new request Rstall (which has at least one in-use stream with a non-0 stall time) and if Rstall has the same minimum frame duration this will not cause a frame rate loss if all buffers from the previous Rstall have already been delivered.

For more details about stalling, see StreamConfigurationMap#getOutputStallDuration(int,Size).

HAL Implementation Details

For more details about stalling, see android.scaler.availableStallDurations.

android.sensor.sensitivity int32 [public]

Gain applied to image data. Must be implemented through analog gain only if set to values below 'maximum analog sensitivity'.

If the sensor can't apply this exact gain, it should lessen the gain to the nearest possible value (rather than gain more).

ISO arithmetic units

android.sensor.info.sensitivityRange

Details

ISO 12232:2006 REI method

android.sensor.timestamp int64 [public]

Time at start of exposure of first row

nanoseconds

> 0

Details

Monotonic, should be synced to other timestamps in system

android.sensor.temperature float [system]

The temperature of the sensor, sampled at the time exposure began for this frame.

The thermal diode being queried should be inside the sensor PCB, or somewhere close to it.

celsius

Optional. This value is missing if no temperature is available.

android.sensor.neutralColorPoint rational x 3 [public]

The estimated camera neutral color in the native sensor colorspace at the time of capture.

Details

This value gives the neutral color point encoded as an RGB value in the native sensor color space. The neutral color point indicates the currently estimated white point of the scene illumination. It can be used to interpolate between the provided color transforms when processing raw sensor data.

The order of the values is R, G, B; where R is in the lowest index.

android.sensor.profileHueSatMap float x hue_samples x saturation_samples x value_samples x 3 [system]
Mapping for hue, saturation, and value

A mapping containing a hue shift, saturation scale, and value scale for each pixel.

Hue shift is given in degrees; saturation and value scale factors are unitless.
Details

hue_samples, saturation_samples, and value_samples are given in android.sensor.profileHueSatMapDimensions.

Each entry of this map contains three floats corresponding to the hue shift, saturation scale, and value scale, respectively; where the hue shift has the lowest index. The map entries are stored in the tag in nested loop order, with the value divisions in the outer loop, the hue divisions in the middle loop, and the saturation divisions in the inner loop. All zero input saturation entries are required to have a value scale factor of 1.0.

android.sensor.profileToneCurve float x samples x 2 [system]
Samples defining a spline for a tone-mapping curve

A list of x,y samples defining a tone-mapping curve for gamma adjustment.

Each sample has an input range of [0, 1] and an output range of [0, 1]. The first sample is required to be (0, 0), and the last sample is required to be (1, 1).

Details

This tag contains a default tone curve that can be applied while processing the image as a starting point for user adjustments. The curve is specified as a list of value pairs in linear gamma. The curve is interpolated using a cubic spline.

android.sensor.greenSplit float [public]

The worst-case divergence between Bayer green channels.

>= 0

Details

This value is an estimate of the worst case split between the Bayer green channels in the red and blue rows in the sensor color filter array.

The green split is calculated as follows:

  1. A 5x5 pixel (or larger) window W within the active sensor array is chosen. The term 'pixel' here is taken to mean a group of 4 Bayer mosaic channels (R, Gr, Gb, B). The location and size of the window chosen is implementation defined, and should be chosen to provide a green split estimate that is both representative of the entire image for this camera sensor, and can be calculated quickly.
  2. The arithmetic mean of the green channels from the red rows (mean_Gr) within W is computed.
  3. The arithmetic mean of the green channels from the blue rows (mean_Gb) within W is computed.
  4. The maximum ratio R of the two means is computed as follows: R = max((mean_Gr + 1)/(mean_Gb + 1), (mean_Gb + 1)/(mean_Gr + 1))

The ratio R is the green split divergence reported for this property, which represents how much the green channels differ in the mosaic pattern. This value is typically used to determine the treatment of the green mosaic channels when demosaicing.

The green split value can be roughly interpreted as follows:

  • R < 1.03 is a negligible split (<3% divergence).
  • 1.20 <= R >= 1.03 will require some software correction to avoid demosaic errors (3-20% divergence).
  • R > 1.20 will require strong software correction to produce a usuable image (>20% divergence).
HAL Implementation Details

The green split given may be a static value based on prior characterization of the camera sensor using the green split calculation method given here over a large, representative, sample set of images. Other methods of calculation that produce equivalent results, and can be interpreted in the same manner, may be used.

android.sensor.testPatternData int32 x 4 [public]

A pixel [R, G_even, G_odd, B] that supplies the test pattern when android.sensor.testPatternMode is SOLID_COLOR.

Optional. Must be supported if android.sensor.availableTestPatternModes contains SOLID_COLOR.

Details

Each color channel is treated as an unsigned 32-bit integer. The camera device then uses the most significant X bits that correspond to how many bits are in its Bayer raw sensor output.

For example, a sensor with RAW10 Bayer output would use the 10 most significant bits from each color channel.

HAL Implementation Details
android.sensor.testPatternMode int32 [public]
  • OFF

    Default. No test pattern mode is used, and the camera device returns captures from the image sensor.

  • SOLID_COLOR

    Each pixel in [R, G_even, G_odd, B] is replaced by its respective color channel provided in android.sensor.testPatternData.

    For example:

    android.testPatternData = [0, 0xFFFFFFFF, 0xFFFFFFFF, 0]
    

    All green pixels are 100% green. All red/blue pixels are black.

    android.testPatternData = [0xFFFFFFFF, 0, 0xFFFFFFFF, 0]
    

    All red pixels are 100% red. Only the odd green pixels are 100% green. All blue pixels are 100% black.

  • COLOR_BARS

    All pixel data is replaced with an 8-bar color pattern.

    The vertical bars (left-to-right) are as follows:

    • 100% white
    • yellow
    • cyan
    • green
    • magenta
    • red
    • blue
    • black

    In general the image would look like the following:

    W Y C G M R B K
    W Y C G M R B K
    W Y C G M R B K
    W Y C G M R B K
    W Y C G M R B K
    . . . . . . . .
    . . . . . . . .
    . . . . . . . .
    
    (B = Blue, K = Black)
    

    Each bar should take up 1/8 of the sensor pixel array width. When this is not possible, the bar size should be rounded down to the nearest integer and the pattern can repeat on the right side.

    Each bar's height must always take up the full sensor pixel array height.

    Each pixel in this test pattern must be set to either 0% intensity or 100% intensity.

  • COLOR_BARS_FADE_TO_GRAY

    The test pattern is similar to COLOR_BARS, except that each bar should start at its specified color at the top, and fade to gray at the bottom.

    Furthermore each bar is further subdivided into a left and right half. The left half should have a smooth gradient, and the right half should have a quantized gradient.

    In particular, the right half's should consist of blocks of the same color for 1/16th active sensor pixel array width.

    The least significant bits in the quantized gradient should be copied from the most significant bits of the smooth gradient.

    The height of each bar should always be a multiple of 128. When this is not the case, the pattern should repeat at the bottom of the image.

  • PN9

    All pixel data is replaced by a pseudo-random sequence generated from a PN9 512-bit sequence (typically implemented in hardware with a linear feedback shift register).

    The generator should be reset at the beginning of each frame, and thus each subsequent raw frame with this test pattern should be exactly the same as the last.

  • CUSTOM1 256

    The first custom test pattern. All custom patterns that are available only on this camera device are at least this numeric value.

    All of the custom test patterns will be static (that is the raw image must not vary from frame to frame).

When enabled, the sensor sends a test pattern instead of doing a real exposure from the camera.

Optional. Defaults to OFF. Value must be one of android.sensor.availableTestPatternModes

Details

When a test pattern is enabled, all manual sensor controls specified by android.sensor.* should be ignored. All other controls should work as normal.

For example, if manual flash is enabled, flash firing should still occur (and that the test pattern remain unmodified, since the flash would not actually affect it).

HAL Implementation Details

All test patterns are specified in the Bayer domain.

The HAL may choose to substitute test patterns from the sensor with test patterns from on-device memory. In that case, it should be indistinguishable to the ISP whether the data came from the sensor interconnect bus (such as CSI2) or memory.

shading
controls
Property Name Type Description Units Range Tags
android.shading.mode byte [public]
  • OFF

    No lens shading correction is applied

  • FAST

    Must not slow down frame rate relative to sensor raw output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Quality of lens shading correction applied to the image data.

Details

When set to OFF mode, no lens shading correction will be applied by the camera device, and an identity lens shading map data will be provided if android.statistics.lensShadingMapMode == ON. For example, for lens shading map with size specified as android.lens.info.shadingMapSize = [ 4, 3 ], the output android.statistics.lensShadingMap for this case will be an identity map shown below:

[ 1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
  1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
  1.0, 1.0, 1.0, 1.0,   1.0, 1.0, 1.0, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0 ]

When set to other modes, lens shading correction will be applied by the camera device. Applications can request lens shading map data by setting android.statistics.lensShadingMapMode to ON, and then the camera device will provide lens shading map data in android.statistics.lensShadingMap, with size specified by android.lens.info.shadingMapSize; the returned shading map data will be the one applied by the camera device for this capture request.

The shading map data may depend on the AE and AWB statistics, therefore the reliability of the map data may be affected by the AE and AWB algorithms. When AE and AWB are in AUTO modes(android.control.aeMode != OFF and android.control.awbMode != OFF), to get best results, it is recommended that the applications wait for the AE and AWB to be converged before using the returned shading map data.

android.shading.strength byte [system]

Control the amount of shading correction applied to the images

unitless: 1-10; 10 is full shading compensation
dynamic
Property Name Type Description Units Range Tags
android.shading.mode byte [public]
  • OFF

    No lens shading correction is applied

  • FAST

    Must not slow down frame rate relative to sensor raw output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Quality of lens shading correction applied to the image data.

Details

When set to OFF mode, no lens shading correction will be applied by the camera device, and an identity lens shading map data will be provided if android.statistics.lensShadingMapMode == ON. For example, for lens shading map with size specified as android.lens.info.shadingMapSize = [ 4, 3 ], the output android.statistics.lensShadingMap for this case will be an identity map shown below:

[ 1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
  1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0,
  1.0, 1.0, 1.0, 1.0,   1.0, 1.0, 1.0, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.0, 1.0, 1.0, 1.0 ]

When set to other modes, lens shading correction will be applied by the camera device. Applications can request lens shading map data by setting android.statistics.lensShadingMapMode to ON, and then the camera device will provide lens shading map data in android.statistics.lensShadingMap, with size specified by android.lens.info.shadingMapSize; the returned shading map data will be the one applied by the camera device for this capture request.

The shading map data may depend on the AE and AWB statistics, therefore the reliability of the map data may be affected by the AE and AWB algorithms. When AE and AWB are in AUTO modes(android.control.aeMode != OFF and android.control.awbMode != OFF), to get best results, it is recommended that the applications wait for the AE and AWB to be converged before using the returned shading map data.

statistics
controls
Property Name Type Description Units Range Tags
android.statistics.faceDetectMode byte [public]
  • OFF
  • SIMPLE

    Optional Return rectangle and confidence only

  • FULL

    Optional Return all face metadata

State of the face detector unit

android.statistics.info.availableFaceDetectModes

Details

Whether face detection is enabled, and whether it should output just the basic fields or the full set of fields. Value must be one of the android.statistics.info.availableFaceDetectModes.

android.statistics.histogramMode byte [system as boolean]
  • OFF
  • ON

Operating mode for histogram generation

android.statistics.sharpnessMapMode byte [system as boolean]
  • OFF
  • ON

Operating mode for sharpness map generation

android.statistics.hotPixelMapMode byte [public as boolean]
  • OFF
  • ON

Operating mode for hotpixel map generation.

Details

If set to ON, a hotpixel map is returned in android.statistics.hotPixelMap. If set to OFF, no hotpixel map should be returned.

This must be set to a valid mode from android.statistics.info.availableHotPixelMapModes.

android.statistics.lensShadingMapMode byte [public]
  • OFF
  • ON

Whether the camera device will output the lens shading map in output result metadata.

Details

When set to ON, android.statistics.lensShadingMap must be provided in the output result metadata.

static
Property Name Type Description Units Range Tags
android.statistics.info.availableFaceDetectModes byte x n [public]
List of enums from android.statistics.faceDetectMode

Which face detection modes are available, if any

List of enum: OFF SIMPLE FULL
Details

OFF means face detection is disabled, it must be included in the list.

SIMPLE means the device supports the android.statistics.faceRectangles and android.statistics.faceScores outputs.

FULL means the device additionally supports the android.statistics.faceIds and android.statistics.faceLandmarks outputs.

android.statistics.info.histogramBucketCount int32 [system]

Number of histogram buckets supported

>= 64

android.statistics.info.maxFaceCount int32 [public]

Maximum number of simultaneously detectable faces

>= 4 if android.statistics.info.availableFaceDetectModes lists modes besides OFF, otherwise 0

android.statistics.info.maxHistogramCount int32 [system]

Maximum value possible for a histogram bucket

android.statistics.info.maxSharpnessMapValue int32 [system]

Maximum value possible for a sharpness map region.

android.statistics.info.sharpnessMapSize int32 x 2 [system as size]
width x height

Dimensions of the sharpness map

Must be at least 32 x 32

android.statistics.info.availableHotPixelMapModes byte x n [public as boolean]
list of enums

The set of hot pixel map output modes supported by this camera device.

Details

This tag lists valid output modes for android.statistics.hotPixelMapMode.

If no hotpixel map is available for this camera device, this will contain only OFF. If the hotpixel map is available, this should include both the ON and OFF options.

dynamic
Property Name Type Description Units Range Tags
android.statistics.faceDetectMode byte [public]
  • OFF
  • SIMPLE

    Optional Return rectangle and confidence only

  • FULL

    Optional Return all face metadata

State of the face detector unit

android.statistics.info.availableFaceDetectModes

Details

Whether face detection is enabled, and whether it should output just the basic fields or the full set of fields. Value must be one of the android.statistics.info.availableFaceDetectModes.

android.statistics.faceIds int32 x n [hidden]

List of unique IDs for detected faces

Details

Only available if faceDetectMode == FULL

android.statistics.faceLandmarks int32 x n x 6 [hidden]
(leftEyeX, leftEyeY, rightEyeX, rightEyeY, mouthX, mouthY)

List of landmarks for detected faces

Details

Only available if faceDetectMode == FULL

android.statistics.faceRectangles int32 x n x 4 [hidden as rectangle]
(xmin, ymin, xmax, ymax). (0,0) is top-left of active pixel area

List of the bounding rectangles for detected faces

Details

Only available if faceDetectMode != OFF

android.statistics.faceScores byte x n [hidden]

List of the face confidence scores for detected faces

1-100

Details

Only available if faceDetectMode != OFF. The value should be meaningful (for example, setting 100 at all times is illegal).

android.statistics.faces int32 x n [public as face] [synthetic]

List of the faces detected through camera face detection in this result.

Details

Only available if android.statistics.faceDetectMode != OFF.

android.statistics.histogram int32 x n x 3 [system]
count of pixels for each color channel that fall into each histogram bucket, scaled to be between 0 and maxHistogramCount

A 3-channel histogram based on the raw sensor data

Details

The k'th bucket (0-based) covers the input range (with w = android.sensor.info.whiteLevel) of [ k * w/N, (k + 1) * w / N ). If only a monochrome sharpness map is supported, all channels should have the same data

android.statistics.histogramMode byte [system as boolean]
  • OFF
  • ON

Operating mode for histogram generation

android.statistics.sharpnessMap int32 x n x m x 3 [system]
estimated sharpness for each region of the input image. Normalized to be between 0 and maxSharpnessMapValue. Higher values mean sharper (better focused)

A 3-channel sharpness map, based on the raw sensor data

Details

If only a monochrome sharpness map is supported, all channels should have the same data

android.statistics.sharpnessMapMode byte [system as boolean]
  • OFF
  • ON

Operating mode for sharpness map generation

android.statistics.lensShadingMap float x 4 x n x m [public]
2D array of float gain factors per channel to correct lens shading

The shading map is a low-resolution floating-point map that lists the coefficients used to correct for vignetting, for each Bayer color channel.

Each gain factor is >= 1

Details

The least shaded section of the image should have a gain factor of 1; all other sections should have gains above 1.

When android.colorCorrection.mode = TRANSFORM_MATRIX, the map must take into account the colorCorrection settings.

The shading map is for the entire active pixel array, and is not affected by the crop region specified in the request. Each shading map entry is the value of the shading compensation map over a specific pixel on the sensor. Specifically, with a (N x M) resolution shading map, and an active pixel array size (W x H), shading map entry (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. The map is assumed to be bilinearly interpolated between the sample points.

The channel order is [R, Geven, Godd, B], where Geven is the green channel for the even rows of a Bayer pattern, and Godd is the odd rows. The shading map is stored in a fully interleaved format, and its size is provided in the camera static metadata by android.lens.info.shadingMapSize.

The shading map should have on the order of 30-40 rows and columns, and must be smaller than 64x64.

As an example, given a very small map defined as:

android.lens.info.shadingMapSize = [ 4, 3 ]
android.statistics.lensShadingMap =
[ 1.3, 1.2, 1.15, 1.2,  1.2, 1.2, 1.15, 1.2,
    1.1, 1.2, 1.2, 1.2,  1.3, 1.2, 1.3, 1.3,
  1.2, 1.2, 1.25, 1.1,  1.1, 1.1, 1.1, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.2, 1.3, 1.25, 1.2,
  1.3, 1.2, 1.2, 1.3,   1.2, 1.15, 1.1, 1.2,
    1.2, 1.1, 1.0, 1.2,  1.3, 1.15, 1.2, 1.3 ]

The low-resolution scaling map images for each channel are (displayed using nearest-neighbor interpolation):

Red lens shading map Green (even rows) lens shading map Green (odd rows) lens shading map Blue lens shading map

As a visualization only, inverting the full-color map to recover an image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives:

Image of a uniform white wall (inverse shading map)

HAL Implementation Details

The lens shading map calculation may depend on exposure and white balance statistics. When AE and AWB are in AUTO modes (android.control.aeMode != OFF and android.control.awbMode != OFF), the HAL may have all the information it need to generate most accurate lens shading map. When AE or AWB are in manual mode (android.control.aeMode == OFF or android.control.awbMode == OFF), the shading map may be adversely impacted by manual exposure or white balance parameters. To avoid generating unreliable shading map data, the HAL may choose to lock the shading map with the latest known good map generated when the AE and AWB are in AUTO modes.

android.statistics.predictedColorGains float x 4 [hidden] [deprecated]
A 1D array of floats for 4 color channel gains

The best-fit color channel gains calculated by the camera device's statistics units for the current output frame.

Deprecated. Do not use.

Details

This may be different than the gains used for this frame, since statistics processing on data from a new frame typically completes after the transform has already been applied to that frame.

The 4 channel gains are defined in Bayer domain, see android.colorCorrection.gains for details.

This value should always be calculated by the AWB block, regardless of the android.control.* current values.

android.statistics.predictedColorTransform rational x 3 x 3 [hidden] [deprecated]
3x3 rational matrix in row-major order

The best-fit color transform matrix estimate calculated by the camera device's statistics units for the current output frame.

Deprecated. Do not use.

Details

The camera device will provide the estimate from its statistics unit on the white balance transforms to use for the next frame. These are the values the camera device believes are the best fit for the current output frame. This may be different than the transform used for this frame, since statistics processing on data from a new frame typically completes after the transform has already been applied to that frame.

These estimates must be provided for all frames, even if capture settings and color transforms are set by the application.

This value should always be calculated by the AWB block, regardless of the android.control.* current values.

android.statistics.sceneFlicker byte [public]
  • NONE
  • 50HZ
  • 60HZ

The camera device estimated scene illumination lighting frequency.

Details

Many light sources, such as most fluorescent lights, flicker at a rate that depends on the local utility power standards. This flicker must be accounted for by auto-exposure routines to avoid artifacts in captured images. The camera device uses this entry to tell the application what the scene illuminant frequency is.

When manual exposure control is enabled (android.control.aeMode == OFF or android.control.mode == OFF), the android.control.aeAntibandingMode doesn't do the antibanding, and the application can ensure it selects exposure times that do not cause banding issues by looking into this metadata field. See android.control.aeAntibandingMode for more details.

Report NONE if there doesn't appear to be flickering illumination.

android.statistics.hotPixelMapMode byte [public as boolean]
  • OFF
  • ON

Operating mode for hotpixel map generation.

Details

If set to ON, a hotpixel map is returned in android.statistics.hotPixelMap. If set to OFF, no hotpixel map should be returned.

This must be set to a valid mode from android.statistics.info.availableHotPixelMapModes.

android.statistics.hotPixelMap int32 x 2 x n [public]
list of coordinates based on android.sensor.pixelArraySize

List of (x, y) coordinates of hot/defective pixels on the sensor.

n <= number of pixels on the sensor. The (x, y) coordinates must be bounded by android.sensor.info.pixelArraySize.

Details

A coordinate (x, y) must lie between (0, 0), and (width - 1, height - 1) (inclusive), which are the top-left and bottom-right of the pixel array, respectively. The width and height dimensions are given in android.sensor.info.pixelArraySize. This may include hot pixels that lie outside of the active array bounds given by android.sensor.info.activeArraySize.

HAL Implementation Details

A hotpixel map contains the coordinates of pixels on the camera sensor that do report valid values (usually due to defects in the camera sensor). This includes pixels that are stuck at certain values, or have a response that does not accuractly encode the incoming light from the scene.

To avoid performance issues, there should be significantly fewer hot pixels than actual pixels on the camera sensor.

android.statistics.lensShadingMapMode byte [public]
  • OFF
  • ON

Whether the camera device will output the lens shading map in output result metadata.

Details

When set to ON, android.statistics.lensShadingMap must be provided in the output result metadata.

tonemap
controls
Property Name Type Description Units Range Tags
android.tonemap.curveBlue float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Tonemapping / contrast / gamma curve for the blue channel, to use when android.tonemap.mode is CONTRAST_CURVE.

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

See android.tonemap.curveRed for more details.

android.tonemap.curveGreen float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Tonemapping / contrast / gamma curve for the green channel, to use when android.tonemap.mode is CONTRAST_CURVE.

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

See android.tonemap.curveRed for more details.

android.tonemap.curveRed float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Tonemapping / contrast / gamma curve for the red channel, to use when android.tonemap.mode is CONTRAST_CURVE.

0-1 on both input and output coordinates, normalized as a floating-point value such that 0 == black and 1 == white.

Details

Each channel's curve is defined by an array of control points:

android.tonemap.curveRed =
  [ P0in, P0out, P1in, P1out, P2in, P2out, P3in, P3out, ..., PNin, PNout ]
2 <= N <= android.tonemap.maxCurvePoints

These are sorted in order of increasing Pin; it is always guaranteed that input values 0.0 and 1.0 are included in the list to define a complete mapping. For input values between control points, the camera device must linearly interpolate between the control points.

Each curve can have an independent number of points, and the number of points can be less than max (that is, the request doesn't have to always provide a curve with number of points equivalent to android.tonemap.maxCurvePoints).

A few examples, and their corresponding graphical mappings; these only specify the red channel and the precision is limited to 4 digits, for conciseness.

Linear mapping:

android.tonemap.curveRed = [ 0, 0, 1.0, 1.0 ]

Linear mapping curve

Invert mapping:

android.tonemap.curveRed = [ 0, 1.0, 1.0, 0 ]

Inverting mapping curve

Gamma 1/2.2 mapping, with 16 control points:

android.tonemap.curveRed = [
  0.0000, 0.0000, 0.0667, 0.2920, 0.1333, 0.4002, 0.2000, 0.4812,
  0.2667, 0.5484, 0.3333, 0.6069, 0.4000, 0.6594, 0.4667, 0.7072,
  0.5333, 0.7515, 0.6000, 0.7928, 0.6667, 0.8317, 0.7333, 0.8685,
  0.8000, 0.9035, 0.8667, 0.9370, 0.9333, 0.9691, 1.0000, 1.0000 ]

Gamma = 1/2.2 tonemapping curve

Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points:

android.tonemap.curveRed = [
  0.0000, 0.0000, 0.0667, 0.2864, 0.1333, 0.4007, 0.2000, 0.4845,
  0.2667, 0.5532, 0.3333, 0.6125, 0.4000, 0.6652, 0.4667, 0.7130,
  0.5333, 0.7569, 0.6000, 0.7977, 0.6667, 0.8360, 0.7333, 0.8721,
  0.8000, 0.9063, 0.8667, 0.9389, 0.9333, 0.9701, 1.0000, 1.0000 ]

sRGB tonemapping curve

HAL Implementation Details

For good quality of mapping, at least 128 control points are preferred.

A typical use case of this would be a gamma-1/2.2 curve, with as many control points used as are available.

android.tonemap.mode byte [public]
  • CONTRAST_CURVE

    Use the tone mapping curve specified in the android.tonemap.curve* entries.

    All color enhancement and tonemapping must be disabled, except for applying the tonemapping curve specified by android.tonemap.curveRed, android.tonemap.curveBlue, or android.tonemap.curveGreen.

    Must not slow down frame rate relative to raw sensor output.

  • FAST

    Advanced gamma mapping and color enhancement may be applied.

    Should not slow down frame rate relative to raw sensor output.

  • HIGH_QUALITY

    Advanced gamma mapping and color enhancement may be applied.

    May slow down frame rate relative to raw sensor output.

High-level global contrast/gamma/tonemapping control.

Details

When switching to an application-defined contrast curve by setting android.tonemap.mode to CONTRAST_CURVE, the curve is defined per-channel with a set of (in, out) points that specify the mapping from input high-bit-depth pixel value to the output low-bit-depth value. Since the actual pixel ranges of both input and output may change depending on the camera pipeline, the values are specified by normalized floating-point numbers.

More-complex color mapping operations such as 3D color look-up tables, selective chroma enhancement, or other non-linear color transforms will be disabled when android.tonemap.mode is CONTRAST_CURVE.

This must be set to a valid mode in android.tonemap.availableToneMapModes.

When using either FAST or HIGH_QUALITY, the camera device will emit its own tonemap curve in android.tonemap.curveRed, android.tonemap.curveGreen, and android.tonemap.curveBlue. These values are always available, and as close as possible to the actually used nonlinear/nonglobal transforms.

If a request is sent with CONTRAST_CURVE with the camera device's provided curve in FAST or HIGH_QUALITY, the image's tonemap will be roughly the same.

static
Property Name Type Description Units Range Tags
android.tonemap.maxCurvePoints int32 [public]

Maximum number of supported points in the tonemap curve that can be used for android.tonemap.curveRed, or android.tonemap.curveGreen, or android.tonemap.curveBlue.

>= 64

Details

If the actual number of points provided by the application (in android.tonemap.curve*) is less than max, the camera device will resample the curve to its internal representation, using linear interpolation.

The output curves in the result metadata may have a different number of points than the input curves, and will represent the actual hardware curves used as closely as possible when linearly interpolated.

HAL Implementation Details

This value must be at least 64. This should be at least 128.

android.tonemap.availableToneMapModes byte x n [public]
list of enums

The set of tonemapping modes supported by this camera device.

Details

This tag lists the valid modes for android.tonemap.mode.

Full-capability camera devices must always support CONTRAST_CURVE and FAST.

dynamic
Property Name Type Description Units Range Tags
android.tonemap.curveBlue float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Tonemapping / contrast / gamma curve for the blue channel, to use when android.tonemap.mode is CONTRAST_CURVE.

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

See android.tonemap.curveRed for more details.

android.tonemap.curveGreen float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Tonemapping / contrast / gamma curve for the green channel, to use when android.tonemap.mode is CONTRAST_CURVE.

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

See android.tonemap.curveRed for more details.

android.tonemap.curveRed float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Tonemapping / contrast / gamma curve for the red channel, to use when android.tonemap.mode is CONTRAST_CURVE.

0-1 on both input and output coordinates, normalized as a floating-point value such that 0 == black and 1 == white.

Details

Each channel's curve is defined by an array of control points:

android.tonemap.curveRed =
  [ P0in, P0out, P1in, P1out, P2in, P2out, P3in, P3out, ..., PNin, PNout ]
2 <= N <= android.tonemap.maxCurvePoints

These are sorted in order of increasing Pin; it is always guaranteed that input values 0.0 and 1.0 are included in the list to define a complete mapping. For input values between control points, the camera device must linearly interpolate between the control points.

Each curve can have an independent number of points, and the number of points can be less than max (that is, the request doesn't have to always provide a curve with number of points equivalent to android.tonemap.maxCurvePoints).

A few examples, and their corresponding graphical mappings; these only specify the red channel and the precision is limited to 4 digits, for conciseness.

Linear mapping:

android.tonemap.curveRed = [ 0, 0, 1.0, 1.0 ]

Linear mapping curve

Invert mapping:

android.tonemap.curveRed = [ 0, 1.0, 1.0, 0 ]

Inverting mapping curve

Gamma 1/2.2 mapping, with 16 control points:

android.tonemap.curveRed = [
  0.0000, 0.0000, 0.0667, 0.2920, 0.1333, 0.4002, 0.2000, 0.4812,
  0.2667, 0.5484, 0.3333, 0.6069, 0.4000, 0.6594, 0.4667, 0.7072,
  0.5333, 0.7515, 0.6000, 0.7928, 0.6667, 0.8317, 0.7333, 0.8685,
  0.8000, 0.9035, 0.8667, 0.9370, 0.9333, 0.9691, 1.0000, 1.0000 ]

Gamma = 1/2.2 tonemapping curve

Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points:

android.tonemap.curveRed = [
  0.0000, 0.0000, 0.0667, 0.2864, 0.1333, 0.4007, 0.2000, 0.4845,
  0.2667, 0.5532, 0.3333, 0.6125, 0.4000, 0.6652, 0.4667, 0.7130,
  0.5333, 0.7569, 0.6000, 0.7977, 0.6667, 0.8360, 0.7333, 0.8721,
  0.8000, 0.9063, 0.8667, 0.9389, 0.9333, 0.9701, 1.0000, 1.0000 ]

sRGB tonemapping curve

HAL Implementation Details

For good quality of mapping, at least 128 control points are preferred.

A typical use case of this would be a gamma-1/2.2 curve, with as many control points used as are available.

android.tonemap.mode byte [public]
  • CONTRAST_CURVE

    Use the tone mapping curve specified in the android.tonemap.curve* entries.

    All color enhancement and tonemapping must be disabled, except for applying the tonemapping curve specified by android.tonemap.curveRed, android.tonemap.curveBlue, or android.tonemap.curveGreen.

    Must not slow down frame rate relative to raw sensor output.

  • FAST

    Advanced gamma mapping and color enhancement may be applied.

    Should not slow down frame rate relative to raw sensor output.

  • HIGH_QUALITY

    Advanced gamma mapping and color enhancement may be applied.

    May slow down frame rate relative to raw sensor output.

High-level global contrast/gamma/tonemapping control.

Details

When switching to an application-defined contrast curve by setting android.tonemap.mode to CONTRAST_CURVE, the curve is defined per-channel with a set of (in, out) points that specify the mapping from input high-bit-depth pixel value to the output low-bit-depth value. Since the actual pixel ranges of both input and output may change depending on the camera pipeline, the values are specified by normalized floating-point numbers.

More-complex color mapping operations such as 3D color look-up tables, selective chroma enhancement, or other non-linear color transforms will be disabled when android.tonemap.mode is CONTRAST_CURVE.

This must be set to a valid mode in android.tonemap.availableToneMapModes.

When using either FAST or HIGH_QUALITY, the camera device will emit its own tonemap curve in android.tonemap.curveRed, android.tonemap.curveGreen, and android.tonemap.curveBlue. These values are always available, and as close as possible to the actually used nonlinear/nonglobal transforms.

If a request is sent with CONTRAST_CURVE with the camera device's provided curve in FAST or HIGH_QUALITY, the image's tonemap will be roughly the same.

led
controls
Property Name Type Description Units Range Tags
android.led.transmit byte [hidden as boolean]
  • OFF
  • ON

This LED is nominally used to indicate to the user that the camera is powered on and may be streaming images back to the Application Processor. In certain rare circumstances, the OS may disable this when video is processed locally and not transmitted to any untrusted applications.

In particular, the LED must always be on when the data could be transmitted off the device. The LED should always be on whenever data is stored locally on the device.

The LED may be off if a trusted application is using the data that doesn't violate the above rules.

dynamic
Property Name Type Description Units Range Tags
android.led.transmit byte [hidden as boolean]
  • OFF
  • ON

This LED is nominally used to indicate to the user that the camera is powered on and may be streaming images back to the Application Processor. In certain rare circumstances, the OS may disable this when video is processed locally and not transmitted to any untrusted applications.

In particular, the LED must always be on when the data could be transmitted off the device. The LED should always be on whenever data is stored locally on the device.

The LED may be off if a trusted application is using the data that doesn't violate the above rules.

static
Property Name Type Description Units Range Tags
android.led.availableLeds byte x n [hidden]

A list of camera LEDs that are available on this system.

info
static
Property Name Type Description Units Range Tags
android.info.supportedHardwareLevel byte [public]
  • LIMITED
  • FULL

Generally classifies the overall set of the camera device functionality.

Optional. Default value is LIMITED.

Details

Camera devices will come in two flavors: LIMITED and FULL.

A FULL device has the most support possible and will enable the widest range of use cases such as:

  • 30 FPS at maximum resolution (== sensor resolution)
  • Per frame control
  • Manual sensor control
  • Zero Shutter Lag (ZSL)

A LIMITED device may have some or none of the above characteristics. To find out more refer to android.request.availableCapabilities.

HAL Implementation Details

The camera 3 HAL device can implement one of two possible operational modes; limited and full. Full support is expected from new higher-end devices. Limited mode has hardware requirements roughly in line with those for a camera HAL device v1 implementation, and is expected from older or inexpensive devices. Full is a strict superset of limited, and they share the same essential operational flow.

For full details refer to "S3. Operational Modes" in camera3.h

blackLevel
controls
Property Name Type Description Units Range Tags
android.blackLevel.lock byte [public as boolean]
  • OFF
  • ON

Whether black-level compensation is locked to its current values, or is free to vary.

Details

When set to ON, the values used for black-level compensation will not change until the lock is set to OFF.

Since changes to certain capture parameters (such as exposure time) may require resetting of black level compensation, the camera device must report whether setting the black level lock was successful in the output result metadata.

For example, if a sequence of requests is as follows:

  • Request 1: Exposure = 10ms, Black level lock = OFF
  • Request 2: Exposure = 10ms, Black level lock = ON
  • Request 3: Exposure = 10ms, Black level lock = ON
  • Request 4: Exposure = 20ms, Black level lock = ON
  • Request 5: Exposure = 20ms, Black level lock = ON
  • Request 6: Exposure = 20ms, Black level lock = ON

And the exposure change in Request 4 requires the camera device to reset the black level offsets, then the output result metadata is expected to be:

  • Result 1: Exposure = 10ms, Black level lock = OFF
  • Result 2: Exposure = 10ms, Black level lock = ON
  • Result 3: Exposure = 10ms, Black level lock = ON
  • Result 4: Exposure = 20ms, Black level lock = OFF
  • Result 5: Exposure = 20ms, Black level lock = ON
  • Result 6: Exposure = 20ms, Black level lock = ON

This indicates to the application that on frame 4, black levels were reset due to exposure value changes, and pixel values may not be consistent across captures.

The camera device will maintain the lock to the extent possible, only overriding the lock to OFF when changes to other request parameters require a black level recalculation or reset.

HAL Implementation Details

If for some reason black level locking is no longer possible (for example, the analog gain has changed, which forces black level offsets to be recalculated), then the HAL must override this request (and it must report 'OFF' when this does happen) until the next capture for which locking is possible again.

dynamic
Property Name Type Description Units Range Tags
android.blackLevel.lock byte [public as boolean]
  • OFF
  • ON

Whether black-level compensation is locked to its current values, or is free to vary.

Details

Whether the black level offset was locked for this frame. Should be ON if android.blackLevel.lock was ON in the capture request, unless a change in other capture settings forced the camera device to perform a black level reset.

HAL Implementation Details

If for some reason black level locking is no longer possible (for example, the analog gain has changed, which forces black level offsets to be recalculated), then the HAL must override this request (and it must report 'OFF' when this does happen) until the next capture for which locking is possible again.

sync
dynamic
Property Name Type Description Units Range Tags
android.sync.frameNumber int64 [hidden]
  • CONVERGING -1

    The current result is not yet fully synchronized to any request. Synchronization is in progress, and reading metadata from this result may include a mix of data that have taken effect since the last synchronization time.

    In some future result, within android.sync.maxLatency frames, this value will update to the actual frame number frame number the result is guaranteed to be synchronized to (as long as the request settings remain constant).

  • UNKNOWN -2

    The current result's synchronization status is unknown. The result may have already converged, or it may be in progress. Reading from this result may include some mix of settings from past requests.

    After a settings change, the new settings will eventually all take effect for the output buffers and results. However, this value will not change when that happens. Altering settings rapidly may provide outcomes using mixes of settings from recent requests.

    This value is intended primarily for backwards compatibility with the older camera implementations (for android.hardware.Camera).

The frame number corresponding to the last request with which the output result (metadata + buffers) has been fully synchronized.

Either a non-negative value corresponding to a frame_number, or one of the two enums (CONVERGING / UNKNOWN).

Details

When a request is submitted to the camera device, there is usually a delay of several frames before the controls get applied. A camera device may either choose to account for this delay by implementing a pipeline and carefully submit well-timed atomic control updates, or it may start streaming control changes that span over several frame boundaries.

In the latter case, whenever a request's settings change relative to the previous submitted request, the full set of changes may take multiple frame durations to fully take effect. Some settings may take effect sooner (in less frame durations) than others.

While a set of control changes are being propagated, this value will be CONVERGING.

Once it is fully known that a set of control changes have been finished propagating, and the resulting updated control settings have been read back by the camera device, this value will be set to a non-negative frame number (corresponding to the request to which the results have synchronized to).

Older camera device implementations may not have a way to detect when all camera controls have been applied, and will always set this value to UNKNOWN.

FULL capability devices will always have this value set to the frame number of the request corresponding to this result.

Further details:

  • Whenever a request differs from the last request, any future results not yet returned may have this value set to CONVERGING (this could include any in-progress captures not yet returned by the camera device, for more details see pipeline considerations below).
  • Submitting a series of multiple requests that differ from the previous request (e.g. r1, r2, r3 s.t. r1 != r2 != r3) moves the new synchronization frame to the last non-repeating request (using the smallest frame number from the contiguous list of repeating requests).
  • Submitting the same request repeatedly will not change this value to CONVERGING, if it was already a non-negative value.
  • When this value changes to non-negative, that means that all of the metadata controls from the request have been applied, all of the metadata controls from the camera device have been read to the updated values (into the result), and all of the graphics buffers corresponding to this result are also synchronized to the request.

Pipeline considerations:

Submitting a request with updated controls relative to the previously submitted requests may also invalidate the synchronization state of all the results corresponding to currently in-flight requests.

In other words, results for this current request and up to android.request.pipelineMaxDepth prior requests may have their android.sync.frameNumber change to CONVERGING.

HAL Implementation Details

Using UNKNOWN here is illegal unless android.sync.maxLatency is also UNKNOWN.

FULL capability devices should simply set this value to the frame_number of the request this result corresponds to.

static
Property Name Type Description Units Range Tags
android.sync.maxLatency int32 [public]
  • PER_FRAME_CONTROL 0

    Every frame has the requests immediately applied. (and furthermore for all results, android.sync.frameNumber == android.request.frameCount)

    Changing controls over multiple requests one after another will produce results that have those controls applied atomically each frame.

    All FULL capability devices will have this as their maxLatency.

  • UNKNOWN -1

    Each new frame has some subset (potentially the entire set) of the past requests applied to the camera settings.

    By submitting a series of identical requests, the camera device will eventually have the camera settings applied, but it is unknown when that exact point will be.

The maximum number of frames that can occur after a request (different than the previous) has been submitted, and before the result's state becomes synchronized (by setting android.sync.frameNumber to a non-negative value).

number of processed requests

>= -1

Details

This defines the maximum distance (in number of metadata results), between android.sync.frameNumber and the equivalent android.request.frameCount.

In other words this acts as an upper boundary for how many frames must occur before the camera device knows for a fact that the new submitted camera settings have been applied in outgoing frames.

For example if the distance was 2,

initial request = X (repeating)
request1 = X
request2 = Y
request3 = Y
request4 = Y

where requestN has frameNumber N, and the first of the repeating
initial request's has frameNumber F (and F < 1).

initial result = X' + { android.sync.frameNumber == F }
result1 = X' + { android.sync.frameNumber == F }
result2 = X' + { android.sync.frameNumber == CONVERGING }
result3 = X' + { android.sync.frameNumber == CONVERGING }
result4 = X' + { android.sync.frameNumber == 2 }

where resultN has frameNumber N.

Since result4 has a frameNumber == 4 and android.sync.frameNumber == 2, the distance is clearly 4 - 2 = 2.

HAL Implementation Details

Use frame_count from camera3_request_t instead of android.request.frameCount.

LIMITED devices are strongly encouraged to use a non-negative value. If UNKNOWN is used here then app developers do not have a way to know when sensor settings have been applied.

Tags

[ top ]