Compatible media transcoding, introduced in Android 12, is a feature that allows devices to use more modern, storage-efficient media formats for video capture, such as HEVC, while maintaining compatibility with apps. With this feature, device manufacturers can use HEVC instead of AVC by default to improve video quality while reducing storage and bandwidth requirements. For devices with compatible media transcoding enabled, Android can automatically convert videos (up to one minute in length) recorded in formats such as HEVC or HDR when the videos are opened by an app that doesn't support the format. This allows apps to function even when videos are captured in newer formats on the device.
The compatible media transcoding feature is off by default. To request media transcoding, apps must declare their media capabilities. For more information on declaring media capabilities, see Compatible media transcoding on the Android Developers site.
How it works
The compatible media transcoding feature consists of two main parts:
- Transcoding services in the media framework: These services convert files from one format to another using hardware for low latency and high quality conversions. This includes the transcoding API, the transcoding service, an OEM plugin for custom filters, and hardware. For further details, see Architecture overview.
- Compatible media transcoding feature in media providers: This component found in media providers intercepts apps accessing media files and serves either the original file or a transcoded file based on the app's declared capabilities. If an app supports the format of the media file, no special handling is required. If an app doesn't support the format, the framework converts the file to an older format, such as AVC, when the app accesses the file.
Figure 1 shows an overview of the media transcoding process.
Figure 1. Overview of compatible media transcoding.
Supported formats
The compatible media transcoding feature supports the following format conversions:
- HEVC (8-bit) to AVC: Codec conversions are performed through connecting one mediacodec decoder and one mediacode encoder.
- HDR10+ (10-bit) to AVC (SDR): HDR to SDR conversions are performed using mediacodec instances and a vendor plugin hook into the decoder instances. For more information, see HDR to SDR encoding.
Supported content sources
The compatible media transcoding feature supports on-device media generated
by the native OEM camera app that is stored in the DCIM/Camera/
folder in
primary external volume. The feature doesn't support media on secondary storage.
Content passed to devices through email or SD cards aren't supported.
Apps access the files based on various filepaths. The following describes the filepaths where transcoding is enabled or bypassed:
Transcoding enabled:
- App access through MediaStore APIs
- App access through direct filepath APIs including Java and native code
- App access through the Storage Access Framework (SAF)
- App access through the OS share sheet Intents. (MediaStore URI only)
- MTP/PTP file transfer from phone to PC
Transcoding bypassed:
- Transferring file off a device by ejecting the SD card
- Transferring files from device to device using options such as Nearby Share or Bluetooth transfer.
Add customized filepaths for transcoding
Device manufacturers can optionally add filepaths for media transcoding under
the DCIM/
directory. Any paths outside the DCIM/
directory are rejected.
Adding such filepaths might be required to meet carrier requirements or local
regulations.
To add a filepath, use the transcode path
runtime resource overlay (RRO),
config_supported_transcoding_relative_paths
. The following is an example
of how to add a filepath:
<string-array name="config_supported_transcoding_relative_paths" translatable="false">
<item>DCIM/JCF/</item>
</string-array>
To verify the configured filepaths, use:
adb shell dumpsys activity provider com.google.android.providers.media.module/com.android.providers.media.MediaProvider | head -n 20
Architecture overview
This section describes the architecture of the media transcoding feature.
Figure 2. Media transcoding architecture.
The media transcoding architecture consists of the following components:
- MediaTranscodingManager system API: Interface that allows the client to communicate with the MediaTranscoding service. The MediaProvider module uses this API.
- MediaTranscodingService: Native service that manages client
connections, schedules transcoding requests, and manages bookkeeping for
TranscodingSessions
. - MediaTranscoder: Native library that performs transcoding. This library is built on top of the media framework NDK to be compatible with modules.
The compatible media transcoding feature logs transcoding metrics in both the service and the media transcoder. The client side and service side code are in the MediaProvider module to allow for timely bug fixes and updates.
File access
Compatible media transcoding is built on top of the Filesystem in Userspace (FUSE) filesystem, which is used for scoped storage. FUSE enables the MediaProvider module to examine file operations in user space and to gate access to files based on the policy to allow, deny, or redact access.
When an app attempts to access a file, the FUSE daemon intercepts the file read access from the app. If the app supports a newer format (such as HEVC), the original file is returned. If the app doesn't support the format, the file is transcoded to an older format (such as AVC) or is returned from cache if a transcoded version is available.
Request transcoded files
The compatible media transcoding feature is disabled by default, meaning that if the device supports HEVC, Android doesn't transcode files unless specified by an app in a manifest file or in the force transcode list.
Apps can request transcoded assets using the following options:
- Declare unsupported formats in the manifest file. For details, see Declare capabilities in a resource and Declare capabilities in code.
- Add apps to the force transcode list that's included in the MediaProvider module. This enables transcoding for apps that haven't updated their manifest file. Once an app updates its manifest file with unsupported formats, it must be removed from the force transcode list. Device manufacturers can nominate their apps to be added or removed from the force transcode list by submitting a patch or by reporting a bug. The Android team periodically reviews the list and may remove apps from the list.
- Disable supported formats with the app compatibility framework at run time (users can also disable this for each app in Settings).
- Open a file with
MediaStore
while explicitly specifying unsupported formats with theopenTypedAssetFileDescriptor
API.
For USB transfers (device to PC), transcoding is disabled by default but users can choose to enable transcoding using the Convert videos to AVC toggle in the USB Preferences setting screen as shown in Figure 3.
Figure 3. Toggle to enable media transcoding in USB Preferences screen.
Restrictions on requesting transcoded files
To prevent transcoding requests from locking up system resources for extended periods, apps requesting transcoding sessions are limited to:
- 10 consecutive sessions
- a total running time of three minutes
If an app exceeds all of these restrictions, the framework returns the original file descriptor.
Device requirements
To support the compatible media transcoding feature, devices must meet the following requirements:
- Device has HEVC encoding enabled by default on the native camera app
- (Devices supporting HDR to SDR transcoding) Device supports HDR video capture
To ensure device performance for media transcoding, video hardware and storage
read/write access performance must be optimized. When media codecs are
configured with priority equal to 1
, the codecs must operate at the highest
possible throughput. We recommend that the transcoding performance achieves a
minimum of 200 fps. To test your hardware performance, run the media transcoder
benchmark at frameworks/av/media/libmediatranscoding/transcoder/benchmark
.
Validation
To validate the compatible media transcoding feature, run the following CTS tests:
android.media.mediatranscoding.cts
android.mediaprovidertranscode.cts
Enable media transcoding globally
To test the media transcoding framework or app behavior with transcoding, you can enable or disable the compatible media transcoding feature globally. In the Settings > System > Developer > Media transcoding developer options page, set the Override transcoding defaults toggle to on and then set the Enable transcoding toggle to on or off. If this setting is enabled, media transcoding might occur in the backgroundfor apps other than the one you're developing.
Check transcoding status
During testing, you can use the following ADB shell command to check transcoding status, including current and past transcoding sessions:
adb shell dumpsys media.transcoding
Extend video length limitation
For testing purposes, you can extend the one minute video length limitation for transcoding by using the following command. A reboot might be required after running this command.
adb shell device_config put storage_native_boot transcode_max_duration_ms <LARGE_NUMBER_IN_MS>
AOSP source and references
The following are AOSP source code related to compatible media transcoding.
Transcoding System API (only used by MediaProvider)
ApplicationMediaCapabilities API
frameworks/base/apex/media/framework/java/android/media/ApplicationMediaCapabilities.java
MediaTranscoding Service
frameworks/av/services/mediatranscoding/
frameworks/av/media/libmediatranscoding/
Native MediaTranscoder
frameworks/av/media/libmediatranscoding/transcoder
HDR sample plugin for MediaTranscoder
MediaProvider file interception and transcoding code
MediaTranscoder benchmark
frameworks/av/media/libmediatranscoding/transcoder/benchmark
CTS tests
cts/tests/tests/mediatranscoding/
HDR to SDR encoding
To support HDR to SDR encoding, device manufacturers can use the AOSP sample
Codec 2.0 filter plugin located in
/platform/frameworks/av/media/codec2/hidl/plugin/
.
This section describes how the filter plugin works, how to implement the plugin
and how to test the plugin.
If a device doesn't include a plugin that supports HDR to SDR encoding, an app accessing an HDR video gets the original file descriptor regardless of the app's media capabilities declared in the manifest.
How it works
This section describes the general behavior of the Codec 2.0 filter plugin.
Background
Android provides an adaptation layer implementation between the
Codec 2.0
interface and the android.hardware.media.c2
HAL interface at
android::hardware::media::c2
. For filter plugins, AOSP includes a wrapper
mechanism that wraps decoders together with filter plugins.
MediaCodec
recognizes these wrapped components as decoders with filtering features.
Overview
The FilterWrapper
class takes vendor codecs and returns
wrapped codecs back to the media.c2
adaptation layer. The FilterWrapper
class loads libc2filterplugin.so
through the FilterWrapper::Plugin
API and
records available filters from the plugin. On creation, FilterWrapper
instantiates all available filters. Only filters that alter the buffer are
started at start.
Figure 1. Filter plugin architecture.
Filter plugin interface
The
FilterPlugin.h
interface defines the following APIs to expose the filters:
std::shared_ptr<C2ComponentStore>getComponentStore()
Returns a
C2ComponentStore
object that contains filters. This is separate from what the vendor's Codec 2.0 implementation exposes. Typically, this store only contains the filters used by teFilterWrapper
class.bool describe(C2String name, Descriptor *desc)
Describes the filters in addition to what is available from
C2ComponentStore
. The following descriptions are defined:controlParam
: Parameters that control the behavior of the filters. For example, for HDR to SDR tone-mapper, the control parameter is the target transfer function.affectedParams
: Parameters that are affected by the filtering operations. For example, for HDR to SDR tone-mapper, the affected parameters are the color aspects.
bool isFilteringEnabled(const std::shared_ptr<C2ComponentInterface> &intf)
Returns
true
if the filter component alters the buffer. For example, the tone-mapping filter returnstrue
if the target transfer function is SDR and the input transfer function is HDR (HLG or PQ).
FilterWrapper details
The section describes details of the FilterWrapper
class.
Creation
The wrapped component instantiates the underlying decoder and all defined filters at creation.
Query and configuration
The wrapped component separates incoming parameters from queries or configuration requests according to the filter description. For example, configuration of the filter control parameter is routed to the corresponding filter, and affected parameters from the filters are present on the queries (instead of reading from the decoder that has unaffected parameters).
Figure 2. Query and configuration.
Start
At start, the wrapped component starts the decoder and all the filters that alter the buffers. If no filter is enabled, the wrapped component starts the decoder and pass-through buffers and sends commands to the decoder itself.
Buffer handling
Figure 3. Buffer handling.
Buffers queued to the wrapped decoder go to the underlying decoder. The wrapped
component grabs the output buffer from the decoder through an onWorkDone_nb()
callback, and then queues it to the filters. The final output buffer from the
last filter is reported to the client.
For this buffer handling to work, the wrapped component must configure
C2PortBlockPoolsTuning
to the last filter so that the framework output buffers
from the expected block pool.
Stop, reset, and release
At stop, the wrapped component stops the decoder and all enabled filters that were started. At reset and release, all components get reset or released regardless of whether they're enabled or not.
Implement the sample filter plugin
To enable the plugin, do the following:
- Implement the
FilterPlugin
interface in a library and drop it at/vendor/lib[64]/libc2filterplugin.so.
- Add additional permissions to
mediacodec.te
if required. - Update the adaptation layer to Android 12 and
rebuild the
media.c2
service.
Test the plugin
To test the sample plugin, do the following:
- Rebuild and flash the device.
Build the sample plugin using the following command:
m sample-codec2-filter-plugin
Remount the device and rename the vendor plugin so that it's recognized by the codec service.
adb root adb remount adb reboot adb wait-for-device adb root adb remount adb push /out/target/<...>/lib64/sample-codec2-filter-plugin.so \ /vendor/lib64/libc2filterplugin.so adb push /out/target/<...>/lib/sample-codec2-filter-plugin.so \ /vendor/lib/libc2filterplugin.so adb reboot