This page explains how Android processes the various inputs it receives from the keyboard, sensors, and more.
Haptics
The Android haptics subsystem refers to hardware and software features that contribute to the creation of stimuli through the sense of touch. This section provides guidance and compliance instructions on the best use of Android haptics APIs.
Input
The Android input subsystem nominally consists of an event pipeline that traverses multiple layers of the system. At the lowest layer, the physical input device produces signals that describe state changes such as key presses and touch contact points.
Neural Networks API
The Android Neural Networks API (NNAPI) runs computationally intensive operations for machine learning. This document provides an overview on how to implement a Neural Networks API driver for Android 9.
Peripherals and accessories
Using a suite of standard protocols, you can implement compelling peripherals and other accessories that extend Android capabilities in a wide range of Android-powered devices.
Sensors
Android sensors give apps access to a mobile device's underlying
physical sensors. They are data-providing virtual devices defined by
sensors.h
, the sensor Hardware Abstraction Layer (HAL).
Context Hub Runtime Environment
Context Hub Runtime Environment (CHRE) provides a common platform for running system-level apps on a low-power processor, with a simple, standardized, embedded-friendly API. CHRE makes it easy for device OEMs to offload processing from the applications processor, to save battery and improve various areas of the user experience, and enable a class of always-on, contextually aware features.