You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

20 KiB

7.2. Input Devices

Device implementations:

7.2.1. Keyboard

If device implementations include support for third-party Input Method Editor (IME) applications, they:

Device implementations:

  • [C-0-1] MUST NOT include a hardware keyboard that does not match one of the formats specified in android.content.res.Configuration.keyboard (QWERTY or 12-key).
  • SHOULD include additional soft keyboard implementations.
  • MAY include a hardware keyboard.

7.2.2. Non-touch Navigation

Android includes support for d-pad, trackball, and wheel as mechanisms for non-touch navigation.

Device implementations:

If device implementations lack non-touch navigations, they:

  • [C-1-1] MUST provide a reasonable alternative user interface mechanism for the selection and editing of text, compatible with Input Management Engines. The upstream Android open source implementation includes a selection mechanism suitable for use with devices that lack non-touch navigation inputs.

7.2.3. Navigation Keys

The Home, Recents, and Back functions typically provided via an interaction with a dedicated physical button or a distinct portion of the touch screen, are essential to the Android navigation paradigm and therefore, device implementations:

  • [C-0-1] MUST provide a user affordance to launch installed applications that have an activity with the <intent-filter> set with ACTION=MAIN and CATEGORY=LAUNCHER or CATEGORY=LEANBACK_LAUNCHER for Television device implementations. The Home function SHOULD be the mechanism for this user affordance.
  • SHOULD provide buttons for the Recents and Back function.

If the Home, Recents, or Back functions are provided, they:

  • [C-1-1] MUST be accessible with a single action (e.g. tap, double-click or gesture) when any of them are accessible.
  • [C-1-2] MUST provide a clear indication of which single action would trigger each function. Having a visible icon imprinted on the button, showing a software icon on the navigation bar portion of the screen, or walking the user through a guided step-by-step demo flow during the out-of-box setup experience are examples of such an indication.

Device implementations:

  • [SR] are STRONGLY RECOMMENDED to not provide the input mechanism for the Menu function as it is deprecated in favor of action bar since Android 4.0.

If device implementations provide the Menu function, they:

  • [C-2-1] MUST display the action overflow button whenever the action overflow menu popup is not empty and the action bar is visible.
  • [C-2-2] MUST NOT modify the position of the action overflow popup displayed by selecting the overflow button in the action bar, but MAY render the action overflow popup at a modified position on the screen when it is displayed by selecting the Menu function.

If device implementations do not provide the Menu function, for backwards compatibility, they:

  • [C-3-1] MUST make the Menu function available to applications when targetSdkVersion is less than 10, either by a physical button, a software key, or gestures. This Menu function should be accessible unless hidden together with other navigation functions.

If device implementations provide the Assist function, they:

  • [C-4-1] MUST make the Assist function accessible with a single action (e.g. tap, double-click or gesture) when other navigation keys are accessible.
  • [SR] STRONGLY RECOMMENDED to use long press on HOME function as this designated interaction.

If device implementations use a distinct portion of the screen to display the navigation keys, they:

  • [C-5-1] Navigation keys MUST use a distinct portion of the screen, not available to applications, and MUST NOT obscure or otherwise interfere with the portion of the screen available to applications.
  • [C-5-2] MUST make available a portion of the display to applications that meets the requirements defined in section 7.1.1.
  • [C-5-3] MUST honor the flags set by the app through the View.setSystemUiVisibility() API method, so that this distinct portion of the screen (a.k.a. the navigation bar) is properly hidden away as documented in the SDK.

If the navigation function is provided as an on-screen, gesture-based action:

If a navigation function is provided from anywhere on the left and right edges of the current orientation of the screen:

  • [C-7-1] The navigation function MUST be Back and provided as a swipe from both left and right edges of the current orientation of the screen.
  • [C-7-2] If custom swipeable system panels are provided on the left or right edges, they MUST be placed within the top 1/3rd of the screen with a clear, persistent visual indication that dragging in would invoke the aforementioned panels, and hence not Back. A system panel MAY be configured by a user such that it lands below the top 1/3rd of the screen edge(s) but the system panel MUST NOT use longer than 1/3rd of the edge(s).
  • [C-7-3] When the foreground app has either the View.SYSTEM_UI_FLAG_IMMERSIVE or View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY flags set, swiping from the edges MUST behave as implemented in AOSP, which is documented in the SDK.
  • [C-7-4] When the foreground app has either the View.SYSTEM_UI_FLAG_IMMERSIVE or View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY flags set, custom swipeable system panels MUST be hidden until the user brings in the system bars (a.k.a. navigation and status bar) as implemented in AOSP.

7.2.4. Touchscreen Input

Android includes support for a variety of pointer input systems, such as touchscreens, touch pads, and fake touch input devices. Touchscreen-based device implementations are associated with a display such that the user has the impression of directly manipulating items on screen. Since the user is directly touching the screen, the system does not require any additional affordances to indicate the objects being manipulated.

Device implementations:

  • SHOULD have a pointer input system of some kind (either mouse-like or touch).
  • SHOULD support fully independently tracked pointers.

If device implementations include a touchscreen (single-touch or better) on a primary Android-compatible display, they:

  • [C-1-1] MUST report TOUCHSCREEN_FINGER for the Configuration.touchscreen API field.
  • [C-1-2] MUST report the android.hardware.touchscreen and android.hardware.faketouch feature flags.

If device implementations include a touchscreen that can track more than a single touch on a primary Android-compatible display, they:

  • [C-2-1] MUST report the appropriate feature flags android.hardware.touchscreen.multitouch, android.hardware.touchscreen.multitouch.distinct, android.hardware.touchscreen.multitouch.jazzhand corresponding to the type of the specific touchscreen on the device.

If device implementations rely on an external input device such as mouse or trackball (i.e. not directly touching the screen) for input on a primary Android-compatible display and meet the fake touch requirements in section 7.2.5, they:

  • [C-3-1] MUST NOT report any feature flag starting with android.hardware.touchscreen.
  • [C-3-2] MUST report only android.hardware.faketouch.
  • [C-3-3] MUST report TOUCHSCREEN_NOTOUCH for the Configuration.touchscreen API field.

7.2.5. Fake Touch Input

Fake touch interface provides a user input system that approximates a subset of touchscreen capabilities. For example, a mouse or remote control that drives an on-screen cursor approximates touch, but requires the user to first point or focus then click. Numerous input devices like the mouse, trackpad, gyro-based air mouse, gyro-pointer, joystick, and multi-touch trackpad can support fake touch interactions. Android includes the feature constant android.hardware.faketouch, which corresponds to a high-fidelity non-touch (pointer-based) input device such as a mouse or trackpad that can adequately emulate touch-based input (including basic gesture support), and indicates that the device supports an emulated subset of touchscreen functionality.

If device implementations do not include a touchscreen but include another pointer input system which they want to make available, they:

  • SHOULD declare support for the android.hardware.faketouch feature flag.

If device implementations declare support for android.hardware.faketouch, they:

  • [C-1-1] MUST report the absolute X and Y screen positions of the pointer location and display a visual pointer on the screen.
  • [C-1-2] MUST report touch event with the action code that specifies the state change that occurs on the pointer going down or up on the screen.
  • [C-1-3] MUST support pointer down and up on an object on the screen, which allows users to emulate tap on an object on the screen.
  • [C-1-4] MUST support pointer down, pointer up, pointer down then pointer up in the same place on an object on the screen within a time threshold, which allows users to emulate double tap on an object on the screen.
  • [C-1-5] MUST support pointer down on an arbitrary point on the screen, pointer move to any other arbitrary point on the screen, followed by a pointer up, which allows users to emulate a touch drag.
  • [C-1-6] MUST support pointer down then allow users to quickly move the object to a different position on the screen and then pointer up on the screen, which allows users to fling an object on the screen.

If device implementations declare support for android.hardware.faketouch.multitouch.distinct, they:

  • [C-2-1] MUST declare support for android.hardware.faketouch.
  • [C-2-2] MUST support distinct tracking of two or more independent pointer inputs.

If device implementations declare support for android.hardware.faketouch.multitouch.jazzhand, they:

  • [C-3-1] MUST declare support for android.hardware.faketouch.
  • [C-3-2] MUST support distinct tracking of 5 (tracking a hand of fingers) or more pointer inputs fully independently.

7.2.6. Game Controller Support

7.2.6.1. Button Mappings

Device implementations:

  • [C-1-1] MUST be capable to map HID events to the corresponding InputEvent constants as listed in the below tables. The upstream Android implementation satisfies this requirement.

If device implementations embed a controller or ship with a separate controller in the box that would provide means to input all the events listed in the below tables, they:

  • [C-2-1] MUST declare the feature flag android.hardware.gamepad

Button HID Usage2 Android Button
A1 0x09 0x0001 KEYCODE_BUTTON_A (96)
B1 0x09 0x0002 KEYCODE_BUTTON_B (97)
X1 0x09 0x0004 KEYCODE_BUTTON_X (99)
Y1 0x09 0x0005 KEYCODE_BUTTON_Y (100)
D-pad up1

D-pad down1

0x01 0x00393 AXIS_HAT_Y4
D-pad left1

D-pad right1

0x01 0x00393 AXIS_HAT_X4
Left shoulder button1 0x09 0x0007 KEYCODE_BUTTON_L1 (102)
Right shoulder button1 0x09 0x0008 KEYCODE_BUTTON_R1 (103)
Left stick click1 0x09 0x000E KEYCODE_BUTTON_THUMBL (106)
Right stick click1 0x09 0x000F KEYCODE_BUTTON_THUMBR (107)
Home1 0x0c 0x0223 KEYCODE_HOME (3)
Back1 0x0c 0x0224 KEYCODE_BACK (4)

1 KeyEvent

2 The above HID usages must be declared within a Game pad CA (0x01 0x0005).

3 This usage must have a Logical Minimum of 0, a Logical Maximum of 7, a Physical Minimum of 0, a Physical Maximum of 315, Units in Degrees, and a Report Size of 4. The logical value is defined to be the clockwise rotation away from the vertical axis; for example, a logical value of 0 represents no rotation and the up button being pressed, while a logical value of 1 represents a rotation of 45 degrees and both the up and left keys being pressed.

4 MotionEvent

Analog Controls1 HID Usage Android Button
Left Trigger 0x02 0x00C5 AXIS_LTRIGGER
Right Trigger 0x02 0x00C4 AXIS_RTRIGGER
Left Joystick 0x01 0x0030

0x01 0x0031

AXIS_X

AXIS_Y

Right Joystick 0x01 0x0032

0x01 0x0035

AXIS_Z

AXIS_RZ

1 MotionEvent

7.2.7. Remote Control

See Section 2.3.1 for device-specific requirements.