Google today launched the first Android 11 developer preview, available for download now at developer.android.com. The preview includes a preview SDK for developers with system images for the Pixel 2, Pixel 2 XL, Pixel 3, Pixel 3 XL, Pixel 3a, Pixel 3a XL, Pixel 4, and Pixel 4 XL, as well as the official Android Emulator.
While it’s the fifth year running that Google has released the first developer preview of the next Android version in Q1, this is the earliest developer preview yet. Android N (later named Android Nougat), Android O (Android Oreo), Android P (Android Pie), and Android Q (Android 10) were all first shown off in the month of March. Last year, Google used the Android Beta Program, which lets you get early Android builds via over-the-air updates on select devices. This year, however, Google is not making the first preview available as a beta (you’ll need to manually flash your device). In other words, Android 11 is not ready for early adopters to try, just developers.
If Google sticks to its usual cadence, the company will likely open Android 11 testing to more phones with the second developer preview. The first Android 11 developer preview is, however, available on a couple more phones (eight Pixels as opposed to six). Still, that’s a tiny slice of the over 2.5 billion monthly active Android devices — the main reason developers are eager to see what’s new for the platform in the first place.
With Android Q, Google talked up support for 5G and foldable devices. The company is keeping those going, while also emphasizing machine learning improvements.
If you want the short version, here are the first Android 11 developer preview highlights: 5G experiences, people and conversations improvements, Neural Networks API 1.3, privacy and security features, Google Play System updates, app compatibility, connectivity, image and camera improvements, and low latency tweaks.
Developer Preview 1 features
Still not satisfied? Here is the longer version of all the new APIs and features (and there is more to come; this is just the developer preview, after all):
- Bandwidth estimator API: Google updated this API for 5G to make it easier to check the downstream/upstream bandwidth, without needing to poll the network or compute your own estimate. If the modem doesn’t provide support, the API makes a default estimation based on the current connection.
- Dynamic meteredness API: This API lets you check whether the connection is unmetered and, if so, offer higher resolution or quality that may use more data. The API now includes cellular networks so that you can identify users whose carriers are offering truly unmetered data while connected to the carrier’s 5G network.
- Pinhole and waterfall screens: Apps can manage pinhole screens and waterfall screens using the existing display cutout APIs. If you want, a new API lets your app use the entire waterfall screen including the edges, with insets to help you manage interaction near the edges.
- Dedicated conversations section in the notification shade: Users can find their ongoing conversations with people in their favorite apps.
- Bubbles: Keep conversations in view and accessible while multi-tasking on their phones. Messaging and chat apps should use the Bubbles API on notifications to enable this in Android 11.
- Insert images into notification replies: If your app supports image copy/paste, you can now let users insert assets directly into notification inline replies as well as in the app itself. Image copy support is also available in Chrome while image paste support will show up in Gboard’s clipboard.
- Neural Networks API (NNAPI) 1.3: Google is expanding the operations and controls for running computationally intensive operations for machine learning on Android devices. Quality of Service APIs support priority and timeout for model execution and Memory Domain APIs reduce memory copying and transformation for consecutive model execution. Quantization support has gained signed integer asymmetric quantization, where signed integers are used in place of floating point numbers to enable smaller models and faster inference.
- One-time permission: For the most sensitive types of data — not just location but also for the device microphone and camera — users can now grant temporary access through a one-time permission. This permission means that apps can access the data until the user moves away from the app, and they must then request permission again for the next access.
- Scoped storage: Enhancements including opt-in raw file path access for media, updated DocumentsUI, and batch edit operations in MediaStore.
- Safer Location Access: Updated Google Play Policy to ensure that apps only request location permissions when truly necessary.
- Biometrics: Expanded to meet the needs of a wider range of devices. BiometricPrompt now supports three authenticator types with different levels of granularity — strong, weak, and device credential. The BiometricPrompt has been decoupled from the app’s Activity lifecycle to make it easier to integrate with various app architectures, and to improve the transaction UI.
- Platform hardening: Expanded use of compiler-based sanitizers in security-critical components, including BoundSan, IntSan, CFI, and Shadow-Call Stack. Google has also enabled heap pointer tagging for apps targeting Android 11 or higher, to help apps catch memory issues in production. The team used HWAsan to find and fix memory errors — developers can now use HWAsan-enabled system images to find such issues in their apps.
- Secure storage and sharing of data: Apps can now share data blobs easily and more safely with other apps through a BlobstoreManager. The Blob store is ideal for use cases like sharing ML models among multiple apps for the same user.
- Identity credentials: Android 11 adds platform support for secure storage and retrieval of verifiable identification documents, such as ISO 18013-5 compliant Mobile Driving Licenses.
- Google Play System Updates: Project Mainline has gained 12 new updatable modules, for a total of 22 modules. New modules include a permissions module that standardizes user and developer access to critical privacy controls on Android devices, a media provider module, and an NNAPI module that optimizes performance and guarantees consistent APIs across devices.
- Minimizing the impact of behavior changes: Android 11 minimizes behavioral changes that could affect apps by closely reviewing their impact and by making them opt-in, wherever possible, until the developer sets targetSdkVersion to Android 11.
- Easier testing and debugging: Many of the breaking changes are toggleable. Developers can thus force-enable or disable the changes individually from Developer options or ADB. There’s thus no longer a need to change targetSdkVersion or recompile apps for basic testing.
- Updated greylists: Updated the lists of restricted non-SDK interfaces.
- Dynamic resource loader: A Resource Loader framework in Android 11 loads resources and assets dynamically at runtime.
- New platform stability milestone: A new release milestone called “Platform Stability” is set for early June. This milestone includes not only final SDK/NDK APIs, but also final internal APIs and system behaviors that may affect apps.
- Call screening service improvements: Call-screening apps can now do more to help users. Apps can get the incoming call’s STIR/SHAKEN verification status as part of the call details, and they can customize a system-provided post call screen to let users perform actions such as marking a call as spam or adding to contacts.
- Wi-Fi suggestion API enhancements: Google extended the Wi-Fi suggestion API to give connectivity management apps greater ability to manage their own networks.
- Passpoint enhancements: Android now enforces and notifies about expiration date of a Passpoint profile, supports Common Name specification in the profile, and allows self-signed private CAs for Passpoint R1 profiles. Connectivity apps can now use the Wi-Fi suggestion API to manage Passpoint networks.
- HEIF animated drawables: The ImageDecoder API now lets you decode and render image sequence animations stored in HEIF files, so you can make use of high-quality assets while minimizing impact on network data and APK size. HEIF image sequences can offer drastic file-size reductions for image sequences when compared to animated GIFs. Developers can display HEIF image sequences in their apps by calling decodeDrawable with an HEIF source. If the source contains a sequence of images, an AnimatedImageDrawable is returned.
- Native image decoder: New NDK APIs let apps decode and encode images (such as JPEG, PNG, and WebP) from native code for graphics or post processing, while retaining a smaller APK size since you don’t need to bundle an external library. The native decoder also takes advantage of Android’s process for ongoing platform security updates.
- Muting during camera capture: Apps can use new APIs to mute vibration from ringtones, alarms, or notifications while the session is active.
- Bokeh modes: Apps can use metadata tags to enable bokeh modes on camera capture requests in devices that support it. A still image mode offers highest quality capture, while a continuous mode ensures that capture keeps up with sensor output, such as for video capture.
- Low-latency video decoding in MediaCodec: Low latency video is critical for real-time video streaming apps and services like Stadia. Video codecs that support low latency playback return the first frame of the stream as quickly as possible after decoding starts. Apps can use new APIs to check and configure low-latency playback for a specific codec.
- HDMI low-latency mode: Apps can use new APIs to check for and request auto low latency mode (also known as game mode) on external displays and TVs. In this mode, the display or TV disables graphics post-processing in order to minimize latency.
After you’ve flashed Android 11 onto your device or fired up the Android Emulator, you’ll want to update your Android Studio environment with the Android 11 Preview SDK (set up guide). For a complete rundown on what’s new, check the API overview, API reference, and diff report.
The goal of the first developer preview is to let early adopters and developers play with the build early so they can explore new features and APIs for apps, test for compatibility, and give feedback before more details are shared at I/O 2020, scheduled for May 12 to May 14. More new features and capabilities will be released in subsequent previews and betas.
Last year, there were six betas. This year, there will be three developer previews and three betas. Here’s the preview/beta schedule for Android 11:
- February: Developer Preview 1 (Early baseline build focused on developer feedback, with new features, APIs, and behavior changes.)
- March: Developer Preview 2 (Incremental update with additional features, APIs, and behavior changes.)
- April: Developer Preview 3 (Incremental update for stability and performance.)
- May: Beta 1 (Initial beta-quality release, over-the-air update to early adopters who enroll in Android Beta.)
- June: Beta 2 (Platform Stability milestone. Final APIs and behaviors. Play publishing opens.)
- Q3: Beta 3 (Release candidate build.)
- Q3: Final release (Android 11 release to AOSP and ecosystem.)
Google is asking developers to make their apps compatible with Android 11 so that their users can expect a seamless transition when they upgrade. If you find any bugs, you can report them here.