android development

Android 16 QPR2 Beta 2 is Here




Posted by Matthew McCullough, VP of Product Management, Android Developer



Android 16 QPR2 has released Platform Stability today with Beta 2! That means that the API surface is locked, and the app-facing behaviors are final, so you can incorporate...
Google

HDR and User Interfaces




Posted by Alec Mouri - Software Engineer






As explained in What is HDR?, we can think of HDR as only referring to a luminance range brighter than SDR. When integrating HDR content into a user interface, you must be careful when your user interface is primarily SDR colors and assets. The human visual system adapts to perceived color based on the surrounding environment, which can lead to surprising results. We’ll look at one pertinent example.

Simultaneous Contrast

Consider the following image:

Source: Wikipedia


This image shows two gray rectangles with different background colors. For most people viewing this image, the two gray rectangles appear to be different shades of gray: the topmost rectangle with a darker background appears to be a lighter shade than the bottommost rectangle with a lighter background.

But these are the same shades of gray! You can prove this to yourself by using your favorite color picking tool or by looking at the below image:




This illustrates a visual phenomenon called simultaneous contrast. Readers who are interested in the biological explanation may learn more here.

Nearby differences in color are therefore “emphasized”: colors appear darker when immediately next to brighter colors. That same color would appear lighter when immediately next to darker colors.

Implications on Mixing HDR and SDR

The effect of simultaneous contrast affects the appearance of user interfaces that need to present a mixture of HDR and SDR content. The peak luminance allowed by HDR will create an effect of simultaneous contrast: the eye will adapt* to a higher peak luminance (and oftentimes a higher average luminance in practice), which will perceptually cause SDR content to appear dimmer although technically the SDR content luminance has not changed at all. For users, this can be expressed as: my phone screen became “grey” or “washed out”.

We can see this phenomenon in the below image. The device on the right simulates how photos may appear with an SDR UI, if those photos were rendered as HDR. Note that the August photos look identical when compared side-by-side, but the quality of the SDR UI is visually degraded.




Applications, when designing for HDR, need to consider how “much” SDR is shown at any given time in their screens when controlling how bright HDR is “allowed” to be. A UI that is dominated by SDR, such as a gallery view where small amounts of HDR content are displayed, can suddenly appear to be darker than expected.

When building your UI, consider the impact of HDR on text legibility or the appearance of nearby SDR assets, and use the appropriate APIs provided by your platform to constrain HDR brightness, or even disable HDR. For example, a 2x headroom for HDR brightness may be acceptable to balance the quality of your HDR scene with your SDR elements. In contrast, a UI that is dominated by HDR, such as full-screen video without other UI elements on-top, does not need to consider this as strongly, as the focus of the UI is on the HDR content itself. In those situations, a 5x headroom (or higher, depending on content metadata such as UltraHDR's max_content_boost) may be more appropriate.

It might be tempting to “brighten” SDR content instead. Resist this temptation! This will cause your application to be too bright, especially if there are other applications or system UI elements on-screen.

How to control HDR headroom

Android 15 introduced a control for desired HDR headroom. You can have your application request that the system uses a particular HDR headroom based on the context around your desired UI:

If you only want to show SDR content, simply request no headroom.
If you only want to show HDR content, then request a high HDR headroom up to and according to the demands of the content.
If you want to show a mixture of HDR and SDR content, then can request an intermediate headroom value accordingly. Typical headroom amounts would be around 2x for a mixed scene and 5-8x for a fully-HDR scene.


Here is some example usage:

// Required for the window to respect the desired HDR headroom.
// Note that the equivalent api on SurfaceView does NOT require
// COLOR_MODE_HDR to constraint headroom, if there is HDR content displayed
// on the SurfaceView.
window.colorMode = ActivityInfo.COLOR_MODE_HDR
// Illustrative values: different headroom values may be used depending on
// the desired headroom of the content AND particularities of apps's UI
// design.
window.desiredHdrHeadroom =
if(/* SDR only */) {
0f
} else {
if (/* Mixed, mostly SDR */) {
1.5f
} else {
if ( /* Mixed, mostly HDR */) {
3f
} else {
/* HDR only */
5f
}
}
}


Other platforms also have APIs that allow for developers to have some control over constraining HDR content in their application.

Web platforms have a more coarse concept: The First Public Working Draft of the CSS Color HDR Module adds a constrained-high option to constrain the headroom for mixed HDR and SDR scenes. Within the Apple ecosystem, constrainedHigh is similarly coarse, reckoning with the challenges of displaying mixed HDR and SDR scenes on consumer displays.

If you are a developer who is considering supporting HDR, be thoughtful about how HDR interacts with your UI and use HDR headroom controls appropriately.



*There are other mechanisms the eye employs for light adaptation, like pupillary light reflex, which amplifies this visual phenomenon (brighter peak HDR light means the pupil constricts, which causes less light to hit the retina).
Google

How Dashlane Brought Credential Manager to Wear OS with Only 78 New Lines of Code




Posted by John Zoeller - Developer Relations Engineer, Loyrn Hairston - Product Marketing Manager, and Jonathan Salamon - Dashlane Staff Software Engineer







Dashlane is a password management and provision tool that provides a secure way to manage user credentials, access control, and authentication across multiple systems and applications.

Dashlane has over 18 million users and 20,000 businesses in 180 countries. It’s available on Android, Wear OS, iOS, macOS, Windows, and as a web app with an extension for Chrome, Firefox, Edge, and Safari.

Recently, they expanded their offerings by creating a Wear OS app with a Credential Provider integration from the Credential Manager API, bringing passkeys to their clients and users on smartwatches.

Streamlining Authentication on Wear OS

Dashlane users have frequently requested a Wear OS solution that provides standalone authentication for their favorite apps. In the past, Wear OS lacked the key APIs necessary for this request, which kept Dashlane from being able to provide the functionality. In their words:

“Our biggest challenge was the lack of a standard credentials API on Wear OS, which meant that it was impossible to bring our core features to this platform.”

This has changed with the introduction of the new Credential Manager API on Wear OS.

Credential Manager provides a simplified, standardized user sign-in experience with built-in authentication options for passkeys, passwords, and federated identities like Sign in with Google. Conveniently, it can be implemented with minimal effort by reusing the same code as the mobile version.

The Dashlane team was thrilled to learn about this, as it meant they could save a lot of time and effort: “[The] CredentialManager API provides the same API on phones and Wear OS; you write the code only once to support multiple form factors.”

Selecting Dashlane-provided credentials is simple for users


After Dashlane had planned out their roadmap, they were able execute their vision for the new app with only a small engineering investment, reusing 92% of the Credential Manager code from their mobile app. And because the developers built Dashlane’s app UI with Jetpack Compose for Wear OS, 60% of their UI code was also reused.




Developing for Wear OS

To provide credentials to other apps with Credential Manager, Dashlane needed to implement the Credential Provider interface on Wear OS. This proved to be a simple exercise in calling their existing mobile code, where Dashlane had already implemented behavior for credential querying and credential selection.

For example, Dashlane was able to reuse their logic to handle client invocations of CredentialManager.getCredential. When a client invokes this, the Android framework propagates the client’s getCredentialRequest to Dashlane’s CredentialProviderService.onBeginGetCredentialRequest implementation to retrieve the credentials specified in the request.

Dashlane delegates the logic for onBeginGetCredentialRequest to their handleGetCredentials function, below, which is shared between their mobile and Wear OS implementations.

// When a Credential Manager client calls 'getCredential', the Android
// framework invokes `onBeginGetCredentialRequest`. Dashlane
// implemented this `handleGetCredentials` function to handle some of
// the logic needed for `onBeginGetCredentialRequest`
override fun handleGetCredentials(
context: Context,
request: BeginGetCredentialRequest):
List<CredentialEntry> =
request.beginGetCredentialOptions.flatMap { option ->
when (option) {
// Handle passkey credential
is BeginGetPublicKeyCredentialOption -> {
val passkeyRequestOptions = Gson().fromJson(
option.requestJson, PasskeyRequestOptions::class.java)

credentialLoader.loadPasskeyCredentials(
passkeyRequestOptions.rpId,
passkeyRequestOptions.allowCredentials ?: listOf()
).map { passkey ->
val passkeyDisplayName = getSuggestionTitle(passkey, context)

PublicKeyCredentialEntry.Builder(
context,
passkeyDisplayName,
pendingIntentForGet(context, passkey.id),
option
)
.setLastUsedTime(passkey.locallyViewedDate)
.setIcon(buildMicroLogomarkIcon(context = context))
.setDisplayName(passkeyDisplayName)
.build()
// Handle other credential types


Reusing precise logic flows like this made it a breeze for Dashlane to implement their Wear OS app.

“The Credential Manager API is unified across phones and Wear OS, which was a huge advantage. It meant we only had to write our code once.”

Impact and Improved Growth

The team is excited to be among the first credential providers on wearables: “Being one of the first on Wear OS was a key differentiator for us. It reinforces our brand as an innovator, focusing on the user experience, better meeting and serving our users where they are.”

As an early adopter of this new technology, Dashlanes Wear OS app has already shown early promise, as described by Dashlane software engineer, Sebastien Eggenspieler: “In the first 3 months, our Wear OS app organically grew to represent 1% of our active device install base.”

With their new experience launched, Wear OS apps can now rely on Dashlane as a trusted credential provider for their own Credential Manager integrations, using Dashlane to allow users to log in with a single tap; and users can view details about their credentials right from their wrist.

Dashlane’s innovative design helps users manage their credentials


Dashlane’s Recommendations to Wear OS Developers

With their implementation complete, the Dashlane team can offer some advice for other developers who are considering the Credential Manager API. Their message is clear: “the future is passwordless… and passkeys are leading the way, [so] provide a passkey option.”

As a true innovator in their field, and the preferred credential provider for so many users, we are thrilled to have Dashlane support Credential Manager. They truly inspired us with their commitment to providing Wear OS users with the best experience possible:

“We hope that in the future every app developer will migrate their existing users to the Credential Manager API.”

Get Started with Credential Manager

With its elegant simplicity and built-in secure authentication methods, the Credential Manager API provides a simple, straightforward authentication experience for users that changes the game in Wear OS.

Want to find out more about how Dashlane is driving the future of end-user authentication? Check out our video blog with their team in Paris, and read about how they found a 70% in sign-in conversion rates with passkeys.

To learn more about how you can implement Credential Manager, read our official developer and UX guides, and be sure to check out our brand new blog post and video blog as part of Wear OS Spotlight week!

We’ve also expanded our existing Credential Manager sample to support Wear OS, to help guide you along the way, and if you'd like to provide credentials like Dashlane, you can use our Credential Provider sample.

Finally, explore how you can start developing additional experiences for Wear OS today with our documentation and samples.
Google

The evolution of Wear OS authentication




Posted by John Zoeller – Developer Relations Engineer



This post is part of Wear OS Spotlight Week. Today, we're focusing on implementing Credential Manager on Wear OS, aiming to streamline the authentication experience.

For all software develope...
Google

Further explorations with Watch Face Push




Posted by Garan Jenkin – Developer Relations Engineer

This post is part of Wear OS Spotlight Week. Today, we're exploring the wonderful world of watch faces.

At Google I/O ‘25 we launched Watch Face Push, a new API aimed at enabling watch face mar...
Google

Building experiences for Wear OS




Posted by Michael Stillwell – Developer Relations Engineer



This post is part of Wear OS Spotlight Week. Today, we're focusing on creating engaging experiences across the various surfaces available on the wrist.

Developing for the growing ecosystem of Wear OS is a unique and rewarding challenge that encourages you to think beyond mobile patterns. Wear's design philosophy focuses on crafting experiences for a device that's always with the user, where meaningful interactions take seconds, not minutes. A successful wearable app doesn't attempt to maximize screen time; it instead aims to deliver meaningful glanceable experiences that help people stay present and productive while on the go. This vision is now fully enabled by the next generation of hardware, which we explored last week with the introduction of the new Pixel Watch 4.

Wear OS devices also introduce constraints that push you to innovate. Power efficiency is critical, requiring you to build experiences that are both beautiful and battery-conscious. You'll also tackle challenges like handling offline use cases and catering for a variety of screen sizes.

Despite these differences, you'll find yourself on familiar technical foundations. Wear OS is based on Android, which means you can leverage your existing knowledge of the platform, architecture, developer APIs, and tools to create wearable experiences.

Wear OS surfaces

Wear OS offers a range of surfaces to inform and engage users. This allows you to tailor your app's presence on the watch, providing the right information at the right time and scaling your development investment to best meet your users' needs.

Watch faces display the time and are the first thing a user sees when they look at their watch. We'll cover watch faces in more detail in other blog posts across Wear OS Spotlight week.

The Watch face is the first thing a user sees when they look at their watch


Apps provide a richer, more immersive UI for complex tasks that are too involved for other surfaces.

Apps support complex tasks and can scroll vertically


Notifications provide glanceable, time-sensitive information and actions.

A notification provides glanceable, time-sensitive information


Complications display highly-glanceable, relevant data from your app directly on the user's chosen watch face. Learn more about building complication data sources for Wear OS.

Complications display glanceable data from your app directly on the user's watch face.


Tiles (Widgets for Wear OS) offer fast, predictable access to information and actions with a simple swipe from the watch face.

Tiles offer fast, predictable information and actions


Whilst a variety of Wear OS surfaces let developers to engage with users in different ways, it may be overwhelming to get started. We recommend approaching Wear OS development in phases and scale up your investment over time:

Recommended Wear OS development phases: enhance the wearable experience of your Android app, build Tiles and complications, and then create a complete wearable experience.



Improve the wearable experience of your mobile app. You can improve the wearable experience with minimal effort. By default, notifications from your phone app are automatically bridged to the watch. You can start by enhancing these with wearable-specific actions using NotificationCompat.WearableExtender, offering a more tailored experience without building a full Wear OS experience.
Build a companion experience. When you're ready for a dedicated UI, create a tethered app experience that depends on the phone app for its core features and data. This involves creating a tethered app that works in tandem with your phone app, allowing you to design a customized UI for the wrist and take advantage of surfaces like tiles and complications.
Graduate to a standalone app. Finally, you can evolve your app into a standalone experience that works independently of a phone, which is ideal for offline scenarios like exercising. This provides the most flexibility but also requires more effort to optimize for constraints like power efficiency.


Notifications
Notifications are a core part of the Wear OS experience, delivering glanceable, time-sensitive information and actions for the user. Because Wear OS is based on Android, it shares the same notification system as mobile devices, letting you leverage your existing knowledge to build rich experiences for the wrist.

From a development perspective, it helps to think of a notification not as a simple alert, but as a declarative UI data structure that is shared between the user's devices. You define the content and actions, and the system intelligently renders that information to best suit the context and form factor. This declarative approach has become increasingly powerful. On Wear OS, for example, it's the mechanism behind ongoing activities.

Alert-style notifications

One great thing about notifications is that you don't even need a Wear OS app for your users to see them on their watch. By default, notifications generated by your phone app are automatically "bridged", or mirrored, to a connected watch, providing an instant wearable presence for your app with no extra work. These bridged notifications include an action to open the app on the phone.

You can enhance this default behavior by adding wearable-specific functionality to your phone notifications. Using NotificationCompat.WearableExtender, you can add actions that only appear on the watch, offering a more tailored experience without needing to build a full Wear OS app.

// Prerequisites:
//
// 1. You've created the notification channel CHANNEL_ID
// 2. You've obtained the POST_NOTIFICATIONS permission

val channelId = "my_channel_id"
val sender = "Clem"
val subject = "..."

val notification =
NotificationCompat.Builder(applicationContext, channelId)
.apply {
setContentTitle("New mail from $sender")
setContentText(subject)
setSmallIcon(R.drawable.new_mail_mobile)
// Added for Wear OS
extend(
NotificationCompat.WearableExtender().apply {
setSmallIcon(R.drawable.new_mail_wear)
}
)
}
.build()

NotificationManagerCompat.from(applicationContext).notify(0, notification)


Prevent duplicate notifications

Once you build a dedicated app for Wear OS, you'll need to develop a clear notification strategy to avoid a common challenge: duplicate notifications. Since notifications from your phone app are bridged by default, a user with both your phone and watch apps installed could see two alerts for the same event.

Wear OS provides a straightforward way to manage this:

On the mobile app's notification, assign a string identifier using setBridgeTag().
In your Wear OS app, you can then programmatically prevent notifications with certain tags from being bridged using a BridgingConfig. This gives you fine-grained control, allowing you to bridge some notifications while handling others natively in your Wear OS app.


If your mobile and watch apps generate similar but distinct notifications, you can link them using setDismissalId(). When a user dismisses a notification on one device, any notification with the same dismissal ID on another connected device is also dismissed.

Creating interactive experiences

From a user's perspective, apps and tiles may feel very similar. Both are full-screen experiences that are visually rich, support animations, and handle user interaction. The main differences are in how they are launched, and their specific capabilities:

Apps can be deeply immersive and handle complex, multi-step tasks. They are the obvious choice when handling data that must be synced between the watch app and its associated phone app, and the only choice for long-running tasks like tracking workouts and listening to music.
Tiles are designed for fast, predictable access to the information and actions users need most, providing glanceable content with a simple swipe from the watch face. Think of tiles as widgets for Wear OS.


Apps and tiles are built using distinct technologies. Apps can be built with Jetpack Compose, while tiles are defined declaratively using the ProtoLayout library. This distinction allows each surface to be highly optimized for its specific role – apps can provide rich, interactive experiences while tiles remain fast and power-efficient.

Building apps

Apps provide the richest experience on Wear OS. Jetpack Compose for Wear OS is the recommended UI toolkit for building them – it works seamlessly with other Jetpack libraries and accelerates development productivity. Many prominent apps, like Gmail, Calendar and Todoist, are built entirely with Compose for Wear OS.

Compose for Wear OS for beautiful UIs

If you've used Jetpack Compose for mobile development, you'll find that Compose for Wear OS shares the same foundational principles and mental model. However, building for the wrist requires some different techniques, and the toolkit provides a specialized UI component library optimized for watches.

Wear OS has its own dedicated Material Design, foundation, and navigation libraries to use instead of the mobile Jetpack libraries. These libraries provide UI components tailored for round screens and glanceable interactions, and are each supported by Android Studio's preview system.


Lists: On mobile, you might use a LazyColumn to display a vertical collection of items. On Wear OS, the TransformingLazyColumn is the equivalent component. It supports scaling and transparency effects to items at the edge of a round screen, improving legibility. It also has built-in support for scrolling with rotary input.
Navigation: Handling screen transitions and the back stack also requires a component that’s specific to Wear OS. Instead of the standard NavHost, you must use SwipeDismissableNavHost. This component works with the system's swipe-to-dismiss gesture, ensuring users can intuitively navigate back to the previous screen.


Learn how to use Jetpack Compose on Wear OS to get started, including sample code.

Implementing core app features

Wear OS also provides APIs designed for power efficiency and the on-wrist use case, as well as Wear OS versions of mobile APIs:

Authentication: Credential Manager API unifies the user sign-in process and supports modern, secure methods like passkeys, passwords, and federated identity services (like Sign-in with Google), providing a seamless and secure experience without relying on a companion phone.
Ambient: To handle the low-power ambient (always-on) state, we recommend using the AmbientLifecycleObserver to receive callbacks for state transitions. In the onEnterAmbient() callback, adjust your UI for low-power display by dimming colors and hiding non-essential elements. Use onExitAmbient() to restore your app's full UI. Learn more about always-on apps and system ambient mode.
Health and Fitness (sensor data): While you can use the standard Android Sensor APIs, it's not recommended for performance reasons, especially for long-running workouts. Instead, use Health Services on Wear OS. It acts as an intermediary to the various sensors, providing your app with batched, power-efficient updates for everything from heart rate to running metrics, without needing to manage the underlying sensors directly.


Building tiles

Tiles offer quick, predictable access to the information and actions users need most, accessible with a simple swipe from the watch face. By using platform data bindings to display sources like step count or heart rate, you can provide timely and useful information in your tile.

Tiles are built declaratively using the ProtoLayout libraries, which are optimized for performance and power efficiency—critical considerations on a wearable device. Learn more about how to get started with tiles and how to make use of sample tile layouts.

More resources for building experiences for Wear OS

Wear OS Documentation Hub: The essential resource for developers looking to create experiences for Wear OS, from design guidelines to code samples.
App design guidance: The official resource to learn how to design for Wear OS.
Compose starter sample app: A starter project that provides a solid foundation for a new Wear OS app.
Media sample app from Jetcaster: A sample podcast app showcasing how to reuse code between form factors, updated to Material 3 Expressive on Wear OS.
WearTilesKotlin sample app: Demonstrates the fundamentals of building a tile but also includes templates for common layouts, letting you quickly bootstrap your own designs while following best practices.
Compose on Wear OS codelab: A step-by-step tutorial for building a functional app for Wear OS from scratch.
Tiles on Wear OS codelab: For a more guided, step-by-step introduction to building tiles.


There has never been a better time to start building for Wear OS. If you have feedback on the APIs, please let us know using the issue trackers for Wear Compose and Tiles. We look forward to seeing what you build!
Google

Welcome to Wear OS Spotlight Week




Posted by Chiara Chiappini – Android Developer Relations Engineer, and Kevin Hufnagle - Android Technical Writer






Wear OS is rapidly expanding its presence in the market, presenting a unique and significant opportunity for developers. With a gr...
Google

Android 16 QPR2 Beta 1 is here




Posted by Matthew McCullough – VP of Product Management, Android Developer



Today we're releasing Android 16 quarterly platform release 2 (QPR2) Beta 1, providing you with an early opportunity to try out the APIs and features that are moving Andro...
Google

Media3 1.8.0 – What’s new?




Posted by Toni Heidenreich – Engineering Manager




This release includes several bug fixes, performance improvements, and new features. Read on to find out more, and as always please check out the full release notes for a comprehensive overview of...
Google

What is HDR?




Posted by John Reck – Software Engineer

For Android developers, delivering exceptional visual experiences is a continuous goal. High Dynamic Range (HDR) unlocks new possibilities, offering the potential for more vibrant and immersive content. Techn...
Google

Our first Spotlight Week: diving into Android 15




Posted by Aaron Labiaga- Android Developer Relations Engineer




By now, you’ve probably heard the news: Android 15 was just released earlier today to AOSP. To celebrate, we’re kicking off a new series called “Spotlight Week” where we’ll shine a li...
Google

Android 15 is released to AOSP




Posted by Matthew McCullough – VP of Product Management, Android Developer



Today we're releasing Android 15 and making the source code available at the Android Open Source Project (AOSP). Android 15 will be available on supported Pixel de...
Google

Android Studio Koala Feature Drop is Stable!




Posted by Sandhya Mohan, Product Manager, Android Studio



Today, we are thrilled to announce the stable release of Android Studio Koala Feature Drop (2024.1.2)!🐨

Earlier this year, we announced that every Android Studio animal version will have ...
Google

Adding 16 KB Page Size to Android




Posted by Steven Moreland – Staff Software Engineer, Sandeep Patil – Principal Software Engineer



A page is the granularity at which an operating system manages memory. Most CPUs today support a 4 KB page size and so the Android OS and applica...
Google