Google has just unveiled the schedule for its much-anticipated I/O developer conference, which is packed with developer-centric sessions. These are set to shed light on its new Android XR operating system. However, a quick glance at the lineup shows that Google isn’t quite ready to make Android XR the star of the show just yet.
The chatter around Android XR has been somewhat quiet since its initial announcement last December. It was introduced alongside Samsung’s ambitious ‘Project Moohan’ mixed reality headset. As of now, neither has a confirmed release date, but there’s an expectation they’ll see the light of day later this year.
Google has lifted the veil on some Android XR features, most notably the eagerly awaited support for passthrough camera access. Developers already have their hands on Android XR SDK, but there’s a keen interest in how it will fare against more established XR platforms like Meta’s Horizon OS and Apple’s visionOS.
The Google I/O event, scheduled for May 20th and 21st, is set to feature numerous streamed keynotes, yet only two sessions are specifically dedicated to Android XR—and they won’t be streamed live. That said, there’s a “What’s New in Android” session that should include an Android XR segment.
While the live sessions might not reveal much, the two developer-focused talks indicate Google’s eagerness to pull developers into its XR ecosystem, albeit without the fanfare of the main keynote presentations.
Here’s what’s to note from their session descriptions: Android XR is gearing up for a public release later this year. Google is assembling a fresh XR toolchain that combines Jetpack SceneCore and ARCore within its XR variant of Jetpack. Currently in developer preview, Jetpack XR enables developers working on mobile or large-screen Android apps to craft spatially aware experiences using 3D models and immersive environments. Rolling ARCore into Jetpack XR suggests that Google is aiming for a cohesive platform for building both AR and VR applications.
The sessions also emphasize enhancing current apps with XR components like 3D models, hand-tracking, and stereoscopic video. Google’s intent seems to be reaching beyond game development, as they strive for Android XR to reach feature parity with the wider Android ecosystem.
Furthermore, Jetpack Compose, Google’s declarative UI toolkit, is coming to XR, indicating a push towards a unified UI design experience that can translate seamlessly from mobile to immersive formats.
The second session points to intriguing new AI capabilities being worked into Android XR, hinting at future possibilities like real-time object recognition, scene comprehension, or AI-fueled environments.
Despite none of these sessions being live-streamed, hinting at a reserved approach, it would be great to hear updates from Samsung’s forthcoming ‘Project Moohan’ headset, touted to be the first to harness Android XR.
Regardless, we’ll be listening in on livestreams and following the technical discussions keenly, eager for new revelations.
Building Differentiated Apps for Android XR with 3D Content: Join developers Dereck Bridié and Patrick Fuentes as they introduce new toolsets like Jetpack SceneCore and ARCore for Jetpack XR. This session will navigate the process of infusing apps with 3D models, stereoscopic video, and hand-tracking capabilities, offering a sneak peek into the Android XR SDK developer preview just ahead of its public rollout.
The Future Is Now, with Compose and AI on Android XR: Led by senior product manager Cecilia Abadie and developer relations engineer Jan Kleinert, this session explores the advancements in the Android XR SDK Beta launching at I/O. Discover how enhancements to Jetpack Compose for XR and innovative AI features can expand your existing large screen developments into the exciting realm of Android XR.