VR (virtual reality) and AR (augmented reality) have been at the forefront of development for years, with major technology companies and startups working to build their own XR software and hardware.
The problem, however, was the relatively fragmented nature of this development. The space needed a ubiquitous platform that different companies and developers could use to build products and apps based on a single ecosystem, much like what we see in the Android world. That’s where Qualcomm Technologies, Inc. gets serious with its Snapdragon Spaces™ technology, offering a simple and convenient way for developers to jump into her world of XR. Here’s what you need to know:
What is XR?
Before we get into all the exciting features that Snapdragon Spaces has to offer, let’s talk about XR (Augmented Reality), the umbrella term that encompasses VR, AR, and MR (Mixed Reality).
VR and AR are better known. In the former case, you find yourself in a completely computer-generated space and can interact with objects in the virtual world using the accompanying hardware. AR, on the other hand, mixes the real world, allowing you to see everything in your physical space with a transparent display, and augmented projections that add information and details to enhance the experience.
Finally, there is MR, which combines the best of both worlds. This is also completely computer-generated, but it takes images from the real world and incorporates them into a VR environment to create mixed reality.
There are three different types of computer-based “reality,” all of which fall under the umbrella term “XR.”
Qualcomm takes the lead with Snapdragon Spaces
Snapdragon Spaces is powered by Qualcomm, which is truly at the forefront of the XR space. The company already offers products using Qualcomm’s XR processor, most recently the Meta Quest 3 with the Snapdragon® XR2 Gen 2 platform and the Ray-Ban Meta smart glasses collection with the Snapdragon AR1 platform. there is. Chips are essential, providing the processing power and power efficiency to run a variety of “real world” environments, but Qualcomm wants to do more than just make processors.
On the hardware side, Qualcomm is helping create reference devices that partners can use to quickly move through the prototyping phase and build commercially available products. At the same time, developers can utilize hardware development kits to test their apps. These hardware kits offer more than just processing power. It includes not only powerful CPUs and GPUs, but also sensors and other technologies needed to build and use XR apps, such as connectivity, hand movement tracking, and airplane detection.
And of course, Qualcomm’s involvement extends beyond the hardware. Snapdragon Spaces is a software development kit that developers can use to build their XR apps for a variety of products, from headworn devices to smartphones. It was initially only used by Qualcomm partners to develop headsets, but now the company is opening it up to all creators. Qualcomm is fully involved in expanding the ecosystem, providing developers with software support, updates, new features, and everything else they need to build XR apps.
For developers interested in creating XR apps for a variety of headsets that use Snapdragon Spaces, have Qualcomm processors, or use Android, the Snapdragon Spaces SDK is just what you need.
Snapdragon Spaces makes it easy to create XR apps
Starting a career in a technology field that is still in its infancy can be daunting. Developers interested in XR may be concerned about having to start with low-level libraries and build from scratch.
But Snapdragon Spaces uses Unreal Engine or Unity, a tool used by hundreds of thousands of developers around the world to generate 3D games. You can use either to create XR apps, so you don’t have to relearn all the tools or start from scratch.
Snapdragon Spaces SDR for Unreal Engine or Unity includes a variety of technologies you can use, including anchor points, hand tracking, object tracking, and plane detection. Anchor points allow you to track specific points in space so you know exactly where you are in a virtual, augmented, or mixed 3D environment. A key aspect of any XR world, hand tracking allows you to track the movements of your hands and controllers and interact with them in space.
It also has image or object tracking and plane detection, which is especially useful for augmented reality and mixed reality setups. Considering the physical “3D” environment, images can be projected into flat, horizontal, or vertical space, and there’s a lot you can do around it, from object modeling to games.
Importantly, all of this is built into the SDK and available to developers within their tools of choice. This means you can spend your time being creative when building your XR app, without worrying about building support software from scratch at a lower level. You can trust the SDK to take care of everything for you and focus on the inspiration to create something great.
Great new feature: Dual Render Fusion
As mentioned earlier, Qualcomm is deeply involved in supporting and expanding the ecosystem, and this means new features. An exciting feature recently added to the SDK is called Dual Render Fusion.
Many current generation headworn devices combine a wired or wirelessly connected headset with an Android smartphone. This setup allows the secondary device, the smartphone, to do most of the heavy lifting in terms of processing power, allowing for a lighter, slimmer, and easier-to-wear headset.
Dual rendering fusion allows you to add a second display instead of a second device used only for processing. The developer has prepared his two screens that can be used to display anything on both Android smartphones and headsets.
What’s really interesting is that this also acts as a stepping stone from regular 3D smartphone apps to XR apps that can be viewed on a headset. Rather than rebuilding the XR app from scratch, you can use this addition to add a second screen and migrate smartphone apps and app functionality to his XR environment.
For example, if you’re playing a game on your phone and wearing an AR headset, you can continue playing the game on your phone while getting additional information from the game to your headset, such as inventory and game statistics. The same goes for the opposite, the headset becomes the main screen and additional details and information are displayed on the smartphone. All of this can be done without redeveloping the game specifically for AR or MR.
Dual rendering fusion allows developers to easily migrate from smartphone apps to XR apps without having to start from scratch.
Jump to XR with Snapdragon Spaces
Often in new technology areas, it is difficult to bridge the gap between hardware and software due to lack of support. How can you create an app for hardware that isn’t readily available? Or even if you have access to hardware, a developer doesn’t have the software platform to build an app for that hardware. not.
Qualcomm is solving this problem by approaching it from both sides. Snapdragon Spaces builds on existing 3D tools such as Unreal Engine and Unity and provides a standard way to develop apps for XR with necessary features such as plane detection, object tracking, and hand tracking. and avoid steep learning curves. We’re also constantly adding great new features like Dual Render Fusion, which lets you jump from your smartphone app to his XR app without having to recreate anything from scratch.
There are also a number of devices you can test, from hardware reference devices available from Qualcomm to Snapdragon Spaces-enabled Android smartphones like the OnePlus 11. This really narrows the gap between software and hardware and allows developers to reach our full potential. Please view from the XR app.
Sponsored by Qualcomm Technologies, Inc. Snapdragon and Qualcomm branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.