Editorial frame

How to read this page without getting lost in the hardware cycle

This page now does more than list products. It separates durable infrastructure from short-lived device hype, shows where open standards actually matter, and highlights which toolchains are realistic for different kinds of teams.

What lasts

The stack outlives any single headset

The most durable decisions happen below the device layer: OpenXR, WebXR, glTF, OpenUSD, identity, and low-latency infrastructure. Hardware changes quickly; portable formats and runtimes compound.

What ships now

Smartphones and passthrough MR are still the practical bridge

For most teams, browser-based AR, ARKit or ARCore pilots, and OpenXR-capable passthrough devices remain the fastest way to validate spatial UX before betting on lightweight all-day glasses.

Where to start

Choose by deployment context, not by brand prestige

Enterprise collaboration, browser-native publishing, premium spatial computing, and asset-pipeline authoring each need different tools. The next section gives concrete starting paths.

Decision guide

Recommended starting paths
by team and objective

If you are deciding what to build with first, these are the most defensible starting points on the current Augmentiverse landscape.

Browser-first pilot

For teams testing public-facing spatial experiences

Start with WebXR, glTF assets, and Three.js or Babylon.js. Use smartphones and headset browsers first so the experience remains URL-addressable, easy to share, and less dependent on app-store friction.

Enterprise MR

For training, assistance, and anchored shared work

Prioritise OpenXR-capable headsets, Unity-based deployment, and shared anchoring services. HoloLens, Magic Leap, and Android XR class devices remain more realistic than consumer glasses for high-reliability field workflows.

Premium content pipeline

For teams building durable 3D asset workflows

Treat glTF and OpenUSD as the center of gravity. Author in Blender, Maya, 3ds Max, or Omniverse, validate before publishing, and keep the asset layer portable so content can move across runtimes over time.

Editorial note
This page covers tools across five layers of the Augmentiverse stack: (1) display hardware — what users wear; (2) development SDKs — how experiences are built; (3) 3D engines — how content is rendered; (4) authoring tools — how assets are created; and (5) infrastructure — how content is delivered at low latency. All specifications and pricing are as of April 2026. Prices are in USD unless noted.
Augmentiverse technology stack

The Augmentiverse requires a complete, interoperable stack from physical display to infrastructure. Each layer below links to the tools covered on this page.

User layer
AR Glasses MR Headsets Smartphones VR Headsets
Runtime layer
OpenXR 1.1 WebXR API ARKit ARCore visionOS Android XR
Engine layer
Unity 6 Unreal Engine 5 Three.js / Babylon.js RealityKit
Asset layer
glTF 2.0 OpenUSD USDZ Blender Maya / 3ds Max
Identity layer
W3C DIDs Verifiable Credentials Ceramic Network
Network layer
ETSI MEC 5G / Wi-Fi 7 AWS Wavelength Azure Edge Zones
01 — AR & AI smart glasses

Consumer & prosumer
AR glasses

The fastest-growing hardware segment in the Augmentiverse ecosystem. Global XR device shipments grew 44.4% year-over-year in 2025, driven almost entirely by smart glasses rather than VR or MR headsets (IDC, March 2026). Meta holds approximately 82% of smart glasses shipments (Counterpoint Research, H2 2025).

Consumer Prosumer
~$699
X1 chip · 3DoF · spatial display
Xreal One Pro
Xreal · 2025

The best true AR display glasses for spatial computing enthusiasts. The proprietary X1 chip handles all display processing and 3DoF tracking entirely on-device, without needing external software. Creates a massive virtual display floating in space — up to 100-inch equivalent. Connects to any USB-C device including smartphones, laptops, and gaming consoles.

Display Micro-OLED · 1080p per eye · 500 nits
Tracking 3DoF on-device via X1 chip
Connection USB-C wired · phone/laptop/console
Platform Nebula OS · Android / iOS / macOS / Windows
Weight ~80g
Developer
$99/mo
46° FOV · hand tracking · standalone
Spectacles Gen 5
Snap Inc. · 2024 · Developer only

Standalone AR glasses from Snap with one of the widest fields of view in the consumer category (46°). Features hand tracking, gesture recognition, and full Snapchat Lens integration. Currently developer-only via subscription. A critical platform for social and creative AR content creators, used to build and prototype next-generation AR experiences.

FOV 46° diagonal — widest in class
Compute Standalone · Snapdragon XR processor
Input Hand tracking · voice · touch
Platform Snap OS · Lens Studio SDK
Access Developer subscription ($99/month)
Consumer
~$500–700
Micro-LED binocular · standalone
RayNeo X2
TCL · RayNeo · 2024–2025

The first standalone AR glasses with binocular full-colour Micro-LED waveguide displays, powered by Qualcomm Snapdragon XR2. Supports real-time translation, smart assistant functions, and navigation. One of the most technically capable standalone AR glasses available to consumers, with built-in processing independent of a smartphone.

Display Micro-LED binocular full-colour
Processor Qualcomm Snapdragon XR2
Features Navigation · translation · AI assistant
Compute Standalone (no phone required)
Developer Open source
~$349
AI 39g · open source · Noa AI
Frame
Brilliant Labs · 2024

An open-source AI AR glasses platform weighing just 39g. Uses a small projector to display text and simple graphics, powered by the Noa AI assistant for real-time question answering, translation, and object identification. As an open-source platform, developers can build and publish their own apps — making Frame a creative playground for AI hardware innovation.

Weight 39g — ultralight
Display Micro projector · text & simple graphics
AI Noa multimodal assistant · on-device + cloud
Platform Open source · Lua scripting · app store
Upcoming 2027 Consumer
TBD
??? acetate frame · "icon" design · AI
Apple Smart Glasses
Apple · rumoured late 2026 / 2027

Apple's unreleased smart glasses, confirmed by Bloomberg's Mark Gurman as on track for a late 2026 or early 2027 debut. Display-less design at launch (camera + microphones, AI-powered, no display) to directly rival Meta's Ray-Ban glasses. Apple is leaning into premium materials — acetate frame — with what it internally calls an "icon" design goal. Expected deep Apple Intelligence integration.

Display None at launch (camera + audio AI)
Material Acetate frame — luxury build quality
AI Apple Intelligence · Siri integration
Timeline Projected late 2026 / early 2027
02 — Mixed Reality headsets

Spatial computing
headsets & MR devices

Mixed Reality headsets are the primary platform for the Augmentiverse's core use cases — persistent spatial holograms anchored to the physical world, with full environmental awareness. These devices map their physical surroundings and render digital content that stays precisely in place.

Enterprise
~$3,500
see-through · 52° FOV · Windows MR
HoloLens 2
Microsoft · 2019 · enterprise standard

The most mature and widely deployed enterprise MR headset. Optical see-through holographic displays with over 2.5K light points per radian holographic density. Advanced hand and eye tracking, voice input, and flip-up visor for switching between physical and holographic modes. The reference platform for industrial AR: surgery, manufacturing, field service, and training. Deeply integrated with Microsoft Azure and Azure Remote Assist.

Display Waveguide · 2K per eye · 52° FOV
Tracking Eye + hand + voice · depth sensor
Platform Windows Holographic · Azure Spatial Anchors
Standards OpenXR · Unity / Unreal Engine support
Battery ~2–3 hours (tethered option available)
Enterprise
~$3,299
260g · 70° FOV · AMD Zen 2 · dimming
Magic Leap 2
Magic Leap · 2022 · enterprise

Magic Leap 2 is more powerful and lighter than HoloLens 2 at just 260g, with an AMD Zen 2 quad-core processor and RDNA 2 GPU. Its best-in-class 70° field of view is the widest of any enterprise AR headset. Includes a unique active dimming capability, allowing digital content to be more clearly visible in bright environments. Bundles controllers with hand and eye tracking. Runs Magic Leap OS (Android-based).

FOV 70° — widest enterprise AR FOV
Weight 260g — lighter than HoloLens 2
Processor AMD Zen 2 quad-core + RDNA 2 GPU
Unique Active global & segmented dimming
Input Controller + eye + hand + voice
Consumer Android XR
~$1,800
Galaxy XR · Wi-Fi 7 · BT 5.4 · 545g
Samsung Galaxy XR
Samsung · Google · launched October 2025

Samsung's first spatial computing headset, launched October 2025 at $1,800. Runs Android XR — Google's new headset platform built on OpenXR 1.1, making it the first major consumer headset with deep OpenXR integration at the OS level. A rich sensor stack (eye, hand, voice, six world-facing cameras, depth sensor) with Wi-Fi 7 and Bluetooth 5.4. Positioned as the most accessible sub-$2K MR headset with flagship-grade specs.

OS Android XR · OpenXR 1.1 native
Tracking Eye + hand + voice · 6 cameras · depth
Connectivity Wi-Fi 7 · Bluetooth 5.4
Battery ~2–2.5h · external battery pack
Weight 545g headset + 302g battery pack
03 — VR headsets

Virtual Reality
headsets

VR occupies the far virtual end of the Reality–Virtuality Continuum — the philosophical opposite of the Augmentiverse's AR-first approach. However, VR headsets remain important Augmentiverse tools for training simulation, content development workflows, and as stepping stones to MR. Modern VR platforms also increasingly support passthrough MR modes via camera-based see-through.

Consumer VR + MR passthrough
$499
Snapdragon XR2 Gen 3 · MR passthrough
Meta Quest 3
Meta · 2023

The market-leading standalone VR headset, now with full-colour MR passthrough enabled by depth sensors and cameras. Snapdragon XR2 Gen 3 processor. The most widely used platform for WebXR development and testing. Supports OpenXR, making it a key Augmentiverse development target for passthrough MR experiences.

Display2064×2208 per eye · pancake lenses
ProcessorSnapdragon XR2 Gen 3
MR modeFull-colour passthrough via cameras
StandardsOpenXR · WebXR
Consumer Gaming / VR
$999–$1,399
PlayStation VR2 · 4K HDR OLED · eye tracking
PlayStation VR2
Sony · 2023

Sony's second-generation VR headset, requiring a PlayStation 5. Features 4K HDR OLED displays, eye tracking for foveated rendering, and adaptive trigger haptics. Primarily a gaming platform but increasingly relevant for interactive media and simulation applications. Now also supports PC VR via adapter.

Display4K HDR OLED per eye · 110° FOV
TrackingInside-out · eye tracking · foveated render
PlatformPS5 required · PC adapter available
Developer / Enthusiast PC VR
$999
Valve Index · 130° FOV · 144Hz · finger tracking
Valve Index
Valve · 2019 · developer reference

High-fidelity PC VR headset with 130° field of view and 144Hz refresh rate. Unique Knuckle controllers track individual finger movements. The reference platform for high-end PC VR development via SteamVR. Widely used for Augmentiverse content development and prototyping before deployment to lighter platforms.

Display1440×1600 per eye · 130° FOV · 144Hz
ControllersKnuckles — individual finger tracking
PlatformSteamVR · OpenXR
04 — Development SDKs & frameworks

How Augmentiverse
content is built

Development SDKs provide the foundational APIs for detecting physical surfaces, tracking movement, anchoring content, and delivering experiences to devices. The most important open-standard SDKs are OpenXR and WebXR — covered in the Standards section. Below are the major platform-specific and cross-platform tools.

Apple ARKit
Apple · iOS / iPadOS / visionOS

Apple's flagship AR framework, deeply integrated with iPhone, iPad, and Vision Pro hardware. LiDAR depth sensing on Pro devices enables high-precision surface detection and environment occlusion. visionOS 26 added shared world anchors, 90Hz hand tracking, and expanded camera access APIs. Supports SceneKit, RealityKit, Unity, and Unreal Engine.

iOS · iPadOS · visionOS LiDAR depth Location anchors Free
Google ARCore
Google · Android · iOS · Web

The most accessible AR platform in 2026, running on 87%+ of active Android devices. The Geospatial API uses 15 years of Google Maps and Street View data to place AR content at exact real-world coordinates — enabling true location-anchored Augmentiverse content. Terrain and rooftop anchors, Streetscape Geometry API, and AI-powered scene classification. Completely free with no per-call charges.

Android · iOS Geospatial API Location anchors Free
Unity AR Foundation 6.x
Unity Technologies · cross-platform

The cross-platform solution to AR fragmentation — a unified API that works across ARKit (iOS/visionOS), ARCore (Android), and OpenXR. Unity 6 (6.3 LTS, December 2025) powers everything from mobile AR to Apple Vision Pro via PolySpatial. QR code and marker tracking added in version 6.4. Write once, deploy to iOS, Android, and visionOS. The most widely used AR development framework by developer count.

iOS · Android · visionOS OpenXR Cross-platform Free tier
WebXR Device API
W3C · all major browsers

The only open-standard, browser-native path to AR without app installation. A W3C Candidate Recommendation shipped in Chrome, Edge, Firefox, and Samsung Internet. The WebXR AR Module enables immersive-ar sessions for smartphone and headset AR. Enables the Augmentiverse's vision of URL-addressable spatial content — no app store required. See the Standards section for full coverage.

Browser-native W3C Standard No install required Free / open
Vuforia Engine
PTC · enterprise · industrial AR

The gold standard for enterprise and industrial AR with market-leading marker-based and markerless tracking. Achieves millimetre-level precision for assembly guidance, maintenance, and quality inspection. Works with Unity and Unreal Engine. Particularly strong for model-target tracking — recognising and tracking 3D objects like industrial machinery. Widely deployed in manufacturing, healthcare, and training.

Enterprise / Industrial Image + model targets HoloLens · iOS · Android Paid
Snap Lens Studio
Snap Inc. · social AR authoring

The most widely used social AR creation platform, enabling designers and developers to build AR filters, lenses, and experiences for Snapchat and Spectacles. Used by millions of creators globally. Features body tracking, face mesh, hand tracking, and world AR. Critical for understanding how Augmentiverse content will be consumed by mass-market audiences — Snap processes over 6 billion snaps per day.

Social AR Snapchat · Spectacles Body / face / hand tracking Free
05 — 3D engines & renderers

Rendering the
Augmentiverse

3D engines are the creative and rendering heart of the Augmentiverse — the platforms where developers build the spatial experiences that run on AR glasses and MR headsets. Engine choice determines cross-platform reach, visual fidelity, and performance.

Unity 6
Unity Technologies · dominant AR engine

The dominant engine for AR and spatial computing by developer count. Unity 6 (6.3 LTS, December 2025) powers everything from mobile AR apps to Apple Vision Pro experiences via PolySpatial. AR Foundation 6.x provides a single API across ARKit, ARCore, and OpenXR. By far the most widely used engine for Augmentiverse content creation, with extensive community, documentation, and asset library support.

iOS · Android · visionOS · Windows MR OpenXR AR Foundation Free personal tier
Unreal Engine 5
Epic Games · photorealistic fidelity

Epic's engine provides photorealistic rendering for high-end AR experiences, with strong C++ integration and industry-leading visual fidelity via Nanite and Lumen systems. Particularly strong for enterprise AR in architecture, manufacturing visualisation, and product design. The AR Template simplifies project setup with optimised rendering pipelines. Supports OpenXR natively.

Photorealistic Enterprise AR C++ / Blueprints Royalty after $1M
Three.js / Babylon.js
Open source · web-native 3D

The primary rendering libraries for WebXR-based Augmentiverse experiences. Three.js is the most widely used web 3D library; Babylon.js provides a higher-level framework with built-in physics and AR features. Both integrate with the WebXR Device API to deliver browser-native AR without installation. Essential for anyone building the web-accessible layer of the Augmentiverse.

Web-native WebXR integration JavaScript / TypeScript Open source · free
06 — 3D authoring & asset tools

Creating the
spatial content layer

The Augmentiverse requires high-quality 3D assets — models, scenes, animations, and textures — authored in open formats (glTF, OpenUSD) for maximum interoperability. These tools sit at the top of the content creation pipeline.

Blender
Blender Foundation · open source

The leading open-source 3D creation suite. Native glTF 2.0 export, OpenUSD support, sculpting, rigging, animation, physically-based materials, and rendering. The most accessible professional-grade tool for Augmentiverse content creators. Blender's glTF export is one of the most compliant and feature-complete implementations available. Free, open-source, runs on all major platforms.

glTF 2.0 native OpenUSD export Full 3D pipeline Free · open source
Autodesk Maya & 3ds Max
Autodesk · professional pipeline

Industry-standard 3D modelling and animation packages, widely used in film, games, and enterprise spatial computing. Both export to glTF and USD formats. Maya is particularly strong for character rigging and animation; 3ds Max for architectural visualisation. Standard tools at large enterprises building Augmentiverse content for industrial training and product design.

Professional grade glTF / USD export Animation / rigging Subscription
Adobe Substance 3D
Adobe · materials & textures

The industry-standard suite for physically-based material authoring. Substance Painter for texture painting; Substance Designer for procedural material creation; Substance Sampler for photogrammetry-to-material workflows. All produce PBR materials fully compatible with glTF 2.0's PBR material model. Essential for creating visually accurate Augmentiverse assets that render consistently across devices.

PBR materials glTF PBR compatible Photogrammetry Subscription
NVIDIA Omniverse
NVIDIA · OpenUSD collaboration platform

NVIDIA's real-time 3D collaboration and simulation platform built entirely on OpenUSD. Enables multi-user simultaneous editing of shared 3D scenes, high-fidelity path tracing, and physics simulation. Critical for teams building large-scale Augmentiverse environments — digital twins of factories, cities, or facilities. A founding AOUSD member; Omniverse is the reference implementation for OpenUSD in production pipelines.

OpenUSD native Multi-user real-time Digital twins Free tier
Apple Reality Composer Pro
Apple · visionOS spatial authoring

Apple's native authoring environment for visionOS spatial experiences. Uses Reality Composition Format (USDA/USDZ) and integrates directly with Xcode. Visual scene assembly, physics, particles, and audio. The fastest path to publishing on Apple Vision Pro. Reality Composer Pro works with Reality files that can be shared as USDZ — Apple's spatial content format built on OpenUSD, ensuring interoperability with the broader ecosystem.

visionOS USDZ / OpenUSD Visual scene editor Free (with Xcode)
Khronos glTF Validator & Viewer
Khronos Group · quality assurance

The official Khronos reference tools for glTF asset validation and preview. The glTF Validator checks assets against the full glTF 2.0 specification and all extensions — essential quality control before deploying Augmentiverse content. The glTF Sample Viewer is the browser-based reference renderer; the Khronos glTF Viewer (iOS app) enables on-device preview in AR. Both are open source under Apache 2.0.

glTF 2.0 validation Reference renderer iOS AR viewer Open source · free
07 — Infrastructure & delivery

Delivering spatial
content at low latency

Real-time AR requires infrastructure designed to minimise latency between action and response. The Augmentiverse's persistent, spatially anchored content demands network architectures that keep compute close to the user — and platforms that manage spatial anchoring at cloud scale.

Microsoft Azure Spatial Anchors
Microsoft · cloud spatial anchoring

Azure Spatial Anchors enables persistent, shared spatial content that can be discovered by multiple users across HoloLens, iOS (ARKit), and Android (ARCore). Anchors are stored in the cloud and recalled across sessions — exactly what the Augmentiverse's persistent-presence principle requires. Deep integration with HoloLens 2 and Unity. Used by enterprises for shared MR collaboration.

Persistent anchors Multi-user sharing HoloLens · iOS · Android Azure pricing
ARCore Geospatial API
Google · location-based AR at scale

Google's Geospatial API uses 15 years of Street View and Maps data to place AR content at precise real-world coordinates globally — without requiring manual anchor placement. Terrain anchors place content on real ground geometry; rooftop anchors target building rooftops. Geospatial depth sensing extends up to 65 metres. This is the closest existing infrastructure to the Augmentiverse's vision of globally addressable spatial content.

Global location anchors Street View data Terrain + rooftop anchors Free API
ETSI MEC — Mobile Edge Computing
ETSI · network infrastructure standard

The ETSI Multi-Access Edge Computing (MEC) standard enables compute nodes at mobile network edges — physically close to the user's device. This reduces round-trip latency from 50–150ms (centralised cloud) to under 5ms — essential for real-time spatial tracking and rendering. AWS Wavelength, Azure Edge Zones, and Google Distributed Cloud all implement MEC-compatible infrastructure. The physical network layer of the Augmentiverse.

Sub-5ms latency Open standard 5G native AWS · Azure · Google implementations
08 — Device comparison

Augmentiverse
device comparison matrix

How current and near-future devices score against the three Augmentiverse commitments: AR-first display, open standards support, and persistent spatial presence.

Device Category AR-first display OpenXR support WebXR support Spatial anchoring Open standards Price
Apple Vision Pro 2 MR Headset See-through Via 3rd party visionOS 26 World anchors USDZ / OpenUSD ~$3,199
Microsoft HoloLens 2 MR Headset Optical waveguide Native Via browser Azure Spatial OpenXR / glTF ~$3,500
Magic Leap 2 MR Headset Optical + dimming Native Limited Spatial mapping OpenXR ~$3,299
Samsung Galaxy XR MR Headset Camera passthrough Android XR native Android XR ARCore anchors OpenXR 1.1 ~$1,800
Meta Quest 3 VR + passthrough Camera passthrough Native Browser Limited OpenXR $499
Xreal One Pro AR Glasses Optical display Via Android Limited Basic anchoring Partial ~$699
Ray-Ban Meta Display AR Glasses Monocular display Proprietary No No Closed platform $799
Snap Spectacles Gen 5 Developer 46° FOV display No No World tracking Snap-specific $99/mo
Smartphone (ARCore/ARKit) Mobile Camera overlay Partial WebXR in browser Session-limited ARKit/ARCore Existing
= Full support   = Partial support   = Not supported
09 — Hardware roadmap

What's coming to the
Augmentiverse toolchain

Key hardware and platform milestones expected in the near-to-medium term. The most significant inflection point remains mass-market AR glasses with comfortable all-day wearability — projected around 2028–2030.

2025 — done
Meta Ray-Ban Display Monocular display + Neural Band launched Sep 2025 ($799)
Apple Vision Pro 2 M2+R1, visionOS 26, price reduction, launched Oct 2025
Samsung Galaxy XR Android XR + OpenXR 1.1 native, launched Oct 2025
OpenXR 1.1 Core spec consolidation released by Khronos
Ray-Ban Meta tripled Sales tripled H1 2025; 7M units total 2025
2026 — in progress
Xreal Project Aura Android XR-powered glasses · 70° FOV · computing puck
ASUS ROG Xreal R1 Gaming AR glasses · 240Hz refresh rate
Open Metaverse Browser Khronos / RP1 open-source spatial browser initiative
Apple Smart Glasses Display-less, AI-powered, rumoured late 2026
DID v1.1 W3C Candidate Recommendation (March 2026)
Android XR glasses Multiple OEM releases on Android XR platform
2027 → near term
Apple Smart Glasses Display model — projected 2027 follow-up
HoloLens 3 Microsoft's next enterprise headset (unconfirmed)
Meta Orion Prototype AR glasses · 70° FOV · silicon carbide waveguide
WebXR ratification W3C Working Group Recommendation expected
2028–2030 → inflection
Mass-market AR eyewear 15M units/year projected by 2030 · 85%+ with displays
MicroLED at scale Expected to become standard AR display technology
Augmentiverse at scale Open standards + edge compute + mass hardware = critical mass
All-day wearable AR The hardware inflection point for mainstream adoption

Continue exploring

Return to the full Augmentiverse reference

Concept & theory Open standards Bibliography