The Augmentiverse Framework
An AR-first, standards-aligned path to persistent, interoperable spatial computing — the grounded alternative to the Metaverse myth, built on open protocols and physical reality.
Beyond the
Metaverse myth
"The Augmentiverse is not a product or a platform — it is a conceptual framework for building the next layer of the internet on top of physical reality, not in replacement of it."
Elmqaddem, iJET vol. 21 no. 01, 2026The term Augmentiverse was introduced in January 2026 by researcher N. Elmqaddem in the International Journal of Emerging Technologies in Learning (iJET, Vol. 21 No. 01, pp. 59–72). The paper offers a rigorous reframing of where immersive computing is genuinely heading, as distinct from the over-hyped Metaverse narrative of the early 2020s.
Where the Metaverse implied a totalising virtual replacement for reality — popularised by Meta's 2021 rebrand and billions of dollars in investment that ultimately disappointed — the Augmentiverse proposes something far more achievable and more consequential: a persistent, standards-compliant layer of digital content anchored to and aware of the physical world.
Three defining commitments
Digital content is built from the ground up to coexist with the physical environment, not replace it. The real world is the canvas, not an obstacle to be overcome.
Interoperability is non-negotiable. Content, identity, and assets must travel freely across devices and platforms via ratified, royalty-free protocols — never proprietary APIs that create platform lock-in.
Digital objects remain where they are placed in physical space. Discoverable and consistent for all users, over time, on any conformant device. The opposite of ephemeral, app-siloed AR filters.
Theoretical foundations
The framework builds on two foundational intellectual pillars:
The reality–virtuality
continuum
Based on Milgram & Kishino (1994). The Augmentiverse occupies the AR and MR zones — reality-anchored, not reality-replacing.
Digital overlaid on physical reality
Digital information is overlaid on the user's view of the real world via a camera or transparent display. The real world is not replaced — it is annotated. AR content does not necessarily understand or deeply interact with its physical surroundings; it annotates rather than responds to reality.
Augmentiverse relevance: AR is the foundational layer. The Augmentiverse extends current AR by requiring spatial anchoring and open standards — moving from ephemeral overlays to persistent, discoverable spatial content.
Key characteristics
Full immersion in synthetic environments
A fully immersive, computer-generated environment that replaces the user's view of the real world entirely. VR occupies the far virtual end of Milgram's continuum. Users have no visual awareness of their physical surroundings while immersed.
Augmentiverse relevance: VR is the opposite of the Augmentiverse's philosophy. However, VR training environments and simulation tools feed skills and content back into the physical-world-first Augmentiverse stack.
Key characteristics
Spatial computing at its fullest
Digital objects are spatially anchored to and deeply aware of the physical world. MR systems map their environment — holograms stay fixed in space as you move, can be occluded behind real surfaces, and interact with real objects. This is the Augmentiverse's primary mode of experience.
Why MR is the Augmentiverse's home: It satisfies all three commitments — AR-first, spatially persistent, and the target of OpenXR and WebXR AR module standards.
The umbrella term for all immersive tech
XR is not itself a technology — it is a collective label used in standards documents, policy frameworks, and enterprise strategy to refer to AR, VR, and MR without enumerating each. The term appears in OpenXR's name, in W3C's Immersive Web Working Group, and in ETSI's infrastructure standards.
Relationship to Augmentiverse: The Augmentiverse uses XR standards (especially OpenXR and WebXR) as its runtime foundation but specifically prioritises the AR/MR end of the XR spectrum.
XR vs Augmentiverse
XR is a definitional umbrella encompassing all immersive technologies. The Augmentiverse is a normative framework — it specifies how immersive technologies should be built (AR-first, open standards, persistent) rather than simply categorising them.
The Augmentiverse uses XR's standards infrastructure while taking a strong architectural position: digital content should enrich the physical world, not replace it. This is a philosophical commitment that XR as a category does not impose.
The open standards
that make it possible
The Augmentiverse is not a product — it is an ecosystem whose viability depends entirely on cross-industry coordination around open, ratified protocols. Without interoperability, persistent spatial content fragments into proprietary silos. Six foundational standards form its technical architecture.
OpenXR — Khronos Group
A royalty-free, open standard providing a common API for XR hardware and software. OpenXR 1.1 consolidates multiple extensions into the core specification, dramatically reducing fragmentation. Where previously developers had to target each headset separately via proprietary APIs, OpenXR enables a single codebase to run across devices from Meta, Microsoft, Valve, Sony, HTC, and Pico. The fundamental runtime layer of the Augmentiverse.
WebXR Device API — W3C
A W3C Candidate Recommendation enabling AR and VR experiences directly in web browsers without app installation. Developed by the W3C Immersive Web Working Group; shipped in Chrome, Edge, Firefox, and Samsung Internet. The WebXR AR Module specifically enables immersive-ar sessions. Critical for the Augmentiverse: a shared AR layer must be accessible via URL, not locked to an app store.
glTF 2.0 — Khronos Group
The "JPEG of 3D" — a royalty-free specification for efficient transmission and loading of 3D scenes and models. Recognised as ISO/IEC 12113:2022. Supports physically-based rendering (PBR) and is the primary format for web-based and real-time 3D content. glTF minimises asset size and runtime processing. Essential for authoring Augmentiverse content once and deploying it everywhere.
OpenUSD — Alliance for OpenUSD
Universal Scene Description — originally developed by Pixar, now governed by AOUSD (Apple, Nvidia, Pixar, Adobe, Epic Games, IKEA, Unity, Meta). OpenUSD is a high-performance format for composing large-scale 3D scenes. AOUSD and Khronos have a formal liaison to align OpenUSD and glTF. OpenUSD Core Spec 1.0 is now available.
DIDs & Verifiable Credentials — W3C
Decentralised Identifiers (DIDs v1.1) and Verifiable Credentials (VC Data Model 2.0) enable portable, self-sovereign digital identity. Users carry persistent identity across spatial environments without platform lock-in. DIDs are cryptographically verifiable without any central authority. Essential for Augmentiverse content authorship, access control, and persistent user presence.
Mobile Edge Computing — ETSI
Real-time AR requires sub-20ms latency for spatial tracking and rendering. Mobile edge computing places compute nodes physically close to the user — at base stations or local data centres — eliminating round-trip latency to centralised cloud infrastructure. Persistent spatial content that responds to the world in real time cannot tolerate high-latency connections. Standardised by ETSI's MEC Industry Specification Group.
Where the Augmentiverse
is already emerging
The Augmentiverse is not a future state — its constituent technologies are already deployed across critical sectors. The following applications illustrate what persistent, standards-aligned AR enables today.
Surgical planning & medical training
AR overlays provide surgeons with real-time anatomical data during procedures. VR simulations train clinicians in high-fidelity environments. XR-enabled digital twins allow personalised treatment planning. A 2025 systematic review confirmed XR effectiveness in medical education and surgical planning across 21 peer-reviewed studies.
Healthcare AR: $610M (2018) → projected $4.2B (2026) · CAGR 33.9%
Assembly guidance & remote assistance
AR overlays real-time work instructions onto assembly tasks, reducing errors. Remote experts guide field workers via shared spatial views. Digital twins of factory floors allow simulation before physical changes. XR device shipments for industrial use grew over 40% year-over-year in 2025.
AR IoT in manufacturing: projected $90–110B by 2030
Immersive learning environments
AR overlays contextual information on physical materials. VR classrooms provide gamified, adaptive learning. VR training delivers up to 78% better learning outcomes than traditional methods. 30% of universities worldwide offered VR-based courses as of 2024, with 69.4% deployment growth in 2024.
Up to 78% better learning outcomes vs. traditional methods
Spatial design & building visualisation
Architects overlay proposed structures onto physical sites. Stakeholders experience buildings before construction. AR allows real-time comparison of design options in situ, with multiple users sharing the same spatial view. A core Augmentiverse application: persistent content anchored to specific physical coordinates.
Product visualisation & virtual try-on
AR allows consumers to place furniture at home (IKEA Place), try on eyewear or clothing virtually, or inspect product details in 3D. glTF and WebXR make experiences accessible via browser without app download. IKEA is an AOUSD member — reflecting strategic importance of 3D asset standards to retail.
Smart glasses CAGR: 38%+ expected 2026–2033
Site reconstruction & museum augmentation
AR overlays reconstructed historical structures onto archaeological sites. Visitors activate contextual information by pointing devices at exhibits. Spatial anchoring ensures content is geographically tied to the correct physical location — exemplifying the Augmentiverse's persistent-presence principle in a public, open-access context.
The road to
spatial computing
Key intellectual and technological milestones in the emergence of the Augmentiverse framework — from foundational theory to current standards ratification.
The intellectual
lineage
Co-authors of the Reality–Virtuality Continuum (1994). The continuum remains the definitive theoretical framework for classifying immersive technologies, and is the foundational scaffold on which the Augmentiverse positions itself along the spectrum from real to virtual.
Inventor of ubiquitous computing — the idea that the most profound technology disappears into everyday life. His 1991 paper is a direct intellectual antecedent of the Augmentiverse's physical-world-first philosophy: computing as environment, not device.
Defined "spatial computing" in his 2003 MIT thesis as "human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces." This definition, later adopted by Apple in 2023, is central to the Augmentiverse's machine-understood physical world.
Author of the paper introducing the Augmentiverse concept, synthesising developments in AR smart glasses, XR standards, decentralised identity, and edge computing into a coherent framework for the AR-first path to shared spatial reality.
Leads Khronos Group's work on OpenXR and glTF; a leading advocate for open interoperability standards as the foundation for the spatial internet. Active in the Metaverse Standards Forum and the Open Metaverse Browser Initiative (launched 2026).
Influential analyst who has extensively studied the competing terminology around spatial computing. His 2024 essay provides essential context for understanding why the Augmentiverse is a distinct and deliberate framing rather than a synonym for "spatial computing" or "metaverse."
Khronos Group
Consortium of 150+ companies developing OpenXR, glTF, Vulkan, WebGL. The runtime and 3D format heart of the Augmentiverse.
khronos.orgW3C — World Wide Web Consortium
Develops WebXR, DIDs, and Verifiable Credentials — ensuring the Augmentiverse is web-accessible and identity-portable.
w3.orgAlliance for OpenUSD (AOUSD)
Governs OpenUSD with Apple, Nvidia, Pixar, Adobe, IKEA, Epic, Unity. Formal Khronos liaison for glTF alignment.
aousd.orgMetaverse Standards Forum
Broad industry coordination on XR standards, including the 3D asset interoperability working group and Open Metaverse Browser Initiative (2026).
metaverse-standards.orgiJET — Int'l Journal of Emerging Technologies in Learning
Peer-reviewed journal publishing the Augmentiverse framework (Jan 2026, Vol. 21 No. 01). Indexed in EBSCO, DBLP; archived in Portico.
online-journals.orgETSI — European Telecommunications Standards Institute
Develops the Multi-Access Edge Computing (MEC) standards providing the low-latency network infrastructure essential for real-time AR.
etsi.org/mecKey terms defined
Frequently asked questions
Elmqaddem, N. (2026). From Metaverse Myth to Augmentiverse Reality: AR Smart Glasses and Standards-Led Convergence for Interoperability and Spatial Computing. International Journal of Emerging Technologies in Learning (iJET), 21(01), pp. 59–72. doi:10.3991/ijet.v21i01.59733
For the foundational theoretical framework: Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. (1994). Augmented reality: A class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies, 2351. doi:10.1117/12.197321
Primary sources &
further reading
"From Metaverse Myth to Augmentiverse Reality: AR Smart Glasses and Standards-Led Convergence for Interoperability and Spatial Computing."