(non)positional audio

 10th September 2025 at 11:41am

Create an empty mesh object (in Blender it's called an 'Empty') and add an src custom property with an OGG/MP3/WAV URL. For example:

src : https://foo.com/podcast.mp3

1. IF the empty is positioned at 0,0,0 the audio is nonpositional (like music e.g.)

2. IF the empty is not positioned at 0,0,0, then it'll be positional audio

3. the scale of the audio object will determine the amplitude/reach of the positional audio.


NOTE: audio does not play unless #t triggers it. For auto-play try adding #t=1 to the default fragment # on the scene (see this video)

# aliases

 10th September 2025 at 11:53am

# indicates a default fragment to execute during scene-load.

fragmenttypeexample valueinfo
#string (& separated)#-cube&t=0hide object with name cube and start 3D animations (implies xrf://-cube&t=0)
NOTE: this only gets publish to the hashbus and does not update the top-Level URL

» discussion

#!

 26th September 2025 at 8:45pm

Object teleports & imports

Prefixing an object with an exclamation-symbol, will teleport a (local or remote) referenced object from/to its original/usercamera location.

Usecases:

  • show/hide objects/buttons (menu e.g.) in front of user
  • embed remote (object within) 3D file via remote URL
  • instance an interactive object near the user regardless of location
  • instance HUD or semi-transparent-textured-sphere (LUT) around the user
#!menu

Clicking the href-value above will:

  1. reposition the referenced object (menu) to the usercamera's-coordinates.
  2. zoom in case of (non-empty) mesh/sceneroot-object: rescale to 1 m³, and position 1m in front of the camera
  3. toggle behaviour: revert values if 1/2 were already applied
  4. #+ is always implied (objects are always made visible)

This tiny but powerful symbol allows incredible interactive possibilities, by carefully positioning re-usable objects outside of a scene (below the floow e.g.).


#whiteroom&!explainer&!exitmenu

This will teleport the user to whiteroom and moves object explainer and exitmenu in front of the user.

https://my.org/foo.glb #!

Clicking the href-value above will:

  1. import foo.glb from my.org's webserver
  2. show it in front of the user (because #! indicates object teleport)

https://foo.glb #roomB&!bar

Clicking the href-value above will:

  1. replace the current scene with foo.glb
  2. teleport the user to #roomB inside foo.glb
  3. instance the referenced object (bar inside foo.glb) in front of the user.
  4. it will update the top-Level URL (because xrf: was not used)
  5. hide the instanced object when clicked again (toggle visibility)

NOTE: level2 teleportation links, as well as instancing mitigates the 'broken embedded image'-issue of HTML: always attaching the href-values to a 3D (preview) object (that way broken links will not break the design).

Example: clicking a 3D button with title 'menu' and href-value xrf:menu.glb?instance#t=4,5 would instance a 3D menu (menu.glb) in front of the user, and loop its animation between from 4-5 seconds (t=4,5)

NOTE: combining instance-operators allows dynamic construction of 3D scenes (#london&!welcomeMenu&!fadeBox e.g.)

#*

 26th September 2025 at 8:45pm

Object multipliers

The star-prefix will clone a (local or remote) referenced object to the usercamera's location, and make it grabbable.
Usecases:

  • object-picker (build stuff with objects)

NOTE: this is basically the #! operator which infinitely clones the referenced object (instead of repositioning the object).

#+-

 6th October 2025 at 11:58am

(De)selectors to show/hide

How to show/hide/group material- or object- or by name?

or

How to open/close/group menu-buttons or doors in a 3D scene?

Clicking href-values below will:

1. show/hide the targeted material- or object-name (incl. children)

#-parent

#+parent

#-VR*

https://foo.glb #bar & -welcome

Matching logic

1. - and + prefix for exact matches (welcome e.g.)

2. * postfix for match beginning (VR_skybox VR_skyboxmat e.g.)

NOTE: to hide a skybox when importing/loading a 3D file (force AR) is possible by linking to https://my.org/foo.glb#-skybox or https://my.org/foo.glb#-skyboxmaterial

#|

 26th September 2025 at 8:45pm

Sharing object or file

The pipe-symbol (|) sends a (targeted) object to the OS. Clicking the href-value below will:

  1. share the (targeted object in the) file to a another application

This URL can be fed straight into Web Share API or xdg-open

# |bar

https://foo.glb #|bar

NOTE: sharing is limited to xrf: scheme-only

#loop 🌱

 9th September 2025 at 3:52pm

Specify playback loopmode

This compensates a missing element from Media Fragments to enable/disable temporal looping. .

fragmenttypefunctionality
#loopstringenables animation/video/audio loop
#-loopstringdisables animation/video/audio loop


Undefined widget 'videojs'

Spec

Below is the related section of the spec (full spec here: HTML, TXT)



» discussion

#rot

 10th September 2025 at 11:42am
NOTE: in the next iteration of the spec, this will be non-normative.
Reason: in VR/AR setting the 'lookat' of the camera is not possible (while keeping headtracking-sensors active), leading to ambigious results compared to desktop.

set the rotation of the camera (or queried object(s)).

fragmenttypeaccessfunctionality
#rot=0,90,0vector3🔓 🎲 💥 🔗rotate camera (or filtered object(s))

You can add this URI Fragment to the top-level URLbar, or as href value (to trigger via click) in a 3D model Editor (Blender e.g.):

Undefined widget 'videojs'

Developers only:

» example implementation
» discussion

Spec

Below is the related section of the spec (full spec here: HTML, TXT)

#s 🌱

 28th February 2024 at 1:18pm

Play back speed

controls the animation(s) of the scene (or src resource which contains a timeline)

fragmenttypefunctionality
#s=1[-]floatset playback speed
🌱 = fully functional, but not part of the official W3C Media Fragments (yet?)


Undefined widget 'videojs'

Spec

Below is the related section of the spec (full spec here: HTML, TXT)



» discussion

#t

 20th September 2025 at 10:35am

Animation(s) timeline

controls the animation(s) of the scene (or src resource which contains a timeline)

fragmenttypefunctionality
#t=start,stopvector2 (default:#t=0)start,stop (in seconds
Example ValueExplanation
#t=1play (3D) animations from 1 seconds till end (and stop)
#t=1,100play (3D) animations from 1 till 100 seconds (and stop)
#t=1,1stop (3D) animations at frame 0

» example implementation
» discussion


Undefined widget 'videojs'


Undefined widget 'videojs'

Spec

Below is the related section of the spec (full spec here: HTML, TXT)

Controlling embedded content

use URI Templates to control embedded media, for example a simple video-player:


 foo.usdz                                            
    │                                                 
    │                                                 
    ├── ◻ stopbutton 
    │      ├ #:    #-stopbutton
    │      └ href: #player=stop&-stopbutton  (stop and hide stop-button)
    │                                                 
    └── ◻ plane                        
           ├ play: #t=0,10
           ├ stop: #t=0,0
           ├ href: #player=play&stopbutton   (play and show stop-button)
           └ src:  cat.mp4#{player}



» discussion

#uv 🌱

 28th February 2024 at 1:16pm

UV offset

sets the uv-coordinates of polygons/texture

fragmenttypefunctionality
#uv=u,v,uspeed,vspeedvector2set/scroll to uv coordinate
🌱 means that this is an optional element of the XR Fragments spec (you can position the default uv-coordinates in your 3D editor)


Undefined widget 'videojs'

Spec

Below is the related section of the spec (full spec here: HTML, TXT)



» discussion

↪ Parser.parse(k,v,store)

 26th September 2025 at 8:30pm

in xrfragment.js, and xrfragment.module.js

Parse a fragment (key/value) and add it to an object store (if valid).

NOTE: You probably want to use the higher-level URI.parse(url,filter) which calls this function

args type example comment
key string pos
value string 1.2,3,4 datatype must comply with spec
store object {} will not be touched if validation failed,

returns true if validated, otherwise false

here are some interactive examples:



Unknown or fragments with wrong type will be rejected:


Spec

version 0.2 @ 2023-06-27T11:10:08+0200  Actions Status

In case your programming language has no parser (check here) you can crosscompile it, or roll your own Parser.parse(k,v,store) using the spec:

  1. requirement: receive arguments: key (string), value (string), store (writable associative array/object)
  2. add keys without values to store as predefined view
  3. check if fragment is official XR Fragment
  4. guess the type of the value (string,int,float,x,y,z,color,args,query)
  5. don't add to store if value-type is incorrect
  6. if valid, add to store
  7. prefix non-offical fragment key's with underscore (and add to store)

icanhazcode? yes, see Parser.hx

Tests

the spec is tested with JSON unittests consumed by Test.hx to cross-test all languages.

↪ URI.parse(url,filter)

 26th September 2025 at 8:30pm

in xrfragment.js, and xrfragment.module.js

Validates and turns a XR fragment string (document.location.hash in javascript e.g.) into objectform using Parser.parse(k,v,store).

args type example comment
url string #pos=1.2,3,4
#pos=1.2,3,4&t=1,2
filter integer
(bitmask)
0 = no filter
XRF.NAVIGATOR (default)
XRF.EMBEDDED
XRF.PV_OVERRIDE
filter out fragments which are not flagged with this flag.

For example, parsing with XRF.NAVIGATOR will ignore any fragments which are not flagged as such (like scale in case of top-level URI navigation).
On the other hand, XRF.EMBEDDED makes sense when parsing an embedded URI (src: other.gltf#scale=2,2,2 e.g.)

returns an object with validated fragment(values) as key(objects)

here are some interactive examples:



You can combine them with the & character:


Unallowed fragments can be filtered out:


The above is perfect for top-level browser navigation (which should not parse embedded-only XR Fragments like scale or queries)

Another example for parsing embedded assets


The above is perfect for embedded content (scale fragment is allowed here)

Finally, parsing custom framework-specific fragments is also possible (they are prefixed with _)


The above is perfect for embedded content (scale fragment is allowed here)

Spec

version 0.2 Actions Status generated by make doc @ 2023-06-27T11:18:12+0200

XR Fragment URI Grammar

    reserved    = gen-delims / sub-delims
    gen-delims  = "#" / "&"                      
    sub-delims  = "," /  "="

Example: ://foo.com/my3d.asset#pos=1,0,0&prio=-5&t=0,100

Explanation
pos=1,2,3 vector/coordinate argument e.g.
pos=1,2,3&rot=0,90,0&q=.foo combinators

In case your programming language has no parser (check here) you can crosscompile it, or roll your own Parser.parse(k,v,store) using the spec:

  1. store key/values into a associative array or dynamic object
  2. fragment URI starts with #
  3. fragments are split by &
  4. loop thru each fragment
  5. for each fragment split on = to separate key/values
  6. fragment-values are urlencoded (space becomes + using encodeUriComponent e.g.)
  7. for every recognized fragment key/value-pair call Parser.parse

icanhazcode? yes, see URI.hx

Tests

the spec is tested with JSON unittests consumed by Test.hx to cross-test all languages.

⏯️ XR Macros

 22nd November 2023 at 3:59pm

DISCLAIMER: XR Macros is a tiny logic-layer which falls outside of the scope of the XR Fragment spec (for now).
They currently demonstrate features beyond the addressibility/navigation/embedding-focus of XR Fragments, and fall more into the demo/authoring-side of things. Every XR macro demonstration should potentially be replaced by piggybacking OMI extensions (when they support the usecase).

XR Macros are a tiny roundrobin logic-layer defined inside 3D assets/scenes.
Macros introduce a lowest-common-denominator logic-layer, by recursive, economic re-use of the querystring syntax (which the XR Fragment parser already uses).

The example XR macro's are based on common usecases & features found in 3D engines, to offer an lowcode alternative to basic experiences without requiring a scripting language.
(Sometimes a spreadsheet will do instead of a programming language).

Spec

Below is the related section of the spec (full spec here: HTML, TXT, XML)

Example macros

XR macro type access supported
in
sandbox
info
fov int 1..360 🔓 💥 👩 🔗 hints field of view to renderer
clip vector2 🔓 💥 👩 🔗 hints camera clipping to renderer
bg vector3 🔓 💥 👩 🔗 hints background rgb values (0..255) to renderer
fog vector2 🔓 💥 👩 🔗 hints fog settings to renderer
env string 🔓 💥 🔗 query selector / object manipulation
show integer [0-1] 🔒 🎲 💥 🔗 show/hide queried object(s)
mov vector3 🔒 🎲 💥 🔗 move queried object(s) relatively (instead of absolute using #pos=)
scale vector3 🔓 🎲 💥 🔗 scale queried object(s)
prio int 🔒 asset loading linking
gravity vector3 🔓 💥 🔗 animation
physics vector3 🔓 💥 🔗 animation
namespace string 🔒 author / metadata
SPDX string 🔒 author / metadata
unit string 🔒 author / metadata
description string 🔒 author / metadata
session url 🔓 💥 👩 🔗 ✋? multiparty

🔒 = value(s) can only be defined in 3D asset (immutable)
🔓 = value(s) can be overwritten in certain context (by url)
🎲 = multiple values will be roundrobin'ed (#pos=0,0,0|1,0,0 e.g.)
💥 = value(s) can be overwritten by predefined_view
👩 = value(s) can be overwritten when user clicks href (value) or top-level URL change(see How it works)
🔗 = value(s) can be overwritten when 3D asset is embedded/linked as src value
✋? = value(s) can be overwritten by offering confirmation/undo to user

Undefined widget 'videojs'

for more info see How it works

❓ What are levels?

 28th September 2025 at 8:08pm

🌎 3D hypermedia browsers

 7th November 2025 at 9:53pm

XR Fragment browsers

JanusWeb can browse JML 3D documents via level1 XR URI Fragments (basically hashtag camera teleports) and is spec-compatible regarding the <Link> tag for level2 explicit hyperlinks

read more at 🌎 JanusXR immersive web

How to make one yourself

3D Hypermedia browsers supporting XR Fragments, can be implemented on various levels:

  • you can extend JanusWeb browser (best risk/reward-ratio)
  • thru the lens of HTML (your own pseudo-browser using javascript like the sandbox which uses the THREE or AFRAME javascript library)
  • the Godot XRF Library is also suitable direction for making native XR hypermedia browsers.
  • thru the lens of hypermedia browsers (opening XR Documents (.gltf, .obj e.g) natively using URLs, these don't exist (yet))

brainstormphase: integrating the XR Fragment parser on native browserlevel (Wolvic, Chromium-based browsers e.g.) for best performance.

🌎 JanusXR immersive web

 16th November 2025 at 5:03pm







What are JanusXR URLs?

Spoiler alert: they're just (existing) URLs (but with a spatial text-snippet).

JanusXR pioneered adding a immersive, collaborative 3D webspace to regular webpages by re-imagining them as interconnected virtual rooms accessed via portals.

JanusXR interconnected virtual worlds are deeply decentralized because content remains hosted on the open web, allowing users to control their own content, and even customize the open-source software, rather than relying on a proprietary central platform.

You own it, just by exposing a textfile to the web.

XR Fragment compliant

JanusWeb is XR Fragment level1-compliant, basically hashtag camera teleports and is spec-compatible regarding the <Link> tag for level2 explicit hyperlinks

Digital Common Good

JanusXR is arguably the most open immersive digital common good, to distribute 3D files to.

arguably, it offers:

  • the lowest integration cost
  • no lock-in regarding content
  • no lock-in regarding hosting
  • no lock-in regarding code
  • no lock-in regarding blockchain
  • inclusive: zero-login
  • the highest network-value
  • the highest resilience / selfgovernance
  • its (JML) markup language is embeddable by nature (inject into any textfile)

Like HTML, it is application-agnostic.
It is a great example of technology which uses a digital common good (the web/URLs), to produce a new common good. (without lock-in).

Interconnected spaces

JanusWeb links piggyback on traditional links (to a blogpage, textfile etc).

It does this without any lock-in, as any JanusWeb webpage acts as entrypoint to all immersive spaces without restrictions.

If something has an URL, it can be viewed (immersively if possible).


blog URL any (text) URL textfile URL

JML

How can regular URLs suddenly turn into a virtual world?

By injecting a snippet of text (JML) in your blog-article, html-page etc.

   <!--
   <FireBoxRoom>
      <Assets>
        <assetobject id="experience" src="my.glb"/>
      </Assets>
      <Room>
        <object pos="0 0 0" collision_id="experience" id="experience" />
      </Room>
   </FireBoxRoom>
   -->

When a JanusWeb-page surfs to an URL, it will detect any JML, and will unfold your 2D page into its immersive counterpart.

You don't have to publish immersive content on different channels (appstores e.g.), you can use your existing content as distribution channel for the immersive web.

Publishing, How to start?

Well, you can basically add a JML snippet (like above) to any webpage (besides uploading the 3D file).

Many users enrich their existing content (webpages e.g.) with JML,
however these platforms below offer JanusXR content hosting:

  • check VESTA
  • check XRFORGE (click the meeting- or JanusXR link of an experience)

Documentation

A good startingpoint is madjin's janusxr guide

Missing a feature?

JanusXR is an unopiniated immersive layer. But you can add that yourself to the content, or build viewer 'apps' as everything is completely modular.

A public layer, not a product

JanusXR is not a product, it is a digital common anyone can publish to and/or build on top of.

JanusXR is ________

a game engine
for building social experiences
collaborative
the spatial web
decentralized
not controlled by anyone
uncensored
fun
not for online banking
a learning experience
a waste of time
looking forwards
looking back
building the future
preserving the past
compatible with your device
fusing the real and virtual worlds
some markup
a programming environment
a UI toolkit
infinitely extensible
whiteboxable
the metaverse
not the metaverse
standing on the shoulders of giants
application-agnostic
pushing the envelope for virtual events
an open source project
a work in progress
a labor of love
sometimes a bit janky
still alive
a great experiment
born from the ashes of JanusVR, Inc.
in need of some help

🎞 Media Fragments

 6th September 2025 at 12:05pm

The current scene and src media (including the 3D timeline) can be further manipulated using Media Fragment URIs.

So far #t= has been standardized by W3C.
Though not being part of the XR Fragments standard, the demos suggest extra media fragments like #loop, #s and #uv to compensate for the lack of loop/speed/uv control.

XR Fragments is endorsing W3C Media Fragments for media-control, as well as URI Templates for text-templating.


Undefined widget 'videojs'

📜 level0: File

 16th March 2026 at 3:22pm

Capable 3D fileformats

glTF, usdz, obj, collada, THREE.json, X3D e.g. are capable of XR Fragments.
The requirements of any 3D file is: a scenegraph with 1 or more named objects.

XR Fragment Compatible

A 3D scene-file can be considered XR Fragment-compatible when it matches one of these heuristics

  1. implicit: it has at least one sidecar-file
  2. explicit: an object has an href extra.
  3. explicit: it has at least one system folder (one objectname starting with underscore)
  • = last wins in case of non-unique names

Loading a file

  1. application sidecar file
  2. camera HUDLUT
  3. sidecar files
  4. system folders
  5. timelines

📜 level1: URL

 24th March 2026 at 2:22pm
protocol:// my.org/world.glb #room2

URLs are the heart of XR Fragment-based 3D Hypermedia:

  • it allows navigating the XR browser to a (different) 3D scene (world.glb e.g.)
  • it allows teleporting the user to a different location (room2 e.g.)
  • it allows back/forward navigation
  • it allows addressibility

Examples:

https://linux.world/#roomF&t=1,100
linuxapp://conference/nixworkshop?newuser#roomC&t=2,200
androidapp://page1?tutorial#roomB&t=1,100

Implicit and Explicit adressibility is the core of the XR Fragments spec, as well as user teleportation:

  1. implicit deeplinks
  2. portal rendering
  3. teleport camera spawnpoint
  4. XRF microformat

Here's pseudocode for a level1 XR Fragments browser in THREE.js:

// by default the (VR) user gets position at 0,0,0 at the loaded scene (+VR userheight)

url   = 'world.glb#roomB&car'
file  = url.split("#")[0] 
xrf   = new URLSearchParams( url.split("#")[1] )
refs  = ([...xrf.keys()]).filter( (k) => !k.match(/(t)/) ) // skip mediafrags

// set last (nonoperator) objectname as camera-location
for( i in refs ){
  scene.setActiveCameraByName(refs[i]) 
}

for robust parsing of XR Fragments use a http query-language parser, or the polyglot parser



Below is the related section of the spec (full spec here: HTML, TXT)



For more info see How it works

📜 level2: explicit hyperlinks

 23rd March 2026 at 5:17pm

What if we embed XR Fragment URI's inside a 3D file?

level2 promotes adding clickable hyperlinks, via the href metadata attribute.

All modern 3D editors can attach metadata to objects, and export these as 'extras' with the 3D file.

An Easy nocode way to add metadata is by adding custom properties in blender e.g.. This is demonstrated in the getting started video:

Undefined widget 'videojs'


custom property type functionality
href string (uri or predefined view) href navigation / portals / teleporting to other XR documents

3D documents using href are spec-compliant but similar variants (<Link url="..."> in JanusXR JML e.g) are considered spec-compatible


Object metadata can also be added/exported programmatically, for example in AFRAME/THREE.js can export GLB/USDZ/OBJ/COLLADA-files with them, after setting myobject.userData.href = "#nameofplane" e.g.

Spec

Below is the related section of the spec (full spec here: HTML, TXT)

📜 level3: Media Fragments

 16th March 2026 at 3:22pm

XR Movies

Just like with 2D media-files, W3C mediafragments (#t=1,2) can be used to control a timeline via the #t primitive. XR Fragments Level3 makes the 3D timeline, as well as URL-referenced files controllable via Media Fragments like:

  • level2 hrefs (href: #t=4 e.g. to control 3D timeline)
  • level4: xrf: URI scheme:
    • href: xrf:foo.wav#t=0 to play a wav
    • href: xrf:news.glb?clone#t=0 to instance and play another experience

The fragments

  1. #t
  2. 🎞 Media Fragments

this allows for interactive deeplinkable XR movies.

Combined with href metadata attached to button-objects, clickable interactive timelines can be constructed (interactive XR movies e.g.).

For more info see #t

XRF viewer pseudocode

// by default the (VR) user gets position at 0,0,0 at the loaded scene (+VR userheight)

url   = 'world.glb#roomB&car'
file  = url.split("#")[0] 
xrf   = new URLSearchParams( url.split("#")[1] )
refs  = ([...xrf.keys()]).filter( (k) => !k.match(/(t)/) ) // extract mediafrags

if( xrf.get('t') ) scene.playAnimations( xrf.get('t') )

📜 level4: operators

 16th March 2026 at 3:22pm

Prefixing/postfixing objectnames with the following simple operators allow for extremely powerful XR interactions.

  1. #!
  2. #*
  3. #+-
  4. #|
  5. multilangual
  6. xrf: URI scheme

Examples: #+menu to show a object, #-menu to hide a menu, #!menu to teleport a menu, #*block to clone a grabbable block, #|object to share an object

📜 XR fragments

 30th August 2023 at 6:00pm

this document was moved here

📜level5: URI Templates

 16th March 2026 at 3:22pm

NOTE: level5 is non-normative and incomplete, so please keep an eye on the full spec here: HTML, TXT.

There are cases where reactivity and statemachines is desired, to enable a greater degree of interactivity.

URI Templates RFC6570 are a safe no-code way, to enable reactive href-values.

Example

Lets design an escape-room game in-a-3D-file.
First we add a 'exit door'-object with the following metadata:

href: https://my.org/{door}.glb

Now also imagine the following extra metadata on that object:

  • door: gameover.glb
  • door_succes: level2.glb

Now imagine a secret button (which the user needs to find and press), with:

  • href: xrf://#door=door_success

Result: the user will only be teleported to https://my.org/level2.glb when the door was entered after the secret button was pressed.

📜level6: XDG soundtheme

 16th March 2026 at 3:22pm

NOTE: level6 is non-normative and incomplete, so please keep an eye on the full spec here: HTML, TXT.

Sound Theme Specification

A sound naming convention for (user-interface) audiofiles.
See the list of possible events/filenames here. The audio-files can be specified on an application-level, but can also be overriden on a 3D file-based level.

Example

Here are 2 events from the list of possible events.

Property Event
link-pressed The sound used when a link in a web or help browser is pressed.
link-released The sound used when a link in a web or help browser is releaed.

Now Image a radio 3D object, with buttons. Instead of hearing the default sounds (based on the selected XDG Sound Theme in the application), these button-sounds can be overriden.
All we need to do is add this metadata to a 3D (button)object with a href-value in the 3D file:

  • href: #+on&-off
  • link-pressed: oldradio-press.wav
  • link-release: oldradio-release.wav

Profit! When pressing the button, we will not hear the global XDG audio, but our custom provided ones (binary metadata loaded from the 3D file or filesystem).

We can also override the link-pressed and link-release event, by specifying an XDG event in the href:

  • href: #+on&-off&menu_click

This could be appropriate if it's an interactive object which presents a menu in front of the user e.g.

📜level7: engine prefixes

 18th March 2026 at 2:19pm

↑ Example of janusweb executing engine prefixes from a glTF file (defined in Blender)

Engine prefixes are like browser vendor prefixes, focused on 3D viewers & Game engines. They encourage permissionless innovation, and messy perfectable democracy instead of one neat and tidy standard (monopoly).

Progressive enhancement

There are cases where the 3D scene file might want to hint specific features to the viewer-engine (JANUSWEB, THREE.js, AFRAME, Godot e.g.).

This makes sense when the features cannot be triggered via other means (extensions e.g.)

Examples

This is how gltf extras / custom properties on 3D entities, can act as initializers:

custom property on will initialize
-three-background: rgb(255, 0, 0) sceneroot scene.background = new Color('rgb(255, 0, 0)')
-three-material.blending: THREE.AdditiveBlending object obj.material.blending = THREE.AdditiveBlending
-godot-material.blend_mode: BLEND_ADD object if( obj.material.blend_mode ) obj.material.blend_mode = material.BLEND_ADD
-godot4-material.blend: BLEND_MODE_ADD object if( obj.material.blend_mode && godot.version[0] == 4 ) obj.material.blend_mode = material.BLEND_MODE_ADD
-myapp-myfeature: true object obj.myfeature = true

JANUSWEB adopts XRF level7 gltf extras for both -three-* and -janus-*:

custom property on will initialize
-janus-collision_id: myplane object hints .collision_id = 'myplane' in janusweb viewer
-janus-billboard: true object hints .billboard = true in janusweb viewer
-janus-use_local_asset: room2
-janus-source sceneroot contains the JML-sourcecode of the scene
-janus-tag: paragraph object instantiates a <paragraph> object
-janus-text: Lorem ipsum object instantiates <paragraph text="Lorem ipsum"/>

Advanced: add your engine

warning: this is only for advanced developers. Endusers probably want to use an engine which already supports level7 engine prefixes (like janusweb)

Here's a snippet of how janusweb supports level7 gltf extras:

🔥 Remotestorage

 7th November 2025 at 9:51pm

NOTE: technically this is not related to XR Fragments, however it complements the local-first philosophy of XR Fragments very well ♥

Remotestorage is an awesome local-first paradigm to store data which promotes digital sovereignty.
Instead of storing data behind webapplication-servers, you decide where your data lives:


Undefined widget 'videojs'

How to try?

Launch this demo:

  1. click the hamburger menu
  2. click the '3D file' button
  3. click the 'remote storage'-tab

surf to 5apps.com to get a testaccount (before selfhosting armadietto)

🔨 For Developers

 7th November 2025 at 9:47pm

🔨 xrfragment-haxe

 7th November 2025 at 9:47pm

This community repository features parser-code (written in HaXe, a crosscompiler) with many compilation targets (lua, python, javascript etc).

It's available as git repository and directly below:

language link
python xrfragment.py
lua xrfragment.lua
javascript xrfragment.js
javascript xrfragment.module.js
any language using HaXe
spec you can literally write a parser yourself, the spec is kept very easy intentionally

With that, you can immediately add 4D addressibility to your app like this:

Congrats! After connecting pos and rot to your camera, and providing back/forward navigation, you have a XR Fragments navigator-compliant client.


For example, the AFRAME / THREE.js client-libraries use it like this:


If you want to build your own client/browser, see the documentation for these functions in the sidemenu

function info
xrfragment.URI.parse( str, flag ) see URI.parse
xrfragment.Parser.parse(k,v,store) see Parser.parse

🖥 Blender ✅🔥

 7th November 2025 at 9:55pm
Undefined widget 'videojs'
  • download and install Blender here (for desktop)
  • export 3D files (File > Export > glTF 2.0) after adding href metadata as custom properties
  • in the export dialog, set extension to .glb (=easiest)
  • in the export dialog, check these checkboxes (Include dropdown):

✅ custom properties (=XR fragment metadata)

✅ cameras

✅ lights

  • click export-button and save the file (`example.glb) somewhere
  • load the 3D file (example.glb e.g.) into the sandbox by clicking the hamburger-menu: load 3D file-button
  • see getting started video to see the above steps in detail

🖥 Blender export-script

 7th November 2025 at 9:55pm

The following Blender-script will automatically:

  1. add decimate-modifiers to all objects (if not present)

  2. export a .glb 3D model when saving your blender project

  3. all decimations are applied to the exported model

#
# This is a convenient way to convert the scene to lowpoly
# (by adding decimate-modifiers)
# and then exporting a gltf to to <blenderdocument>.glb
# 
# All this is done automatically when saving the blender file
#
# Usage: 1. open script-tab in blender
#        2. copy/paste the script into the texteditor
#        3. press the play-button (ALT+P) once
#        4. profit! from now on ctrl+s will execute it
#
import bpy
import os
from bpy_extras.io_utils import ImportHelper

# uncomment below in case you want to hardcode the exported filename
data = {
  "gltf_file" : "/home/leon/projects/xrfragment/parser/example/assets/index.glb"
}

def notify(message = "", title = "Message Box", icon = 'INFO'):
    def draw(self, context):
        self.layout.label(text=message)
    bpy.context.window_manager.popup_menu(draw, title = title, icon = icon)

# redirect print to all consoles
#def print(data):
#    for window in bpy.context.window_manager.windows:
#        screen = window.screen
#        for area in screen.areas:
#            if area.type == 'CONSOLE':
#                override = {'window': window, 'screen': screen, 'area': area}
#                bpy.ops.console.scrollback_append(override, text=str(data), type="OUTPUT")
                

# Function to add Decimate Modifier to objects without one (except those in the exclusion list)
def add_decimate_modifier_to_objects():
    for obj in bpy.data.objects:
        print(obj.type)
        if obj is not None and (obj.type == 'FONT' or (obj.type == 'MESH' and len(obj.data.polygons) > 8)):
            if not obj.modifiers.get("Decimate"):
                #if obj.name not in exclusion_list and "Decimate" not in obj.modifiers:
                print("adding decimate-modifier to:"+obj.name)
                bpy.context.view_layer.objects.active = obj
                bpy.data.objects[obj.name].select_set(True)

                # Add Decimate Modifier with ratio 0.5
                bpy.ops.object.modifier_add(type='DECIMATE')
                bpy.context.object.modifiers["Decimate"].ratio = 0.5

# Function to be called on file save
def on_save_handler(blenderdoc):
    if 'gltf_file' not in data:
        gltf_file = bpy.data.filepath.replace('.blend','.glb')
    else:
        gltf_file = data['gltf_file']
    print(gltf_file)
    
    add_decimate_modifier_to_objects()
    
    # Export to glTF with specified settings and apply modifiers
    bpy.ops.export_scene.gltf(
        filepath=gltf_file,
        export_format='GLB',
        export_extras=True,
        export_lights=True,
        #export_keep_originals=True,
        export_apply=True,
        export_animations=True,
        #export_frame_range=True,
        #export_gpu_instances=True,
        #export_jpeg_quality=75,
        #export_image_quality=75,
        export_force_sampling=False,
    )
    notify(os.path.basename(gltf_file),"OK export")

# Register the handler
bpy.app.handlers.save_post.clear()
bpy.app.handlers.save_post.append(on_save_handler)
print("sourced gltf_export_on_save")

🖱️ For Designers

 7th November 2025 at 9:54pm

🧪 experimental

 2nd September 2025 at 4:30pm

feature heuristics are basically features which can be inferred from absense or presence of certain metadata.

For example, 3D objects always have a name, and are (not) children of certain 3D objects. All this indirect information can be used to activate certain viewer-features.

All feature heuristics have been with care, to ensure they can be extracted from both new/legacy 3D fileformats.

    🧪 levelX: non-normative metadata

     6th September 2025 at 9:59am

    These metadata or URI Fragments are not considered to be part of the spec, but allow for interesting extensions.

    1. #!
    2. #*
    3. #+-
    4. #|
    5. multilangual
    6. xrf: URI scheme

    All modern 3D editors allow embedding metadata in objects of an exported 3D file.

    An Easy nocode way to add metadata is by adding custom properties in blender e.g.. This is demonstrated in the getting started video:

    Undefined widget 'videojs'


    • href for clickable links
    • src for embedding content
    • tag to tag things

    custom property type functionality
    href string (uri or predefined view) href navigation / portals / teleporting to other XR documents
    src string (uri or predefined view or query) lazyloading of (partial) local/external assets/scenes (3D iframes)
    tag string space-separated tagging of objects (like CSS class) for XRWG and or queries

    > In Editors like blender.org these are called ''custom properties''.

    Object metadata can also be added programmatically, for example in AFRAME/THREE.js can export GLB/USDZ/OBJ/COLLADA-files with them, after setting myobject.userData.href = "#nameofplane" e.g.

    Descriptive Metadata

    XR Fragments does not re-invent descriptive metadata, but encourages adding existing standards to 3D nodes, most notably:

    • ARIA attributes (aria-*: .....)

    ARIA (aria-description) is the most important to support, as it promotes accessibility and allows scene transcripts. Please start aria-description with a verb to aid transcripts.

    Example: object 'tryceratops' with aria-description: is a huge dinosaurus standing on a #mountain generates transcript #tryceratops is a huge dinosaurus standing on a #mountain.
    These hashtags are clickable XR Fragments (activating the visible-

    Undefined widget 'videojs'

    but also the following attributes are encouraged:

    • SPDX license information
    • Open Graph attributes (og:*: .....)
    • Dublin-Core attributes(dc:*: .....)
    • BibTex when known bibtex-keys exist with values enclosed in { and },

    These attributes can be scanned and presented during an href or src eye/mouse-over.

    Undefined widget 'videojs'

    Spec

    Below is the related section of the spec (full spec here: HTML, TXT)

    🧰 <model-viewer> 🚧

     7th November 2025 at 9:48pm

    🚧 the demo needs upgrading due to changes in recent updates in model-viewer

    <model-viewer> is a way to easily embed glTF 3D files into webpages.

    There's a xrfragment overlay, which will turboboost it with XR Fragment support, allowing minimum viable interactions like navigation, teleportation, showing/hiding objects, portals, lenses, loading and embedding scenes, hypermedia files and URLs. it, by turning it into immersive experiences , allowing interactive story telling, elearnings, basically 3D hypermedia for all devices , like VR/AR devices, laptop, tablet and mobile (VR)

    See example

    Here's the example snippet using the xrfragment.model-viewer.js overlay

    🧰 AFRAME

     7th November 2025 at 9:48pm

    AFRAME is a popular WebXR choice for immersive experiences.
    Since it supports (gltf) 3D files out of the box, it's a perfect for playing 3D files with XR Fragments.

    Below are projects which use XR Fragments with AFRAME

    xrfragment-haxe

    This community repository features a specialized aframe-component to help with that.
    It allows simply adding these lines to your AFRAME project:

    <script src="https://xrfragment.org/dist/xrfragment.aframe.js"></script>
     
    <a-entity xrf="https://xrfragment.org/index.glb"></a-entity> 
    

    Obviously, don't use this in production.

    Basically navigation automatically happens via href values embedded in 3D models (glb e.g.) or programmaticaly: xrf.navigator.to('https://xrfragment.org/index.glb#pos=start') e.g.


    The snippet above can be found in this source-example or see it in action here:


    Undefined widget 'videojs'

    Getting started

    1. use the code from the codepen above as a startingpoint
    2. add your own 3D model (index.glb in the example)

    This setup automatically launches the (THREE.js) xrf.init() which injects xrf-capabilities into THREE.js loaders. It'll automatically detect any XR Fragments in 3D assets (loaded afterwards).
    On top of that, it'll reflect changes in the URL-bar.

    Also note that xrf-get allows converting objects inside assets into AFRAME <a-entity>, and xrf-button allows for simple interactions.

    See the above in action below:

    The xrfragment library lives at window.AFRAME.XRF so you can call AFRAME.XRF.navigator.to('foo.hltf#pos=1,1,2') e.g.

    Everything else works the same (and can be extended via) as the THREE.js library (see for more info)

    The navigator

    All (clicked/requested) links will go through the navigator, which lives at xrf.navigator and can be replaced/extended with your own navigator.js

    By default it opens unknown links (like an HTML/PDF link in a new tab, however that can be disabled:

    document.querySelector('a-scene', function(){
      xrf.navigator.opts.openInNewTab = false
    })
    

    plugins

    There are various optional plugins which add a small 2D overlay interface, add network features etc:

    
    <script src="dist/xrfragment.plugin.p2p.js"></script>          <!-- serverless p2p connectivity   -->
    <script src="dist/xrfragment.plugin.matrix.js"></script>       <!-- matrix connectivity           -->
    <script src="dist/xrfragment.plugin.network.js"></script>      <!-- matrix and webrtc chat/scene examples --> 
    <script src="dist/xrfragment.plugin.editor.js"></script>       <!-- basic editor example          --> 
    <script src="dist/xrfragment.plugin.frontend.css.js"></script> <!-- basic menu interface css      -->
    <script src="dist/xrfragment.plugin.frontend.js"></script>     <!-- basic menu interface          -->
    
    

    see here for an example and index.html for its code

    While these are not part of the spec, you can use/modify them to your own likings.

    plugin: frontend

    This creates a hamburger menu and popu notifications for href and aria-description metadata in 3D content.
    By default, hovering/gazing href (buttons) will show a popup with its href and aria-description value.
    You can disable/enable this using:

    frontend.notify_links = false
    

    Everything else works the same (and can be extended via) as the THREE.js library (see for more info)

    🧰 GODOT

     7th November 2025 at 9:48pm

    Godot is a Game/XR multi-platform builder environment.

    Godot developers can use the xrfragment.gd library to build their own XR browser.

    There's an Example Godot Project included which uses it using this simple main.gd script.

    NOTE: the XR Fragment support is not as mature as the AFRAME library (see Example Model Browser in sidemenu)


    Undefined widget 'videojs'

    🧰 libraries

     30th January 2024 at 1:08pm

    🧰 Libraries & Tools

     24th March 2026 at 3:23pm

    🧰 THREE.js

     12th March 2026 at 4:36pm

    THREE.js is a popular choice for 3D graphics on the web.

    A library is not necessary, because the spec is really simple:

    TLDR

    level0: hide objects in system folders

    level1: initing spawnpoint via (default) XR URI Fragment

    level2: making href extras (naive) clickable:


    Below are projects/libs which use XR Fragments with TRHEE

    xrfragment-haxe

    This community repository features a specialized THREE-library to help with that.

    NOTE: Expect just THREE.js boilerplate here, for a more mature demo see the AFRAME wrapper (contains better UX).

    Here you can download xrfragment.three.js or xrfragment.three.module.js, and here's how to empower your THREE.js app with XR Fragments:

    xrf.init() injects itself into THREE.js. It'll automatically detect any XR Fragments in 3D assets (loaded afterwards).
    On top of that, it'll reflect changes in the URL-bar.

    From here you will want to use xrf.navigator.to('index.glb#pos=start') e.g. to navigate scenes via javascript, or preferrably simply by embedding href metadata into your 3D files.
    The snippet above can be found in this source-example or see it in action here:


    The example above loads a gltf-scene which contains embedded XR fragments which:
    • replaces certain objects with tiny clones of itself by instancing src selfreferences (src: #cube or src: #-sky&-cube)

    For all XR fragments see the list

    Events / Customizing

    There are various ways to customize the behaviour of xrfragments.   There's the addEventListener which allows promise-ification of events:


    Above you can see how XR Macro's extend the XR Fragments parser with custom behaviour.

    event info
    init emitted when xrf.init() is being called
    href emitted when user interacts with href ('hover' or click)
    hashbus ⚠️ emitted when hashbus is processing (XR) fragments
    reset emitted when current scene is emptied (and next model is loaded)
    parseModel ⚠️ emitted when global or embedded 3D model is being parsed
    navigate emitted when new scene is going to be loaded via xrf.navigator.to(...) or back/forward button
    navigateLoading emitted when new scene is loading via xrf.navigator.to(...) or back/forward button
    navigateLoaded emitted when new scene is loaded via xrf.navigator.to(...) or back/forward button
    navigateError emitted when new scene is not able to get loaded
    focus emitted when user hovers over an object
    play emitted when media fragment is being activated on an object
    stop emitted when media fragment is being activated on an object
    XRWG ⚠️emitted when XR Word Graph is being recalculated
    predefinedView emitted when predefined view is triggered
    selection emitted when Selection of Interest is triggered
    updateHash emitted when top-level URI (XR Fragments) hash changes

    You can also override/patch the init-code per XR fragment:


    And in the same fashion, you can introduce init-code which reacts to custom properties embedded in 3D assets (which are not part of the XR Fragment spec).
    This is handy for attaching framework-specific logic to your assets:

    Navigator

    To navigate manually call XRF.navigate.to( 'foo.gltf#pos=1,1,1' ) e.g. The default navigator is tied to the browser-history, but you can also provide your own navigator:

    see default navigator for an example implementation

    The navigator

    All (clicked/requested) links will go through the navigator, which lives at xrf.navigator and can be replaced/extended with your own navigator.js

    By default it opens unknown links (like an HTML/PDF link in a new tab, however that can be disabled:

    document.querySelector('a-scene', function(){
      xrf.navigator.opts.openInNewTab = false
    })
    

    Accessing the parser / internals

    Besides XRF.navigate.to( 'foo.gltf#pos=1,1,1' ), you'd probaby never need to mess with the internals (besides using addEventListeners).
    Apart from that, here are some internal functions:

    function info
    XRF.reset() completely wipe current XR Fragment related meshes from scene
    XRF.add(mesh) add mesh to scene (which gets wiped by XRF.reset() / scene-replacement

    You can also access the XR Fragment parser directly via the xrfragment.* global object

    🧰 Unity ⚠️

     7th November 2025 at 9:53pm

    Unity is partly closed/open technology, in contrast to Godot as an open alternative to foster an inclusive future.

    Unity can load various 3D models, but the following glTFast plugin is adviced for XR Fragments:

    • realtime import/export of glTF/glb 3D models
    • support for reading custom metadata via extras event

    NOTE: the XR Fragment metadata is not detected out of the box, so you have to do that manually by parsing extras.

    You can use the parser library using the C#, javascript, python or lua version

    please do a PR upstream in case you've managed to come up with a demo (to help other people using Unity)

    29th May 2023 future of text presentation notes

     30th May 2023 at 2:10pm
    16:04:16 From Leon van Kammen : https://xrf.isvery.ninja/example/aframe/sandbox
    16:04:51 From Frode Hegland : https://futuretextlab.info
    16:04:54 From Leon van Kammen : https://xrf.isvery.ninja
    16:05:21 From Frode Hegland : https://futuretextlab.info/category/vr-resource/
    16:15:08 From Peter Wasilko : What is MR in the center of the diagram?
    16:15:28 From Brandel Zachernuk : “Mixed Reality”
    16:15:54 From Peter Wasilko : How is it distinguished from AR?
    16:17:09 From Brandel Zachernuk : It’s a term that people use to encompass the lot. Many people claimed that Google Glass-style AR with no ‘world registration’ as AR, which drove people to coining an additional term
    16:17:30 From Frode Hegland : Ah… thanks Brandel
    16:19:05 From Patrick Lichty : MR, XR, AR, VR, it seems these are used rather fungible, it’d be good to have a small discussion about the Venn diagram here after the talk.
    16:19:24 From Frode Hegland : Yes exactly http://community.cim3.net/wiki/PurpleNumbers.html
    16:19:39 From Frode Hegland : Doug’s paragraph level addressing made live on the web through this
    16:19:52 From Karl Hebenstreit, Jr. : Reacted to “MR, XR, AR, VR, it s…” with 👍
    16:22:14 From Peter Wasilko : I particularly loved Doug’s deep linking distinction between a location in a document whose content might change vs. content in a document whose location might change. We need anchors to both.
    16:22:43 From Frode Hegland : Reacted to “I particularly loved…” with 👍
    16:24:17 From Frode Hegland : Hi Dene
    16:24:36 From Patrick Lichty : Reacted to “I particularly loved…” with 👏
    16:25:08 From Dene Grigar To Frode Hegland(privately) : Good moring
    16:25:12 From Dene Grigar To Frode Hegland(privately) : morning
    16:25:27 From Frode Hegland To Dene Grigar(privately) : 🙂
    16:30:25 From Frode Hegland : Hi Matthias
    16:31:40 From Patrick Lichty : This is amazing, actually.
    16:31:51 From Peter Wasilko : It looked like the image on the surface of the portal object was changing with one’s relative position to it.
    16:33:09 From Dene Grigar : Is there an example of how this has been used for art?
    16:33:25 From Patrick Lichty : Yes.
    16:36:22 From Karl Hebenstreit, Jr. : Future of Interface Workshop (February 15-16), https://futureofinterface.org/info-center/accessibility/ and there’s an XR Accessibility community, https://xraccess.org/
    16:40:15 From Peter Wasilko : https://en.wikipedia.org/wiki/TouchDesigner
    16:40:47 From Peter Wasilko : https://derivative.ca
    16:45:25 From Fabien : addressability of the known universe with infinite resolution
    16:47:49 From Frode Hegland : Fabien, infinite resolution depends on stated context, so cool
    16:47:50 From Frode Hegland : Can this generate a link to a specific location and view by the user performing an action in that location and sharing it? Like a GPS coordinate maybe.
    16:50:22 From Brandel Zachernuk : This is the W3C TPAC: https://www.w3.org/2023/09/TPAC/, and the IW WG (webXR etc) is here: https://www.w3.org/immersive-web/
    16:50:28 From Fabien : Reacted to “This is the W3C TP…” with 👍
    16:51:33 From Matthias mprove : The TPAC link says ”Sorry, Insufficient Access Privileges”
    16:51:34 From Karl Hebenstreit, Jr. : I see XR accessibility as one of the most complex challenges. How can we make it accessible so people with disabilities are not excluded from virtual worlds?
    16:52:22 From Matthias mprove : Oh, the comma was playing a trick on me: https://www.w3.org/2023/09/TPAC/
    16:53:45 From Peter Wasilko : De Bruijn Indices! https://en.wikipedia.org/wiki/De_Bruijn_index
    17:01:36 From Dene Grigar : Yes, I do
    17:02:14 From Dene Grigar : VR poses a challenge for conservation
    17:02:39 From Frode Hegland : 1 second sorry
    17:02:49 From Fabien : indeed, participated to Not Time To Wait specifically for that https://mediaarea.net/NoTimeToWait6
    17:03:13 From Fabien : (for conservation, in art or not)
    17:03:28 From Daveed Benjamin : Love it Peter! Bit.ly for XR fragments
    17:03:55 From Matthias mprove : paraphrasing Jef Raskin: the beginning of a document should be usable as a filename to refer to the document itself
    17:04:43 From Leon van Kammen : Jeff Rascin
    17:04:51 From Peter Wasilko : https://en.wikipedia.org/wiki/The_Humane_Interface
    17:04:56 From Matthias mprove : /from his book The Humane Interface
    17:05:37 From Dene Grigar To Frode Hegland(privately) : I need to go to another meeting. Thanks for today!
    17:09:32 From Peter Wasilko : Just to play it safe, you might need to query IPFS for the URI phrase before using it on the extremely odd chance some else had previously generated it. In Law, we actually deal with this in naming corporations by breaking out an explicit Name Reservation step.
    17:09:37 From Fabien : for ref https://en.wikipedia.org/wiki/World_Geodetic_System
    17:12:06 From Karl Hebenstreit, Jr. : @Fabien are you talking about a universal (UTC) timestamp?
    17:13:03 From Fabien : was thinking more of the spatial aspect but timestamp could be an example too as we do have to be able to “convert” from one timezone to another or have different timescales too
    17:15:26 From Fabien : other example https://en.wikipedia.org/wiki/Web_Mercator_projection most of us might be familiar with, without being aware of it, point being that with its name and https://en.wikipedia.org/wiki/Well-known_text_representation_of_coordinate_reference_systems it’s explicited and thus can be converted from and to, basically avoiding “the one solution” that we lately notice just doesn’t cover all cases
    17:15:34 From Peter Wasilko : @Patrick, do you have any favorite Brutalist Architecture resources?
    17:17:02 From Patrick Lichty : Brutalism appreciation society on FB, sci fi movie Sankofa and Last Men,
    17:22:14 From Peter Wasilko : One can’t help contemplate Brutalism in VR.
    17:22:38 From Patrick Lichty : Book – Soviet Bus Stops
    17:26:38 From Peter Wasilko : @Brandel do you have a link to that work?
    17:29:02 From Karl Hebenstreit, Jr. : List of speakers from Future of Interface workshop: https://docs.google.com/document/d/1tUjMRyLHEtyzHotfAmQUnDsdMHuMWdBkmuG8TlRKEqs/edit
    17:29:54 From Brandel Zachernuk : https://zachernuk.neocities.org/2016/beestConfigurator/#wr=38&bl=170&ta=272&tb=157&ba=210&bb=94&fl=200&fa=1.371&
    17:30:00 From Karl Hebenstreit, Jr. : @Dene will want to view the Exploratorium: https://futureofinterface.org/exploratorium/
    17:30:15 From Karl Hebenstreit, Jr. : Thank you everyone, on to my next meeting…
    17:30:44 From Leon van Kammen : https://xrf.isvery.ninja/#List%20of%20fragments
    17:31:25 From Matthias mprove : Flashback: URLs were never meant to be seen by the end-user. This just happened some years after Berners-Lee/Cailliau have introduced their browser w/o visible links. Very insightful interview with Robert Cailliau (1999) at https://www.youtube.com/watch?v=x2GylLq59rI
    17:36:04 From Matthias mprove : related to Leons wonderful work :: my ChronoLinks contain perspective and and zoom and time and old map identifier. Example: click the 5 buttons below at EXPEDITION AUF TENERIFFA https://mprove.de/chronolab/world/humboldt/index.html#teneriffa
    (This is a 3d mapbox world)
    17:36:44 From Fabien : Reacted to “Flashback: URLs we…” with 👀
    17:43:44 From Fabien : see also https://git.benetou.fr/utopiah/text-code-xr-engine/issues/24 on “in 3D” screenshot (as glTF snapshot) from defined perspective, tried with “friends” via Immers and had a working version
    17:44:24 From Frode Hegland : Reacted to “see also https://git…” with 👍
    17:45:19 From Patrick Lichty : I have to go to another meeting soon. This was great. Thank you!
    17:46:21 From Matthias mprove : Bonus info: Chronoscope’s ChronoLinks can also contain commands that are executed once the link is loaded by a browser. Example: this animation is just one link. Video: https://twitter.com/chronohh/status/1551203958730985472
    DIY ChonoLink inside here https://twitter.com/chronohh/status/1550876398960709638
    17:47:15 From Fabien : https://wicg.github.io/scroll-to-text-fragment/
    17:48:12 From Matthias mprove : @Leon set a marker on something: This is again Jef Raskin, the distinction between focus of attention and locus of attention. /also from The Humane Interface
    17:48:22 From Fabien : https://markjs.io
    17:50:42 From Brandel Zachernuk : Just like SMS, never intended to be exposed as an end-user capability
    17:51:25 From Brandel Zachernuk : But there’s an explicit line from SMS to Twitter and many (most?) other social media platforms
    17:53:36 From Fabien : code related to federated share moment/position https://git.benetou.fr/utopiah/text-code-xr-engine/src/branch/federation/index.html#L222 that is accessible as a URL e.g https://immers.benetou.fr/s/639cb4171757b8382c120da1
    17:56:54 From Fabien : related on “pointing” at things in the real life with MR/AR/XR, this a “layer” of the “real” world https://git.benetou.fr/utopiah/text-code-xr-engine/issues/73
    18:09:04 From Matthias mprove : My sort of fun: Connecting the Colosseum with Stonehenge https://mprove.de/chrono?q=41.89018,12.49231&z=17.11&r=20&t=14&m=SZAT7400-MIII40-8&o=0.8&s=1&c=z,-5,9,vIAb28777542-16,h,z,-22,9,vIAb22010294-26,z,-7,9,9
    18:09:34 From Fabien : Reacted to “My sort of fun: Co…” with 👍
    18:15:44 From Peter Wasilko : We could exfiltrate data as a Visual Meta blipvert in VR. Rez a placard with the VM for 10th of a second. https://en.wikipedia.org/wiki/Blipvert
    
    jeff rascins, phrases not urls purle-elk-mount-fuji (ipfs)
    like that
    
    hyperbolic browsers, mindmaps
    
    fragment: highlight point of interest
    
    political cultural implications, who is it excluding, obligations, worst thing to apply
    
    XR Fragments,
    Brandel, url as authorisation (client decodes url, logic)
    
    visual meta
    
    mauve: matrix, 
    
    https://github.com/omigroup/gltf-extensions
     
    Mauve (they/it) says:
    https://github.com/omigroup/gltf-extensions/pulls
     
    Mauve (they/it) says:
    https://github.com/omigroup/gltf-extensions/pull/85
     
    Mauve (they/it) says:
    https://blog.mauve.moe/slides/p2p-spatial-indexes/#1
     
    Mauve (they/it) says:
    https://github.com/omigroup/omi-scripting-group
     
    Mauve (they/it) says:
    https://unit.land/
     
    21:58
    me says:
    https://searxr.me
     
    22:06
    MT
    Mauve (they/it)
    Mauve (they/it) says:
    https://stardustxr.org/
    
    22:12
    me says:
    https://github.com/copy/v86
     
    Mauve (they/it) says:
    https://webassembly.sh/
     
    Mauve (they/it) says:
    https://matrix.to/#/#LoFirCy:mauve.moe

    4d3.jpg

     

    4dassets.jpg

     

    aboutleon.png

     

    accessibility

     28th September 2025 at 10:53am

    XR Fragment-capable clients offer increased XR Accessibility via aria-description-metadata, and the so-called 2-button navigation:

    TAB and ENTER actions allow for cycling/executing objects with href metadata contextually, meaning that only relevant objects are candidates for this (allowing comfortable tab-navigation inside huge worlds with large amounts of buttons). The buttons are remappable, and can also be triggered via speech.

    Example of metadata of an accessible button:

    • href: #platformB
    • aria-description: is a space where the history of mankind is explained

    Selecting the button will display/read to the user: #platformB is a space where the history mankind is explained.
    The spec encourages starting aria-description with a verb (for easy transcripts).

    Undefined widget 'videojs'

    Here's the relevant part of the spec:

    The spec also defines a simple text-input interface which allows navigation via speech or text-input:

    AFRAME template

     23rd May 2023 at 2:56pm

    application sidecar file

     3rd October 2025 at 11:19am

    When an XR player (myplayer.exe e.g.) is launched (without a specific file), it should poll for a default file (myplayer.glb e.g.).

    Sidecar files

    These are optional auto-loaded files to enable hasslefree publishing of XR Movie applications.
    The application should poll for only those 3D file-formats it supports.


    Webviewer example

    1. https://my.org/demo.html loads demo.glb
    2. https://my.org/demo.html?file=foo.glb loads foo.glb

    Linux/Mac example

    $ ls -la
    myplayer
    myplayer.glb
    $ ./myplayer          <-- automatically loads myplayer.glb
    $ ./myplayer foo.glb  <-- no sidecar logic
    

    Windows example

    > dir
    myplayer.exe
    myplayer.glb
    > myplayer.exe         <-- automatically loads myplayer.glb
    > myplayer.exe foo.glb <-- no sidecar logic
    

    NOTE: after loading a file the usual sidecar file-logic applies

    automatic reflection mapping

     2nd September 2025 at 4:30pm

    reflection mapping enhances the realism of 3D objects by reflecting their surroundings.
    To make sure each object uses the right environment map, in your 3D editor (blender e.g.) set it based on the closest parent object with a (seamless) texture.

    This way, objects automatically inherit the appropriate reflections and lighting from their nearest parent, ensuring a consistent and realistic look across the scene.


    Below is the related section of the spec (full spec here: HTML, TXT)

    balloon.css

     15th April 2021 at 7:07pm

    :root {
      --balloon-color: rgba(16, 16, 16, 0.95);
      --balloon-font-size: 12px;
      --balloon-move: 4px; }
    
    button[aria-label] {
      overflow: visible; }
    
    [aria-label] {
      position: relative;
      cursor: pointer; }
    [aria-label]:after {
      opacity: 0;
      pointer-events: none;
      transition: all .18s ease-out .18s;
      text-indent: 0;
      font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif;
      font-weight: normal;
      font-style: normal;
      text-shadow: none;
      font-size: var(--balloon-font-size);
      background: var(--balloon-color);
      border-radius: 2px;
      color: #fff;
      content: attr(aria-label);
      padding: .5em 1em;
      position: absolute;
      white-space: nowrap;
      z-index: 10; }
    [aria-label]:before {
      width: 0;
      height: 0;
      border: 5px solid transparent;
      border-top-color: var(--balloon-color);
      opacity: 0;
      pointer-events: none;
      transition: all .18s ease-out .18s;
      content: "";
      position: absolute;
      z-index: 10; }
    [aria-label]:hover:before, [aria-label]:hover:after, [aria-label][data-balloon-visible]:before, [aria-label][data-balloon-visible]:after, [aria-label]:not([data-balloon-nofocus]):focus:before, [aria-label]:not([data-balloon-nofocus]):focus:after {
      opacity: 1;
      pointer-events: none; }
    [aria-label]:not([data-balloon-pos]):after {
      bottom: 100%;
      left: 50%;
      margin-bottom: 10px;
      transform: translate(-50%, var(--balloon-move));
      transform-origin: top; }
    [aria-label]:not([data-balloon-pos]):before {
      bottom: 100%;
      left: 50%;
      transform: translate(-50%, var(--balloon-move));
      transform-origin: top; }
    [aria-label]:not([data-balloon-pos]):hover:after, [aria-label]:not([data-balloon-pos])[data-balloon-visible]:after {
      transform: translate(-50%, 0); }
    [aria-label]:not([data-balloon-pos]):hover:before, [aria-label]:not([data-balloon-pos])[data-balloon-visible]:before {
      transform: translate(-50%, 0); }
    [aria-label].font-awesome:after {
      font-family: FontAwesome, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif; }
    [aria-label][data-balloon-break]:after {
      white-space: pre; }
    [aria-label][data-balloon-break][data-balloon-length]:after {
      white-space: pre-line;
      word-break: break-word; }
    [aria-label][data-balloon-blunt]:before, [aria-label][data-balloon-blunt]:after {
      transition: none; }
    [aria-label][data-balloon-pos="up"]:after {
      bottom: 100%;
      left: 50%;
      margin-bottom: 10px;
      transform: translate(-50%, var(--balloon-move));
      transform-origin: top; }
    [aria-label][data-balloon-pos="up"]:before {
      bottom: 100%;
      left: 50%;
      transform: translate(-50%, var(--balloon-move));
      transform-origin: top; }
    [aria-label][data-balloon-pos="up"]:hover:after, [aria-label][data-balloon-pos="up"][data-balloon-visible]:after {
      transform: translate(-50%, 0); }
    [aria-label][data-balloon-pos="up"]:hover:before, [aria-label][data-balloon-pos="up"][data-balloon-visible]:before {
      transform: translate(-50%, 0); }
    [aria-label][data-balloon-pos="up-left"]:after {
      bottom: 100%;
      left: 0;
      margin-bottom: 10px;
      transform: translate(0, var(--balloon-move));
      transform-origin: top; }
    [aria-label][data-balloon-pos="up-left"]:before {
      bottom: 100%;
      left: 5px;
      transform: translate(0, var(--balloon-move));
      transform-origin: top; }
    [aria-label][data-balloon-pos="up-left"]:hover:after, [aria-label][data-balloon-pos="up-left"][data-balloon-visible]:after {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="up-left"]:hover:before, [aria-label][data-balloon-pos="up-left"][data-balloon-visible]:before {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="up-right"]:after {
      bottom: 100%;
      right: 0;
      margin-bottom: 10px;
      transform: translate(0, var(--balloon-move));
      transform-origin: top; }
    [aria-label][data-balloon-pos="up-right"]:before {
      bottom: 100%;
      right: 5px;
      transform: translate(0, var(--balloon-move));
      transform-origin: top; }
    [aria-label][data-balloon-pos="up-right"]:hover:after, [aria-label][data-balloon-pos="up-right"][data-balloon-visible]:after {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="up-right"]:hover:before, [aria-label][data-balloon-pos="up-right"][data-balloon-visible]:before {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="down"]:after {
      left: 50%;
      margin-top: 10px;
      top: 100%;
      transform: translate(-50%, calc(var(--balloon-move) * -1)); }
    [aria-label][data-balloon-pos="down"]:before {
      width: 0;
      height: 0;
      border: 5px solid transparent;
      border-bottom-color: var(--balloon-color);
      left: 50%;
      top: 100%;
      transform: translate(-50%, calc(var(--balloon-move) * -1)); }
    [aria-label][data-balloon-pos="down"]:hover:after, [aria-label][data-balloon-pos="down"][data-balloon-visible]:after {
      transform: translate(-50%, 0); }
    [aria-label][data-balloon-pos="down"]:hover:before, [aria-label][data-balloon-pos="down"][data-balloon-visible]:before {
      transform: translate(-50%, 0); }
    [aria-label][data-balloon-pos="down-left"]:after {
      left: 0;
      margin-top: 10px;
      top: 100%;
      transform: translate(0, calc(var(--balloon-move) * -1)); }
    [aria-label][data-balloon-pos="down-left"]:before {
      width: 0;
      height: 0;
      border: 5px solid transparent;
      border-bottom-color: var(--balloon-color);
      left: 5px;
      top: 100%;
      transform: translate(0, calc(var(--balloon-move) * -1)); }
    [aria-label][data-balloon-pos="down-left"]:hover:after, [aria-label][data-balloon-pos="down-left"][data-balloon-visible]:after {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="down-left"]:hover:before, [aria-label][data-balloon-pos="down-left"][data-balloon-visible]:before {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="down-right"]:after {
      right: 0;
      margin-top: 10px;
      top: 100%;
      transform: translate(0, calc(var(--balloon-move) * -1)); }
    [aria-label][data-balloon-pos="down-right"]:before {
      width: 0;
      height: 0;
      border: 5px solid transparent;
      border-bottom-color: var(--balloon-color);
      right: 5px;
      top: 100%;
      transform: translate(0, calc(var(--balloon-move) * -1)); }
    [aria-label][data-balloon-pos="down-right"]:hover:after, [aria-label][data-balloon-pos="down-right"][data-balloon-visible]:after {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="down-right"]:hover:before, [aria-label][data-balloon-pos="down-right"][data-balloon-visible]:before {
      transform: translate(0, 0); }
    [aria-label][data-balloon-pos="left"]:after {
      margin-right: 10px;
      right: 100%;
      top: 50%;
      transform: translate(var(--balloon-move), -50%); }
    [aria-label][data-balloon-pos="left"]:before {
      width: 0;
      height: 0;
      border: 5px solid transparent;
      border-left-color: var(--balloon-color);
      right: 100%;
      top: 50%;
      transform: translate(var(--balloon-move), -50%); }
    [aria-label][data-balloon-pos="left"]:hover:after, [aria-label][data-balloon-pos="left"][data-balloon-visible]:after {
      transform: translate(0, -50%); }
    [aria-label][data-balloon-pos="left"]:hover:before, [aria-label][data-balloon-pos="left"][data-balloon-visible]:before {
      transform: translate(0, -50%); }
    [aria-label][data-balloon-pos="right"]:after {
      left: 100%;
      margin-left: 10px;
      top: 50%;
      transform: translate(calc(var(--balloon-move) * -1), -50%); }
    [aria-label][data-balloon-pos="right"]:before {
      width: 0;
      height: 0;
      border: 5px solid transparent;
      border-right-color: var(--balloon-color);
      left: 100%;
      top: 50%;
      transform: translate(calc(var(--balloon-move) * -1), -50%); }
    [aria-label][data-balloon-pos="right"]:hover:after, [aria-label][data-balloon-pos="right"][data-balloon-visible]:after {
      transform: translate(0, -50%); }
    [aria-label][data-balloon-pos="right"]:hover:before, [aria-label][data-balloon-pos="right"][data-balloon-visible]:before {
      transform: translate(0, -50%); }
    [aria-label][data-balloon-length="small"]:after {
      white-space: normal;
      width: 80px; }
    [aria-label][data-balloon-length="medium"]:after {
      white-space: normal;
      width: 150px; }
    [aria-label][data-balloon-length="large"]:after {
      white-space: normal;
      width: 260px; }
    [aria-label][data-balloon-length="xlarge"]:after {
      white-space: normal;
      width: 380px; }
    @media screen and (max-width: 768px) {
      [aria-label][data-balloon-length="xlarge"]:after {
                          white-space: normal;
                          width: 90vw; } }
    [aria-label][data-balloon-length="fit"]:after {
      white-space: normal;
      width: 100%; }
    
    
    /* Customization -------------- */
    
    /* Add this to your CSS */
    .tooltip-red {
      --balloon-color: red;
    }
    
    .tooltip-big-text {
      --balloon-font-size: 20px;
    }
    
    .tooltip-slide {
      --balloon-move: 30px;
    }

    Best practices

     8th October 2025 at 12:01pm
    To guarantee a smooth XR ride, remember: small optimized 3D files prevent motion sickness .

    There's tons of info out there on 3D modeling & rendering.
    However, how to do things economically (to run smooth on standalone VR hardware e.g.) is a different thing.

    This is an effort to summarize design techniques to make the most out of standalone XR (fragment0 experiences.

    1. Decimate your objects
    2. File formats
    3. Fog materials
    4. JPG vs PNG
    5. Lights
    6. Pixel- and gradient-maps
    7. Trimsheet textures
    8. Unlit textures
    9. UV mirroring
    10. Why small file-size matters

    camera HUDLUT

     24th March 2026 at 9:32pm

    HUDLUT extension

    NOTE: this is optional

    This allows XR experiences with:

    • HUD-displays (a transparent plane with infographics e.g.)
    • camera LUT, by wrapping via semi-transparent sphere around a camera e.g.

    The heuristic: if a 3D file's default spawn point (objectname spawn) is of type Camera, then reparent its children to the player's camera.

    Engine prefixes

    To make the textureplane act like a lens-filter in various engines, embedding these engine prefixes are recommended:

    • -three-material.blending: THREE.AdditiveBlending
    • -godot4-material.transparency: BaseMaterial3D.TRANSPARENCY_ALPHA
    • -godot4-material.blend_mode: BaseMaterial3D.BLEND_MODE_ADD
    • -godot3-material.flags_transparent: true
    • -godot3-material.params_blend_mode: SpatialMaterial.BLEND_MODE_ADD
    • -lovr-setBlendMode: add
    • -playcanvas-material.blendType: pc.BLEND_ADDITIVE
    • -babylonjs-material.alphaMode: BABYLON.Engine.ALPHA_ADD

    centralized.png

     

    changing object materials

     2nd September 2025 at 4:30pm

    XR Fragment-capable clients can show/hide objects with a certain name or tag in various ways:

    #<tag_or_objectname>[*]=<materialname>

    example including children info
    #foo=dark no changed material of object with foo as name or part of tag (space-separated) to material (with name dark)
    #foo*=dark yes changes material of object with foo as name or part of tag (space-separated) to material (with name dark)
    #!foo no resets material of object with foo as name or part of tag back to original material
    #!foo* yes resets material of object with foo as name or part of tag back to original material

    NOTE: if a material does not exist, the update does not happen.

    child objects

     28th September 2025 at 5:49pm
    In the context of 3D files and editors, a **child object** is any asset—such as a mesh, light, or camera—that is linked in a dependency relationship to a parent object to create a hierarchical structure.

    The XR Fragments spec uses these relationships to hide objects, or render them in portals.

    By default, this structure, often called a scene graph, means that when the parent is moved, rotated, or scaled, the child object's position is automatically updated relative to the parent, effectively creating a single logical unit.

    For example, a character's eyeball would be a child of the head bone, ensuring the eye always follows the head's movements during animation. This powerful system simplifies complex scene management by allowing an artist to control a collection of parts through the single parent object.

    When a 3D file is saved and loaded, this exact parent-child relationship is preserved, guaranteeing the model and scene's structural integrity across different editing sessions and applications.

    collidable / walkable objects

     2nd September 2025 at 4:30pm
    NOTE: the following is adviced but also non-mandatory for clients: the default floor is assumed to be at coordinate 0,0,0

    XR Fragment-capable clients can sense walkable meshes, by scanning all objects in a scene for:

    1. non-existence of href-attribute
    2. non-existence of src-attribute
    3. non-existance of material

    If all conditions are met, the mesh can be considered collidable/walkable (to teleport to e.g.)

    conflict.jpg

     

    create a teleport button

     20th September 2025 at 10:20am
    An Easy nocode way to add metadata to a 3D file in a 3D editor is by adding custom properties in blender e.g.. Basically:

    Internal teleports

    • create a plane object with a name (foo e.g.)
    • create a plane object elsewhere with a name (bar e.g.)
    • now create a button object, and add metadata href:#bar to teleport the user (after clicking)

    Now you can load the 3D file (in the example editor) and click the button to teleport.

    NOTE 1: teleportations focus on the **origin** of an object, so you might need to adjust them in case of a box e.g. (the origin is usually in the middle, which might not be what you want in case you want to teleport on top of the box).
    NOTE 2: put a camera object inside your (root)object, so that users can teleport to it from external files.

    External teleports

    Create a plane or box-object, and add the following metadata:

    • href:https://xrfragment.org (open a website in a new tab, unless XRF microformat was detected)
    • href:https://me.com/model.glb (to surf to a new 3D model)
    • href:#..... (interactivity: execute some fragments, see 🖇 auto-generated fragments )

    Now you can load the 3D file (in the example editor) and click the button to teleport.

    Decimate your objects

     8th October 2025 at 10:47am
    How to turn large objects into low-poly objects?

    TIP: use 'decimate' effect in your 3D editor

    This blender-script does this automatically in Blender during project-save ❤

    Many 3D designers feel disappointed when incorporating high-fidelity models downloaded from the web directly into Extended Reality (XR) experiences, such as VR or AR, because these models are often not optimized for real-time performance on resource-constrained devices like mobile headsets. They commonly feature excessively high polygon counts and overly complex geometry, which results in significant strain on the GPU and CPU, leading to slow, low-FPS framerates and a poor, disorienting user experience, often referred to as "jank" or "lag" 🐢.

    The Decimate Modifier in 3D editors is a vital solution to this problem, as it is a powerful tool for reducing the polygon count of a mesh automatically while attempting to preserve the object's original shape and visual fidelity, thereby significantly improving rendering efficiency and achieving the smooth, high frame rate critical for comfortable and immersive XR applications.

    How to Decimate in Blender

    To apply the Decimate Modifier in Blender 🛠️, follow these steps:

    1. Select Your Model: In Object Mode, click on the $3\text{D}$ model you wish to optimize.

    2. Add the Modifier: Navigate to the Modifier Properties tab (the wrench icon) in the Properties panel and click "Add Modifier," selecting "Decimate" from the list.

    3. Adjust the Ratio: The default mode, "Collapse," is usually best for general reduction. Adjust the Ratio value, which ranges from 0 (maximum decimation) to 1 (original geometry). For example, a ratio of 0.25 will attempt to reduce the face count by approximately 75. Keep an eye on the face count and the model's visual integrity as you adjust this value.

    4. Apply the Change: Once satisfied with the optimized look and lower face count, you must click "Apply" on the modifier in the stack to bake the changes permanently into the mesh geometry, making the model ready for export to your XR platform.

    decimate.svg

     

    default spawn location

     13th November 2025 at 4:11pm

    create an object with name spawn.

    When a 3D file is loaded, it will position the user/camera at xyz: 0,0,0 OR to the location of (an object called) spawn.

    Basically:

    will load as default:

    Draft of '↪ URI.parse(url,flags)'

     4th May 2023 at 11:36am

    Draft of '📜 level0: File'

     22nd September 2025 at 9:50pm

    Draft of '📜 XR fragments'

     4th August 2023 at 12:29pm

    Draft of '📜level7: engine prefixes'

     18th March 2026 at 1:58pm

    ↑ Example of janusweb executing engine prefixes (defined in Blender)

    Engine prefixes are like [browser vendor prefixes](https://developer.mozilla.org/en-US/docs/Glossary/Vendor_Prefix), focused on 3D viewers & Game engines
      1. Progressive enhancement

    There are cases where the 3D scene file might want to hint specific features to the viewer-engine ([JANUSWEB](https://github.com/jbaicoianu/janusweb), THREE.js, AFRAME, Godot e.g.).

    This makes sense when the features cannot be triggered via other means (extensions e.g.)
        1. Examples

    This is how gltf extras / custom properties on 3D entities, can act as initializers:

    custom propertyonfunctionality
    ---
    -three-background: rgb(255, 0, 0)sceneroothints scene.background = new Color('rgb(255, 0, 0)') to the THREE scene
    -three-material.blending: THREE.AdditiveBlendingobjecthints .material.blending = THREE.AdditiveBlending in THREE.js viewers
    -godot-material.blend_mode: BLEND_ADDobjecthints .material.blend_mode = material.BLEND_ADD
    -godot4-mat.blend: ADDobjectsimilar but for godot v4objecthints .material.blend_mode = material.BLEND_MODE_ADD
    -touchdesigner-mat.blend: ADDITIONALobjectsimilar but for Touchdesigner
    -myapp-myfeature: trueobjecthints feature-toggle for some private app

    [JANUSWEB](https://github.com/jbaicoianu/janusweb) adopts XRF level7 gltf extras for both -three-* and -janus-*:

    custom propertyonfunctionality
    ---
    -janus-collision_id: myplaneobjecthints .collision_id = 'myplane' in janusweb viewer
    -janus-billboard: trueobjecthints .billboard = true in janusweb viewer

    | -janus-use_local_asset: room2 | -janus-source | sceneroot | contains the JML-sourcecode of the scene | | -janus-tag: paragraph | object | instantiates a <paragraph> object | | -janus-text: Lorem ipsum | object | instantiates <paragraph text="Lorem ipsum"/> |

      1. Support your own engine/app

    scene.traverse( (n) => { for( let key in n.userData ) const realKey = key.replace(/^-(janus|three)-/,) THREE fallthrough if( key.match(/^-three-/) ){ if( key.match(/-material\./) ){ if( obj.material ) obj.material[ realKey.replace('material.',) ] = obj.userData[key]; }else{ obj[realKey] = obj.userData[key]; } } JANUS fallthrough if( key.match(/^-janus-/) ){ if( obj.name == 'Scene' || obj.parent.name == ){ room[realKey] = obj.userData[key]; }else{ toJanusObject(obj)[realKey] = obj.userData[key] } } } }) </code>

    Draft of '$:/webxr-notebook/boot.css'

     28th April 2023 at 1:27pm

    Draft of 'How it works'

     5th May 2023 at 1:23pm

    Draft of 'Vertical fog'

     6th October 2025 at 2:38pm

    Draft of 'XR Fragments'

     8th February 2024 at 1:53pm

    dynamic scenes via server

     2nd September 2025 at 4:26pm

    Create a button, but add the following metadata for HTTP-links:

    dynamic 3D file via HTTP backend

    1. add to your button the metadata href:https://yourserver.io/latest.glb
    2. make your server programmatically return a 3D file when latest.glb is requested

    Please refer to your backend/framework documentation on how to serve a (binary) file.

    This allows for flexible server-controllable URLs inside a model. Keep in mind that this obviously reduces the portability of the model, which may (not) be a problem depending on the usecase.

    webserver-backend redirect

    1. add to your button the metadata href:https://yourserver.io/a?b
    2. make your server return Location: /final/link/here based on a or ?b, together with statuscode 302 (=temporary redirect).

    Please refer to your server documentation on how to do a 302 redirect.

    This allows for flexible server-controllable URLs inside a model. Keep in mind that this obviously reduces the portability of the model, which may (not) be a problem depending on the usecase.

    nginx static redirect

    location ~ /a\?b$ {
      return 302 https://foobar.com;
    }
    

    apache static redirect

    RewriteEngine On
    RewriteCond %{QUERY_STRING} ^a?b$
    RewriteRule ^(.*)$ https://foobar.com? [R=302,L]
    

    Edit a 3D scene file

     5th September 2025 at 3:33pm

    There are loads of 3D editors and 3D file formats out there.
    For maximum interoperability the glTF (.glb and .gltf) is suggested as fileformat.
    For editors the following FOSS (free) 3D editors are suggested for importing/exporting glTF files:

    embed a 3D object

     20th September 2025 at 10:36am
    **NOTE**: Embedding is only possible via user-interactions. This is to ensure portable 3D scenes.

    Create an empty mesh object (in Blender it's called an 'Empty') and add an href with a value prefixed with optionally a #! (toggle) or #* (multiply)

    • href: #!menu (teleports object 'menu' in front of user)
    • href: #*cube (duplicates object 'cube' in front of user)
    • href: https://foo.com/menu.glb (imports menu in front of user (*))
    • href: https://foo.com/menu.glb#bar (imports object bar in front of user (*))
    * = when file contains 0 href's and 0 sidecar-files (see 📜 level0: File)

    The best practice is to show a lowpoly preview-version, or image-texture of the embedded object/file. This hints the user what to expect.

    NOTE: the src attribute has been deprecated as it can lead to

    engine-prefixes.webp

     

    EU keeps/stops funding FOSS?

     22nd July 2024 at 10:58am

    Since 2020, Next Generation Internet (NGI) programmes, part of European Commission's Horizon programme, fund free software in Europe using a cascade funding mechanism (see for example NLnet's calls). This year, according to the Horizon Europe working draft detailing funding programmes for 2025, we notice that Next Generation Internet is not mentioned any more as part of Cluster 4.

    NGI programmes have shown their strength and importance to support the European software infrastructure, as a generic funding instrument to fund digital commons and ensure their long-term sustainability. We find this transformation incomprehensible, moreover when NGI has proven efficient and ecomomical to support free software as a whole, from the smallest to the most established initiatives. This ecosystem diversity backs the strength of European technological innovation, and maintaining the NGI initiative to provide structural support to software projects at the heart of worldwide innovation is key to enforce the sovereignty of a European infrastructure. Contrary to common perception, technical innovations often originate from European rather than North American programming communities, and are mostly initiated by small-scaled organizations.

    Previous Cluster 4 allocated 27 millions euros to:

    "Human centric Internet aligned with values and principles commonly shared in Europe" ;
    "A flourishing internet, based on common building blocks created within NGI, that enables better control of our digital life" ;
    "A structured eco-system of talented contributors driving the creation of new internet commons and the evolution of existing internet commons" .
    

    In the name of these challenges, more than 500 projects received NGI funding in the first 5 years, backed by 18 organisations managing these European funding consortia.

    NGI contributes to a vast ecosystem, as most of its budget is allocated to fund third parties by the means of open calls, to structure commons that cover the whole Internet scope - from hardware to application, operating systems, digital identities or data traffic supervision. This third-party funding is not renewed in the current program, leaving many projects short on resources for research and innovation in Europe.

    Moreover, NGI allows exchanges and collaborations across all the Euro zone countries as well as "widening countries"¹, currently both a success and and an ongoing progress, likewise the Erasmus programme before us. NGI also contributes to opening and supporting longer relationships than strict project funding does. It encourages to implement projects funded as pilots, backing collaboration, identification and reuse of common elements across projects, interoperability in identification systems and beyond, and setting up development models that mix diverse scales and types of European funding schemes.

    While the USA, China or Russia deploy huge public and private resources to develop software and infrastructure that massively capture private consumer data, the EU can't afford this renunciation. Free and open source software, as supported by NGI since 2020, is by design the opposite of potential vectors for foreign interference. It lets us keep our data local and favors a community-wide economy and know-how, while allowing an international collaboration. This is all the more essential in the current geopolitical context: the challenge of technological sovereignty is central, and free software allows to address it while acting for peace and sovereignty in the digital world as a whole.

    L’Union Européenne doit poursuivre le financement des logiciels libres Depuis 2020, les programmes Next Generation Internet (NGI), sous-branche du programme Horizon Europe de la Commission Européenne financent en cascade (via les appels de NLnet) le logiciel libre en Europe. Cette année, à la lecture du brouillon du Programme de Travail de Horizon Europe détaillant les programmes de financement de la commission européenne pour 2025, nous nous apercevons que les programmes Next Generation Internet ne sont plus mentionnés dans le Cluster 4.

    Les programmes NGI ont démontré leur force et leur importance dans le soutien à l’infrastructure logicielle européenne, formant un instrument générique de financement des communs numériques qui doivent être rendus accessibles dans la durée. Nous sommes dans l’incompréhension face à cette transformation, d’autant plus que le fonctionnement de NGI est efficace et économique puisqu’il soutient l’ensemble des projets de logiciel libre des plus petites initiatives aux mieux assises. La diversité de cet écosystème fait la grande force de l’innovation technologique européenne et le maintien de l’initiative NGI pour former un soutien structurel à ces projets logiciels, qui sont au cœur de l’innovation mondiale, permet de garantir la souveraineté d’une infrastructure européenne. Contrairement à la perception courante, les innovations techniques sont issues des communautés de programmeurs européens plutôt que nord-américains, et le plus souvent issues de structures de taille réduite.

    Le Cluster 4 allouait 27 millions d’euros au service de :

    « Human centric Internet aligned with values and principles commonly shared in Europe » ;
    « A flourishing internet, based on common building blocks created within NGI, that enables better control of our digital life » ;
    « A structured eco-system of talented contributors driving the creation of new internet commons and the evolution of existing internet common« .
    

    Au nom de ces enjeux, ce sont plus de 500 projets qui ont reçu un financement NGI0 dans les 5 premières années d’exercice, ainsi que plus de 18 organisations collaborant à faire vivre ces consortia européens.

    NGI contribue à un vaste écosystème puisque la plupart du budget est dévolue au financement de tierces parties par le biais des appels ouverts (open calls). Ils structurent des communs qui recouvrent l’ensemble de l’Internet, du matériel aux applications d’intégration verticale en passant par la virtualisation, les protocoles, les systèmes d’exploitation, les identités électroniques ou la supervision du trafic de données. Ce financement des tierces parties n’est pas renouvelé dans le programme actuel, ce qui laissera de nombreux projets sans ressources adéquates pour la recherche et l’innovation en Europe.

    Par ailleurs, NGI permet des échanges et des collaborations à travers tous les pays de la zone euro et aussi avec ceux des widening countries¹, ce qui est actuellement une réussite tout autant qu’un progrès en cours, comme le fut le programme Erasmus avant nous. NGI0 est aussi une initiative qui participe à l’ouverture et à l’entretien de relation sur un temps plus long que les financements de projets. NGI encourage également à l’implémentation des projets financés par le biais de pilotes, et soutient la collaboration au sein des initiatives, ainsi que l’identification et la réutilisation d’éléments communs au travers des projets, l’interopérabilité notamment des systèmes d’identification, et la mise en place de modèles de développement intégrant les autres sources de financements aux différentes échelles en Europe.

    Alors que les États-Unis d’Amérique, la Chine ou la Russie déploient des moyens publics et privés colossaux pour développer des logiciels et infrastructures captant massivement les données des consommateurs, l’Union Européenne ne peut pas se permettre ce renoncement. Les logiciels libres et open source tels que soutenus par les projets NGI depuis 2020 sont, par construction, à l’opposée des potentiels vecteurs d’ingérence étrangère. Ils permettent de conserver localement les données et de favoriser une économie et des savoirs-faire à l’échelle communautaire, tout en permettant à la fois une collaboration internationale. Ceci est d’autant plus indispensable dans le contexte géopolitique que nous connaissons actuellement. L’enjeu de la souveraineté technologique y est prépondérant et le logiciel libre permet d’y répondre sans renier la nécessité d’œuvrer pour la paix et la citoyenneté dans l’ensemble du monde numérique.

    Dans ces perspectives, nous vous demandons urgemment de réclamer la préservation du programme NGI dans le programme de financement 2025.

    ¹ Tels que définis par Horizon Europe, les États Membres élargis sont la Bulgarie, la Croatie, Chypre, la République Tchèque, l’Estonie, la Grèce, la Hongrie, la Lettonie, la Lituanie, Malte, la Pologne, le Portugal, la Roumanie, la Slovaquie et la Slovénie. Les pays associés élargies (sous conditions d’un accord d’association) l’Albanie, l’Arménie, la Bosnie Herzégovine, les Iles Féroé, la Géorgie, le Kosovo, la Moldavie, le Monténégro, le Maroc, la Macédoine du Nord, la Serbie, la Tunisie, la Turquie et l’Ukraine. Les régions élargies d’outre-mer sont : la Guadeloupe, la Guyane Française, la Martinique, La Réunion, Mayotte, Saint-Martin, Les Açores, Madère, les Iles Canaries. In this perpective, we urge you to claim for preserving the NGI programme as part of the 2025 funding programme.

    ¹ As defined by Horizon Europe, widening Member States are Bulgaria, Croatia, Cyprus, the Czech Republic, Estonia, Greece, Hungary, Latvia, Lituania, Malta, Poland, Portugal, Romania, Slovakia and Slovenia. Widening associated countries (under condition of an association agreement) include Albania, Armenia, Bosnia, Feroe Islands, Georgia, Kosovo, Moldavia, Montenegro, Morocco, North Macedonia, Serbia, Tunisia, Turkey and Ukraine. Widening overseas regions are : Guadeloupe, French Guyana, Martinique, Reunion Island, Mayotte, Saint-Martin, The Azores, Madeira, the Canary Islands.

    Examples

     26th September 2025 at 7:40pm

    NOTE: some examples were made during experimental stages of the spec, they will be removed in the future.
    example.glb simple startingpoint
    website.glb website startingpoint
    elearning.glb quiz startingpoint
    telescopic.glb reveal via href + media fragments
    index.glb kitchensink (buggy WIP)
    cdrom.glb with animations controlled via href + media fragments
    xrsh overlay showing aria-descriptions + scene transcripts
    searxr.me metasearch engine supporting XR Fragments

    feedback.png

     

    File formats

     6th October 2025 at 3:10pm

    What 3D file-format should I save my XR Fragments-compatible experience to?

    While XR Fragments spec is fileformat-agnostic, its recommended to save to one of the following formats (via your 3D editor) :

    * = export to .glb for a compressed small file-result.

    For thumbnails (see sidecar files) the spec uses PNG images

    Textures

    3D editors allows for adding textures to objects. Recommended is to use:

    Reason: JPG usually results in smaller filesizes. However there are excellent PNG compressors available like tinypng, saving a PNG in GIMP to Indexed mode (Image>Mode>Indexed).

    filters

     2nd September 2025 at 4:30pm

    3D Objects inside a 3D model can be referenced/shown/hidden via URI filters:

    This allows high re-usability of 3D modes for remote-, local- and recursive (embedded src) usecases:

    
      my.io/scene.usdz                 Embeddable as:
      +─────────────────────────────+
      │ sky                         │  src: http://my.io/scene.udsz#sky          (includes building,mainobject,floor)
      │ +─────────────────────────+ │ 
      │ │ building                │ │  src: http://my.io/scene.udsz#building     (includes mainobject,floor)
      │ │ +─────────────────────+ │ │
      │ │ │ mainobject          │ │ │  src: http://my.io/scene.udsz#mainobject   (includes floor)
      │ │ │ +─────────────────+ │ │ │
      │ │ │ │ floor           │ │ │ │  src: http://my.io/scene.udsz#floor        (just floor object)
      │ │ │ │                 │ │ │ │
      │ │ │ +─────────────────+ │ │ │  href: http://my.io/scene.udsz#-mainobject (hides mainobject when clicked)
      │ │ +─────────────────────+ │ │
      │ +─────────────────────────+ │
      +─────────────────────────────+
    
    

    The href and src documentation show various examples, but the full syntax is explained in the spec below.
    On top of that, tagged objects allow using tag metadata to group objects to trigger grouped features

    What does "&-interactions*" do in the demo scene?

    The scene-node (3D root) of the demo scene indeed contains (startup) # metadata (#pos=start&rot=0,40,0&t=0&-interactions*).
    Its hiding all 3D objects (and their children) which are tagged with 'interactions'.
    For example: you can see all the menu-items in Blender, but not in the browser.

    • & is just a separator ('AND do the following:')
    • - means 'hide'
    • interactions selects all objects with name 'interactions' or tag: interactions metadata
    • * selects all objects inside those selected objects too (text-objects etc)
    For more on syntax see the spec below



    Fragment identifiers are derived from metadata inside the loaded 3D Model.
    More specific: object-, material-, and camera-names via a strategy called 'Fragment-to-metadata mapping':



    Fog materials

     7th October 2025 at 6:42pm
    How to define disappearing objects above/below/far away from you in a 3D file?

    TIP: use transparent materials for fog




    View WebXR demo

    Fact 1: Fog shaders are not always necessary for atmospheric effects.
    Fact 2: Use geometry for near-sight, and images/sprites for far-sight backgrounds (humans cannot perceive depth meters away).

    Because you can often achieve a great sense of depth and visual separation by using a large, partially transparent cylinder or sphere textured with a semi-transparent color or vertical gradient.png texture.
    This will act as a soft volumetric divider between an inner and outer scene.

    Horizontal fog

    By dividing the scene depth by several semitransparant 'filters', a sense of horizontal depth can be achieved:

    fact: perceiving fog depends on the objects placed at various distances (3 layers of fog-perception is enough) not on how discrete the fog is.

    Vertical fog

    See Pixel- and gradient-maps for info on UV-mapping.

    important: the gradient texture should be transparent (at least the middle), so the objects can be seen clearly.

    gradient.png

    the texture can be 16x16 pixels (no need for big textures). This allows for a variety of gradientcolors & experimentation.

    Getting started

     26th September 2025 at 7:50pm

    Just get your hands on a 3D editor (see this 🖥 Blender ✅🔥 guide) and follow the steps in the video:

    Undefined widget 'videojs'

    Join Matrix Community

    Here are various ways to create/test 3D files with XR Fragments:

    scenariohownotes
    easiestsee the 🖥 Blender ✅🔥 workflow below, by loading a .glb 3D file into any demo on xrfragment.orgexport 3D file (.glb) in Blender, after adding a href metadata as custom properties, and load exported files into any demo (see video above)


    Undefined widget 'videojs'

    Developers

    For developers wanting to integrate or build your own 3D hypermedia browser, the easiest is WebXR:

    » View website.glb online or download website.glb and open it in Blender.
    (developers can extend a 3D model viewer here this codepen)

    But there are also other approaches, as XR Fragments is not tied to any XR-technology or fileformat:

    scenariohownotes
    dev #godotload the example project
    dev #threejs #github #modularfork xfragment-three-helloworldrequires javascript- and threejs developer-knowledge
    dev #polyglotuse the XR Fragment parserlowlevel approach, more suitable for other scenarios
    dev #spec #browserimplement the spec yourselfthe spec is simple: parse URL and iterate over a scene
    dev #aframe #githubhosted sandbox by forking xrfragment-helloworldBasically #1 but it will be hosted for free at your own github URL
    dev #aframe #github #modularfork xfragment-aframe-helloworldrequires javascript- and aframe.io developer-knowledge

    Next to that, familiarize yourself with XR Fragments by checking these videos:

    1. All videos on github (tip: star the repo)
    2. All videos on Youtube (tip: subscribe or add to 'Watch-later' list)

    GLSL template

     25th April 2023 at 7:05pm

    glTF extensions

     7th November 2025 at 8:42am

    XR Fragments is not a fileformat-specific extension, it's a spec for deeplinking any 3D file.
    The level2 metadata (See reference) is easy to embed in any 3D editor (not only blender) than it would be to support new GLTF extensions.
    This is not to say extensions are bad (they are superior in certain cases).

    Just like URLs allow fileformat-agnostic navigation, 3D asset 'extras' are fileformat-agnostic too, which together allow for XR Fragments.

    To deal with extensions/overlapping features see native vs XRF features

    hashbus

     28th February 2024 at 1:22pm

    The hashbus sits inbetween HTML's traditional href and the toplevel URL.
    Say what?

    Because of historical reasons the href bundles interaction (a click) and navigation (replacing the viewport with another resource).

    XR Fragments also allows separating these historicially merged actions, by introducing a hashbus:

    href value updates top-level URL
    #foo yes
    xrf://#foo no

    This allows much more document interactions, with the following benefits:

    • interactions don't clutter URLs for back/forward button navigation
    • many usecases don't require a scripting language anymore (hiding/scrolling via #uv e.g.)
    • use same URI Fragment DSL for navigation and interactions
    • re-use URI Templates across 3D nodes
    • allow 3D nodes publish updates to other 3D nodes (via hashbus)

    In short, a complete hypermediatic feedback loop (HFL).


    Below is the related section of the spec (full spec here: HTML, TXT)

    Home

     11th February 2025 at 6:01pm

    horizontal_fog.svg

     

    How it works

     16th March 2026 at 3:31pm

    protocol :// my.org/world.glb #room3

    Short answer: spatial anchors/spawnpoints in 3D scene(files).

    XR Fragments turns 3D files into bookmarkable, clickable & teleportable XR experiences.

    Explain it like I'm 5 y/o

    Sure, press play below:

    ~10min podcast deepdive

    Clickable links

    When clicking an href-value, the user(camera) is teleported to the referenced object.

    The imported/teleported destination can be another object in the same scene-file, or a different file.

    Adding a link

    Above a typical level1-syntax. See ❓ What are levels? for more.

    3D (Game)engine feature-hinting

    XR Fragments promotes file-agnostic but allows enabling engine-specific features via 📜level7: engine prefixes

    How can XR Browsers surf these worlds?

    Using an URL-bar in your browser, app or OS, or button-object inside your 3D file (with href extra).
    The URL should points to an 3D scene or file (glTF, USDZ, OBJ, COLLADA, FBX e.g.):


    2D documents: https :// foo.org/article.html #chapter2

    3D documents: protocol :// foo.org/world.glb #room2


    Above a typical level1-syntax. See ❓ What are levels? for more.

    Example: internal & external teleport

    
      +────────────────────────────────────────────────────────+ 
      │                                                        │
      │  index.glb                                             │
      │    │                                                   │ during teleport:
      │    ├── ◻ roomB                                         │ camera's Y-coord will be set
      │    │                                                   │ ~1.6m above (roomB's) origin
      |    |                                                   | except in case of camera-rig
      │    ├── ◻ buttonA                                       │ (=non-root camera for VR e.g.)
      │    │      └ href: #roomB                               │
      |    │                                                   │  +────────────────────────+
      │    └── ◻ buttonB                                       │  | other.usdz             |
      │           └ href: other.usdz#foo                       │  |   |                    |
      │                                                        │  |   ├── ◻ camera         |
      +────────────────────────────────────────────────────────+  |   └── ◻ foo            |
                                                                  |          └─◻ camera3   |    
                                                                  +────────────────────────+
    																															
      clicking buttonA will teleport the user to roomB 
      (or import roomB if it's not XR Fragment-compatible)
      
      clicking buttonB will teleport the user to foo in other.usdz 
      (or import foo if it's not XR Fragment-compatible)
       
    	 
    See href for more explanation

    Example: internal && external importing objects

    
      +────────────────────────────────────────────────────────+ 
      │                                                        │
      │  index.glb                                             │  Usecase: trigger
      │    │                                                   │  interactive experiences
      │    ├── ◻ bar                                           │  in front of the user
      │    │      └◻ chart                                     │ 
      |    |                                                   | 
      │    ├── ◻ buttonA                                       │ 
      │    │      └ href: #!bar                                │
      │    │                                                   │  +────────────────────────+
      │    └── ◻ buttonB                                       │  | other.usdz             |
      │           └ href: other.usdz#!infographic&t=0           │ |   |                    |
      +────────────────────────────────────────────────────────+  |   └── ◻ infographic    |
                                                                  |          └─◻ KPIs      |    
                                                                  +────────────────────────+
    																															
      clicking buttonA will reposition (and play) the chart in front of the user
      clicking buttonB will clone (and play) the infographic from other.usdz 
    
    
    See #!-operator for more explanation

    How can I add interactions to existing 3D assets/scenes?

    By manually adding metadata inside 3D objects/asset/scene or via a sidecar-file, which gives a 3D file interactive powers.


    Undefined widget 'videojs'


    Below is the related section of the spec (full spec here: HTML, TXT)

    see Getting started to get going!

    Howto

     2nd September 2025 at 4:14pm

    hyperpreview vs 2D hyperlinks

     7th July 2023 at 11:04am

    Let's look at the browser thru the lens of XR, and not the other way around (it's a trap).

    • a 2D hyperlink navigates/replaces the current document (or opens a tab)
    • a hyperpreview simply links/shows/summarizes an 2D/3D object/document/image

    A hyperpreview promotes approximated summaries of text documents, instead of fully supporting/rendering them.
    That way, opening the content (spatially) will be offloaded to (other applications) on the client or operating system.
    This is in contrast with traditional 2D (space-restricted) way of opening hyperlinks in new tabs (or replacing the current document).

    Basically: the moment you want to implement HTML iframes into your spatial experience, you're looking at XR thru the lens of 2D (a common trap). The higher-dimensional recursive nature of XR Fragments already allows recursive (spatial i)frames.

    Spec 0.5

    1. mimetype text/html instanced by src should should be hyperpreviewable (a non-interactive 2D image-texture).

    2. When interacting with a hyperpreview, the XR Fragment host/client should offer copy/share of the adress (to clipboard and optionally other applications which can handle the mimetype).

    3. hyperpreviews should not aim for achieving 100% render-compatibility of all mimetypes. The goal is addressbility and approximated summarization, not embedding javascript-supported browser-iframes.

    4. Designers can solve unsupported mimetypes by using src for an image-thumbnail and href for the content (which should be offloaded to the (applications on) the operatingsystem)

    mimetype behaviour when user interacts with src:

    mimetype render hyperpreview action update URL fragment clipboard contents after clicking
    unknown mimetypes no
    text/html no yes summarize HTML-text (first paragraph hinted by a fragment identifier e.g.) using crude html-to-image name of object (#website)
    3d objects
    model/gltf+json
    model/glb
    model/obj
    ..and so on
    yes no highlight
    (draw boundingbox e.g.)
    name of object (#cube e.g.) src-value + linebreak + url with fragment: http://other.com/other.gltf
    https://foo.com/#cube
    Sharing such 'trail' (with the clipboardmanager) promotes backwards-reasoning (other.gltf is a cube in scene.gltf e.g.)
    images
    image/png
    image/jpg
    image/gif
    ..and so on
    yes no highlight
    (draw border/boundingbox e.g.)
    name of object (#poster e.g.) object url with fragment (https://foo.com/#cube e.g.)

    Example: embed an HTML document into your scene

    • create a plane with custom property src and value https://mysite.com/foo.html#summary or https://mysite.com/foo.html#chapter1.
    • add custom property [ so that the XR Fragment client can easily render a html-to-image conversion to a texture.
      This is perfect for simple text.
      CRUD/scripting/animations don't belong in hyperpreviews and can partially be re-used in the 3D assets (using src or fbx/gltf animations).

    Q: How can I embed text from a textfile on a server?

    A: create an src with value https://mysite.com/foo.txt so that the XR Fragment client can easily render a html-to-image conversion to a (non)scrolling texture.


    Why are hyperpreviews so limited?

    Because hyperpreviews separate the following concerns of hyperlinks: navigation, addressibility, interaction and rendering.
    In 2D hyperlinks we click links, which navigates us to AND renders the destination.

    In Spatial Experiences endusers are better off hyperpreviewing hyperlinks, which optionally can (due to their addressibility be opened in another application or device).

    The aim/goal of forcing a user to interact with all mimetypes spatially is not realistic.

    If we would indulge on the latter, we're opening a can of worms regarding:

    • security (malicious actors thrive when going beyond read-only previews or HTTP GET)
    • the spatial browser becomes mimetype-rendering-silos (ballooning in size & support)
    • rendering speed / framedropping

    image_VR_lady

     8th October 2025 at 12:09pm

    implicit deeplinks

     24th March 2026 at 3:23pm

    Every 3D object, material or camera can be adressed by URI fragments (#myobject e.g.) which are derived from the 3D scene objectnames (implicit metadata).
    These are inferred at runtime from the 3D scene-nodes (object names, object metadata etc).

    Free fragments and features generated for you..How great is that? 🎉

    
      my.io/scene.usdz                 Embeddable as:
      +─────────────────────────────+
      │ sky                         │  src: http://my.io/scene.udsz#sky          (includes building,mainobject,floor)
      │ +─────────────────────────+ │ 
      │ │ building                │ │  src: http://my.io/scene.udsz#building     (includes mainobject,floor)
      │ │ +─────────────────────+ │ │
      │ │ │ mainobject          │ │ │  src: http://my.io/scene.udsz#mainobject   (includes floor)
      │ │ │ +─────────────────+ │ │ │
      │ │ │ │ floor           │ │ │ │  src: http://my.io/scene.udsz#floor        (just floor object)
      │ │ │ │                 │ │ │ │
      │ │ │ +─────────────────+ │ │ │  href: http://my.io/scene.udsz#-mainobject (hides mainobject when clicked)
      │ │ +─────────────────────+ │ │
      │ +─────────────────────────+ │
      +─────────────────────────────+
    
    

    Fragments (#building e.g.) allow for very convenient, guess-able filters to reference/show/hide objects within a 3D scene.

    For more examples see filters and the reference-menu (by feature)

    Below is the related section of the spec (full spec here: HTML, TXT)


    Imploding 3D scene to Text

     2nd September 2025 at 4:30pm

    This can be done using Descriptive metadata and ARIA descriptions (see 🧩 Object metadata)

    Demo

    Undefined widget 'videojs'

    interlinked.png

     

    JPG vs PNG

     6th October 2025 at 3:13pm

    3D editors allows for adding textures to objects. Recommended is to use:

    Reason: JPG usually results in smaller filesizes by default.

    But..there's a but

    For PNG there are excellent PNG compressors available like tinypng, saving a PNG in GIMP to Indexed mode (Image>Mode>Indexed).

    Texture filtering

    Usually 3D editors allow you to set a different filter for a texture:

    Lights

     8th October 2025 at 11:49am
    A 3D-file can contain one or more lights..but limiting scenes to one and unlit materials are adviced

    Note: below focuses on glTF features

    Why one light?

    Lights are computationally heavy, so adding more reduces accessbility across hardware (frame skipping e.g.).

    Unlit materials

    A light can impossibly shine infinitely (without overexposing near objects), therefore unlit materials (which are not affect by lights) are adviced.

    Read more about unlit materials for glTF objects for Blender here

    PBR?

    Physically-based-rendering are material-properties which can be assigned to 3D objects.

    Screenshot of blender, with the UV edit-tab open. It shows the selected pyramid uv-locations in the pixelmap. Resulting in a 28 kilobytes 3D file-size [ .glb .blend ] without using lights/shaders.

    Setting values like below, a variety of material-characteristics can be added to the textured 3D scene above:

    • roughness
    • metallic
    • alpha

    logo.svg

     26th September 2025 at 7:29pm

    lut.webp

     

    mov

     17th August 2023 at 9:59am

    updates the position of queried object(s)) relative to its original position

    fragmenttypeaccessfunctionality
    #mov=0,0,0vector3🔓 🎲 💥 🔗translate position

    » example implementation
    » discussion

    spec

    version 0.2

    1. translate the object(s) by adding the vector on top of its current position

    Demo

    Undefined widget 'videojs'

    example of interactions using mov

    multilangual

     28th September 2025 at 6:09pm

    multilangual 3D file

    How to enable multilangual 3D experiences?

    Answer: by adding extra objectnames/metadata which include a language postfix (ISO 639-1 language 2char-code).

    Example object-metadata:

    • aria-description: this is a cube
    • aria-description-nl: dit is een kubus

    Example object-name:

    Given a scene with objectnames banner and banner-nl

    • href: #+banner

    If the language is set to nl and banner-nl exists too, then that will be selected/operated instead.

    How to determine the language?

    This should be a setting of the browser. For example, to get the language from a webbrowser:

    navigator.language.substr(0,2)  // 'en'
    

    NOTE: this is completely optional, and works via progressive enhancement.

    multiparty networking

     26th September 2025 at 7:42pm

    XR Fragment-capable clients promotes network-agnostic-or-a-la-carte philosophy. Therefore, networking can be hinted using the href metadata.

    This demoviewer basically detects href metadata with values:

    matrix://r/myroom:matrix.org
    trystero://r/myroom:bittorent
    <someprotocol>://<resource>


    Undefined widget 'videojs'

    In the demoviewer, just click the 'meeting link' or '[matrix]' button to see that it triggers a connection popup.
    An important detail is that the user can always decide what (not) to connect (webcam/chat/scene events/avatar) e.g.

    Important to notice: there's no compelling reason for why a networked experienced should go all-in (with avatars & scene-sync), just as there's no reason for websites to show all the mousecursors of all active visitors.

    Technically, any XR Fragment-compatible client can support as much protocols as they want (natively, thru extensions, or using a viewer scriptinglanguage).
    This is why XR Fragments is ready for future networks too.

    My first XRF editor

     8th October 2025 at 12:53pm
    NOTE: this is entertainment for developers. The best environment for editing 3D models are...3D editors. So most of us are better off using those.

    However, since XR Fragment browsers use URI (Fragments) for interactivity, a simple metadata-editor can easily be implemented.
    Below is a demonstration of a editor.js vanilla javascript-plugin, which adds a button to the example viewer:

    Undefined widget 'videojs'

    Basically, a simple metadata editor allows for:

    updating href values

    • to change teleportation destinations (internal links, external links)
    • to show hide certain objects when clicking a button
    • to open a certain website when clicking a button
    • to change networking adresses (Matrix room, WebRTC P2P roomname e.g.)

    updating src values

    • to link to different (internal/external) 3D objects

    updating tag values

    • to create/modify groups of object (for group-hiding/showing via #-mygroup and #mygroup in href-values)
    Beyond this, you can choose to allow the enduser to edit any metadata.

    native vs XRF features

     7th November 2025 at 8:47am

    How to deal with overlapping functionality?


    Well, native (extensions) take precende, otherwise 'fallback' applies.

    Examples

    Native feature XRF Solution
    <Link url="..."> in JanusXR JML href: .... use <Link>, otherwise href
    OMI_Link gltf extension href: .... use OMI_Link, otherwise href

    Below is the related section of the spec (full spec here: HTML, TXT)



    For more info see How it works

    navigation.png

     

    neo.png

     

    nlnet.png

     

    objecteleport.png

     10th September 2025 at 11:32am

    optional levels

     16th March 2026 at 3:22pm

    parent_child_toggle.svg

     

    parent_child.svg

     

    perception_reality6.jpg

     

    Philosophy & FAQ

     16th March 2026 at 3:28pm
    "Future 3D file formats commoditize yesterdays 3D engines" ~ Leon van Kammen

    Why it matters

    In the last 12 years, the the web has (arguably) been enshittified with either:

    • disjointed XR experiences via WebXR apps or appstores
    • seamless XR in isolated ecosystems/games (which own the content)

    XR Fragments removes this 'apartheid', and enables seamless XR-first virtual worlds, via read-only deep immersive web experiences in browsers. JanusWeb is a great example:

    Sit back and watch this #convergence-not-metaverse appstore-agnostic-but-symbiotic philosophy below:


    Undefined widget 'videojs'

    FAQ

    Q: Why is X not possible / so limited?
    A: You're probably referring to the Example Model Browser while XR Fragments is a code-less framework-less language-less SPEC for navigating and linking 3D models. This is important to realize. The spec is about parsing spatial hints in URI's from the URL-bar and metadata inside 3D models. Developers can decide to build anything on top of this paradigm which falls outside the spec.


    Q: How can XR Fragments support all 3D files ever made?
    A: By targeting the lowest common denominator of all 3D fileformats: objectnames, positions, rotations (and metadata called custom properties/extras)


    Q: Do my 3D files need a specific (metadata) layout/format?
    A: No, XR Fragments are file-agnostic and metadata is optional.
    The objectnames inside the 3D file are used as URL references.


    Q: Do I need complex infrastructure?
    A: No, XR Fragments are protocol-agnostic, you can host your files on a USB-stick, wordpress webserver, ftp-directory, ipfs, blockchain etc!


    Q: How will this enable the metaverse?
    A: The metaverse is a fantasy sci-fi concept from a book. XR fragments deals with real people creating 3D interlinked content & storytelling.


    Q: Why not attach a programming language to XR Fragments
    A: The intention is understandable, but it is out of scope. Programming languages & frameworks come and go. Hence XR Fragments is a spec for interactive metadata for 3D viewers. Hypermedia viewers based on metadata outsurvive programminglanguages in general.
    However, you are free to build programming language to extend experiences, or build a viewer or parser in your favorite language.


    Q: Why don't you add feature X from game Y?
    A: To keep the spec simple, it is limited to 1 href primitive which allows myriads of URL-controllable experiences (including the metadata already present in 3D files like object names and hierarchy).
    It's a pragmatic approach after witnessing many metaverse-inspired do-it-all complex technology-stacks.


    Q: Why does XR Fragments use href as a panacea?
    A: It covers 2 core XR usecases (teleport & import).
    HTML's href traditionally also does more than one thing (clickability, document-loading and updating top-level URL), so it's not always a bad thing.


    Q: Why focus on fileformat-agnostic & designers?
    A: this is the 3D hypermedia file sweetspot (the middle greenzone below).



    "Future 3D file formats commoditize yesterdays 3D engines" ~ Leon van Kammen

    Developers tend to fall in love with specific shiny 3D technologies, which typically buries 3D content inside them.
    These however, still lack addressibility and interoperability unlike 3D Models, which can use XR Fragments URLs as a basic primitive.

    See XR Hypermedia Federation for more.

    Philosophy


    Undefined widget 'videojs'

    We have plenty of wellcrafted, amazing 3D assets on the web.
    What's missing? Hyperlinked no-code storytelling ❤

    What else is missing? adressibility of XR experiences.
    Less boilerplate code = productive XR design ❤

    do ask yourself: why do Code-heavy XR applications tend to break over time due to browser/OS/dependency updates?

    Solution: XR Fragments

    Lets invite some old battle-proof friends (src, href, class, queries, URL's and protocols), and connect our 3D assets directly:

    Many meaningful experiences can be achieved using solely interlinked (cached) 3d-assets. The definition of meaningful here is: the highest person-to-person create-and-share value.
    This is possible by piggybacking the W3C media fragment-format, as well as the href, src and class concept from HTML.
    XR Fragments are fileformat-agnostic, so we can link/embed FBX to/inside glTF and so on ❤

    Earlier attempts

    Most attempts either fall into client-server or fileformat lock-in.
    They fall into typical 'Metaverse' / Mozilla hubs spinoffs: lots of code, laser-focused on a specific 3D fileformat, and lots of experts building centralized server-complexity

    The end-game of these (to be fair: interesting & amazing solutions) are: users & resources trapped in walled gardens.
    The bug of centralized solutions is that they (just like the financial economy) must grow (their profits/audience) to survive

    It has been solved before

    How? By enriching the things mere mortals already produce.
    HTML was enriching text which we've already been writing.
    XR Fragments are enriching 3D assets which we've already been making.
    Instead of coming up with new enormous codebases, a simple standard can reduce so much code and complexity ❤

    Focuspoints

    • there's a lack of compelling WebXR content
      • focus on where contentcreators are (not devs)
      • piggyback on export-features of existing 3D editors (blender e.g.)
      • be fileformat agnostic (FBX, glTF etc we love you all)
      • don't lock designers into a specific editor
      • XR Fragments should free devs from coding nontrivial things
    • 3D content should be surfable locally too
      • Just like HTML-files can be saved/viewed on a smartphone
    • "people dont want to run servers" (partially true)
      • focus on browser, lowbarrier & simplicity
      • don't introduce new servers, softwarestacks or frameworks
    • centralized stakeholders maximize securityrisks AND design by committee
      • 3D assets should be allowed to be read-only (100% HTTP GET)
      • XR Fragments are 100% optional (to ease adoption/backwardscompatibility)
      • XR Fragments are only concerned with public navigateable content
    • 3D asset-formats & frameworks come and go
    • Pragmatic solutions: Induction, Deduction, Abduction method using survey

    Out of scope (client or asset responsibility)

    • avatars
    • realism/performance (responsibility of asset & client)
    • realtime gaming event-propagation
    • webrtc
    • gltf (OMI) extensions and glXF draft-format contain interesting ideas, but are hardcoupled to glTF and require creation of specialized editors/exporters.
    • scripting / wasm e.g. (responsibility of client & designer to offer progressive enhanced XR experiences). XR fragments at a max supports interactivity thru roundrobin & predefined views (spec level 0), and queries (spec level 1) anything beyond would overcomplexify the (adoption of) the spec.
    see the session XR fragment, which indicates the client where extended (sessionbased) information can be found. People who insist on scripting could hint clients where scripting-layers could be found in the session-fragment.

    Pixel- and gradient-maps

     7th October 2025 at 6:03pm

    TIP: don' chase "photorealism"

    Note: below focuses on glTF features

    TIP: Stretch texture-pixels around 3D objects (not huge textures).
    How? By configuring the UV coordinates of your 3D file.

    Why?

    Your 3D scenes will shrink to kilobytes
    This significantly improves UX on standalone headsets and WebXR.
    Basically: it makes (educational) virtual worlds accessible to all kinds of (lofi) hardware.

    Lowpoly art

    virtual (educative) worlds don't need to be 'realistic'

    Forget chasing "photorealism" with those massive, gigabyte-sized texture sheets that take ages to load! When you UV-map a simple, tiny pixelmap instead of a huge, detailed texture, you massively cut down on filesize and loading times.
    This approach is so much more flexible too, because if you want a different color scheme, you just swap out that tiny color-map for another one, instantly changing the whole look of your model without loading another giant file.
    It’s a huge win for speed and performance, proving that smaller is definitely better when it comes to XR art.

    UV Mapping?

    So, what is this UV-mapping magic anyway?

    Screenshot of blender, with the UV edit-tab open. It shows the selected pyramid uv-locations in the pixelmap. Resulting in a 28 kilobytes 3D file-size [ .glb .blend ] without using lights/shaders.

    It’s basically like taking the 3D model (your fancy digital sculpture) and carefully cutting it apart and flattening it out like a papercraft template, which gives you the "UV space."
    In 3D software like Blender, you do this selecting a face or vertex, and moving that across the 2D texture.

    The general rule for low-poly UV-mapping is: just play around, if it looks good its good

    UV-mapping is a rabbit-hole which goes beyond the low-poly design usecase, but for those interested check the Blender manual

    pixelmap.png

     

    popper.png

     

    portal rendering

     24th March 2026 at 2:23pm
    NOTE: this **extension** is optional, and can be used in a progressive enchanched way across (non)supporting viewers.

    Undefined widget 'videojs'

    Portals avoid XR clutter. Rinsema's effect: "To observe much is to absorb little"

    A materialless mesh with the following heuristics will render as an immersive portal:

    PortaltypeHeuristicsResult
    XR lenschild objects in materialless meshobjects are only visible in mesh
    XR portalhref valuethe other mesh will be used as camera-point
    (location will have 1.6m camera-height added)
    NOTE: in 3D engines this is refererred to as rendering to a designated stencil.

    You can see this demonstrated in index.glb or the demo-video & viewer below:

    NOTE: designers are adviced to always accompany a button to the portal (see video), to keep things backwards compatible with barebones (non-portal-supporting) XRF viewers.

    Examples

    Example scene hierarchy:

    
    
      my.io/scene.usdz
      +─────────────────────────────+
      │ world1                      │  
      │ +─────────────────────────+ │ 
      │ │ myportalmesh------------------+ href: #portalpoint  
      | |                         | |     
      │ +─────────────────────────+ │   myportalmesh renders world2
      │ world2                      │   inside from portalpoint's of view
      │ +─────────────────────────+ │   (if myportalmesh has no material)
      │ │ cube                    │ │  
    	| | portalpoint             | |
      │ +─────────────────────────+ │	
      +─────────────────────────────+
    
    

    Lenses

    Will render objects inside of the portal (children) ONLY inside of the portal.

    Example scene hierarchy:

    
    
      my.io/scene.usdz
      +─────────────────────────────+
      │ world1                      │  
      │ +─────────────────────────+ │ 
      │ │ myportal                | |  myportal renders someinfo inside  
    	| | +──────────+            | |  as portal  
      | | | someinfo |            | |  (if myportal has no material)
      | | +──────────+            | |
      │ +─────────────────────────+ │
      +─────────────────────────────+
    
    

    Demo viewer

    Press the 'Teleport down there'-button (in the lens) and witness the portals afterwards yourself:

    Potential future additions

     30th August 2023 at 5:59pm

    this document was moved here

    predefined_view

     15th August 2023 at 11:57am

    Just like with SVG fragments, predefined views are settings embedded in the asset.
    They are basically an alias for a (bundle of) XR Fragments.

    When are they triggered?

    • upon load by default (the # custom property, embedded in the asset)
    • when occuring in an url top-level change
    • on-demand (by clicking a href-property with value #my_view e.g.)

    Basically, a custom property-key in a 3D file/scene needs to match this, in order to have its value executed as XR Fragments URI.


    Undefined widget 'videojs'

    Example scene

     
       🌎
       ├ #: #q=-sphere
       ├ #hide: #q=-.foo
       ├ #show: #q=.foo
       │
       ├── ◻ sphere
       │      └ href: #show|hide
       │
       └── ◻ cube
              └ class: foo
     	
    

    Upon load # will hide a mesh with name sphere by default, but when triggering #hide or #show (*) it will show/hide any object with class foo (in other words: the cube mesh) in roundrobin fashion using |.

    * = by navigating the browser to #hide or clicking the sphere's href e.g.

    » example implementation

    Spec

    version 0.2

    1. upon scene-load, the XR Fragment parser should look for metadata (a unique custom property) in the scene with key # (and value #presetA&english e.g.)
    2. upon scene-load, the XR Fragment parser should look for predefined views in the top-level URL (basically keys without values, like #foo&bar)
    3. after collecting the predefined views, their respective string-values should be evaluated by searching thru the scene again (like step 1). A predefined view #foo will be defined somewhere as #foo:#q=cube&scale=1,1,1 e.g.)
    4. the final XR Fragment strings (#q=cube&scale=1,1,1 e.g.) should be applied to the scene.
    5. Recursion is OK (#foo -> #bar -> #flop) but the XR Fragment parser should protect against cyclic dynamics (#foo -> #bar -> #foo e.g.) by not evaluating the originating predefined view (#foo) twice during the same evaluation.

    DIY Parsing

    The AFRAME/THREE libraries do this for you, but here's how you would parse an top-level browser URI (document.location.href in javascript e.g.) using the parser for other languages:

    
    

    Presentation: XR Fragments (Future of Text)

     7th June 2023 at 1:28pm

    Undefined widget 'presentation' XR Fragments 2023 github.com/coderofsalvation/xrfragment

    Powered by TiddlyWiki, Reveal.js & NLnet

    Progressive enhancement

     28th September 2025 at 6:11pm
    All XR Fragments features are optional

    Why?

    To ease adoption.

    Even within the featureset, there are optional optionals:

    1. 📜level6: XDG soundtheme
    2. application sidecar file
    3. camera HUDLUT
    4. href
    5. multilangual
    6. portal rendering
    7. Progressive enhancement
    8. sidecar files
    9. system folders
    10. XRF microformat

    Publishing

     4th November 2025 at 5:51pm

    Open ecosystems

    The Permanence of Open Ecosystems vs. The Ephemerality of Closed Clouds Publishing to a Free and Open Source Software (FOSS) ecosystem decisively trumps reliance on closed, proprietary cloud products primarily because it safeguards your intellectual labor against the risk of content burial and obsolescence.

    In a closed cloud environment, every hour of work invested—be it code, design, or written content—is inherently ephemeral, subject entirely to the vendor's capricious business decisions: they can raise prices, discontinue features, or simply "sunset" the entire platform, effectively closing down and burying your carefully crafted material with no recourse.

    The true danger lies in the loss of control; you are merely a renter of your own digital presence, and when the proprietor locks the doors, all that effort becomes invisible and inaccessible. In stark contrast, by publishing to a FOSS ecosystem, your work is secured by the fundamental principles of openness: the code and platform remain auditable, forkable, and owned by the community, ensuring the longevity and persistence of your content far beyond the lifecycle of any single company.

    1. 🌎 JanusXR immersive web

    Reference

     27th April 2023 at 5:10pm

    roundrobin

     22nd June 2023 at 11:33am

    RoundRobin cycles thru a list of options (separated by |). It is a very basic way to cycle thru predefined views or object-selections


    Undefined widget 'videojs'

    For example, when the user interacts with an embedded href:#foo|bar it will update the top-URL to:



    :// url #foo


    But after clicking it the second time:



    :// url #bar


    And after clicking it the third time:



    :// url #foo


    And so on..

    You can add as many | options as you want, you're simply restricted to the maximum-length limitations of URLs.

    scale

     17th August 2023 at 9:57am

    updates the scale of queried object(s))

    fragmenttypeaccessfunctionality
    #scale=0,0,0vector3🔓 🎲 💥 🔗scale queried objects

    » example implementation
    » discussion

    spec

    version 0.2

    1. scale the object(s) by overwriting the scale-vector of the object(s) with the vector3 value of scale

    Demo

    Undefined widget 'videojs'

    example of interactions using mov, pos (scale can be used as well)

    scaling of instanced objects

     4th August 2023 at 12:45pm

    Sometimes embedded properties (like href or src) instance new objects.
    But what about their scale?
    How does the scale of the object (with the embedded properties) impact the scale of the referenced content?

    Rule of thumb: visible placeholder objects act as a '3D canvas' for the referenced scene (a plane acts like a 2D canvas for images e, a cube as a 3D canvas e.g.).

    Spec

    version 0.2

    1. IF an embedded property (src e.g.) is set on an non-empty placeholder object (geometry of >2 vertices):

    • calculate the bounding box of the placeholder object (maxsize=1.4 e.g.)
    • hide the placeholder object (material e.g.)
    • instance the src scene as a child of the existing object
    • calculate the bounding box of the instanced scene, and scale it accordingly (to 1.4 e.g.)
    REASON: non-empty placeholder object can act as a protective bounding-box (for remote content of which might grow over time e.g.)

    TODO: needs intermediate visuals to make things more obvious

    2. ELSE multiply the scale-vector of the instanced scene with the scale-vector of the placeholder object.

    Selection of interest

     16th May 2025 at 5:14pm
    NOTE: in the next iteration of the spec, selection of interests will be equivalent to #pos.
    Reason: in VR/AR setting the 'lookat' of the camera is not possible (while keeping headtracking-sensors active), leading to ambigious results compared to desktop.

    A Selection of Interest (SoI) adheres to spirit of the original URI fragment, and it's rationale is further explained by Fabien Benetou's in this video.
    Let's have a look at this url:



    :// url #cube & pos = 0,0,0


    This allows link-sharing and referencing on a macrolevel (pos positions the camera) and microlevel (#cube):

    IF the scene or file contains an object with name cube then the camera should look at that object and highlight it (draw a wire-frame bounding box e.g.).

    Another example:



    :// url #.cubes & pos = 0,0,0


    IF the scene or file contains objects with custom property class: cubes then the camera should look at that at least one object and highlight them (draw a wire-frame bounding boxes e.g.).

    NOTE: it is up to the end-user/client to not create links which contain a pos which makes it impossible to see the Selection of interest.


    Undefined widget 'videojs'

    show

     17th August 2023 at 10:02am

    toggles the visibility of queried objects

    fragmenttypeaccessfunctionality
    #show=1integer [0-1]🔓 🎲 💥 🔗show (1) or hide (0) queried objects

    » example implementation
    » discussion

    spec

    version 0.2

    1. hide the object (material e.g.) when show has value 0

    2. show the object (material e.g.) when show has value 1

    3. not supported in src values (there plain queries are used to hide/show object)

    Demo

    Undefined widget 'videojs'

    example of interactions using show

    showing/hiding object(children)

     2nd September 2025 at 4:30pm

    XR Fragment-capable clients can show/hide objects with a certain name or tag in various ways:

    #[-]<tag_or_objectname>[*]

    example including children info
    #foo no shows object with foo as name or part of tag (space-separated)
    #foo* yes shows object with foo as name or part of tag (space-separated)
    #-foo no hides object with foo as name or part of tag (space-separated)
    #-foo* yes hides object with foo as name or part of tag (space-separated)

    sidecar files

     23rd March 2026 at 5:14pm

    Sidecar files

    These are optional auto-loaded files to enable hasslefree XR Movies:


    via href metadata

    scene.glb   <--- 'href' extra [heuristic] detected inside!
    scene.png   <--- then poll preview thumbnail
    scene.ogg   <--- then poll soundtrack to plays when global 3D animation starts)
    scene.vtt   <--- then poll fallback subtitles for accessibility or screenreaders
    scene.json  <--- then poll sidecar JSON-file with explicit metadata
    
    heuristics
    1. if at least one `href` custom property/extra is found in a 3D scene
    2. The viewer should poll for the above mentioned sidecar-file extensions (and present accordingly)

    via chained extension

    scene.xrf.glb   <--- 'href' extra [heuristic] detected inside!
    scene.xrf.png   <--- then poll preview thumbnail
    scene.xrf.ogg   <--- then poll soundtrack to plays when global 3D animation starts)
    scene.xrf.vtt   <--- then poll fallback subtitles for accessibility or screenreaders
    scene.xrf.json  <--- then poll sidecar JSON-file with explicit metadata
    

    A fallback-mechanism to turn 3D files into XR Movies without editing them.

    heuristics

    the chained-extension heuristic .xrf. should be present in the filename (scene.xrf.glb e.g.)


    multilanguage example

    Use symbolic links and language-directories

    scene.glb   
    scene.json  
    es/scene.glb --> ../scene.glb
    es/scene.png
    es/scene.ogg
    es/scene.vtt
    es/scene.json --> ../scene.json
    

    For more see multilangual

    via subdocuments/xattr

    More secure protocols (Nextgraph e.g.) don't allow for simply polling files. In such case, subdocuments or extended attributes should be polled:

    NOTE: in the examples below we use the href-heuristic, but also the .xrf. chained-extension applies here.

    myspreadsheet.ods
    └── explainer.glb      <--- 'href' extra [heuristic] detected inside!
        ├── explainer.ogg  (soundtrack to play when global 3D animation starts)
        ├── explainer.png  (preview thumnbnail)
        ├── explainer.json (sidecar JSON-file with explicit metadata)
        └── explainer.vtt  (subtitles for accessibility or screenreaders)
    

    If only extended attributes (xattr) are available, the respective referenced file can be embedded:

    $ setfattr -n explainer.ogg -v "soundtrack.ogg" explainer.glb
    $ setfattr -n explainer.png -v "thumbnail.png" explainer.glb
    $ setfattr -n explainer.vtt -v "subtitles.vtt" explainer.glb
    

    NOTE: Linux's setfattr/getfattr is xattr on mac, and Set-Content/Get-content on Windows. See pxattr for lowlevel access.

    Slide_FutureOfText/01

     30th May 2023 at 3:20pm

    XR Fragments


    A textual surfboard XR experiences 💙

    A specification to discover, link, navigate & query 4D urls.

    Slide_FutureOfText/02

     29th May 2023 at 11:26am

    DISCLAIMER

    Tasty speculations, oversimplifications ahead.

    Slide_FutureOfText/03

     27th May 2023 at 4:16pm

    About me

    Working for the internet (+vice versa).
    Observing internet & text (with a smile) thru the lens of Karl Popper & Neil Postman.

    Slide_FutureOfText/04

     28th May 2023 at 5:40pm

    Undefined widget 'videojs'

    Slide_FutureOfText/05.2

     28th May 2023 at 4:30pm

    Slide_FutureOfText/06

     28th May 2023 at 6:17pm

    Fragment

    a piece of information that is smaller than the whole

    XR Fragment adds: teleporting friends to specific (nested) experiences (at a certain time)

    Slide_FutureOfText/06.01

     29th May 2023 at 11:44am

    Bold statement

    Text keeps inviting itself to every party

    Slide_FutureOfText/06.3

     30th May 2023 at 3:07pm

    2D
    Fragment
    :// experience.html #something


    a friend teleports me to something somewhere at sometime

    Slide_FutureOfText/07

     29th May 2023 at 11:54am

    Many projections of 4D XR

    Slide_FutureOfText/07.3

     28th May 2023 at 3:45pm

    Slide_FutureOfText/07.46

     30th May 2023 at 3:19pm

    4D
    URL
    ://experience#pos=0,0,1&rot=0,90,0&t=100,500


    a friend teleports me to something somewhere at sometime

    Slide_FutureOfText/07.473

     28th May 2023 at 6:45pm

    Metadata

    Text keeps inviting itself to every party

    Slide_FutureOfText/07.475

     28th May 2023 at 5:35pm

    Slide_FutureOfText/07.48

     29th May 2023 at 11:53am

    src & href inviting themselves to the party (again)

    Slide_FutureOfText/07.49

     28th May 2023 at 5:03pm

    Undefined widget 'videojs'

    Slide_FutureOfText/07.495

     28th May 2023 at 8:15pm

    Slide_FutureOfText/07.497

     28th May 2023 at 8:07pm

    Undefined widget 'videojs'

    Slide_FutureOfText/07.5

     28th May 2023 at 5:41pm

    URL is the teleport

    offline-friendly

    Slide_FutureOfText/07.6

     28th May 2023 at 5:41pm

    Early attempts

    centralized, complex, survival-issues

    Slide_FutureOfText/09

     28th May 2023 at 3:55pm

    "URLs? been there done that."


    Slide_FutureOfText/16

     28th May 2023 at 8:18pm

    Bold redefinition of URLs

    Multidimensional Cognitive Transformers (MCT)


    "an interesting feedbackloop of experiences becoming metadata of text and/or vice-versa"

    Slide_FutureOfText/17

     28th May 2023 at 5:39pm

    Slide_FutureOfText/18

     29th May 2023 at 1:26pm

    src

     10th September 2025 at 11:45am
    NOTE: **deprecated** for portability/design reasons/issues (for safely importing remote content without breaking the design use #!

    src is the 3D version of the iframe.
    It instances content (in objects) in the current scene/asset.

    fragmenttypeexample value
    srcstring (uri or predefined view or query)#cube
    #-ball_inside_cube
    #-/sky&-rain
    #-language&english
    #price:>2&price:<5
    https://linux.org/penguin.png
    https://linux.world/distrowatch.gltf#t=1,100
    linuxapp://conference/nixworkshop/apply.gltf#q=flyer
    androidapp://page1?tutorial#pos=0,0,1&t1,100
    foo.mp3#t=0,0,0
    NOTE: when the enduser clicks href: #cube while object cube has a timeline-supported src set (src: foo.mp3 src: bar.mp4#t=0,0,0 e.g.), then #t=1,1,0 (play oneshot) will be executed for that src(see #t).

    » example implementation
    » example 3D asset
    » discussion


    Non-euclidian portals / lenses

    When src values are projected on flat 3D objects, they will be project non-euclidian as:

    1. A portal: render objects ALSO inside portal (which the enduser can walk into) 2. A lens: render objects ONLY visible inside lens

    Read more on the non-euclidian portals & lenses page

    XR audio/video integration

    • add a src: foo.mp3 or src: bar.mp4 metadata to a 3D object (cube e.g.)
    • to disable auto-play: add #t=0,0,0 (src: bar.mp3#t=0,0,0 e.g.)
    • to play it, add href: #cube somewhere else
    • when the enduser clicks the href, #t=1,0,0 (play) will be applied to the src value
    for more info see #t.




    Undefined widget 'videojs'

    Spec

    Below is the related section of the spec (full spec here: HTML, TXT)

    system folders

     28th September 2025 at 6:09pm
    How to hide an object by default, after loading a scene?

    Answer: by parenting that object to an objectname starting with an underscore-symbol (_)

    This indicates that the (underscore) object (and its children) should never be rendered (unless teleported by #!)

    • objectname _hidden and child will be hidden after scene-load
    • objectname child will be visible after being teleported out of _hidden (by clicking href: #!child)
    Rationale: this prevents unused assets from floating around in space in VR/AR scenes.

    tag

     10th September 2025 at 11:45am
    NOTE: tag is non-normative

    tag metadata allows tagging objects with strings (similar to id and class in HTML).
    It is used by filters to reference groups of objects, and the XRWG to associate things with eachother.

    fragmenttypeexample value
    tagstring (space separated)#cube
    #cubes
    #-sky&rain
    #-language&english
    #price=>2&price=<5

    » discussion

    tagged objects

     2nd September 2025 at 4:30pm

    XR Fragment-capable clients can reference objects with a certain name or tag, take for example this URL:

    https://foo.com/index.glb#cubes

    After loading the scene, all tags and object-names will be loaded into the XRWG, so that:

    1. objects with name cubes will be matched
    2. objects with tag cubes will be matched

    If objects are matched, the client can draw visible links to/from the objects/visitor to 'point' to those objects of interest.

    see predefined_view for more info


    teleport camera spawnpoint

     24th March 2026 at 4:23pm

    set the position of the camera.

    level1 spawnpoints

    protocol :// my.org/world.glb #bar

    Spawn user at object (with name) bar after loading (and replacing the current scene with) world.glb

    level2 href spawnpoints

    objectnameoptionalfunctionality
    spawnyesSpecifies default camera-position (if exist). Detected after loading scene (world.glb e.g.), if no URI hashtag is present (world.glb#roomC e.g.)
    URI fragmenttypefunctionality
    #roomBstringposition camera to position of object with name roomB
    #cam2stringposition camera to position of camera with name cam02, and make it active camera [follow animation e.g.]
    https://my.org/worldX.glb#roomYstringposition camera to position of object roomC after replacing the current scene with worldX.glb
    the usercamera (usually default at 0,0,0 or at objectname spawn) is repositioned to the origin and upvector of the target object, and only in VR/AR its height gets adjusted (~1.6m height is subtracted, to compensate for camera-rig )

    To enable VR elevators e.g., make sure:

    the camera is attach/parented to that object (so it animates along with the object)

    You can add this URI Fragment to the top-level URLbar, or as href value (to trigger via click) in a 3D model Editor (Blender e.g.):

    Undefined widget 'videojs'

    IMPORTANT: #pos=roomB has been deprecated in favor of #roomB to simplify the spec.

    Developers only:

    » example implementation
    » discussion

    Spec

    Below is the related section of the spec (full spec here: HTML, TXT)

    the-deep-immersive-web.svg

     

    THREE template

     23rd May 2023 at 2:56pm

    THREE template #online

     23rd May 2023 at 2:56pm

    timelines

     23rd October 2025 at 10:54am

    when a loaded file contains a timeline (animations/subtitles e.g.), then:

    All timelines must play (looped) by default (to enable XR Movies)

    Unless an object has multiple animations (actions e.g.), then just play the first or with name default.

    Trimsheet textures

     21st October 2025 at 5:25pm
    How to get things done without endlessly adding geometry/textures which bloat your 3D file?

    Answer: trim sheets

    🧩 What’s a Trim Sheet?

    A trim sheet is a single texture packed with reusable material details — panels, bolts, edges, decals, etc. You UV-map parts of your model onto sections of this sheet, instead of giving every object its own texture.

    think of it as a stylesheet

    ⚡ Superproductivity Benefits

    One Texture, Endless Variety

    • Reuse trims across models for walls, pipes, floors, props.
    • Mix different UV regions for quick visual variety.
    • Perfect for modular environment design.

    2. Fast & Consistent

    • Edit one sheet → update all assets instantly.
    • Consistent texel density and material look across your whole scene.
    • Less time baking or re-exporting textures.

    📦 Tiny File Size Advantages

    • Fewer materials = fewer draw calls.
    • Shared texture = better batching and rendering performance.
    read Why small file-size matters in XR

    🧠 Simple Blender Workflow

    1. Create your trimsheet in GIMP (put textures together into 1 file)

    2. Import your trim sheet.

    3. UV-map meshes to the desired trims.

    4. Align borders in the UV Editor.

    5. Done — detailed results from one small texture!


    ✅ TL;DR

    BenefitWhy It Matters
    SpeedTexture fast, reuse everything
    ConsistencyShared style across assets
    EfficiencySmall files, fast rendering

    Free blender addons

    trimsheet.svg

     

    Unlit textures

     8th October 2025 at 12:17pm
    Got a light? Nope..too expensive (but I've got a texture for you)

    The above scene results in a 9 kilobytes 3D file-size [ .glb .blend ] using a texture as color-lookup table.

    With unlit materials we achieve:

    • a gradient horizon mapped onto a sphere (orange area)
    • a plane with an (unlit) checkers pattern
    • a unlit cube
    Read more about saving unlit materials into glTF 3D files for Blender here

    To render all objects in our 3D file, we don't need to introduce more resource-heavy lights.
    Instead we use an single unlit texture as a color lookup table (via uvmapping).

    Utilizing unlit materials for distant or intrinsically bright objects in Extended Reality (XR) is a critical optimization strategy because it drastically reduces the computational burden on resource-constrained mobile VR/AR devices.

    Unlike lit materials, which require the GPU to perform expensive, real-time lighting calculations

    including determining light direction, shadows, specularity, and reflections for every pixel of every mesh under the influence of every light

    An unlit shader renders the object using only the texture's raw color (or its emissive color), bypassing these complex and time-consuming processes entirely.

    This is especially advantageous for far-off background elements, which lose visual detail at a distance, and for objects that are meant to *be* the light source (like the sun, stars, or a neon sign), where lighting calculations are redundant or undesirable.

    Introducing multiple real-time light sources is highly detrimental to performance in XR, as the rendering cost often multiplies with each light affecting a surface, potentially causing the scene to drop below the minimum required frame rate (e.g., and leading to user discomfort or motion sickness ("jank").

    unlit.jpg

     

    URI templates (reactivity)

     6th September 2025 at 9:59am

    Fragment's can be dynamic thanks to URI Templating RFC6570.
    This allows for dynamic and reactive fragments in src and href.

    NOTE 1: the domain+path of an URL cannot be modified

    NOTE 2: src, href and tag object metadata is mutable, however default metadata (#) not.

    Here are some examples:

    dynamic teleports (escape-room)

    
     foo.usdz                                            
        │   
        ├── ◻ level2
        │           
        └── ◻ level1
            |
            ├── ◻ secretbutton 
            │      └ href: #nextlevel=level2
            │                                                 
            └── ◻ exitdoor
                   └ href: #pos={nextlevel}
    
    

    a simple videoplayer

    
     foo.usdz                                            
        │                                                 
        │                                                 
        ├── ◻ stopbutton 
        │      ├ #:    #-stopbutton
        │      └ href: #player=stop&-stopbutton  (stop and hide stop-button)
        │                                                 
        └── ◻ plane                        
               ├ play: #t=l:0,10
               ├ stop: #t=0,0
               ├ href: #player=play&stopbutton   (play and show stop-button)
               └ src:  cat.mp4#{player}
    
    
    

    urls.svg

     22nd September 2025 at 4:33pm

    using shaders

     2nd September 2025 at 4:30pm

    Shaders can be applied to meshes by adding src metadata, which supports sidecar loading of fragment/vertex shaderfiles.
    the following fileformats are encouraged:

    • GLSL (.frag/.vert are automatically sidecar loaded)
    • ISF
    at the moment GLSL is supported in the XR Fragment demo-viewer:

    src: https://foo.com/my.frag

    this will sidecar-load https://foo.com/my.vert (if exist)

    Uniforms

    As per the XR Fragment spec, these can be modified using the u:<name> fragment, for example:

    * src: https://foo.com/my.frag#u:speed=0.2,0.4

    these can be manipulated via href-clicks and URI Fragment Templates:
    • src: https://foo.com/my.frag#{uspeed}
    • uspeed_slow: 0.2,0.4
    • uspeed_fast: 0.2,0.4
    • href: xrf://uspeed=uspeed_slow

    uv and texture scrolling

     2nd September 2025 at 4:30pm
    An Easy nocode way to add metadata is by adding custom properties in blender e.g.. Basically:



    Create a plane or box-object with a texture-material, and add the following metadata:

    • #:#uv=0,0,0.1,0.1
    Profit! this will position the uv-coords initially at 0,0 and scroll 0.1 in the u and v direction.

    Read more about #uv

    NOTE: combine it with Reactivity / URI templating if you want the user to control/change presets.

    UV mirroring

     21st October 2025 at 5:25pm
    How to save up to 50% of your textures?

    Answer: by using mirrored uv-mapping

    read more on Trimsheet textures here

    🧩 What are mirrored UV's?

    Mirroring UV coordinates lets you reuse the same texture space for symmetrical parts of a model — perfect for quads or triangles that share mirrored geometry.

    This effectively halves your texture usage, saving up to 50% of file size while keeping visual detail identical on both sides.

    Beyond optimization, mirrored and tiled UVs can be arranged to create kaleidoscopic or fractal-like patterns, producing complex, symmetrical designs from a single small texture — ideal for stylized or modular environments in Blender.

    Free blender addons

    UVMAP.png

     

    uvmirror.svg

     

    vector

     27th April 2023 at 10:53pm

    comma-separated coordinates e.g. which after parsing can be accessed using .x, .y, .z etc.

    type example
    vector2 1.2,3 or 0.4,0.6
    vector3 1.2,3,4 or 0.4,0.6,5

    here are some interactive examples:

    
    

    vertical_fog.svg

     

    WebVTT subtitles

     24th March 2026 at 2:50pm

    Guided & interactive XR navigation is possible when using href in WebVTT subtitle-files.

    Potential

    • guided XR tours across multiple files/URLs
    • timeline for spawnpoints
    • rich storytelling / e-learning

    Example using sidecar files:

    myscene.xrf.glb
    myscene.xrf.vtt
    myscene.xrf.ogg
    

    Since .xrf. (or at least one href) is a heuristic for loading sidecar files, the webvtt subtitles (myscene.xrf.vtt) can be displayed while the experience plays (syncronized with myscene.xrf.ogg):

    WEBVTT
    
    00:01.000 --> 00:04.000 href:#spawn
    <v narrator voice>welcome to a special experience.
    one
    two
    three
    
    00:06.000 --> 00:12.000 
    Let me take you to the divine..
    
    00:13.000 --> 00:14.000 href:#fadeAudioOut&inside
    Here we are 
    
    00:14.000 --> 00:19.000
    Now lets open a portal 
    to a remote location 
    
    00:19.000 --> 00:25.000 href:https://xrfragment.org
    That portal will take us 
    to https://xrfragment.org
    Just click it..
    
    00:25.100 --> 00:30.000 
    Welcome to another Janus URL
    

    Here href is used as CUE setting, which the player can act upon.

    WebXR

     27th April 2023 at 12:34pm

    Why small file-size matters

     8th October 2025 at 12:08pm

    See the example files or XRForge for example assets)

    importance of optimizing 3D file size

    It's all about the future of immersive environments.

    Do you want you experience to run fast or with hickups?

    Golden rule

    To guarantee a smooth XR ride, remember: ''small optimized 3D files'' prevent motion sickness .

    A lot of money has been poored into XR experiences with improper use of 360 imagery or unoptimized (WebXR) experiences for standalone VR headsets, resulting in motionsickness. Create small optimized 3D files instead, compatible with XR Fragments instead to make your efforts worthwile.

    The primary reason

    The primary reason 3D assets, specifically models and textures, must remain relatively small lies in the necessity for rapid data delivery and memory efficiency.

    Low-polygon models

    Low-polygon models and small, compressed texture sizes directly translate to smaller file sizes, which drastically reduces the amount of data that needs to be downloaded, streamed, and loaded into system memory (RAM and VRAM). This is critical for achieving the 'fast-loading' virtual worlds of the future, particularly those accessed via mobile devices or slower connections, or those employing continuous procedural loading (like large open-world games). By minimizing the initial data transfer and the subsequent memory footprint, developers ensure users can achieve:

    immediate immersion

    Reduce latency during zone transitions, and save precious memory resources, thereby preventing system bottlenecks before rendering even begins.

    Essential

    Furthermore, keeping polygon counts low and texture resolutions manageable is essential for maintaining stable, high frame rates and preventing distracting 'framedrops' during real-time rendering. Every vertex and every texture pixel contributes to the GPU's workload: high-poly models require significantly more processing in the geometry pipeline, while large textures demand more memory bandwidth and computational power for sampling and fragment shading. In a complex virtual world with numerous concurrent users and dynamic objects, this computational burden multiplies rapidly. Adopting a strict low-poly approach and using efficient texture atlases ensures that the GPU can consistently render the scene within a tight millisecond budget, guaranteeing a smooth visual experience and enabling the scalable, fluid performance required for competitive gaming, collaborative work, and large-scale social virtual environments.

    XR Fragments

     17th March 2026 at 4:29pm

    Hyperlink the 3D world
    The 3D deeplinking standard for the deep immersive web.
    Turn 3D files into local-first, interactive, accessible XR movies, E-learnings & 3D websites.

    How? By using URLs with spawnpoints.

    Undefined widget 'videojs'


    Empower existing 3D fileformats like glTF, usdz, obj, collada which are used in websites, Game Engines, and like 3D editors.
    XR Fragments makes 3D files interactive
    via URLS , using any protocol , not necessarily served via HTTP, but also IPFS, hypercore, webtorrent e.g . This allows spatial interactions , like browser-navigation, teleportation, importing scenes, spatial hypermedia, allowing useful audiovisual immersive experiences like e-learnings, quiz, realtime-rendered 3D movies, and audiovisual storytelling via 3D metadata , so called 'extras' embedded in 3D files ('custom properties' in Blender) and promote URI's and Local-First data, which lives local, and ideally only syncs/shares elsewhere via open user-operated internet protocols. .

    ~10 mins podcast introduction

    Avoid cloud lock-in, by making your 3D experiences portable to outlast current technologies.

    website.glb#scene1
    Try 3D file


    Undefined widget 'videojs'

    #spatialweb #openinternet #interoperable #accessibility #3Dhypermedia
    Join Matrix Community

    🎨 no-code design-first
    🏄 surf 3D scenes in AR/VR
    📎 embeddable
    🤝 interoperable
    ⛔ network-agnostic, local-first
    💾 compatible with glTF FBX USDZ OBJ and more
    🔮 99% compatible with future fileformats
    🌱 friendly to opensource & corporations
    ❤️ no fileformat or editor lock-in
    🧑‍🌾 solo-user read-only 3D content

    Made for 3D designers



    See How it works

    TLDR

    The TLDR of processing 3D files with XR Fragments [pseudocode]:


    Virtual worlds without lock-in

    Scale beyond companies, appstores, network protocols and file-formats:

    Undefined widget 'videojs'

    Virtual worlds connected via URLs


    XR Fragments is a spec to link 3D models into a basic distributed interactive XR experience.
    Think of it as bundling virtual worlds into a spatial book.

    Progressive enhancement

    Use engine prefixes to hint specific game-engine features from within your 3D file.

    XR Fragments empowers designers to embed engine-hints, simple interactions & navigation inside a 3D file.
    This no longer requires developers to implement trivial interactive stuff.
    It promotes design-first, secure, durable and interoperable XR experiences from 3D models, basically 3D hypermedia, mitigating handcoded-XR-apps-as-3D-content-burial-sites.

    Getting Started

    Just get your hands on a 3D editor and follow the steps in the video:

    Undefined widget 'videojs'


    Check How it works, or view a demo.glb scene right now, or see the menu in the left corner for more.

    Presentation




    XR Movies

     4th September 2025 at 12:58pm

    The viewer should ideally presents a play-button when:

    • at least one animationdata-item is defined in the 3D file
    • and/or when a timeline file (soundtrack or subtitle) sidecar-file is detected

    See complementary file for detection of sidecar-files, for enhanced accessibility via WebVTT subtitles, thumbnails, soundtrack e.g.

    XR Movies & books

     16th March 2026 at 3:27pm

    Soundtrack, subtitles, thumbnail

    XR Movies anyone?

    Simple: just add those files and name them accordingly (mymovie.xrf.ogg for mymovie.xrf.glb e.g.) as sidecar-files

    Local-first

    While level1 is the core of the XR Fragments spec (spawnpoints + href-links):

    level0 completes the spec for more local usecases, especially:

    1. decorating/detecting a 3D file with extra files
    2. credible exit between XR platforms

    XRF microformat

     13th November 2025 at 3:59pm

    How can applications discover 3D experiences on a network?

    Answer: spatial microformats

    The XRF microformat is an optional text heuristics which applications can detect across various usecases.

    via HTML webpage

    If the browser/application requests an webpage (https://nlnet.nl e.g.) it should check for the rel-me microformat :

    <link rel="alternate" as="spatial-entrypoint" href="scene.xrf.glb">
    

    This way the application loads https://nlnet.nl/scene.xrf.glb when the user types nlnet.nl into the URLbar.
    Optionally, type can be specified for dynamically generated 3D files:

    <link rel="alternate" as="spatial-entrypoint" href="https://worlds.org/scene.php#platformB" type="model/gltf+binary" />
    

    The type-attribute is for fallback-purposes.
    Viewer-supported 3D file-extensions (.glb e.g.) will ALWAYS take precedence over the (non)presence of the type attribute.
    The reason is that platforms (Mastodon 'labels' e.g.) don't allow specifying type-attributes.
    Another reason is that XR Fragments is filetype-agnostic, so flexibility is expected on the viewer-side.

    NOTE: in case of multiple 3D files mentioned in <link rel="me", only the first (supported 3D filetype) will be chosen.

    Example of multiple spatial microformats:

    <link rel="alternate" as="spatial-entrypoint" href="scene.xrf.glb"/>
    <link rel="me" href="myavatar.vrm"/>
    <!-- JanusXR microformat https://github.com/jbaicoianu/janusweb
       <FireBoxRoom>
          <Assets>
            <assetobject id="experience" src="scene.xrf.glb"/>
          </Assets>
          <Room>
            <object pos="0 0 0" collision_id="experience" id="experience" />
          </Room>
       </FireBoxRoom>
    -->
    

    via WebFinger

    When John has an account on foo.com, how can other applications request his 3D homepage by simply entering john@foo.com?

    Answer: it can be requested at https://foo.com/.well-known/webfinger?resource=acct:john@foo.com, resulting in:

    {
      "subject": "acct:john@foo.com",
      "aliases": [
        "https://mastodon.example/social/john",
        "https://john.foo.com",
        "https://3d.john.foo.com/model/scene.glb"
      ],
      "properties": {
        "http://schema.org/name": "John Doe",
        "http://schema.org/description": "Developer, 3D Enthusiast, and Social Explorer"
      },
      "links": [
        {
          "rel": "http://ostatus.org/schema/1.0/subscribe",
          "template": "https://mastodon.example/social/john/{uri}"
        },
        {
          "rel": "self",
          "type": "text/html",
          "href": "https://john.foo.com"
        },
        {
          "rel": "me",
          "type": "text/html",
          "href": "https://john.foo.com"
        },
    		{
          "rel": "me",
          "type": "model/gltf+binary",
          "href": "https://3d.john.foo.com/model/avatar.vrm"
        },
        {
          "rel": "scene",
          "type": "model/gltf+binary",
          "href": "https://3d.john.foo.com/model/scene.xrf.glb"
        }
      ]
    }
    

    This way the application will load https://3d.john.foo.com/model/scene.glb when the user types john@foo.com into the user field.

    via Text (URI)

    Another way for an application to trigger loading a 3D scene is by detecting URI's of 3D scene-files any text:

    • foo.glb (or any other popular 3D extension)
    • https://foo.com/scene.glb (or any other popular protocol)

    This way, the application can highlight the link whenever it detects the URI (in a text-file or text-section of a 3D model)

    xrf: URI scheme

     26th September 2025 at 8:45pm

    Prefixing the xrf: to href-values will prevent level2 href-values from changing the top-Level URL.

    Usecase: for non-shareable URLs like href: xrf:#t=4,5, to display a stateful msg e.g.).

    Reason: XR Fragments is inspired by HTML's href-attribute, which does various things:

    1. it updates the browser-location
    2. it makes something clickable
    3. it jumps to another document / elsewhere in the same document
    4. and more

    The xrf: scheme will just do 2 & 3 (so the URL-values will not leak into the top-level URL).

    compliance with RFC 3986

    • unimplemented/unknown URI schemes (xrf:... e.g.) will not update the top-level URL

    xrfragment.jpg

     6th September 2025 at 12:03pm

    xrfsweetspot.jpg