diff --git a/index.html b/index.html index 4c18b7e..490f7ca 100644 --- a/index.html +++ b/index.html @@ -1391,7 +1391,7 @@ button.sidebar-toggle{ {"created":"20251006100256067","text":"> What 3D file-format should I save my XR Fragments-compatible experience to?\n\nWhile XR Fragments spec is fileformat-agnostic, its recommended to save to one of the following formats (via your [3D editor](#Edit%20a%203D%20scene%20file))\n:\n\n* [glTF](https://www.khronos.org/gltf/) (*)\n* [usdz](https://openusd.org/release/spec_usdz.html)\n* [X3D](https://en.wikipedia.org/wiki/X3D)\n* [obj](https://en.wikipedia.org/wiki/Wavefront_.obj_file)\n* [collada](https://www.khronos.org/collada)\n* [THREE.json](https://github.com/mrdoob/three.js/wiki/JSON-Object-Scene-format-4)\n\n\\* = export to `.glb` for a compressed small file-result.\n\n> For thumbnails (see [[sidecar files]]) the spec uses [[PNG images|https://en.wikipedia.org/wiki/PNG]]\n\n## Textures\n\n[3D editors](#Edit%20a%203D%20scene%20file) allows for adding textures to objects.\nRecommended is to use:\n\n* [JPEG images](https://en.wikipedia.org/wiki/JPEG) for opaque images\n* [PNG images](https://en.wikipedia.org/wiki/PNG) for transparent images\n\n> **Reason:** JPG usually results in smaller filesizes. However there are excellent PNG compressors available like [tinypng](https://tinypng.com), saving a PNG in [GIMP](https://gimp.org) to **Indexed mode** (Image>Mode>Indexed).","tags":"[[Best practices]]","title":"File formats","modified":"20251006131015252","type":"text/markdown"}, {"created":"20240619105321821","text":"3D Objects inside a 3D model can be referenced/shown/hidden via URI filters:\n\u003Cbr>\n\n\u003Cimg src=\"https://coderofsalvation.codeberg.page/xrfragment.media/images/filters.gif\" style=\"width:100%\"/>\n\nThis allows high re-usability of 3D modes for remote-, local- and recursive (embedded `src`) usecases:\n\u003Cbr>\u003Cbr>\n\n\u003Cpre>\n\u003Ccode>\n my.io/scene.usdz Embeddable as:\n +─────────────────────────────+\n │ sky │ src: http://my.io/scene.udsz#sky (includes building,mainobject,floor)\n │ +─────────────────────────+ │ \n │ │ building │ │ src: http://my.io/scene.udsz#building (includes mainobject,floor)\n │ │ +─────────────────────+ │ │\n │ │ │ mainobject │ │ │ src: http://my.io/scene.udsz#mainobject (includes floor)\n │ │ │ +─────────────────+ │ │ │\n │ │ │ │ floor │ │ │ │ src: http://my.io/scene.udsz#floor (just floor object)\n │ │ │ │ │ │ │ │\n │ │ │ +─────────────────+ │ │ │ href: http://my.io/scene.udsz#-mainobject (hides mainobject when clicked)\n │ │ +─────────────────────+ │ │\n │ +─────────────────────────+ │\n +─────────────────────────────+\n\u003C/code>\n\u003C/pre>\n\nThe [[href]] and [[src]] documentation show various examples, but the full syntax is explained in the spec below.\u003Cbr>\nOn top of that, [[tagged objects]] allow using `tag` metadata to group objects to trigger grouped features\n\n\u003Ch2>What does \"&-interactions*\" do in the demo scene?\u003C/h2>\n\nThe scene-node (3D root) of the [[demo scene|example/assets/index.glb]] indeed contains (startup) [[#]] metadata (`#pos=start&rot=0,40,0&t=0&-interactions*`).\n\u003Cbr>\nIts hiding all 3D objects (and their children) which are tagged with 'interactions'.\u003Cbr>\nFor example: you can see all the menu-items in Blender, but not in the browser.\u003Cbr>\n\n* `&` is just a separator ('AND do the following:')\n* `-` means 'hide'\n* `interactions` selects all objects with name 'interactions' or tag: interactions metadata\n* `*` selects all objects inside those selected objects too (text-objects etc)\n\n> For more on syntax see the spec below\n\n\u003Cbr>\u003Cbr>\n\u003Ciframe src=\"doc/RFC_XR_Fragments.html#xr-fragment-filters\" frameborder=\"0\" class=\"spec\">\u003C/iframe>\n\nFragment identifiers are derived from \u003Cb>metadata\u003C/b> inside the loaded 3D Model.\u003Cbr>More specific: \u003Cb>object-\u003C/b>, \u003Cb>material-\u003C/b>, and \u003Cb>camera-\u003C/b>names via a strategy called 'Fragment-to-metadata mapping':\n\n\u003Cbr>\u003Cbr>\n\u003Ciframe src=\"doc/RFC_XR_Fragments.html#fragment-to-metadata-mapping\" frameborder=\"0\" class=\"spec\">\u003C/iframe>\n\n","tags":"[[🧪 experimental]]","title":"filters","modified":"20250902143004749"}, {"created":"20251006115535367","text":"> How to define disappearing objects above/below/far away from you ''in a 3D file''?\n\n''TIP:'' use transparent materials for fog\n\n\u003Cbr>\n\u003Ccenter>\n\u003Cvideo width=\"100%\" style=\"max-width:500px\" controls src=\"https://files.mastodon.online/media_attachments/files/115/299/267/585/663/936/original/d65107f3667189a6.mp4\" style=\"max-width:400px\"/>\n\u003Cbr>\n\u003Cbr>\n\u003Ca target=\"_blank\" href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?https://copyparty.benetou.fr/webxr-neonjungle-gltf/space_fogtronic.glb\" style=\"padding:20px; border-radius:10px; background:#A0F; color:white;text-decoration:none\">View WebXR demo\u003C/a>\n\u003C/center>\n\u003Cbr>\n\n> ''Fact 1:'' Fog shaders \u003Cb>are not\u003C/b> always necessary for atmospheric effects.\n\n> ''Fact 2:'' Use geometry for near-sight, and images/sprites for far-sight backgrounds (humans cannot perceive depth meters away).\n\nBecause you can often achieve a great sense of depth and visual separation by using a large, partially transparent cylinder or sphere textured with a ''semi-transparent color'' or vertical ''gradient.png'' texture.\u003Cbr>\nThis will act as a soft volumetric divider between an inner and outer scene.\n\n!! Horizontal fog\n\nBy dividing the scene depth by several semitransparant 'filters', a sense of horizontal depth can be achieved:\n\n[img [horizontal_fog.svg]]\n\n> ''fact:'' perceiving fog depends on the objects placed at various distances (3 layers of fog-perception is enough) not on how discrete the fog is.\n\n!! Vertical fog\n\n> See [[Pixel- and gradient-maps]] for info on UV-mapping.\n\n[img[vertical_fog.svg]]\n\n> ''important'': the gradient texture should be transparent (at least the middle), so the objects can be seen clearly. \n\n!!!! gradient.png\n\n\u003Cdiv id=\"texture\">\u003C/div>\n\n> the texture can be 16x16 pixels (no need for big textures). This allows for a variety of gradientcolors & experimentation.","tags":"[[Best practices]]","title":"Fog materials","modified":"20251007164236479","type":"text/vnd.tiddlywiki"}, -{"created":"20230808113746326","text":"Just get your hands on a 3D editor (see this [[🖥 Blender ✅🔥]] guide) and follow the steps in the video:\n\u003Cbr>\u003Cbr>\n\u003Cdiv style=\"max-width:600px\">\n\u003C$videojs controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/gettingstarted2024.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\n\u003Ccenter>\n \u003Ca class=\"btn\" href=\"https://matrix.to/#/#xrfragments:matrix.org\" target=\"_blank\" style=\"padding:10px 30px\">Join Matrix Community\u003C/a>\n\u003C/center>\n\nHere are various ways to create/test 3D files with XR Fragments:\n\n| ''scenario'' | ''how'' | ''notes'' |\n| easiest | see the [[🖥 Blender ✅🔥]] workflow below, by loading a `.glb` 3D file into any \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">demo\u003C/a> on xrfragment.org | export 3D file (.glb) in \u003Ca href=\"https://blender.org\" target=\"_blank\">Blender\u003C/a>, after adding a [[href]] \u003Cb>metadata\u003C/b> as \u003Ca href=\"https://docs.blender.org/manual/en/2.79/data_system/custom_properties.html\" target=\"_blank\">custom properties\u003C/a>, and load exported files into \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">any demo\u003C/a> (see video above)|\n\n\u003Cbr>\n\u003Cdiv style=\"max-width:600px\">\n\u003C$videojs controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/loading.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\u003Ch2>Developers\u003C/h2>\n\nFor developers wanting to integrate or build your own 3D hypermedia browser, the easiest is WebXR:\n\n\u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">» View \u003Cb>website.glb\u003C/b> online\u003C/a> or \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">download \u003Cb>website.glb\u003C/b> and open\u003C/a> it in \u003Ca href=\"https://blender.org\" target=\"_blank\">Blender\u003C/a>.\u003Cbr>\n(developers can extend a 3D model viewer here \u003Ca href=\"https://codepen.io/coderofsalvation/pen/yLwedvX\" target=\"_blank\">this codepen\u003C/a>)\n\u003Cbr>\u003Cbr>\n\nBut there are also other approaches, as XR Fragments is not tied to any XR-technology or fileformat:\n\n| ''scenario'' | ''how'' | ''notes'' |\n| dev #godot | load the \u003Ca href=\"#%F0%9F%A7%B0%20GODOT\">example project\u003C/a> | |\n| dev #threejs #github #modular | fork \u003Ca href=\"https://github.com/coderofsalvation/xrfragment-three-helloworld\">xfragment-three-helloworld\u003C/a> | requires javascript- and \u003Ca href=\"https://threejs.org\" target=\"_blank\">threejs\u003C/a> developer-knowledge |\n| dev #polyglot | use the [[XR Fragment parser|https://github.com/coderofsalvation/xrfragment/tree/main/dist]] | lowlevel approach, more suitable for other scenarios |\n| dev #spec #browser | implement [[the spec|📜 XR fragments]] yourself | the spec is simple: parse URL and iterate over a scene |\n| dev #aframe #github | hosted sandbox by \u003Ca href=\"https://github.com/coderofsalvation/xrfragment-helloworld\" target=\"_blank\">forking xrfragment-helloworld\u003C/a> | Basically #1 but it will be hosted for free at your own github URL |\n| dev #aframe #github #modular | fork \u003Ca href=\"https://github.com/coderofsalvation/xrfragment-aframe-helloworld\">xfragment-aframe-helloworld\u003C/a> | requires javascript- and \u003Ca href=\"https://aframe.io\" target=\"_blank\">aframe.io\u003C/a> developer-knowledge |\n\nNext to that, familiarize yourself with XR Fragments by checking these videos: \n\n1. \u003Ca href=\"https://github.com/coderofsalvation/xrfragment.media\" target=\"_blank\">All videos on github\u003C/a> (tip: star the repo)\u003Cbr>\n2. \u003Ca href=\"https://www.youtube.com/playlist?list=PLctjJGlTmeE64XPSQER2BSbjmqVGaWM4J\" target=\"_blank\">All videos on Youtube\u003C/a> (tip: subscribe or add to 'Watch-later' list)","tags":"Home","title":"Getting started","modified":"20250926175054497","type":"text/vnd.tiddlywiki","list-before":"Philosophy & FAQ"}, +{"created":"20230808113746326","text":"Just get your hands on a 3D editor (see this [[🖥 Blender ✅🔥]] guide) and follow the steps in the video:\n\u003Cbr>\u003Cbr>\n\u003Cdiv style=\"max-width:600px\">\n\u003C$videojs controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/gettingstarted2024.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\n\u003Ccenter>\n \u003Ca class=\"btn\" href=\"https://matrix.to/#/#xrfragments:matrix.org\" target=\"_blank\" style=\"padding:10px 30px\">Join Matrix Community\u003C/a>\n\u003C/center>\n\nHere are various ways to create/test 3D files with XR Fragments:\n\n| ''scenario'' | ''how'' | ''notes'' |\n| easiest | see the [[🖥 Blender ✅🔥]] workflow below, by loading a `.glb` 3D file into any \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">demo\u003C/a> on xrfragment.org | export 3D file (.glb) in \u003Ca href=\"https://blender.org\" target=\"_blank\">Blender\u003C/a>, after adding a [[href]] \u003Cb>metadata\u003C/b> as \u003Ca href=\"https://docs.blender.org/manual/en/2.79/data_system/custom_properties.html\" target=\"_blank\">custom properties\u003C/a>, and load exported files into \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">any demo\u003C/a> (see video above)|\n\n\u003Cbr>\n\u003Cdiv style=\"max-width:600px\">\n\u003C$videojs controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/loading.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\u003Ch2>Developers\u003C/h2>\n\nFor developers wanting to integrate or build your own 3D hypermedia browser, the easiest is WebXR:\n\n\u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">» View \u003Cb>website.glb\u003C/b> online\u003C/a> or \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">download \u003Cb>website.glb\u003C/b> and open\u003C/a> it in \u003Ca href=\"https://blender.org\" target=\"_blank\">Blender\u003C/a>.\u003Cbr>\n(developers can extend a 3D model viewer here \u003Ca href=\"https://codepen.io/coderofsalvation/pen/yLwedvX\" target=\"_blank\">this codepen\u003C/a>)\n\u003Cbr>\u003Cbr>\n\nBut there are also other approaches, as XR Fragments is not tied to any XR-technology or fileformat:\n\n| ''scenario'' | ''how'' | ''notes'' |\n| dev #godot | load the \u003Ca href=\"#%F0%9F%A7%B0%20GODOT\">example project\u003C/a> | |\n| dev #threejs #github #modular | fork \u003Ca href=\"https://github.com/coderofsalvation/xrfragment-three-helloworld\">xfragment-three-helloworld\u003C/a> | requires javascript- and \u003Ca href=\"https://threejs.org\" target=\"_blank\">threejs\u003C/a> developer-knowledge |\n| dev #polyglot | use the [[XR Fragment parser|https://github.com/coderofsalvation/xrfragment/tree/main/dist]] | lowlevel approach, more suitable for other scenarios |\n| dev #spec #browser | implement [[the spec|📜 XR fragments]] yourself | the spec is simple: parse URL and iterate over a scene |\n| dev #aframe #github | hosted sandbox by \u003Ca href=\"https://github.com/coderofsalvation/xrfragment-helloworld\" target=\"_blank\">forking xrfragment-helloworld\u003C/a> | Basically #1 but it will be hosted for free at your own github URL |\n| dev #aframe #github #modular | fork \u003Ca href=\"https://github.com/coderofsalvation/xrfragment-aframe-helloworld\">xfragment-aframe-helloworld\u003C/a> | requires javascript- and \u003Ca href=\"https://aframe.io\" target=\"_blank\">aframe.io\u003C/a> developer-knowledge |\n\nNext to that, familiarize yourself with XR Fragments by checking these videos: \n\n1. \u003Ca href=\"https://github.com/coderofsalvation/xrfragment.media\" target=\"_blank\">All videos on github\u003C/a> (tip: star the repo)\u003Cbr>\n2. \u003Ca href=\"https://www.youtube.com/playlist?list=PLctjJGlTmeE64XPSQER2BSbjmqVGaWM4J\" target=\"_blank\">All videos on Youtube\u003C/a> (tip: subscribe or add to 'Watch-later' list)","tags":"Home","title":"Getting started","modified":"20250926175054497","type":"text/vnd.tiddlywiki","list-before":"Permacomputing & FAQ"}, {"created":"20230425160210102","text":"\u003Cshader-doodle>\n \u003Csd-node name=\"motionblur\" prevbuffer>\n \u003Csd-node name=\"rotate\">\n \u003Csd-node name=\"basic_gl\">\n \u003Cscript type=\"x-shader/x-fragment\">\n void main() {\n vec2 st = gl_FragCoord.xy / u_resolution.xy;\n vec3 color = vec3(st.x, st.y, abs(sin(u_time)));\n\n gl_FragColor = vec4(color, 1.);\n }\n \u003C/script>\n \u003C/sd-node>\n \u003Cscript type=\"x-shader/x-fragment\">\n uniform sampler2D basic_gl;\n\n const float PI = 3.1415926;\n\n void main() {\n vec2 st = gl_FragCoord.xy / u_resolution.xy;\n\n float angle = 2. * PI * (.5 + .5 * cos(u_time));\n float scale = .7 + .4 * cos(u_time);\n\n mat2 rotation = mat2(cos(angle), -sin(angle), sin(angle), cos(angle));\n vec2 p = (st - vec2(.5)) * rotation / scale + vec2(.5);\n\n gl_FragColor = p.x \u003C 0. || p.x > 1. || p.y \u003C 0. || p.y > 1.\n ? vec4(0., 0., 0., 1.)\n : texture2D(basic_gl, p);\n }\n \u003C/script>\n \u003C/sd-node>\n \u003Cscript type=\"x-shader/x-fragment\">\n uniform sampler2D rotate, u_prevbuffer;\n\n void main () {\n vec2 st = gl_FragCoord.xy / u_resolution.xy;\n gl_FragColor = vec4(mix(\n texture2D(rotate, st),\n texture2D(u_prevbuffer, st),\n .8\n ).rgb, 1.);\n }\n \u003C/script>\n \u003C/sd-node>\n \u003Cscript type=\"x-shader/x-fragment\">\n uniform sampler2D motionblur;\n\n void main() {\n vec2 st = gl_FragCoord.xy / u_resolution.xy;\n gl_FragColor = texture2D(motionblur, st);\n }\n \u003C/script>\n\u003C/shader-doodle>","tags":"GLSL","title":"GLSL template","modified":"20230425170513931","type":"text/vnd.tiddlywiki"}, {"created":"20240924135721168","text":"XR Fragments is \u003Cb>not a\u003C/b> fileformat-specific extension, it's a spec for \u003Cb>deeplinking\u003C/b> any 3D file.\u003Cbr>\nThe level2 metadata (See reference) is easy to embed in any 3D editor (not only blender) than it would be to support new GLTF extensions.\u003Cbr>\nThis is not to say extensions are bad (they are superior in certain cases).\u003Cbr>\n\n> Just like URLs allow fileformat-agnostic navigation, 3D asset 'extras' are fileformat-agnostic too, which together allow for XR Fragments.\n\nTo deal with extensions/overlapping features \nsee [native vs XRF features](#native%20vs%20XRF%20features)","tags":"Reference","title":"glTF extensions","modified":"20251107074242841","type":"text/markdown"}, {"created":"20240226111559175","text":"The hashbus sits inbetween HTML's traditional `href` and the toplevel URL.\u003Cbr>\nSay what?\u003Cbr>\n\u003Cbr>\n> Because of historical reasons the `href` bundles interaction (a click) and navigation (replacing the viewport with another resource).\n\nXR Fragments also allows separating these historicially merged actions, by introducing a hashbus:\n\n| href value | updates top-level URL |\n|-|-|\n| `#foo` | yes |\n|`xrf://#foo` | no |\n\nThis allows much more document interactions, with the following benefits:\n\n* interactions don't clutter URLs for back/forward button navigation\n* many usecases don't require a scripting language anymore (hiding/scrolling via [#uv](#uv) e.g.)\n* use same URI Fragment DSL for navigation and interactions\n* re-use URI Templates across 3D nodes\n* allow 3D nodes publish updates to other 3D nodes (via hashbus)\n\nIn short, a complete **hypermediatic feedback loop** (HFL).\n\n\u003Cbr>\nBelow is the related section of the spec (full spec here: \u003Ca href=\"doc/RFC_XR_Fragments.html\" target=\"_blank\">HTML\u003C/a>, \u003Ca href=\"doc/RFC_XR_Fragments.txt\" target=\"_blank\">TXT\u003C/a>)\n\n\u003Ciframe src=\"doc/RFC_XR_Fragments.html#hypermediatic-feedbackloop-for-xr-browsers\" frameborder=\"0\" class=\"spec\">\u003C/iframe>\n\n","tags":"","title":"hashbus","modified":"20240228122229072","type":"text/markdown"}, @@ -1488,7 +1488,7 @@ button.sidebar-toggle{ {"created":"20260324134403628","text":"> Guided & interactive XR navigation is possible when using [[href]] in WebVTT subtitle-files.\n\n### Potential\n\n* guided XR tours across multiple files/URLs\n* timeline for spawnpoints\n* rich storytelling / e-learning\n\nExample using [[sidecar files]]:\n\n```\nmyscene.xrf.glb\nmyscene.xrf.vtt\nmyscene.xrf.ogg\n```\n\nSince `.xrf.` (or at least one [[href]]) is a heuristic for loading [[sidecar files]], the webvtt subtitles (`myscene.xrf.vtt`) can be displayed while the experience plays (syncronized with `myscene.xrf.ogg`):\n\n```\nWEBVTT\n\n00:01.000 --> 00:04.000 href:#spawn\n\u003Cv narrator voice>welcome to a special experience.\none\ntwo\nthree\n\n00:06.000 --> 00:12.000 \nLet me take you to the divine..\n\n00:13.000 --> 00:14.000 href:#fadeAudioOut&inside\nHere we are \n\n00:14.000 --> 00:19.000\nNow lets open a portal \nto a remote location \n\n00:19.000 --> 00:25.000 href:https://xrfragment.org\nThat portal will take us \nto https://xrfragment.org\nJust click it..\n\n00:25.100 --> 00:30.000 \nWelcome to another Janus URL\n```\n\nHere [[href]] is used as CUE setting, which the player can act upon.\n","tags":"href","title":"WebVTT subtitles","modified":"20260324135016197","type":"text/markdown"}, {"created":"20230427103350051","text":"","tags":"","title":"WebXR","modified":"20230427103400217"}, {"created":"20251006103353443","text":"> See the [example files](https://codeberg.org/coderofsalvation/xrfragment/src/branch/main/assets) or [XRForge](https://xrforge.isvery.ninja) for example assets)\n\n## importance of optimizing 3D file size\n\n\n{{image_VR_lady}}\n\nIt's all about the future of immersive environments.\n\n> Do you want you experience to run fast or with hickups?\n\n## Golden rule\n\nTo guarantee a smooth XR ride, remember: ''small optimized 3D files'' prevent motion sickness .\n\n> A lot of money has been poored into XR experiences with improper use of 360 imagery or unoptimized (WebXR) experiences for standalone VR headsets, resulting in motionsickness. Create small optimized 3D files instead, compatible with XR Fragments instead to **make your efforts worthwile**.\n\n## The primary reason \n\nThe primary reason 3D assets, specifically models and textures, must remain relatively small lies in the necessity for rapid data delivery and memory efficiency. \n\n## Low-polygon models\n\nLow-polygon models and small, compressed texture sizes directly translate to smaller file sizes, which drastically reduces the amount of data that needs to be downloaded, streamed, and loaded into system memory (RAM and VRAM). This is critical for achieving the 'fast-loading' virtual worlds of the future, particularly those accessed via mobile devices or slower connections, or those employing continuous procedural loading (like large open-world games). By minimizing the initial data transfer and the subsequent memory footprint, developers ensure users can achieve:\n\n## immediate immersion\n\nReduce latency during zone transitions, and save precious memory resources, thereby preventing system bottlenecks before rendering even begins.\n\n## Essential\n\nFurthermore, keeping polygon counts low and texture resolutions manageable is essential for maintaining stable, high frame rates and preventing distracting 'framedrops' during real-time rendering. Every vertex and every texture pixel contributes to the GPU's workload: high-poly models require significantly more processing in the geometry pipeline, while large textures demand more memory bandwidth and computational power for sampling and fragment shading. In a complex virtual world with numerous concurrent users and dynamic objects, this computational burden multiplies rapidly. Adopting a strict low-poly approach and using efficient texture atlases ensures that the GPU can consistently render the scene within a tight millisecond budget, guaranteeing a smooth visual experience and enabling the scalable, fluid performance required for competitive gaming, collaborative work, and large-scale social virtual environments.","tags":"[[Best practices]]","title":"Why small file-size matters","modified":"20251008100849876","type":"text/markdown"}, -{"created":"20230424092557827","text":"\u003Cb>Hyperlink the 3D world\u003C/b>\u003Cbr>\nThe 3D deeplinking standard for [[the deep immersive web|Philosophy & FAQ]].\u003Cbr>\n''Turn'' 3D files ''into'' local-first, interactive, accessible \u003Ca href=\"#XR%Movies\">XR movies\u003C/a>, E-learnings & 3D websites.\u003Cbr>\n\u003Cbr>\nHow? By using \u003Cb>URLs with spawnpoints\u003C/b>.\u003Cbr>\n\u003Cbr>\n\n\u003Cdiv style=\"max-width:800px;box-shadow:none\" class=\"border\">\n\u003C$videojs _autoplay controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/hyperlinking-the-3d-world.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\u003Cbr>\nEmpower existing 3D\n\u003Cu tabindex=\"0\">fileformats\n \u003Cspan>like \u003Cb>glTF\u003C/b>, \u003Cb>usdz\u003C/b>, \u003Cb>obj\u003C/b>, \u003Cb>collada\u003C/b> which are used in websites, Game Engines, and like \u003Ca href=\"#Edit%20a%203D%20scene%20file\">3D editors\u003C/a>.\u003Cbr>XR Fragments makes 3D files interactive \u003C/span>\n\u003C/u> via \n\u003Cu tabindex=\"0\">URLS\n \u003Cspan>, using any \n \u003Cu tabindex=\"0\">protocol\n \u003Cspan>, not necessarily served via HTTP, but also \u003Ca href=\"https://ipfs.com\" target=\"_blank\">IPFS\u003C/a>, \u003Ca href=\"https://hypercore-protocol.github.io/new-website/guides/getting-started/\" target=\"_blank\">hypercore\u003C/a>, \u003Ca href=\"https://github.com/webtorrent/webtorrent\" target=\"_blank\">webtorrent\u003C/a> e.g\u003C/span>\n\t \u003C/u>\n\t\u003C/span>\n\u003C/u>.\nThis allows spatial\n \u003Cu tabindex=\"0\">interactions \n\t \u003Cspan>, like browser-navigation, teleportation, importing scenes, spatial hypermedia, allowing useful audiovisual immersive\u003C/span>\n\t\t\u003Cu tabindex=\"0\">experiences\n\t\t \u003Cspan>like e-learnings, quiz, realtime-rendered 3D movies, and audiovisual storytelling\u003C/span>\n\t\t\u003C/u>\n\t\u003C/u>\nvia 3D \n\u003Cu tabindex=\"0\">metadata\n \u003Cspan>, so called 'extras' embedded in 3D files ('custom properties' in \u003Ca href=\"https://blender.org\" target=\"_blank\">Blender\u003C/a>)\u003C/span>\n\u003C/u>\nand promote URI's and \n\u003Cu tabindex=\"0\">Local-First\n \u003Cspan> data, which lives local, and ideally only syncs/shares elsewhere via ''open user-operated internet'' protocols.\u003C/span>\n\u003C/u>\n.\n\n\n\u003Cdiv style=\"text-align:center\">\n\u003Cb style=\"font-size:11px\">~10 mins podcast introduction\u003C/b>\u003Cbr>\n\u003Caudio controls src=\"https://coderofsalvation.codeberg.page/xrfragment.media/podcast-xrfragments-intro.mp3\" type=\"audio/mpeg\">\n\u003C/audio>\n\u003C/div>\n\u003Cbr>\nAvoid \u003Cb>cloud lock-in\u003C/b>, by making your 3D experiences \u003Cb>portable\u003C/b> to \u003Cb>outlast\u003C/b> current technologies.\n\n\u003Ch2>Examples\u003C/h2>\n\u003Cbr>\n\u003Ciframe allowfullscreen allow=\"xr-spatial-tracking; fullscreen\" src=\"https://xrforge.isvery.ninja/view/index.html?profile=preview#janus.url=https://xrforge.isvery.ninja/models/zzswz4qlqw8w/model_files/test.xrf.glb\" style=\"width:100%; height: 50vh; border:0; border-radius:10px;\">\u003C/iframe>\n\n> A 3D file as [[XR Movie|XR Movies]] on the selfhostable [[XRForge|https://xrforge.isvery.ninja]] (try above)\n\n\u003Cbr>\u003Cbr>\n\u003Cdiv style=\"display:inline-block; padding:0px 20px; border-radius:5px 5px 0px 0px; border:2px solid #555;background: #ededed;font-weight: bold;font-size: 16px;border-bottom: none;\">website.glb#scene1\u003C/div>\n\u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">\n \u003Cimg src=\"https://coderofsalvation.codeberg.page/xrfragment.media/images/website.glb.jpg\" style=\"border-left: 2px solid #555; cursor:pointer\">\n\u003C/a>\n\n> A 3D file as XR website via AFRAME: \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\" style=\"padding:10px 30px\">Try here\u003C/a>\n\n\n\u003Cbr>\n\u003Cdiv style=\"max-width:800px;box-shadow:none\" class=\"border\">\n\u003C$videojs _autoplay controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/showreel_2024.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\u003Cdiv>#spatialweb #openinternet #interoperable #accessibility #3Dhypermedia\u003C/div>\n\u003Ccenter>\n \u003Ca class=\"btn\" href=\"https://matrix.to/#/#xrfragments:matrix.org\" target=\"_blank\" style=\"padding:10px 30px\">Join Matrix Community\u003C/a>\n\u003C/center>\n\u003Cbr>\n\n\u003Ctable style=\"border:none\">\n \u003Ctr>\n\t \u003Ctd style=\"border:none;vertical-align:top; width:49%\">\n\t\t\t\u003Cb>🎨 no-code design-first\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>🏄 surf 3D scenes in AR/VR\u003C/b>\u003Cbr/>\n\t \u003Cb>📎 embeddable\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>🤝 interoperable\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>⛔ network-agnostic, local-first\u003C/b>\u003Cbr/>\n \u003Cb>💾 compatible with glTF FBX USDZ OBJ and more\u003C/b>\u003Cbr/>\t\t\t\t\t\n\t\t\u003C/td>\n\t\t\u003Ctd style=\"border:none;vertical-align:top\">\n\t\t\t\u003Cb>🔮 99% compatible with \u003Cb>future fileformats\u003C/b>\u003C/b>\u003Cbr/>\n \u003Cb>🌱 friendly to opensource & corporations\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>❤️ \u003Cb>no\u003C/b> fileformat or editor lock-in\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>🧑‍🌾 solo-user read-only 3D content\u003C/b>\u003Cbr/>\n\t\t\u003C/td>\n\t\u003C/tr>\n\u003C/table>\n\u003Cbr>\n\n\u003Ch2>Made for 3D designers\u003C/h2>\n[img[xrfragment.jpg]]\n\u003Cbr>\u003CBr>\n\n\n\u003C\u003C\u003C\nSee [[How it works]]\n\u003C\u003C\u003C\n\n\n\u003Ch2>TLDR\u003C/h2>\n\nThe \u003Cb>TLDR\u003C/b> of processing 3D files with XR Fragments [pseudocode]:\n\u003Cbr>\u003Cbr>\n\u003Cdiv>\n \u003Ctextarea spellcheck=\"false\" class=\"sandboxify noresult\" style=\"min-height:190px;width:100%;max-width:800px;\">foreach object in scene:\n if object.extra.href:\n\t object.onClick = updateCameraFromURL(object.extra.href, camera, timeline)\n \nif changed(app.URL):\n camera.updateCameraFromURL(app.URL)\n \t\ndocument.location.href = 'my.org/foo.glb#roomC'\n\n\u003C/textarea>\n\u003C/div> \n\u003Cbr>\n\u003Ch2>Virtual worlds without lock-in\u003C/h2>\n\nScale beyond companies, appstores, network protocols and file-formats:\n\n\u003Cdiv style=\"max-width:600px;box-shadow:none;padding:15px\" class=\"border\">\n\u003C$videojs _autoplay controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/xrfragment.bumper2.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\u003Cbr>\n\n\u003Ch3>Virtual worlds connected via URLs\u003C/h3>\n\n[img[urls.svg]]\n\n\u003Cbr>\n\nXR Fragments is a spec to link 3D models into a basic ''distributed'' interactive XR experience.\u003Cbr>\nThink of it as bundling virtual worlds into a \u003Cb>spatial book\u003C/b>.\u003Cbr>\n\u003Cbr>\n\n\n!! Progressive enhancement\n\nUse [[engine prefixes|📜level7: engine prefixes]] to hint specific \u003Cb>game-engine features\u003C/b> from within your 3D file.\n\u003Cbr>\u003Cbr>\n[img[engine-prefixes.webp]]\n\n\nXR Fragments \u003Cb>empowers designers\u003C/b> to embed engine-hints, \u003Cb>simple interactions & navigation\u003C/b> inside a \u003Cb>3D file\u003C/b>.\u003Cbr>\nThis \u003Cb>no longer\u003C/b> requires developers to implement trivial interactive stuff.\u003Cbr>\nIt promotes \u003Cb>design-first, secure, durable and interoperable\u003C/b> XR experiences from \u003Cb>3D models\u003C/b>, basically 3D hypermedia, mitigating \u003Cb>handcoded-XR-apps-as-3D-content-burial-sites\u003C/b>.\u003Cbr>\n\u003Cbr>\n\n\u003Ch2>Getting Started\u003C/h2>\n\nJust get your hands on a 3D editor and follow the steps in the video:\n\u003Cbr>\u003Cbr>\n\u003Cdiv style=\"max-width:600px\">\n\u003C$videojs controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/gettingstarted2024.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\u003Cbr>\u003CBr>\nCheck [[How it works|How it works]], or \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">view a \u003Cb>demo.glb\u003C/b> scene right now\u003C/a>, or see the menu in the left corner for more.\n\u003Cbr>\u003Cbr>\n\u003Ch2>Presentation\u003C/h2>\n\u003Cbr>\n\u003Ciframe width=\"560\" height=\"315\" src=\"https://www.youtube.com/embed/bfxqm1q_GXw?start=1445\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen>\u003C/iframe>\n\n\u003C!-- persist telescopic unfolds -->\n\u003C\u003Cscript>>\n\u003Cscript>\n([...document.querySelectorAll('u')]).map( (u) => {\n u.addEventListener('click', e => e.target.className = 'show' )\n});\n\u003C/script>\n\u003C\u003Cscript 0>>\n","tags":"Home","title":"XR Fragments","modified":"20260416093935651","list-before":"How it works"}, +{"created":"20230424092557827","text":"\u003Cb>Hyperlink the 3D world\u003C/b>\u003Cbr>\nThe 3D deeplinking standard for [[the deep immersive web|Permacomputing & FAQ]].\u003Cbr>\n''Turn'' 3D files ''into'' local-first, interactive, accessible \u003Ca href=\"#XR%Movies\">XR movies\u003C/a>, E-learnings & 3D websites.\u003Cbr>\n\u003Cbr>\nHow? By using \u003Cb>URLs with spawnpoints\u003C/b>.\u003Cbr>\n\u003Cbr>\n\n\u003Cdiv style=\"max-width:800px;box-shadow:none\" class=\"border\">\n\u003C$videojs _autoplay controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/hyperlinking-the-3d-world.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\u003Cbr>\nEmpower existing 3D\n\u003Cu tabindex=\"0\">fileformats\n \u003Cspan>like \u003Cb>glTF\u003C/b>, \u003Cb>usdz\u003C/b>, \u003Cb>obj\u003C/b>, \u003Cb>collada\u003C/b> which are used in websites, Game Engines, and like \u003Ca href=\"#Edit%20a%203D%20scene%20file\">3D editors\u003C/a>.\u003Cbr>XR Fragments makes 3D files interactive \u003C/span>\n\u003C/u> via \n\u003Cu tabindex=\"0\">URLS\n \u003Cspan>, using any \n \u003Cu tabindex=\"0\">protocol\n \u003Cspan>, not necessarily served via HTTP, but also \u003Ca href=\"https://ipfs.com\" target=\"_blank\">IPFS\u003C/a>, \u003Ca href=\"https://hypercore-protocol.github.io/new-website/guides/getting-started/\" target=\"_blank\">hypercore\u003C/a>, \u003Ca href=\"https://github.com/webtorrent/webtorrent\" target=\"_blank\">webtorrent\u003C/a> e.g\u003C/span>\n\t \u003C/u>\n\t\u003C/span>\n\u003C/u>.\nThis allows spatial\n \u003Cu tabindex=\"0\">interactions \n\t \u003Cspan>, like browser-navigation, teleportation, importing scenes, spatial hypermedia, allowing useful audiovisual immersive\u003C/span>\n\t\t\u003Cu tabindex=\"0\">experiences\n\t\t \u003Cspan>like e-learnings, quiz, realtime-rendered 3D movies, and audiovisual storytelling\u003C/span>\n\t\t\u003C/u>\n\t\u003C/u>\nvia 3D \n\u003Cu tabindex=\"0\">metadata\n \u003Cspan>, so called 'extras' embedded in 3D files ('custom properties' in \u003Ca href=\"https://blender.org\" target=\"_blank\">Blender\u003C/a>)\u003C/span>\n\u003C/u>\nand promote URI's and \n\u003Cu tabindex=\"0\">Local-First\n \u003Cspan> data, which lives local, and ideally only syncs/shares elsewhere via ''open user-operated internet'' protocols.\u003C/span>\n\u003C/u>\n.\n\n\n\u003Cdiv style=\"text-align:center\">\n\u003Cb style=\"font-size:11px\">~10 mins podcast introduction\u003C/b>\u003Cbr>\n\u003Caudio controls src=\"https://coderofsalvation.codeberg.page/xrfragment.media/podcast-xrfragments-intro.mp3\" type=\"audio/mpeg\">\n\u003C/audio>\n\u003C/div>\n\u003Cbr>\nAvoid \u003Cb>cloud lock-in\u003C/b>, by making your 3D experiences \u003Cb>portable\u003C/b> to \u003Cb>outlast\u003C/b> current technologies.\n\n\u003Ch2>Examples\u003C/h2>\n\u003Cbr>\n\u003Ciframe allowfullscreen allow=\"xr-spatial-tracking; fullscreen\" src=\"https://xrforge.isvery.ninja/view/index.html?profile=preview#janus.url=https://xrforge.isvery.ninja/models/zzswz4qlqw8w/model_files/test.xrf.glb\" style=\"width:100%; height: 50vh; border:0; border-radius:10px;\">\u003C/iframe>\n\n> A 3D file as [[XR Movie|XR Movies]] on the selfhostable [[XRForge|https://xrforge.isvery.ninja]] (try above)\n\n\u003Cbr>\u003Cbr>\n\u003Cdiv style=\"display:inline-block; padding:0px 20px; border-radius:5px 5px 0px 0px; border:2px solid #555;background: #ededed;font-weight: bold;font-size: 16px;border-bottom: none;\">website.glb#scene1\u003C/div>\n\u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">\n \u003Cimg src=\"https://coderofsalvation.codeberg.page/xrfragment.media/images/website.glb.jpg\" style=\"border-left: 2px solid #555; cursor:pointer\">\n\u003C/a>\n\n> A 3D file as XR website via AFRAME: \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\" style=\"padding:10px 30px\">Try here\u003C/a>\n\n\n\u003Cbr>\n\u003Cdiv style=\"max-width:800px;box-shadow:none\" class=\"border\">\n\u003C$videojs _autoplay controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/showreel_2024.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\n\u003Cdiv>#spatialweb #openinternet #interoperable #accessibility #3Dhypermedia\u003C/div>\n\u003Ccenter>\n \u003Ca class=\"btn\" href=\"https://matrix.to/#/#xrfragments:matrix.org\" target=\"_blank\" style=\"padding:10px 30px\">Join Matrix Community\u003C/a>\n\u003C/center>\n\u003Cbr>\n\n\u003Ctable style=\"border:none\">\n \u003Ctr>\n\t \u003Ctd style=\"border:none;vertical-align:top; width:49%\">\n\t\t\t\u003Cb>🎨 no-code design-first\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>🏄 surf 3D scenes in AR/VR\u003C/b>\u003Cbr/>\n\t \u003Cb>📎 embeddable\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>🤝 interoperable\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>⛔ network-agnostic, local-first\u003C/b>\u003Cbr/>\n \u003Cb>💾 compatible with glTF FBX USDZ OBJ and more\u003C/b>\u003Cbr/>\t\t\t\t\t\n\t\t\u003C/td>\n\t\t\u003Ctd style=\"border:none;vertical-align:top\">\n\t\t\t\u003Cb>🔮 99% compatible with \u003Cb>future fileformats\u003C/b>\u003C/b>\u003Cbr/>\n \u003Cb>🌱 friendly to opensource & corporations\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>❤️ \u003Cb>no\u003C/b> fileformat or editor lock-in\u003C/b>\u003Cbr/>\n\t\t\t\u003Cb>🧑‍🌾 solo-user read-only 3D content\u003C/b>\u003Cbr/>\n\t\t\u003C/td>\n\t\u003C/tr>\n\u003C/table>\n\u003Cbr>\n\n\u003Ch2>Made for 3D designers\u003C/h2>\n[img[xrfragment.jpg]]\n\u003Cbr>\u003CBr>\n\n\n\u003C\u003C\u003C\nSee [[How it works]]\n\u003C\u003C\u003C\n\n\n\u003Ch2>TLDR\u003C/h2>\n\nThe \u003Cb>TLDR\u003C/b> of processing 3D files with XR Fragments [pseudocode]:\n\u003Cbr>\u003Cbr>\n\u003Cdiv>\n \u003Ctextarea spellcheck=\"false\" class=\"sandboxify noresult\" style=\"min-height:190px;width:100%;max-width:800px;\">foreach object in scene:\n if object.extra.href:\n\t object.onClick = updateCameraFromURL(object.extra.href, camera, timeline)\n \nif changed(app.URL):\n camera.updateCameraFromURL(app.URL)\n \t\ndocument.location.href = 'my.org/foo.glb#roomC'\n\n\u003C/textarea>\n\u003C/div> \n\u003Cbr>\n\u003Ch2>Virtual worlds without lock-in\u003C/h2>\n\nScale beyond companies, appstores, network protocols and file-formats:\n\n\u003Cdiv style=\"max-width:600px;box-shadow:none;padding:15px\" class=\"border\">\n\u003C$videojs _autoplay controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/xrfragment.bumper2.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\u003Cbr>\n\n\u003Ch3>Virtual worlds connected via URLs\u003C/h3>\n\n[img[urls.svg]]\n\n\u003Cbr>\n\nXR Fragments is a spec to link 3D models into a basic ''distributed'' interactive XR experience.\u003Cbr>\nThink of it as bundling virtual worlds into a \u003Cb>spatial book\u003C/b>.\u003Cbr>\n\u003Cbr>\n\n\n!! Progressive enhancement\n\nUse [[engine prefixes|📜level7: engine prefixes]] to hint specific \u003Cb>game-engine features\u003C/b> from within your 3D file.\n\u003Cbr>\u003Cbr>\n[img[engine-prefixes.webp]]\n\n\nXR Fragments \u003Cb>empowers designers\u003C/b> to embed engine-hints, \u003Cb>simple interactions & navigation\u003C/b> inside a \u003Cb>3D file\u003C/b>.\u003Cbr>\nThis \u003Cb>no longer\u003C/b> requires developers to implement trivial interactive stuff.\u003Cbr>\nIt promotes \u003Cb>design-first, secure, durable and interoperable\u003C/b> XR experiences from \u003Cb>3D models\u003C/b>, basically 3D hypermedia, mitigating \u003Cb>handcoded-XR-apps-as-3D-content-burial-sites\u003C/b>.\u003Cbr>\n\u003Cbr>\n\n\u003Ch2>Getting Started\u003C/h2>\n\nJust get your hands on a 3D editor and follow the steps in the video:\n\u003Cbr>\u003Cbr>\n\u003Cdiv style=\"max-width:600px\">\n\u003C$videojs controls=\"controls\" aspectratio=\"16:9\" preload=\"auto\" poster=\"\" fluid=\"fluid\" class=\"vjs-big-play-centered\">\n \u003Csource src=\"https://coderofsalvation.codeberg.page/xrfragment.media/gettingstarted2024.mp4\" type=\"video/mp4\"/>\n\u003C/$videojs>\n\u003C/div>\n\u003Cbr>\u003CBr>\nCheck [[How it works|How it works]], or \u003Ca href=\"https://coderofsalvation.codeberg.page/xrfragment-haxe/example/aframe/sandbox/?./../../assets/website.glb\" target=\"_blank\">view a \u003Cb>demo.glb\u003C/b> scene right now\u003C/a>, or see the menu in the left corner for more.\n\u003Cbr>\u003Cbr>\n\u003Ch2>Presentation\u003C/h2>\n\u003Cbr>\n\u003Ciframe width=\"560\" height=\"315\" src=\"https://www.youtube.com/embed/bfxqm1q_GXw?start=1445\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen>\u003C/iframe>\n\n\u003C!-- persist telescopic unfolds -->\n\u003C\u003Cscript>>\n\u003Cscript>\n([...document.querySelectorAll('u')]).map( (u) => {\n u.addEventListener('click', e => e.target.className = 'show' )\n});\n\u003C/script>\n\u003C\u003Cscript 0>>\n","tags":"Home","title":"XR Fragments","modified":"20260416093935651","list-before":"How it works"}, {"created":"20250903111630328","text":"The viewer should ideally \u003Cb>presents a play-button\u003C/b> when:\n\n* at least one animationdata-item is defined in the 3D file\n* and/or when a timeline file (soundtrack or subtitle) sidecar-file is detected\n\nSee [complementary file](#📜%20level0:%20File) for detection of sidecar-files, for enhanced accessibility via [WebVTT subtitles](#WebVTT%20subtitles), thumbnails, soundtrack e.g.\n\n\n\n","tags":"","title":"XR Movies","modified":"20250904105805727","type":"text/markdown"}, {"created":"20260316112303145","text":"!! Soundtrack, subtitles, thumbnail \n\n[[XR Movies]] anyone?\u003Cbr>\n\n> ''Simple:'' just add those files and name them accordingly (`mymovie.xrf.ogg` for `mymovie.xrf.glb` e.g.) as [[sidecar-files|📜 level0: File]]\n\n\u003Ch3>Local-first\u003C/h3>\nWhile level1 is the core of the XR Fragments spec (spawnpoints + [[href]]-links):\n\n> [[level0|📜 level0: File]] completes the spec for more ''local'' usecases, especially:\n\n1. decorating/detecting a 3D file with extra files\u003Cbr>\n2. credible exit between XR platforms\n","title":"XR Movies & books","modified":"20260316142752927","type":"text/vnd.tiddlywiki","tags":"Home"}, {"created":"20250516081212327","text":"How can applications discover 3D experiences on a network?\n\n> Answer: **spatial microformats** \n\nThe XRF microformat is an [optional](#Progressive%20enhancement) text heuristics which applications can detect across various usecases.\n\n## via HTML webpage\n\nIf the browser/application requests an webpage (`https://nlnet.nl` e.g.) it should check for the [rel-me microformat](https://gmpg.org/xfn/) :\n\n```\n\u003Clink rel=\"alternate\" as=\"spatial-entrypoint\" href=\"scene.xrf.glb\">\n```\n\nThis way the application loads `https://nlnet.nl/scene.xrf.glb` when the user types `nlnet.nl` into the URLbar.\u003Cbr>\nOptionally, `type` can be specified for dynamically generated 3D files:\n\n```\n\u003Clink rel=\"alternate\" as=\"spatial-entrypoint\" href=\"https://worlds.org/scene.php#platformB\" type=\"model/gltf+binary\" />\n```\n\nThe `type`-attribute is for fallback-purposes.\u003Cbr>\nViewer-supported 3D file-extensions (`.glb` e.g.) will **ALWAYS** take precedence over the (non)presence of the `type` attribute.\u003Cbr>\nThe reason is that platforms (Mastodon 'labels' e.g.) don't allow specifying type-attributes.\u003Cbr>\nAnother reason is that XR Fragments is filetype-agnostic, so flexibility is expected on the viewer-side.\n\n> NOTE: in case of multiple 3D files mentioned in `\u003Clink rel=\"me\"`, only the first (supported 3D filetype) will be chosen.\n\nExample of multiple spatial microformats:\n\n```\n\u003Clink rel=\"alternate\" as=\"spatial-entrypoint\" href=\"scene.xrf.glb\"/>\n\u003Clink rel=\"me\" href=\"myavatar.vrm\"/>\n\u003C!-- JanusXR microformat https://github.com/jbaicoianu/janusweb\n \u003CFireBoxRoom>\n \u003CAssets>\n \u003Cassetobject id=\"experience\" src=\"scene.xrf.glb\"/>\n \u003C/Assets>\n \u003CRoom>\n \u003Cobject pos=\"0 0 0\" collision_id=\"experience\" id=\"experience\" />\n \u003C/Room>\n \u003C/FireBoxRoom>\n-->\n```\n\n## via WebFinger\n\nWhen John has an account on foo.com, how can other applications request his 3D homepage by simply entering `john@foo.com`?\n\n> Answer: it can be requested at `https://foo.com/.well-known/webfinger?resource=acct:john@foo.com`, resulting in:\n\n```\n{\n \"subject\": \"acct:john@foo.com\",\n \"aliases\": [\n \"https://mastodon.example/social/john\",\n \"https://john.foo.com\",\n \"https://3d.john.foo.com/model/scene.glb\"\n ],\n \"properties\": {\n \"http://schema.org/name\": \"John Doe\",\n \"http://schema.org/description\": \"Developer, 3D Enthusiast, and Social Explorer\"\n },\n \"links\": [\n {\n \"rel\": \"http://ostatus.org/schema/1.0/subscribe\",\n \"template\": \"https://mastodon.example/social/john/{uri}\"\n },\n {\n \"rel\": \"self\",\n \"type\": \"text/html\",\n \"href\": \"https://john.foo.com\"\n },\n {\n \"rel\": \"me\",\n \"type\": \"text/html\",\n \"href\": \"https://john.foo.com\"\n },\n\t\t{\n \"rel\": \"me\",\n \"type\": \"model/gltf+binary\",\n \"href\": \"https://3d.john.foo.com/model/avatar.vrm\"\n },\n {\n \"rel\": \"scene\",\n \"type\": \"model/gltf+binary\",\n \"href\": \"https://3d.john.foo.com/model/scene.xrf.glb\"\n }\n ]\n}\n```\n\nThis way the application will load `https://3d.john.foo.com/model/scene.glb` when the user types `john@foo.com` into the user field.\n\n## via Text (URI)\n\nAnother way for an application to trigger loading a 3D scene is by detecting URI's of 3D scene-files any text:\n\n* `foo.glb` (or any other popular 3D extension)\n* `https://foo.com/scene.glb` (or any other popular protocol)\n\nThis way, the application can highlight the link whenever it detects the URI (in a text-file or text-section of a 3D model)","tags":"[[📜 level1: URL]] level1 optional","title":"XRF microformat","modified":"20251113145944756","type":"text/markdown"},