update documentation

This commit is contained in:
Leon van Kammen 2023-09-09 11:30:03 +02:00
parent 9b512f12fd
commit 266983455e
4 changed files with 301 additions and 391 deletions

View File

@ -245,7 +245,7 @@ In case of <code>buttonA</code> the end-user will be teleported to another locat
<h1 id="embedding-3d-content">Embedding 3D content</h1>
<p>Here&rsquo;s an ascii representation of a 3D scene-graph with 3D objects <code></code> which embeds remote &amp; local 3D objects <code></code> (without) using queries:</p>
<p>Here&rsquo;s an ascii representation of a 3D scene-graph with 3D objects <code></code> which embeds remote &amp; local 3D objects <code></code> with/out using queries:</p>
<pre><code> +--------------------------------------------------------+ +-------------------------+
| | | |
@ -422,9 +422,10 @@ Ideally metadata must come <strong>with</strong> text, but not <strong>obfuscate
<p>This way:</p>
<ol>
<li>XR Fragments allows <b id="tagging-text">hasslefree XR text tagging</b>, using BibTeX metadata <strong>at the end of content</strong> (like <a href="https://visual.meta.info">visual-meta</a>).</li>
<li>XR Fragments allows <b id="tagging-text">hasslefree spatial tagging</b>, by detecting BibTeX metadata <strong>at the end of content</strong> of text (see default mimetype &amp; Data URI)</li>
<li>XR Fragments allows <b id="tagging-objects">hasslefree spatial tagging</b>, by treating 3D object name/class-pairs as BibTeX tags.</li>
<li>XR Fragments allows hasslefree <a href="#textual-tag">textual tagging</a>, <a href="#spatial-tag">spatial tagging</a>, and <a href="#supra-tagging">supra tagging</a>, by mapping 3D/text object (class)names using BibTeX &lsquo;tags&rsquo;</li>
<li>Bibs/BibTeX-appendices is first-choice <strong>requestless metadata</strong>-layer for XR text, HTML/RDF/JSON is great (but fits better in the application-layer)</li>
<li>BibTex &amp; Hashtagbibs are the first-choice <strong>requestless metadata</strong>-layer for XR text, HTML/RDF/JSON is great (but fits better in the application-layer)</li>
<li>Default font (unless specified otherwise) is a modern monospace font, for maximized tabular expressiveness (see <a href="#core-principle">the core principle</a>).</li>
<li>anti-pattern: hardcoupling a mandatory <strong>obtrusive markuplanguage</strong> or framework with an XR browsers (HTML/VRML/Javascript) (see <a href="#core-principle">the core principle</a>)</li>
<li>anti-pattern: limiting human introspection, by immediately funneling human thought into typesafe, precise, pre-categorized metadata like RDF (see <a href="#core-principle">the core principle</a>)</li>
@ -432,23 +433,33 @@ Ideally metadata must come <strong>with</strong> text, but not <strong>obfuscate
<p>This allows recursive connections between text itself, as well as 3D objects and vice versa, using <strong>BibTags</strong> :</p>
<pre><code> +---------------------------------------------+ +------------------+
| My Notes | | / \ |
| | | / \ |
| The houses here are built in baroque style. | | /house\ |
| | | |_____| |
| | +---------|--------+
| @house{houses, &gt;----'house'--------| class/name match?
| url = {#.house} &gt;----'houses'-------` class/name match?
| } |
+---------------------------------------------+
<pre><code> http://y.io/z.fbx | (Evaluated) BibTex/ 'wires' / tags |
----------------------------------------------------------------------------+-------------------------------------
| @house{castle,
+-[src: data:.....]----------------------+ +-[3D mesh]-+ | url = {https://y.io/z.fbx#castle}
| My Notes | | / \ | | }
| | | / \ | | @baroque{castle,
| The houses are built in baroque style. | | / \ | | url = {https://y.io/z.fbx#castle}
| | | |_____| | | }
| @house{baroque, | +-----│-----+ | @house{baroque,
| description = {classic} | ├─ name: castle | description = {classic}
| } | └─ class: house baroque | }
+----------------------------------------+ | @house{contactowner,
| }
+-[remotestorage.io / localstorage]------+ | @todo{contactowner,
| #contactowner@todo@house | | }
| ... | |
+----------------------------------------+ |
</code></pre>
<p>BibTex (generated from 3D objects), can be extended by the enduser with personal BiBTex or <a href="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs</a>.</p>
<blockquote>
<p>The enduser can add connections by speaking/typing/scanning <a href="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs</a> which the XR Browser can expand to (hidden) BibTags.</p>
<p><a href="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs</a> allows the enduser to add &lsquo;postit&rsquo; connections (compressed BibTex) by speaking/typing/scanning text, which the XR Browser saves to remotestorage (or localStorage per toplevel URL). As well as, referencing BibTags per URI later on: <code>https://y.io/z.fbx#@baroque@todo</code> e.g.</p>
</blockquote>
<p>This allows instant realtime tagging of objects at various scopes:</p>
<p>Obviously, expressing the relationships above in XML/JSON instead of BibTeX, would cause instant cognitive overload.<br>
The This allows instant realtime filtering of relationships at various levels:</p>
<table>
<thead>
@ -461,40 +472,43 @@ Ideally metadata must come <strong>with</strong> text, but not <strong>obfuscate
<tbody>
<tr>
<td><b id="textual-tagging">textual</b></td>
<td>text containing &lsquo;houses&rsquo; is now automatically tagged with &lsquo;house&rsquo; (incl. plaintext <code>src</code> child nodes)</td>
<td>text containing &lsquo;baroque&rsquo; is now automatically tagged with &lsquo;house&rsquo; (incl. plaintext <code>src</code> child nodes)</td>
</tr>
<tr>
<td><b id="spatial-tagging">spatial</b></td>
<td>spatial object(s) with <code>&quot;class&quot;:&quot;house&quot;</code> (because of <code>{#.house}</code>) are now automatically tagged with &lsquo;house&rsquo; (incl. child nodes)</td>
<td>spatial object(s) with name <code>baroque</code> or <code>&quot;class&quot;:&quot;house&quot;</code> are now automatically tagged with &lsquo;house&rsquo; (incl. child nodes)</td>
</tr>
<tr>
<td><b id="supra-tagging">supra</b></td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, named &lsquo;house&rsquo;, are automatically tagged with &lsquo;house&rsquo; (current node to root node)</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named &lsquo;baroque&rsquo; or &lsquo;house&rsquo;, are automatically tagged with &lsquo;house&rsquo; (current node to root nodes)</td>
</tr>
<tr>
<td><b id="omni-tagging">omni</b></td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, containing class/name &lsquo;house&rsquo;, are automatically tagged with &lsquo;house&rsquo; (too node to all nodes)</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named &lsquo;baroque&rsquo; or &lsquo;house&rsquo;, are automatically tagged with &lsquo;house&rsquo; (too node to all nodes)</td>
</tr>
<tr>
<td><b id="infinite-tagging">infinite</b></td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, containing class/name &lsquo;house&rsquo; or &lsquo;houses&rsquo;, are automatically tagged with &lsquo;house&rsquo; (too node to all nodes)</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named &lsquo;baroque&rsquo; or &lsquo;house&rsquo;, are automatically tagged with &lsquo;house&rsquo; (too node to all nodes)</td>
</tr>
</tbody>
</table>
<p>This empowers the enduser spatial expressiveness (see <a href="#core-principle">the core principle</a>): spatial wires can be rendered, words can be highlighted, spatial objects can be highlighted/moved/scaled, links can be manipulated by the user.<br>
The simplicity of appending BibTeX &lsquo;tags&rsquo; (humans first, machines later) is also demonstrated by <a href="https://visual-meta.info">visual-meta</a> in greater detail.</p>
<p>BibTex allows the enduser to adjust different levels of associations (see <a href="#core-principle">the core principle</a>): spatial wires can be rendered, words can be highlighted, spatial objects can be highlighted/moved/scaled, links can be manipulated by the user.<br></p>
<ol>
<blockquote>
<p>NOTE: infinite matches both &lsquo;baroque&rsquo; and &lsquo;style&rsquo;-occurences in text, as well as spatial objects with <code>&quot;class&quot;:&quot;style&quot;</code> or name &ldquo;baroque&rdquo;. This multiplexing of id/category is deliberate because of <a href="#core-principle">the core principle</a>.</p>
</blockquote>
<ol start="8">
<li>The XR Browser needs to adjust tag-scope based on the endusers needs/focus (infinite tagging only makes sense when environment is scaled down significantly)</li>
<li>The XR Browser should always allow the human to view/edit the metadata, by clicking &lsquo;toggle metadata&rsquo; on the &lsquo;back&rsquo; (contextmenu e.g.) of any XR text, anywhere anytime.</li>
</ol>
<blockquote>
<p>NOTE: infinite matches both &lsquo;house&rsquo; and &lsquo;houses&rsquo; in text, as well as spatial objects with <code>&quot;class&quot;:&quot;house&quot;</code> or name &ldquo;house&rdquo;. This multiplexing of id/category is deliberate because of <a href="#core-principle">the core principle</a>.</p>
<p>The simplicity of appending BibTeX (and leveling the metadata-playfield between humans and machines) is also demonstrated by <a href="https://visual-meta.info">visual-meta</a> in greater detail.</p>
</blockquote>
<h2 id="default-data-uri-mimetype">Default Data URI mimetype</h2>
@ -555,8 +569,8 @@ The simplicity of appending BibTeX &lsquo;tags&rsquo; (humans first, machines la
+--------------------------------------------------------------+
</code></pre>
<p>The enduser will only see <code>welcome human</code> and <code>Hello friends</code> rendered spatially.
The beauty is that text (AND visual-meta) in Data URI promotes rich copy-paste.
<p>The enduser will only see <code>welcome human</code> and <code>Hello friends</code> rendered spatially (see mimetype).
The beauty is that text in Data URI automatically promotes rich copy-paste (retaining metadata).
In both cases, the text gets rendered immediately (onto a plane geometry, hence the name &lsquo;_canvas&rsquo;).
The XR Fragment-compatible browser can let the enduser access visual-meta(data)-fields after interacting with the object (contextmenu e.g.).</p>
@ -564,31 +578,6 @@ The XR Fragment-compatible browser can let the enduser access visual-meta(data)-
<p>additional tagging using <a href="https://github.com/coderofsalvation/hashtagbibs">bibs</a>: to tag spatial object <code>note_canvas</code> with &lsquo;todo&rsquo;, the enduser can type or speak <code>@note_canvas@todo</code></p>
</blockquote>
<p>The mapping between 3D objects and text (src-data) is simple (the :</p>
<p>Example:</p>
<pre><code> +------------------------------------------------+
| |
| index.gltf |
| │ |
| └── ◻ rentalhouse |
| └ class: house &lt;----------------- matches -------+
| └ ◻ note | |
| └ src:`data: todo: call owner | hashtagbib |
| #owner@house@todo | ----&gt; expands to @house{owner,
| | bibtex: }
| ` | @contact{
+------------------------------------------------+ }
</code></pre>
<p>Bi-directional mapping between 3D object names and/or classnames and text using bibs,BibTags &amp; XR Fragments, allows for rich interlinking between text and 3D objects:</p>
<ol>
<li>When the user surfs to https://&hellip;/index.gltf#rentalhouse the XR Fragments-parser points the enduser to the rentalhouse object, and can show contextual info about it.</li>
<li>When (partial) remote content is embedded thru XR Fragment queries (see XR Fragment queries), indirectly related metadata can be embedded along.</li>
</ol>
<h2 id="bibs-bibtex-lowest-common-denominator-for-linking-data">Bibs &amp; BibTeX: lowest common denominator for linking data</h2>
<blockquote>
@ -747,14 +736,14 @@ In that sense, it&rsquo;s one step up from the <code>.ini</code> fileformat (whi
<p>To keep XR Fragments a lightweight spec, BibTeX is used for rudimentary text/spatial tagging (not JSON, RDF or a scripting language because they&rsquo;re harder to write/speak/repair.).</p>
</blockquote>
<p>Applications are also free to attach any JSON(LD / RDF) to spatial objects using custom properties (but is not interpreted by this spec).</p>
<p>Of course, on an application-level JSON(LD / RDF) can still be used at will, by embedding RDF-urls/data as custom properties (but is not interpreted by this spec).</p>
<h2 id="xr-text-example-parser">XR Text example parser</h2>
<ol>
<li>The XR Fragments spec does not aim to harden the BiBTeX format</li>
<li>respect multi-line BibTex values because of <a href="#core-principle">the core principle</a></li>
<li>Expand hashtag(bibs) and rulers (like <code>${visual-meta-start}</code>) according to the <a href="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs spec</a></li>
<li>Respect hashtag(bibs) and rulers (like <code>${visual-meta-start}</code>) according to the <a href="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs spec</a></li>
<li>BibTeX snippets should always start in the beginning of a line (regex: ^@), hence mimetype <code>text/plain;charset=utf-8;bib=^@</code></li>
</ol>

View File

@ -180,7 +180,7 @@ In case of `buttonA` the end-user will be teleported to another location and tim
# Embedding 3D content
Here's an ascii representation of a 3D scene-graph with 3D objects `◻` which embeds remote & local 3D objects `◻` (without) using queries:
Here's an ascii representation of a 3D scene-graph with 3D objects `◻` which embeds remote & local 3D objects `◻` with/out using queries:
```
+--------------------------------------------------------+ +-------------------------+
@ -295,47 +295,59 @@ Ideally metadata must come **with** text, but not **obfuscate** the text, or **i
This way:
1. XR Fragments allows <b id="tagging-text">hasslefree XR text tagging</b>, using BibTeX metadata **at the end of content** (like [visual-meta](https://visual.meta.info)).
1. XR Fragments allows hasslefree <a href="#textual-tag">textual tagging</a>, <a href="#spatial-tag">spatial tagging</a>, and <a href="#supra-tagging">supra tagging</a>, by mapping 3D/text object (class)names using BibTeX 'tags'
1. Bibs/BibTeX-appendices is first-choice **requestless metadata**-layer for XR text, HTML/RDF/JSON is great (but fits better in the application-layer)
1. Default font (unless specified otherwise) is a modern monospace font, for maximized tabular expressiveness (see [the core principle](#core-principle)).
1. anti-pattern: hardcoupling a mandatory **obtrusive markuplanguage** or framework with an XR browsers (HTML/VRML/Javascript) (see [the core principle](#core-principle))
1. anti-pattern: limiting human introspection, by immediately funneling human thought into typesafe, precise, pre-categorized metadata like RDF (see [the core principle](#core-principle))
1. XR Fragments allows <b id="tagging-text">hasslefree spatial tagging</b>, by detecting BibTeX metadata **at the end of content** of text (see default mimetype & Data URI)
2. XR Fragments allows <b id="tagging-objects">hasslefree spatial tagging</b>, by treating 3D object name/class-pairs as BibTeX tags.
3. XR Fragments allows hasslefree <a href="#textual-tag">textual tagging</a>, <a href="#spatial-tag">spatial tagging</a>, and <a href="#supra-tagging">supra tagging</a>, by mapping 3D/text object (class)names using BibTeX 'tags'
4. BibTex & Hashtagbibs are the first-choice **requestless metadata**-layer for XR text, HTML/RDF/JSON is great (but fits better in the application-layer)
5. Default font (unless specified otherwise) is a modern monospace font, for maximized tabular expressiveness (see [the core principle](#core-principle)).
6. anti-pattern: hardcoupling a mandatory **obtrusive markuplanguage** or framework with an XR browsers (HTML/VRML/Javascript) (see [the core principle](#core-principle))
7. anti-pattern: limiting human introspection, by immediately funneling human thought into typesafe, precise, pre-categorized metadata like RDF (see [the core principle](#core-principle))
This allows recursive connections between text itself, as well as 3D objects and vice versa, using **BibTags** :
```
+---------------------------------------------+ +------------------+
| My Notes | | / \ |
| | | / \ |
| The houses here are built in baroque style. | | /house\ |
| | | |_____| |
| | +---------|--------+
| @house{houses, >----'house'--------| class/name match?
| url = {#.house} >----'houses'-------` class/name match?
| } |
+---------------------------------------------+
```
http://y.io/z.fbx | (Evaluated) BibTex/ 'wires' / tags |
----------------------------------------------------------------------------+-------------------------------------
| @house{castle,
+-[src: data:.....]----------------------+ +-[3D mesh]-+ | url = {https://y.io/z.fbx#castle}
| My Notes | | / \ | | }
| | | / \ | | @baroque{castle,
| The houses are built in baroque style. | | / \ | | url = {https://y.io/z.fbx#castle}
| | | |_____| | | }
| @house{baroque, | +-----│-----+ | @house{baroque,
| description = {classic} | ├─ name: castle | description = {classic}
| } | └─ class: house baroque | }
+----------------------------------------+ | @house{contactowner,
| }
+-[remotestorage.io / localstorage]------+ | @todo{contactowner,
| #contactowner@todo@house | | }
| ... | |
+----------------------------------------+ |
```
> The enduser can add connections by speaking/typing/scanning [hashtagbibs](https://github.com/coderofsalvation/hashtagbibs) which the XR Browser can expand to (hidden) BibTags.
BibTex (generated from 3D objects), can be extended by the enduser with personal BiBTex or [hashtagbibs](https://github.com/coderofsalvation/hashtagbibs).
This allows instant realtime tagging of objects at various scopes:
> [hashtagbibs](https://github.com/coderofsalvation/hashtagbibs) allows the enduser to add 'postit' connections (compressed BibTex) by speaking/typing/scanning text, which the XR Browser saves to remotestorage (or localStorage per toplevel URL). As well as, referencing BibTags per URI later on: `https://y.io/z.fbx#@baroque@todo` e.g.
Obviously, expressing the relationships above in XML/JSON instead of BibTeX, would cause instant cognitive overload.<br>
The This allows instant realtime filtering of relationships at various levels:
| scope | matching algo |
|---------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <b id="textual-tagging">textual</b> | text containing 'houses' is now automatically tagged with 'house' (incl. plaintext `src` child nodes) |
| <b id="spatial-tagging">spatial</b> | spatial object(s) with `"class":"house"` (because of `{#.house}`) are now automatically tagged with 'house' (incl. child nodes) |
| <b id="supra-tagging">supra</b> | text- or spatial-object(s) (non-descendant nodes) elsewhere, named 'house', are automatically tagged with 'house' (current node to root node) |
| <b id="omni-tagging">omni</b> | text- or spatial-object(s) (non-descendant nodes) elsewhere, containing class/name 'house', are automatically tagged with 'house' (too node to all nodes) |
| <b id="infinite-tagging">infinite</b> | text- or spatial-object(s) (non-descendant nodes) elsewhere, containing class/name 'house' or 'houses', are automatically tagged with 'house' (too node to all nodes) |
| <b id="textual-tagging">textual</b> | text containing 'baroque' is now automatically tagged with 'house' (incl. plaintext `src` child nodes) |
| <b id="spatial-tagging">spatial</b> | spatial object(s) with name `baroque` or `"class":"house"` are now automatically tagged with 'house' (incl. child nodes) |
| <b id="supra-tagging">supra</b> | text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named 'baroque' or 'house', are automatically tagged with 'house' (current node to root nodes) |
| <b id="omni-tagging">omni</b> | text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named 'baroque' or 'house', are automatically tagged with 'house' (too node to all nodes) |
| <b id="infinite-tagging">infinite</b> | text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named 'baroque' or 'house', are automatically tagged with 'house' (too node to all nodes) |
This empowers the enduser spatial expressiveness (see [the core principle](#core-principle)): spatial wires can be rendered, words can be highlighted, spatial objects can be highlighted/moved/scaled, links can be manipulated by the user.<br>
The simplicity of appending BibTeX 'tags' (humans first, machines later) is also demonstrated by [visual-meta](https://visual-meta.info) in greater detail.
BibTex allows the enduser to adjust different levels of associations (see [the core principle](#core-principle)): spatial wires can be rendered, words can be highlighted, spatial objects can be highlighted/moved/scaled, links can be manipulated by the user.<br>
1. The XR Browser needs to adjust tag-scope based on the endusers needs/focus (infinite tagging only makes sense when environment is scaled down significantly)
1. The XR Browser should always allow the human to view/edit the metadata, by clicking 'toggle metadata' on the 'back' (contextmenu e.g.) of any XR text, anywhere anytime.
> NOTE: infinite matches both 'baroque' and 'style'-occurences in text, as well as spatial objects with `"class":"style"` or name "baroque". This multiplexing of id/category is deliberate because of [the core principle](#core-principle).
> NOTE: infinite matches both 'house' and 'houses' in text, as well as spatial objects with `"class":"house"` or name "house". This multiplexing of id/category is deliberate because of [the core principle](#core-principle).
8. The XR Browser needs to adjust tag-scope based on the endusers needs/focus (infinite tagging only makes sense when environment is scaled down significantly)
9. The XR Browser should always allow the human to view/edit the metadata, by clicking 'toggle metadata' on the 'back' (contextmenu e.g.) of any XR text, anywhere anytime.
> The simplicity of appending BibTeX (and leveling the metadata-playfield between humans and machines) is also demonstrated by [visual-meta](https://visual-meta.info) in greater detail.
## Default Data URI mimetype
@ -388,37 +400,13 @@ For all other purposes, regular mimetypes can be used (but are not required by t
+--------------------------------------------------------------+
```
The enduser will only see `welcome human` and `Hello friends` rendered spatially.
The beauty is that text (AND visual-meta) in Data URI promotes rich copy-paste.
The enduser will only see `welcome human` and `Hello friends` rendered spatially (see mimetype).
The beauty is that text in Data URI automatically promotes rich copy-paste (retaining metadata).
In both cases, the text gets rendered immediately (onto a plane geometry, hence the name '_canvas').
The XR Fragment-compatible browser can let the enduser access visual-meta(data)-fields after interacting with the object (contextmenu e.g.).
> additional tagging using [bibs](https://github.com/coderofsalvation/hashtagbibs): to tag spatial object `note_canvas` with 'todo', the enduser can type or speak `@note_canvas@todo`
The mapping between 3D objects and text (src-data) is simple (the :
Example:
```
+------------------------------------------------+
| |
| index.gltf |
| │ |
| └── ◻ rentalhouse |
| └ class: house <----------------- matches -------+
| └ ◻ note | |
| └ src:`data: todo: call owner | hashtagbib |
| #owner@house@todo | ----> expands to @house{owner,
| | bibtex: }
| ` | @contact{
+------------------------------------------------+ }
```
Bi-directional mapping between 3D object names and/or classnames and text using bibs,BibTags & XR Fragments, allows for rich interlinking between text and 3D objects:
1. When the user surfs to https://.../index.gltf#rentalhouse the XR Fragments-parser points the enduser to the rentalhouse object, and can show contextual info about it.
2. When (partial) remote content is embedded thru XR Fragment queries (see XR Fragment queries), indirectly related metadata can be embedded along.
## Bibs & BibTeX: lowest common denominator for linking data
> "When a car breaks down, the ones **without** turbosupercharger are easier to fix"
@ -457,14 +445,14 @@ In that sense, it's one step up from the `.ini` fileformat (which has never leak
> To keep XR Fragments a lightweight spec, BibTeX is used for rudimentary text/spatial tagging (not JSON, RDF or a scripting language because they're harder to write/speak/repair.).
Applications are also free to attach any JSON(LD / RDF) to spatial objects using custom properties (but is not interpreted by this spec).
Of course, on an application-level JSON(LD / RDF) can still be used at will, by embedding RDF-urls/data as custom properties (but is not interpreted by this spec).
## XR Text example parser
1. The XR Fragments spec does not aim to harden the BiBTeX format
2. respect multi-line BibTex values because of [the core principle](#core-principle)
3. Expand hashtag(bibs) and rulers (like `${visual-meta-start}`) according to the [hashtagbibs spec](https://github.com/coderofsalvation/hashtagbibs)
3. Respect hashtag(bibs) and rulers (like `${visual-meta-start}`) according to the [hashtagbibs spec](https://github.com/coderofsalvation/hashtagbibs)
4. BibTeX snippets should always start in the beginning of a line (regex: ^@), hence mimetype `text/plain;charset=utf-8;bib=^@`
Here's an XR Text (de)multiplexer in javascript, which ticks all the above boxes:

View File

@ -3,7 +3,7 @@
Internet Engineering Task Force L.R. van Kammen
Internet-Draft 8 September 2023
Internet-Draft 9 September 2023
Intended status: Informational
@ -40,7 +40,7 @@ Status of This Memo
time. It is inappropriate to use Internet-Drafts as reference
material or to cite them other than as "work in progress."
This Internet-Draft will expire on 11 March 2024.
This Internet-Draft will expire on 12 March 2024.
Copyright Notice
@ -53,7 +53,7 @@ Copyright Notice
van Kammen Expires 11 March 2024 [Page 1]
van Kammen Expires 12 March 2024 [Page 1]
Internet-Draft XR Fragments September 2023
@ -83,11 +83,11 @@ Table of Contents
9.3. Bibs & BibTeX: lowest common denominator for linking
data . . . . . . . . . . . . . . . . . . . . . . . . . . 13
9.4. XR Text example parser . . . . . . . . . . . . . . . . . 15
10. HYPER copy/paste . . . . . . . . . . . . . . . . . . . . . . 18
11. Security Considerations . . . . . . . . . . . . . . . . . . . 18
12. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 18
13. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 18
14. Appendix: Definitions . . . . . . . . . . . . . . . . . . . . 19
10. HYPER copy/paste . . . . . . . . . . . . . . . . . . . . . . 17
11. Security Considerations . . . . . . . . . . . . . . . . . . . 17
12. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 17
13. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 17
14. Appendix: Definitions . . . . . . . . . . . . . . . . . . . . 18
1. Introduction
@ -109,7 +109,7 @@ Table of Contents
van Kammen Expires 11 March 2024 [Page 2]
van Kammen Expires 12 March 2024 [Page 2]
Internet-Draft XR Fragments September 2023
@ -165,7 +165,7 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 3]
van Kammen Expires 12 March 2024 [Page 3]
Internet-Draft XR Fragments September 2023
@ -221,7 +221,7 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 4]
van Kammen Expires 12 March 2024 [Page 4]
Internet-Draft XR Fragments September 2023
@ -235,8 +235,8 @@ Internet-Draft XR Fragments September 2023
7. Embedding 3D content
Here's an ascii representation of a 3D scene-graph with 3D objects
&#9723; which embeds remote & local 3D objects &#9723; (without)
using queries:
&#9723; which embeds remote & local 3D objects &#9723; with/out using
queries:
+--------------------------------------------------------+ +-------------------------+
| | | |
@ -277,7 +277,7 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 5]
van Kammen Expires 12 March 2024 [Page 5]
Internet-Draft XR Fragments September 2023
@ -333,7 +333,7 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 6]
van Kammen Expires 12 March 2024 [Page 6]
Internet-Draft XR Fragments September 2023
@ -389,7 +389,7 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 7]
van Kammen Expires 12 March 2024 [Page 7]
Internet-Draft XR Fragments September 2023
@ -418,54 +418,74 @@ Internet-Draft XR Fragments September 2023
This way:
1. XR Fragments allows <b id="tagging-text">hasslefree XR text
tagging</b>, using BibTeX metadata *at the end of content* (like
visual-meta (https://visual.meta.info)).
2. XR Fragments allows hasslefree <a href="#textual-tag">textual
1. XR Fragments allows <b id="tagging-text">hasslefree spatial
tagging</b>, by detecting BibTeX metadata *at the end of content*
of text (see default mimetype & Data URI)
2. XR Fragments allows <b id="tagging-objects">hasslefree spatial
tagging</b>, by treating 3D object name/class-pairs as BibTeX
tags.
3. XR Fragments allows hasslefree <a href="#textual-tag">textual
tagging</a>, <a href="#spatial-tag">spatial tagging</a>, and <a
href="#supra-tagging">supra tagging</a>, by mapping 3D/text
object (class)names using BibTeX 'tags'
3. Bibs/BibTeX-appendices is first-choice *requestless metadata*-
4. BibTex & Hashtagbibs are the first-choice *requestless metadata*-
layer for XR text, HTML/RDF/JSON is great (but fits better in the
application-layer)
4. Default font (unless specified otherwise) is a modern monospace
5. Default font (unless specified otherwise) is a modern monospace
font, for maximized tabular expressiveness (see the core
principle (#core-principle)).
5. anti-pattern: hardcoupling a mandatory *obtrusive markuplanguage*
6. anti-pattern: hardcoupling a mandatory *obtrusive markuplanguage*
or framework with an XR browsers (HTML/VRML/Javascript) (see the
core principle (#core-principle))
6. anti-pattern: limiting human introspection, by immediately
7. anti-pattern: limiting human introspection, by immediately
funneling human thought into typesafe, precise, pre-categorized
metadata like RDF (see the core principle (#core-principle))
van Kammen Expires 12 March 2024 [Page 8]
Internet-Draft XR Fragments September 2023
This allows recursive connections between text itself, as well as 3D
objects and vice versa, using *BibTags* :
http://y.io/z.fbx | (Evaluated) BibTex/ 'wires' / tags |
----------------------------------------------------------------------------+-------------------------------------
| @house{castle,
+-[src: data:.....]----------------------+ +-[3D mesh]-+ | url = {https://y.io/z.fbx#castle}
| My Notes | | / \ | | }
| | | / \ | | @baroque{castle,
| The houses are built in baroque style. | | / \ | | url = {https://y.io/z.fbx#castle}
| | | |_____| | | }
| @house{baroque, | +-----│-----+ | @house{baroque,
| description = {classic} | ├─ name: castle | description = {classic}
| } | └─ class: house baroque | }
+----------------------------------------+ | @house{contactowner,
| }
+-[remotestorage.io / localstorage]------+ | @todo{contactowner,
| #contactowner@todo@house | | }
| ... | |
+----------------------------------------+ |
BibTex (generated from 3D objects), can be extended by the enduser
with personal BiBTex or hashtagbibs
(https://github.com/coderofsalvation/hashtagbibs).
van Kammen Expires 11 March 2024 [Page 8]
Internet-Draft XR Fragments September 2023
+---------------------------------------------+ +------------------+
| My Notes | | / \ |
| | | / \ |
| The houses here are built in baroque style. | | /house\ |
| | | |_____| |
| | +---------|--------+
| @house{houses, >----'house'--------| class/name match?
| url = {#.house} >----'houses'-------` class/name match?
| } |
+---------------------------------------------+
| The enduser can add connections by speaking/typing/scanning
| hashtagbibs (https://github.com/coderofsalvation/hashtagbibs)
| which the XR Browser can expand to (hidden) BibTags.
| allows the enduser to add 'postit' connections (compressed BibTex)
| by speaking/typing/scanning text, which the XR Browser saves to
| remotestorage (or localStorage per toplevel URL). As well as,
| referencing BibTags per URI later on: https://y.io/
| z.fbx#@baroque@todo e.g.
This allows instant realtime tagging of objects at various scopes:
Obviously, expressing the relationships above in XML/JSON instead of
BibTeX, would cause instant cognitive overload.
The This allows instant realtime filtering of relationships at
various levels:
@ -481,98 +501,82 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 9]
van Kammen Expires 12 March 2024 [Page 9]
Internet-Draft XR Fragments September 2023
+====================================+=============================+
| scope | matching algo |
+====================================+=============================+
| <b id="textual- | text containing 'houses' is |
| tagging">textual</b> | now automatically tagged |
| | with 'house' (incl. |
| | plaintext src child nodes) |
+------------------------------------+-----------------------------+
| <b id="spatial- | spatial object(s) with |
| tagging">spatial</b> | "class":"house" (because of |
| | {#.house}) are now |
| | automatically tagged with |
| | 'house' (incl. child nodes) |
+------------------------------------+-----------------------------+
| <b id="supra-tagging">supra</b> | text- or spatial-object(s) |
| | (non-descendant nodes) |
| | elsewhere, named 'house', |
| | are automatically tagged |
| | with 'house' (current node |
| | to root node) |
+------------------------------------+-----------------------------+
| <b id="omni-tagging">omni</b> | text- or spatial-object(s) |
| | (non-descendant nodes) |
| | elsewhere, containing |
| | class/name 'house', are |
| | automatically tagged with |
| | 'house' (too node to all |
| | nodes) |
+------------------------------------+-----------------------------+
| <b id="infinite- | text- or spatial-object(s) |
| tagging">infinite</b> | (non-descendant nodes) |
| | elsewhere, containing |
| | class/name 'house' or |
| | 'houses', are automatically |
| | tagged with 'house' (too |
| | node to all nodes) |
+------------------------------------+-----------------------------+
+====================================+============================+
| scope | matching algo |
+====================================+============================+
| <b id="textual- | text containing 'baroque' |
| tagging">textual</b> | is now automatically |
| | tagged with 'house' (incl. |
| | plaintext src child nodes) |
+------------------------------------+----------------------------+
| <b id="spatial- | spatial object(s) with |
| tagging">spatial</b> | name baroque or |
| | "class":"house" are now |
| | automatically tagged with |
| | 'house' (incl. child |
| | nodes) |
+------------------------------------+----------------------------+
| <b id="supra-tagging">supra</b> | text- or spatial-object(s) |
| | (non-descendant nodes) |
| | elsewhere, (class)named |
| | 'baroque' or 'house', are |
| | automatically tagged with |
| | 'house' (current node to |
| | root nodes) |
+------------------------------------+----------------------------+
| <b id="omni-tagging">omni</b> | text- or spatial-object(s) |
| | (non-descendant nodes) |
| | elsewhere, (class)named |
| | 'baroque' or 'house', are |
| | automatically tagged with |
| | 'house' (too node to all |
| | nodes) |
+------------------------------------+----------------------------+
| <b id="infinite- | text- or spatial-object(s) |
| tagging">infinite</b> | (non-descendant nodes) |
| | elsewhere, (class)named |
| | 'baroque' or 'house', are |
| | automatically tagged with |
| | 'house' (too node to all |
| | nodes) |
+------------------------------------+----------------------------+
Table 5
Table 5
This empowers the enduser spatial expressiveness (see the core
principle (#core-principle)): spatial wires can be rendered, words
can be highlighted, spatial objects can be highlighted/moved/scaled,
links can be manipulated by the user.
The simplicity of appending BibTeX 'tags' (humans first, machines
later) is also demonstrated by visual-meta (https://visual-meta.info)
in greater detail.
BibTex allows the enduser to adjust different levels of associations
(see the core principle (#core-principle)): spatial wires can be
rendered, words can be highlighted, spatial objects can be
highlighted/moved/scaled, links can be manipulated by the user.
van Kammen Expires 11 March 2024 [Page 10]
van Kammen Expires 12 March 2024 [Page 10]
Internet-Draft XR Fragments September 2023
1. The XR Browser needs to adjust tag-scope based on the endusers
| NOTE: infinite matches both 'baroque' and 'style'-occurences in
| text, as well as spatial objects with "class":"style" or name
| "baroque". This multiplexing of id/category is deliberate because
| of the core principle (#core-principle).
8. The XR Browser needs to adjust tag-scope based on the endusers
needs/focus (infinite tagging only makes sense when environment
is scaled down significantly)
2. The XR Browser should always allow the human to view/edit the
9. The XR Browser should always allow the human to view/edit the
metadata, by clicking 'toggle metadata' on the 'back'
(contextmenu e.g.) of any XR text, anywhere anytime.
| NOTE: infinite matches both 'house' and 'houses' in text, as well
| as spatial objects with "class":"house" or name "house". This
| multiplexing of id/category is deliberate because of the core
| principle (#core-principle).
| The simplicity of appending BibTeX (and leveling the metadata-
| playfield between humans and machines) is also demonstrated by
| visual-meta (https://visual-meta.info) in greater detail.
9.1. Default Data URI mimetype
@ -606,18 +610,18 @@ Internet-Draft XR Fragments September 2023
* out-of-the-box (de)multiplex human text and metadata in one go
(see the core principle (#core-principle))
* no network-overhead for metadata (see the core principle (#core-
principle))
* ensuring high FPS: HTML/RDF historically is too 'requesty'/'parsy'
for game studios
van Kammen Expires 11 March 2024 [Page 11]
van Kammen Expires 12 March 2024 [Page 11]
Internet-Draft XR Fragments September 2023
* no network-overhead for metadata (see the core principle (#core-
principle))
* ensuring high FPS: HTML/RDF historically is too 'requesty'/'parsy'
for game studios
* rich send/receive/copy-paste everywhere by default, metadata being
retained (see the core principle (#core-principle))
* netto result: less webservices, therefore less servers, and
@ -647,57 +651,29 @@ Internet-Draft XR Fragments September 2023
+--------------------------------------------------------------+
The enduser will only see welcome human and Hello friends rendered
spatially. The beauty is that text (AND visual-meta) in Data URI
promotes rich copy-paste. In both cases, the text gets rendered
immediately (onto a plane geometry, hence the name '_canvas'). The
XR Fragment-compatible browser can let the enduser access visual-
meta(data)-fields after interacting with the object (contextmenu
e.g.).
spatially (see mimetype). The beauty is that text in Data URI
automatically promotes rich copy-paste (retaining metadata). In both
cases, the text gets rendered immediately (onto a plane geometry,
hence the name '_canvas'). The XR Fragment-compatible browser can
let the enduser access visual-meta(data)-fields after interacting
with the object (contextmenu e.g.).
| additional tagging using bibs
| (https://github.com/coderofsalvation/hashtagbibs): to tag spatial
| object note_canvas with 'todo', the enduser can type or speak
| @note_canvas@todo
The mapping between 3D objects and text (src-data) is simple (the :
Example:
van Kammen Expires 11 March 2024 [Page 12]
van Kammen Expires 12 March 2024 [Page 12]
Internet-Draft XR Fragments September 2023
+------------------------------------------------+
| |
| index.gltf |
| │ |
| └── ◻ rentalhouse |
| └ class: house <----------------- matches -------+
| └ ◻ note | |
| └ src:`data: todo: call owner | hashtagbib |
| #owner@house@todo | ----> expands to @house{owner,
| | bibtex: }
| ` | @contact{
+------------------------------------------------+ }
Bi-directional mapping between 3D object names and/or classnames and
text using bibs,BibTags & XR Fragments, allows for rich interlinking
between text and 3D objects:
1. When the user surfs to https://.../index.gltf#rentalhouse the XR
Fragments-parser points the enduser to the rentalhouse object,
and can show contextual info about it.
2. When (partial) remote content is embedded thru XR Fragment
queries (see XR Fragment queries), indirectly related metadata
can be embedded along.
9.3. Bibs & BibTeX: lowest common denominator for linking data
| "When a car breaks down, the ones *without* turbosupercharger are
@ -717,19 +693,6 @@ Internet-Draft XR Fragments September 2023
2. an introspective 'sketchpad' for metadata, which can (optionally)
mature into RDF later
van Kammen Expires 11 March 2024 [Page 13]
Internet-Draft XR Fragments September 2023
+================+=====================================+===============+
|characteristic |UTF8 Plain Text (with BibTeX) |RDF |
+================+=====================================+===============+
@ -759,6 +722,14 @@ Internet-Draft XR Fragments September 2023
|content+metadata| | |
+----------------+-------------------------------------+---------------+
|easy to write/ |yes |depends |
van Kammen Expires 12 March 2024 [Page 13]
Internet-Draft XR Fragments September 2023
|repair for | | |
|layman | | |
+----------------+-------------------------------------+---------------+
@ -778,14 +749,6 @@ Internet-Draft XR Fragments September 2023
|preserves | |application |
|metadata | | |
+----------------+-------------------------------------+---------------+
van Kammen Expires 11 March 2024 [Page 14]
Internet-Draft XR Fragments September 2023
|emoji |yes |depends on |
| | |encoding |
+----------------+-------------------------------------+---------------+
@ -812,16 +775,23 @@ Internet-Draft XR Fragments September 2023
| rudimentary text/spatial tagging (not JSON, RDF or a scripting
| language because they're harder to write/speak/repair.).
Applications are also free to attach any JSON(LD / RDF) to spatial
objects using custom properties (but is not interpreted by this
spec).
Of course, on an application-level JSON(LD / RDF) can still be used
at will, by embedding RDF-urls/data as custom properties (but is not
interpreted by this spec).
van Kammen Expires 12 March 2024 [Page 14]
Internet-Draft XR Fragments September 2023
9.4. XR Text example parser
1. The XR Fragments spec does not aim to harden the BiBTeX format
2. respect multi-line BibTex values because of the core principle
(#core-principle)
3. Expand hashtag(bibs) and rulers (like ${visual-meta-start})
3. Respect hashtag(bibs) and rulers (like ${visual-meta-start})
according to the hashtagbibs spec
(https://github.com/coderofsalvation/hashtagbibs)
4. BibTeX snippets should always start in the beginning of a line
@ -830,18 +800,6 @@ Internet-Draft XR Fragments September 2023
Here's an XR Text (de)multiplexer in javascript, which ticks all the
above boxes:
van Kammen Expires 11 March 2024 [Page 15]
Internet-Draft XR Fragments September 2023
xrtext = {
expandBibs: (text) => {
@ -876,6 +834,14 @@ xrtext = {
t.split( pat[2] )
.map( kv => {
if( !(kv = kv.trim()) || kv == "}" ) return
van Kammen Expires 12 March 2024 [Page 15]
Internet-Draft XR Fragments September 2023
v[ kv.match(/\s?(\S+)\s?=/)[1] ] = kv.substr( kv.indexOf("{")+1 )
})
tags.push( { k:tag, v } )
@ -890,14 +856,6 @@ xrtext = {
let item = tags[i]
if( item.ruler ){
str += `@${item.ruler}\n`
van Kammen Expires 11 March 2024 [Page 16]
Internet-Draft XR Fragments September 2023
continue;
}
str += `@${item.k}\n`
@ -932,28 +890,16 @@ tags.find( (t) => t.k == 'flap{' ).v.asdf = 1 // edit tag
tags.push({ k:'bar{', v:{abc:123} }) // add tag
console.log( xrtext.encode(text,tags) ) // multiplex text & bibtex back together
This expands to the following (hidden by default) BibTex appendix:
van Kammen Expires 11 March 2024 [Page 17]
van Kammen Expires 12 March 2024 [Page 16]
Internet-Draft XR Fragments September 2023
This expands to the following (hidden by default) BibTex appendix:
hello world
here are some hashtagbibs followed by bibtex:
@ -1000,16 +946,16 @@ Internet-Draft XR Fragments September 2023
13. Acknowledgments
* NLNET (https://nlnet.nl)
* Future of Text (https://futureoftext.org)
van Kammen Expires 11 March 2024 [Page 18]
van Kammen Expires 12 March 2024 [Page 17]
Internet-Draft XR Fragments September 2023
* NLNET (https://nlnet.nl)
* Future of Text (https://futureoftext.org)
* visual-meta.info (https://visual-meta.info)
14. Appendix: Definitions
@ -1056,16 +1002,16 @@ Internet-Draft XR Fragments September 2023
| | possible |
+---------------+----------------------------------------------+
| introspective | inward sensemaking ("I feel this belongs to |
| | that") |
+---------------+----------------------------------------------+
van Kammen Expires 11 March 2024 [Page 19]
van Kammen Expires 12 March 2024 [Page 18]
Internet-Draft XR Fragments September 2023
| | that") |
+---------------+----------------------------------------------+
| extrospective | outward sensemaking ("I'm fairly sure John |
| | is a person who lives in oklahoma") |
+---------------+----------------------------------------------+
@ -1115,6 +1061,4 @@ Internet-Draft XR Fragments September 2023
van Kammen Expires 11 March 2024 [Page 20]
van Kammen Expires 12 March 2024 [Page 19]

View File

@ -167,7 +167,7 @@ In case of <tt>buttonA</tt> the end-user will be teleported to another location
</section>
<section anchor="embedding-3d-content"><name>Embedding 3D content</name>
<t>Here's an ascii representation of a 3D scene-graph with 3D objects <tt></tt> which embeds remote &amp; local 3D objects <tt></tt> (without) using queries:</t>
<t>Here's an ascii representation of a 3D scene-graph with 3D objects <tt></tt> which embeds remote &amp; local 3D objects <tt></tt> with/out using queries:</t>
<artwork> +--------------------------------------------------------+ +-------------------------+
| | | |
@ -335,28 +335,39 @@ Ideally metadata must come <strong>with</strong> text, but not <strong>obfuscate
<t>This way:</t>
<ol spacing="compact">
<li>XR Fragments allows &lt;b id=&quot;tagging-text&quot;&gt;hasslefree XR text tagging&lt;/b&gt;, using BibTeX metadata <strong>at the end of content</strong> (like <eref target="https://visual.meta.info">visual-meta</eref>).</li>
<li>XR Fragments allows &lt;b id=&quot;tagging-text&quot;&gt;hasslefree spatial tagging&lt;/b&gt;, by detecting BibTeX metadata <strong>at the end of content</strong> of text (see default mimetype &amp; Data URI)</li>
<li>XR Fragments allows &lt;b id=&quot;tagging-objects&quot;&gt;hasslefree spatial tagging&lt;/b&gt;, by treating 3D object name/class-pairs as BibTeX tags.</li>
<li>XR Fragments allows hasslefree &lt;a href=&quot;#textual-tag&quot;&gt;textual tagging&lt;/a&gt;, &lt;a href=&quot;#spatial-tag&quot;&gt;spatial tagging&lt;/a&gt;, and &lt;a href=&quot;#supra-tagging&quot;&gt;supra tagging&lt;/a&gt;, by mapping 3D/text object (class)names using BibTeX 'tags'</li>
<li>Bibs/BibTeX-appendices is first-choice <strong>requestless metadata</strong>-layer for XR text, HTML/RDF/JSON is great (but fits better in the application-layer)</li>
<li>BibTex &amp; Hashtagbibs are the first-choice <strong>requestless metadata</strong>-layer for XR text, HTML/RDF/JSON is great (but fits better in the application-layer)</li>
<li>Default font (unless specified otherwise) is a modern monospace font, for maximized tabular expressiveness (see <eref target="#core-principle">the core principle</eref>).</li>
<li>anti-pattern: hardcoupling a mandatory <strong>obtrusive markuplanguage</strong> or framework with an XR browsers (HTML/VRML/Javascript) (see <eref target="#core-principle">the core principle</eref>)</li>
<li>anti-pattern: limiting human introspection, by immediately funneling human thought into typesafe, precise, pre-categorized metadata like RDF (see <eref target="#core-principle">the core principle</eref>)</li>
</ol>
<t>This allows recursive connections between text itself, as well as 3D objects and vice versa, using <strong>BibTags</strong> :</t>
<artwork> +---------------------------------------------+ +------------------+
| My Notes | | / \ |
| | | / \ |
| The houses here are built in baroque style. | | /house\ |
| | | |_____| |
| | +---------|--------+
| @house{houses, &gt;----'house'--------| class/name match?
| url = {#.house} &gt;----'houses'-------` class/name match?
| } |
+---------------------------------------------+
<artwork> http://y.io/z.fbx | (Evaluated) BibTex/ 'wires' / tags |
----------------------------------------------------------------------------+-------------------------------------
| @house{castle,
+-[src: data:.....]----------------------+ +-[3D mesh]-+ | url = {https://y.io/z.fbx#castle}
| My Notes | | / \ | | }
| | | / \ | | @baroque{castle,
| The houses are built in baroque style. | | / \ | | url = {https://y.io/z.fbx#castle}
| | | |_____| | | }
| @house{baroque, | +-----│-----+ | @house{baroque,
| description = {classic} | ├─ name: castle | description = {classic}
| } | └─ class: house baroque | }
+----------------------------------------+ | @house{contactowner,
| }
+-[remotestorage.io / localstorage]------+ | @todo{contactowner,
| #contactowner@todo@house | | }
| ... | |
+----------------------------------------+ |
</artwork>
<blockquote><t>The enduser can add connections by speaking/typing/scanning <eref target="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs</eref> which the XR Browser can expand to (hidden) BibTags.</t>
</blockquote><t>This allows instant realtime tagging of objects at various scopes:</t>
<t>BibTex (generated from 3D objects), can be extended by the enduser with personal BiBTex or <eref target="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs</eref>.</t>
<blockquote><t><eref target="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs</eref> allows the enduser to add 'postit' connections (compressed BibTex) by speaking/typing/scanning text, which the XR Browser saves to remotestorage (or localStorage per toplevel URL). As well as, referencing BibTags per URI later on: <tt>https://y.io/z.fbx#@baroque@todo</tt> e.g.</t>
</blockquote><t>Obviously, expressing the relationships above in XML/JSON instead of BibTeX, would cause instant cognitive overload.<br />
The This allows instant realtime filtering of relationships at various levels:</t>
<table>
<thead>
<tr>
@ -368,38 +379,38 @@ Ideally metadata must come <strong>with</strong> text, but not <strong>obfuscate
<tbody>
<tr>
<td>&lt;b id=&quot;textual-tagging&quot;&gt;textual&lt;/b&gt;</td>
<td>text containing 'houses' is now automatically tagged with 'house' (incl. plaintext <tt>src</tt> child nodes)</td>
<td>text containing 'baroque' is now automatically tagged with 'house' (incl. plaintext <tt>src</tt> child nodes)</td>
</tr>
<tr>
<td>&lt;b id=&quot;spatial-tagging&quot;&gt;spatial&lt;/b&gt;</td>
<td>spatial object(s) with <tt>&quot;class&quot;:&quot;house&quot;</tt> (because of <tt>{#.house}</tt>) are now automatically tagged with 'house' (incl. child nodes)</td>
<td>spatial object(s) with name <tt>baroque</tt> or <tt>&quot;class&quot;:&quot;house&quot;</tt> are now automatically tagged with 'house' (incl. child nodes)</td>
</tr>
<tr>
<td>&lt;b id=&quot;supra-tagging&quot;&gt;supra&lt;/b&gt;</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, named 'house', are automatically tagged with 'house' (current node to root node)</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named 'baroque' or 'house', are automatically tagged with 'house' (current node to root nodes)</td>
</tr>
<tr>
<td>&lt;b id=&quot;omni-tagging&quot;&gt;omni&lt;/b&gt;</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, containing class/name 'house', are automatically tagged with 'house' (too node to all nodes)</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named 'baroque' or 'house', are automatically tagged with 'house' (too node to all nodes)</td>
</tr>
<tr>
<td>&lt;b id=&quot;infinite-tagging&quot;&gt;infinite&lt;/b&gt;</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, containing class/name 'house' or 'houses', are automatically tagged with 'house' (too node to all nodes)</td>
<td>text- or spatial-object(s) (non-descendant nodes) elsewhere, (class)named 'baroque' or 'house', are automatically tagged with 'house' (too node to all nodes)</td>
</tr>
</tbody>
</table><t>This empowers the enduser spatial expressiveness (see <eref target="#core-principle">the core principle</eref>): spatial wires can be rendered, words can be highlighted, spatial objects can be highlighted/moved/scaled, links can be manipulated by the user.<br />
The simplicity of appending BibTeX 'tags' (humans first, machines later) is also demonstrated by <eref target="https://visual-meta.info">visual-meta</eref> in greater detail.</t>
<ol spacing="compact">
</table><t>BibTex allows the enduser to adjust different levels of associations (see <eref target="#core-principle">the core principle</eref>): spatial wires can be rendered, words can be highlighted, spatial objects can be highlighted/moved/scaled, links can be manipulated by the user.<br />
</t>
<blockquote><t>NOTE: infinite matches both 'baroque' and 'style'-occurences in text, as well as spatial objects with <tt>&quot;class&quot;:&quot;style&quot;</tt> or name &quot;baroque&quot;. This multiplexing of id/category is deliberate because of <eref target="#core-principle">the core principle</eref>.</t>
</blockquote>
<ol spacing="compact" start="8">
<li>The XR Browser needs to adjust tag-scope based on the endusers needs/focus (infinite tagging only makes sense when environment is scaled down significantly)</li>
<li>The XR Browser should always allow the human to view/edit the metadata, by clicking 'toggle metadata' on the 'back' (contextmenu e.g.) of any XR text, anywhere anytime.</li>
</ol>
<blockquote><t>NOTE: infinite matches both 'house' and 'houses' in text, as well as spatial objects with <tt>&quot;class&quot;:&quot;house&quot;</tt> or name &quot;house&quot;. This multiplexing of id/category is deliberate because of <eref target="#core-principle">the core principle</eref>.</t>
<blockquote><t>The simplicity of appending BibTeX (and leveling the metadata-playfield between humans and machines) is also demonstrated by <eref target="https://visual-meta.info">visual-meta</eref> in greater detail.</t>
</blockquote>
<section anchor="default-data-uri-mimetype"><name>Default Data URI mimetype</name>
<t>The <tt>src</tt>-values work as expected (respecting mime-types), however:</t>
@ -446,34 +457,12 @@ The simplicity of appending BibTeX 'tags' (humans first, machines later) is also
| |
+--------------------------------------------------------------+
</artwork>
<t>The enduser will only see <tt>welcome human</tt> and <tt>Hello friends</tt> rendered spatially.
The beauty is that text (AND visual-meta) in Data URI promotes rich copy-paste.
<t>The enduser will only see <tt>welcome human</tt> and <tt>Hello friends</tt> rendered spatially (see mimetype).
The beauty is that text in Data URI automatically promotes rich copy-paste (retaining metadata).
In both cases, the text gets rendered immediately (onto a plane geometry, hence the name '_canvas').
The XR Fragment-compatible browser can let the enduser access visual-meta(data)-fields after interacting with the object (contextmenu e.g.).</t>
<blockquote><t>additional tagging using <eref target="https://github.com/coderofsalvation/hashtagbibs">bibs</eref>: to tag spatial object <tt>note_canvas</tt> with 'todo', the enduser can type or speak <tt>@note_canvas@todo</tt></t>
</blockquote><t>The mapping between 3D objects and text (src-data) is simple (the :</t>
<t>Example:</t>
<artwork> +------------------------------------------------+
| |
| index.gltf |
| │ |
| └── ◻ rentalhouse |
| └ class: house &lt;----------------- matches -------+
| └ ◻ note | |
| └ src:`data: todo: call owner | hashtagbib |
| #owner@house@todo | ----&gt; expands to @house{owner,
| | bibtex: }
| ` | @contact{
+------------------------------------------------+ }
</artwork>
<t>Bi-directional mapping between 3D object names and/or classnames and text using bibs,BibTags &amp; XR Fragments, allows for rich interlinking between text and 3D objects:</t>
<ol spacing="compact">
<li>When the user surfs to https://.../index.gltf#rentalhouse the XR Fragments-parser points the enduser to the rentalhouse object, and can show contextual info about it.</li>
<li>When (partial) remote content is embedded thru XR Fragment queries (see XR Fragment queries), indirectly related metadata can be embedded along.</li>
</ol>
</section>
</blockquote></section>
<section anchor="bibs-bibtex-lowest-common-denominator-for-linking-data"><name>Bibs &amp; BibTeX: lowest common denominator for linking data</name>
<blockquote><t>&quot;When a car breaks down, the ones <strong>without</strong> turbosupercharger are easier to fix&quot;</t>
@ -626,7 +615,7 @@ In that sense, it's one step up from the <tt>.ini</tt> fileformat (which has nev
</tr>
</tbody>
</table><blockquote><t>To keep XR Fragments a lightweight spec, BibTeX is used for rudimentary text/spatial tagging (not JSON, RDF or a scripting language because they're harder to write/speak/repair.).</t>
</blockquote><t>Applications are also free to attach any JSON(LD / RDF) to spatial objects using custom properties (but is not interpreted by this spec).</t>
</blockquote><t>Of course, on an application-level JSON(LD / RDF) can still be used at will, by embedding RDF-urls/data as custom properties (but is not interpreted by this spec).</t>
</section>
<section anchor="xr-text-example-parser"><name>XR Text example parser</name>
@ -634,7 +623,7 @@ In that sense, it's one step up from the <tt>.ini</tt> fileformat (which has nev
<ol spacing="compact">
<li>The XR Fragments spec does not aim to harden the BiBTeX format</li>
<li>respect multi-line BibTex values because of <eref target="#core-principle">the core principle</eref></li>
<li>Expand hashtag(bibs) and rulers (like <tt>${visual-meta-start}</tt>) according to the <eref target="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs spec</eref></li>
<li>Respect hashtag(bibs) and rulers (like <tt>${visual-meta-start}</tt>) according to the <eref target="https://github.com/coderofsalvation/hashtagbibs">hashtagbibs spec</eref></li>
<li>BibTeX snippets should always start in the beginning of a line (regex: ^@), hence mimetype <tt>text/plain;charset=utf-8;bib=^@</tt></li>
</ol>
<t>Here's an XR Text (de)multiplexer in javascript, which ticks all the above boxes:</t>