-
Notifications
You must be signed in to change notification settings - Fork 20
/
Copy pathindex.bs
executable file
·351 lines (256 loc) · 20.4 KB
/
index.bs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
<pre class="metadata">
Shortname: webxr-ar-module
Title: WebXR Augmented Reality Module - Level 1
Group: immersivewebwg
Status: ED
TR: https://www.w3.org/TR/webxr-ar-module-1/
ED: https://immersive-web.github.io/webxr-ar-module/
Repository: immersive-web/webxr-ar-module
Level: 1
Mailing List Archives: https://lists.w3.org/Archives/Public/public-immersive-web-wg/
Implementation Report: https://wpt.fyi/results/webxr/ar-module?label=experimental&label=master&aligned
WPT Path Prefix: webxr/ar-module
!Participate: <a href="https://github.com/immersive-web/webxr-ar-module/issues/new">File an issue</a> (<a href="https://github.com/immersive-web/webxr-ar-module/issues">open issues</a>)
!Participate: <a href="https://lists.w3.org/Archives/Public/public-immersive-web-wg/">Mailing list archive</a>
!Participate: <a href="irc://irc.w3.org:6665/">W3C's #immersive-web IRC</a>
Editor: Brandon Jones 87824, Google https://google.com/, [email protected]
Former Editor: Nell Waliczek 93109, Amazon [Microsoft until 2018] https://amazon.com/, [email protected]
Editor: Manish Goregaokar 109489, Google [Mozilla until 2020], [email protected]
Editor: Rik Cabanier 106988, Meta https://facebook.com, [email protected]
Abstract: The WebXR Augmented Reality module expands the <a href="https://www.w3.org/TR/webxr/">WebXR Device API</a> with the functionality available on AR hardware.
Status Text: This WebXR Augmented Reality Module is designed as a module to be implemented in addition to <a href="https://www.w3.org/TR/webxr/">WebXR Device API</a>, and is originally included in WebXR Device API which was divided into core and modules.
</pre>
<pre class="link-defaults">
spec:infra;
type:dfn; text:string
spec: webxr-1;
type:enum-value; text:"immersive-vr"
type:enum-value; text:"inline"
type: enum; text: XRSessionMode
type: dfn; text: exclusive access
type: dfn; text: immersive xr device
type: dfn; text: xr device; for: /
type: dfn; text: mode; for: XRSession
type: dfn; text: inline session
type: dfn; text: immersive session
type: dfn; text: xr compositor
type: dfn; text: native origin
type: dfn; text: viewer reference space
type: dfn; text: feature descriptor
type: dfn; text: view
type: dfn; text: eye; for: view
type: dfn; text: secondary view
type: dfn; text: secondary-views; for:secondary view
type: event; text:select
</pre>
<pre class="anchors">
spec: compositing-1; urlPrefix: https://www.w3.org/TR/compositing-1
type: dfn; text: source-over; url: porterduffcompositingoperators_srcover
type: dfn; text: lighter; url: porterduffcompositingoperators_plus
</pre>
<link rel="icon" type="image/png" sizes="32x32" href="favicon-32x32.png">
<link rel="icon" type="image/png" sizes="96x96" href="favicon-96x96.png">
<style>
.unstable::before {
content: "This section is not stable";
display: block;
font-weight: bold;
text-align: right;
color: red;
}
.unstable {
border: thin solid pink;
border-radius: .5em;
padding: .5em;
margin: .5em calc(-0.5em - 1px);
background-image: url("data:image/svg+xml,<svg xmlns='http://www.w3.org/2000/svg' width='300' height='290'><text transform='rotate(-45)' text-anchor='middle' font-family='sans-serif' font-weight='bold' font-size='70' y='210' opacity='.1'>Unstable</text></svg>");
background-repeat: repeat;
background-color: #FFF4F4;
}
.unstable h3:first-of-type {
margin-top: 0.5rem;
}
.unstable.example:not(.no-marker)::before {
content: "Example " counter(example) " (Unstable)";
float: none;
}
.non-normative::before {
content: "This section is non-normative.";
font-style: italic;
}
.tg {
border-collapse: collapse;
border-spacing: 0;
}
.tg th {
border-style: solid;
border-width: 1px;
background: #90b8de;
color: #fff;
font-family: sans-serif;
font-weight: bold;
border-color: grey;
}
.tg td {
padding: 4px 5px;
background-color: rgb(221, 238, 255);
font-family: monospace;
border-style: solid;
border-width: 1px;
border-color: grey;
overflow: hidden;
word-break: normal;
}
</style>
Introduction {#intro}
============
<section class="non-normative">
Hardware that enables Virtual Reality (VR) and Augmented Reality (AR) applications are now broadly available to consumers, offering an immersive computing platform with both new opportunities and challenges. The ability to interact directly with immersive hardware is critical to ensuring that the web is well equipped to operate as a first-class citizen in this environment. The WebXR Augmented Reality module expands the functionality available to developers when their code is running on AR hardware.
</section>
Terminology {#terminology}
-----------
Augmented Reality describes a class of XR experiences in which virtual content is aligned and composed with the <dfn>real-world environment</dfn> before being displayed to users.
XR hardware can be divided into categories based on <dfn>display technology</dfn>: [=additive light=], [=pass-through=], and [=opaque=].
Devices described as having an <dfn>additive light</dfn> [=display technology=], also known as see-through, use transparent optical displays to present virtual content. On these devices, the user may always be able to see through to the [=real-world environment=] regardless of developer requests during session creation.
Note: Such devices typically will not do any compositing in software, relying on the natural compositing afforded by transparent displays.
<div class=example>Examples of such devices include the <a href="https://www.microsoft.com/en-us/hololens/hardware">Hololens 2</a> and <a href="https://www.magicleap.com/en-us">Magic Leap</a> devices. </div>
Devices described as having a <dfn>pass-through</dfn> [=display technology=] use an opaque display to combine virtual content with a camera stream of the [=real-world environment=]. On these devices, the [=real-world environment=] will only be visible when the developer has made an explicit request for it during session creation.
Note: Such devices will typically use cameras to collect images of the real world, and composite the AR scene with these images in software before displaying them to the user.
<div class=example>Examples of such devices include handheld mobile AR with a phone, and the <a href="https://varjo.com/products/xr-3/">Varjo XR-3</a> device. </div>
Devices described as having an <dfn>opaque</dfn> [=display technology=] fully obscure the [=real-world environment=] and do not provide a way to view the [=real-world environment=].
Note: Such devices are typically VR devices that have chosen to allow {{XRSessionMode/"immersive-ar"}} sessions in an attempt to provide a compatibility path for AR content on VR devices.
WebXR Device API Integration {#webxr-device-api-integration}
==============
XRSessionMode {#xrsessionmode-enum}
-------------
As defined in the <a href="https://www.w3.org/TR/webxr/">WebXR Device API</a> categorizes {{XRSession}}s based on their {{XRSessionMode}}. This module enables use of the {{XRSessionMode/"immersive-ar"}} {{XRSessionMode}} enum.
A session mode of <dfn enum-value for="XRSessionMode">"immersive-ar"</dfn> indicates that the session's output will be given [=exclusive access=] to the [=immersive XR device=] display and that content <b>is</b> intended to be [=blend technique|blended=] with the [=real-world environment=].
On compatible hardware, user agents MAY support {{XRSessionMode/"immersive-vr"}} sessions, {{XRSessionMode/"immersive-ar"}} sessions, or both. Supporting the additional {{XRSessionMode/"immersive-ar"}} session mode, does not change the requirement that user agents MUST support {{XRSessionMode/"inline"}} sessions.
NOTE: This means that {{XRSessionMode/"immersive-ar"}} sessions support all the features and reference spaces that {{XRSessionMode/"immersive-vr"}} sessions do, since both are [=immersive sessions=].
<div class="example">
The following code checks to see if {{XRSessionMode/"immersive-ar"}} sessions are supported.
<pre highlight="js">
navigator.xr.isSessionSupported('immersive-ar').then((supported) => {
if (!supported) { return; }
// 'immersive-ar' sessions are supported.
// Page should advertise AR support to the user.
}
</pre>
</div>
<div class="example">
The following code attempts to retrieve an {{XRSessionMode/"immersive-ar"}} {{XRSession}}.
<pre highlight="js">
let xrSession;
navigator.xr.requestSession("immersive-ar").then((session) => {
xrSession = session;
});
</pre>
</div>
XREnvironmentBlendMode {#xrenvironmentblendmode-enum}
----------------------
When drawing XR content, it is often useful to understand how the rendered pixels will be blended by the [=XR Compositor=] with the [=real-world environment=].
<pre class="idl">
enum XREnvironmentBlendMode {
"opaque",
"alpha-blend",
"additive"
};
partial interface XRSession {
// Attributes
readonly attribute XREnvironmentBlendMode environmentBlendMode;
};
</pre>
The <dfn attribute for="XRSession">environmentBlendMode</dfn> attribute MUST report the {{XREnvironmentBlendMode}} value that matches [=blend technique=] currently being performed by the [=XR Compositor=].
- A blend mode of <dfn enum-value for="XREnvironmentBlendMode">opaque</dfn> MUST be reported if the [=XR Compositor=] is using [=opaque environment blending=].
- A blend mode of <dfn enum-value for="XREnvironmentBlendMode">alpha-blend</dfn> MUST be reported if the [=XR Compositor=] is using [=alpha-blend environment blending=].
- A blend mode of <dfn enum-value for="XREnvironmentBlendMode">additive</dfn> MUST be reported if the [=XR Compositor=] is using [=additive environment blending=].
XRInteractionMode {#xrinteractionmode-enum}
----------------------
Sometimes the application will wish to draw UI that the user may interact with. WebXR allows for a variety of form factors, including both handheld phone AR and head-worn AR. For different form factors, the UIs will belong in different spaces to facilitate smooth interaction, for example the UI for handheld phone AR will likely be drawn directly on the screen without projection, but the UI for headworn AR will likely be drawn a small distance from the head so that users may use their controllers to interact with it.
<pre class="idl">
enum XRInteractionMode {
"screen-space",
"world-space",
};
partial interface XRSession {
// Attributes
readonly attribute XRInteractionMode interactionMode;
};
</pre>
The <dfn attribute for="XRSession">interactionMode</dfn> attribute describes the best space (according to the user agent) for the application to draw interactive UI for the current session.
- An {{XRSession/interactionMode}} value of {{XRInteractionMode/"screen-space"}} indicates that the UI should be drawn directly to the screen without projection. Typically in this scenario, {{select!!event}} events are triggered with {{XRInputSourceEvent/inputSource}}s having an {{XRInputSource/targetRayMode}} of {{XRTargetRayMode/"screen"}}.
- An {{XRSession/interactionMode}} value of {{XRInteractionMode/"world-space"}} indicates that the UI should be drawn in the world, some distance from the user, so that they may interact with it using controllers. Typically in this scenario, {{select!!event}} events are triggered with {{XRInputSourceEvent/inputSource}}s having an {{XRInputSource/targetRayMode}} of {{XRTargetRayMode/"tracked-pointer"}} or {{XRTargetRayMode/"gaze"}}.
Note: The <a href="https://immersive-web.github.io/dom-overlays/">WebXR DOM Overlays module</a>, if supported, can be used in some of these cases instead.
XR Compositor Behaviors {#xr-compositor-behaviors}
---------------------
When presenting content to the [=/XR device=], the [=XR Compositor=] MUST apply the appropriate <dfn>blend technique</dfn> to combine virtual pixels with the [=real-world environment=]. The appropriate technique is determined based on the [=XR device=]'s [=display technology=] and the [=XRSession/mode=].
- When performing <dfn>opaque environment blending</dfn>, the rendered buffers obtained by the [=XR Compositor=] are composited using [=source-over=] blending on top of buffers containing exclusively 100% opaque black pixels. The composited output is then presented on the [=XR device=]. This technique MUST be applied on [=opaque=] and [=pass-through=] displays when the [=XRSession/mode=] is set to either {{XRSessionMode/"immersive-vr"}} or {{XRSessionMode/"inline"}}. This technique MUST NOT be applied when the [=XRSession/mode=] is set to {{XRSessionMode/"immersive-ar"}}, regardless of the [=XR Device=]'s [=display technology=].
- When performing <dfn>alpha-blend environment blending</dfn>, the rendered buffers obtained by the [=XR Compositor=] are composited using [=source-over=] blending on top of buffers containing pixel representations of the [=real-world environment=]. These pixel representations must be aligned on each {{XRFrame}} to the {{XRView/transform}} of each view in {{XRViewerPose/views}}. The composited output is then presented on the [=XR device=]. This technique MUST be applied on [=pass-through=] displays when the [=XRSession/mode=] is set {{XRSessionMode/"immersive-ar"}}. This technique MUST NOT be applied when the [=XRSession/mode=] is set to {{XRSessionMode/"immersive-vr"}} or {{XRSessionMode/"inline"}} regardless of the [=XR Device=]'s [=display technology=].
- When performing <dfn>additive environment blending</dfn>, the rendered buffers obtained by the [=XR Compositor=] are composited using [=lighter=] blending before being presented on the [=/XR device=]. This technique MUST be applied on [=additive light=] displays, regardless of the [=XRSession/mode=].
NOTE: When using a device that performs [=alpha-blend environment blending=], use of a {{XRRenderState/baseLayer}} with no alpha channel will result in the [=real-world environment=] being completely obscured. It should be assumed that this is intentional on the part of developer, and the user agent may wish to suspend compositing of [=real-world environment=] as an optimization in such cases.
The [=XR Compositor=] MAY make additional color or pixel adjustments to optimize the experience. The timing of composition MUST NOT depend on the [=blend technique=] or source of the [=real-world environment=]. but MUST NOT perform occlusion based on pixel depth relative to real-world geometry; only rendered content MUST be composed on top of the real-world background.
NOTE: Future modules may enable automatic or manual pixel occlusion with the [=real-world environment=].
The [=XR Compositor=] MUST NOT automatically grant the page access to any additional information such as camera intrinsics, media streams, real-world geometry, etc.
NOTE: Developers may request access to an [=/XR Device=]'s camera, should one be exposed through the existing [[mediacapture-streams|Media Capture and Streams]] specification. However, doing so does not provide a mechanism to query the {{XRRigidTransform}} between the camera's location and the [=native origin=] of the [=viewer reference space=]. It also does not provide a guaranteed way to determine the camera intrinsics necessary to match the view of the [=real-world environment=]. As such, performing effective computer vision algorithms wil be significantly hampered. Future modules or specifications may enable such functionality.
First Person Observer Views {#first-person-observer}
--------------------------------
Many AR devices have a camera, however the camera is typically not aligned with the eyes. When doing video capture of the session for streaming or saving to a file, it is suboptimal to simply composite this camera feed with one of the rendered eye feeds as there will be an internal offset. Devices may use reprojection or other tricks to fix up the stream, but some may expose a [=secondary view=], the <dfn>first-person observer view</dfn>, which has an [=view/eye=] of {{XREye/"none"}}.
Site content MUST explicitly opt-in to receiving a [=first-person observer view=] by enabling the "[=secondary view/secondary-views=]" [=feature descriptor=].
Enabling the "[=secondary view/secondary-views=]" feature for a session that supports [=first-person observer views=] SHOULD NOT enable the [=first-person observer view=] unconditionally on every frame of the session, rather it will only expose this view in the {{XRViewerPose/views}} array for frames when capture is going on.
While the {{XRSession}} has a [=blend technique=] exposed by the {{XRSession/environmentBlendMode}}, [=first-person observer views=] always use [=alpha-blend environment blending=].
Site content may wish to know which view is the [=first-person observer view=] so that it can account for the different [=blend technique=], or choose to render UI elements differently. {{XRView}} objects that correspond to the [=first-person observer view=] have their {{isFirstPersonObserver}} attribute returning <code>true</code>.
<pre class="idl">
partial interface XRView {
readonly attribute boolean isFirstPersonObserver;
};
</pre>
<div class=example>
For most programs, supporting secondary views is simply a matter of:
- Including `"secondary-views"` as an optional feature in {{XRSystem/requestSession()}}
- Ensuring that {{XRViewerPose/views}} is iterated over instead of just accessing the first two elements
<pre highlight="js">
let session = await navigator.xr.requestSession("immersive-ar", {optionalFeatures: ["secondary-views"]});
let space = await session.requestReferenceSpace("local");
// perform other set up
let gl = /* obtain a graphics context */;
session.requestAnimationFrame(function(frame) {
let views = frame.getViewerPose(space);
// IMPORTANT: use `view of views` here instead of
// directly indexing the first two or three elements
for (view of views) {
render(session, gl, view);
}
});
function render(session, gl, view) {
// render content to the view
// potentially use view.isFirstPersonObserver if necessary to
// distinguish between compositing info
}
</pre>
</div>
Privacy & Security Considerations {#privacy-security}
=================================
Implementations of the AR Module MUST NOT expose camera images to the content, rather they MUST handle any compositing with the real world in their own implementations via the [=XR compositor=]. Further extensions to this module MAY expose real-world information (like raw camera frames or lighting estimation), however they MUST gate this behavior on an additional [=feature descriptor=] that requires user consent.
Compared to the WebXR Device API it extends, the AR module only provides some additional details about the nature of the device it is running on via the {{XRSession/environmentBlendMode}} and {{XRSession/interactionMode}} attributes. It allows websites to start an XR session as
{{XRSessionMode/"immersive-ar"}} which blends the real world behind the XR
scene.
Even if this module does not allow websites to access the camera images, it may
not be obvious to end users and user agents SHOULD clarify this.
<h2 id="changes" class="no-num">
Changes</h2>
<h3 id="changes-from-20191010" class="no-num">
Changes from the <a href="https://www.w3.org/TR/2019/WD-webxr-ar-module-1-20191010/">First Public Working Draft 10 October 2019</a></h3>
- Added Privacy and Security considerations (<a href="https://github.com/immersive-web/webxr-ar-module/pull/49">GitHub #49</a>, <a href="https://github.com/immersive-web/webxr-ar-module/pull/63">GitHub #63</a>)
- Clarification of terminology (<a href="https://github.com/immersive-web/webxr-ar-module/pull/63">GitHub #63</a>)
- Added first person observer view (<a href="https://github.com/immersive-web/webxr-ar-module/pull/57">GitHub #57</a>)
- Renamed XRInteractionSpace to XRInteractionMode (<a href="https://github.com/immersive-web/webxr-ar-module/pull/52">GitHub #52</a>)
- Added XRInteractionSpace (<a href="https://github.com/immersive-web/webxr-ar-module/pull/50">GitHub #50</a>)
Acknowledgements {#ack}
================
The following individuals have contributed to the design of the WebXR Device API specification:
* <a href="mailto:[email protected]">Chris Van Wiemeersch</a> (<a href="https://mozilla.org/">Mozilla</a>)
* <a href="mailto:[email protected]">Kearwood Gilbert</a> (<a href="https://mozilla.org/">Mozilla</a>)
* <a href="mailto:[email protected]">Rafael Cintron</a> (<a href="https://microsoft.com/">Microsoft</a>)
* <a href="mailto:[email protected]">Sebastian Sylvan</a> (Formerly <a href="https://microsoft.com/">Microsoft</a>)
And a special thanks to <a href="mailto:[email protected]">Vladimir Vukicevic</a> (<a href="https://unity3d.com/">Unity</a>) for kick-starting this whole adventure!
</section>