Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accessibility out of scope? #32

Closed
alice opened this issue May 26, 2020 · 2 comments
Closed

Accessibility out of scope? #32

alice opened this issue May 26, 2020 · 2 comments

Comments

@alice
Copy link

alice commented May 26, 2020

Coming here from the TAG review thread, as we are looking at this at our virtual face-to-face.

I was a little taken aback at the comment that the overlaid interface would help with internationalisation and accessibility, but that accessibility is out of scope.

  • Could you elaborate on why accessibility is out of scope, if it is worth mentioning as a benefit?
  • How do you envisage accessibility benefiting from this API?
@klausw
Copy link
Contributor

klausw commented May 26, 2020

I wasn't sure how to answer the accessibility scope question. The DOM overlays API itself is fairly minimal and doesn't expose new user-facing features that would need new accessibility considerations. There aren't any new user-visible dialogs or interactive elements provided as part of the DOM overlays API itself.

However, it does make already existing platform features available in a new context, and makes it possible to use existing web accessibility features for immersive applications too.

As a specific example, enabling "talkback" mode on an Android phone changes the way that platform UIs work. Screen touches result in the touched text or UI element being read aloud or described by voice output. This already works for 2D web content in the browser, but a normal WebXR application wouldn't have been able to make direct use of this since there wasn't any available information about screen content usable by this system feature.

When showing a DOM overlay on top of WebXR content, the existing "talkback" functionality now works for this DOM content. For example, if the application shows an "Exit AR" button, this button can now be discovered by touch and activated by double-tapping it when talkback is enabled. For more complex UI such as dialogs or explanatory text, this information is fully accessible without requiring any extra work by the application developer.

Of course, fully accessible applications will still require significant effort by the application developer, and some have started to work on this, for example supports spoken descriptions of scene views, but this becomes easier with DOM overlays, for example when used for text annotations.

If you have an ARCore-compatible Android phone, you can try this on https://modelviewer.dev/examples/annotations.html using Chrome Beta (or upcoming regular Chrome or Dev/Canary, as long as it's >= version 83). You can configure TalkBack in Android system "accessibility" settings, and set the "Volume Up + Volume Down" key combination as a toggle for that feature.

@klausw
Copy link
Contributor

klausw commented May 27, 2020

I've rephrased this in the explainer, removing the "beyond the scope" part and clarified that the goal is to let UAs leverage existing DOM accessibility features.

@klausw klausw closed this as completed Dec 10, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants