A blind man walks into a bar. The bartender asks him what he’s having. The blind man points to the shelf of expired grape juice. The bartender, amazed by the man’s ability to point at an over aged grape water asks how the man is able to navigate the space so effortlessly.
The blind man tips his shades, revealing the glow of a screen portraying the floor the lenses are aiming at. What he reveals, is an emergent technology that allows those who are legally blind to see without much impairment. His VR device uses front-facing cameras to capture whatever his glasses are looking at from afar and display it right in front of him.
This works because not all who are legally blind, are completely blind. To be legally blind your vision with corrective lenses must be worse with 20/200. And what does this mean? Well, if you have 20/20 vision then the smallest letters you can read from 20 feet away match the normal 20-foot distance. But if you have 20/200 vision, which is the standard for being legally blind, the letters you can read from 20 feet can be read from 200 feet by people with the perceived normal 20/20.
Vision is a weird thing. I use a -4.25 and a -4.00 prescription for my contacts, which means unaided, my vision is 20/ 425 and 20/ 400 respectively. With the magic of curved spheres I am able to navigate the world with ‘normal vision’, as we define at 20/20. Anytime I want to step into the legally blind perspective, all I have to do is navigate a day without my lenses. And that’s incredibly challenging.
However, many vision impaired, and legally blind individuals can still see, as I can without my corrective lenses. And many of these individuals are nearsighted. And when VR devices can rest just inches away from your eyes, being nearsighted isn’t entirely a problem. In fact, many people who wear glasses will find that they no longer need them when operating VR spaces. Being nearsighted in VR is not typically noticeable. With the proper technology, our nearsighted disability, is no longer a disability.
And with interactions trending towards the virtual world there are more opportunities for User Experience Designers to create spaces where accessibility removes disability. While the example of front-facing cameras on a VR device can allow for an accessible user interface for more than just the blind, it becomes even less of a technology burden to adapt virtual spaces we already occupy so heavily into accessible environments.
Software devs and frequent internet users alike may already be familiar with ADA, Section 508 which requires closed captioning. Different facets of these laws apply to different styles of visual media. For example, live streams, even public-facing ones, are not required to have closed captioning even though it is an easy thing to do with modest accuracy, given modern technology. While different media companies are facing different sides of these laws and regulations, it is recommended to use Closed Captioning whenever possible. And with the current increase in Live video content here are some helpful links for adding captions to Facebook Live, Live stream and video conference calls.
Just because it is not required, does not mean it should not be. Law lags behind ethics, so make sure your industry or production team is leading with ethics that predicate laws. If you are designing web content you may use the WCAG guidelines (for rich internet applications) which can be summed up with the premise, allow all visual content to be equally accessible without visuals and vice versa.
This simple summary of hundreds of pages of accessibility legalese extends beyond the written guidelines and into UX design. It is one thing to have captions for your content, but it is another to do them well. FCC online video guidelines require 99 percent accuracy, exact wording, time synchronization, completeness and placement that does not impair other important content. But captions should strive for more.
WCAG guidelines require all visuals to have text descriptions, but good UX writing can allow for a more complete word drawn picture.
Captions need to surpass just the basic coverage of dialogue that so often goes wrong. Using different fonts, and colors, captions can contain expression and emotion. Red font paired with proper priming tools, such as emojiis or just the scene itself, can show love, anger, or even signify a character. Motifs and themes can be enhanced with subtitles for all viewers. Have a scene with an alien language? Add a font with less readability to show confusion. Have a dramatic scene in a cold climate? Add captions to visible clouds of breathe as characters speak. Captions and artistic subtitles can enhance communications for everyone. Let’s examine some examples. The first of which I crafted in about two minutes, the other two are from the 2004 film, Night Watch.
There are plenty of opportunities to use captions and subtitles as visual art in film. They can enhance immersion, provide expression, create familiarity with characters or add to motifs and themes. The same strategies can be invoked to add to experiences for vision impaired while reading prose. Try adding sound files to embellish scenes. Have a web story that takes place in an airplane? Add ambient airplane noises. Following an epic quest to destroy a ring on the side of a volcano? Add rumbling sounds. Have your protagonists entered a loud club where they can barely hear each other? What’s playing (below)?
With emergent technologies such as VR and haptic suits, the possibility to make experiences that simultaneously open up accessibility as well as enhance all user experiences. Imagine watching Lord of the Rings without sound and having rumble packs can create epic moments for charges into battle, or feel the cold of entering Shelob’s lair with Frodo. Haptic suits such as the Teslasuit can do all this and more. With temperature control, rumble packs and electric shock, Teslasuit does everything but added smells, which is also available, albeit in lower demand.
There are risks to adding more features using emergent technologies. Smellovision can trigger allergies or nausea in acute cases. Electric shocks may be bad news for people with pacemakers or preexisting conditions. However, risks to immersive media are nothing new for the entertainment industry, and we can find ways to thwart these risks with proper warning labels and cautionary yet simple agreements.
From a User Experience perspective, we must make sure this new tech is both adding to the enjoyment of the experience and not creating fruitless barriers to entry that may even risks. Media has never been without risk, whether these are artistic risks or risks of breaking into emergent technologies, this risk-taking drives trends. As early as 1997, an episode of Pokemon had a scene cause epileptic seizures in many viewers when Pikachu uses “Thunderbolt” causing the screen to flash red and blue rapidly. Currently, there are tools to thwart possible epilepsy triggering moments. The Photosensitive Epilepsy Analysis Tool (PEAT) is free for content producers everywhere to make sure their content will not trigger seizures. Similar guidelines can be used for VR experiences, but none such exist for Haptic body suits or smellovision yet. It’s safe to say that as these products become more available for consumers we will have some growing pains to overcome.
Accessibility is greatly enhanced through emergent technologies that highlight the setting dependency of disability. Motor, visual and cognitive constraints can be eliminated through careful and conscientious user experience design. It just takes time. The payoffs for this time enhances the value of your content’s community, the quality of the media and your content’s general consumption rate. Whether you are producing VR games, software, video content or podcasts, taking the time to add sensory experiences to your content increases its quality and provides you a larger market audience.