Blog homepage

Blind Accessibility - Step One: Get to know your players

Game Audio / Sound Design

Over the last decade, game accessibility has seen a huge rise in awareness and support from the industry. Subtitles exist for most AAA and indie games, and even colorblind modes, resizable text and re-mappable controls are becoming commonplace. Microsoft also released the Xbox Adaptive Controller, featuring ads for the device during the Macy’s Thanksgiving Parade and Super Bowl LIII—making these commercials quite the investment.

Blind accessibility, too, has made big improvements in recent years. Zach Quarles, Supervising Sound Director at Microsoft, wrote in 2017 about how they made the sound design in Killer Instinct accessible to blind competitors by enhancing player feedback. Mike Zaimont of Lab Zero Games added text-to-speech to the menus in the PC version of Skullgirls, first through ClipTrap and ClipReader and then by implementing Tolk, an open source screen reader abstraction library for Windows. Karen Stevens, Accessibility Lead at EA Sports, also led an effort to include haptic feedback in Madden NFL 18 to make the core gameplay more accessible to players with visual impairments. And I can go on: Microsoft and the Shepherd Center have teamed up to form The Accessibility User Research Collective, which is currently conducting a study in the US to improve screen reader access in the Xbox app, and they've had success in the past with testing accessible game chat in Halo Wars 2 and Forza Motorsport 7. Plus, more recently, Crackdown 3 and the private beta of The Division 2 have included voiced menus, which I will talk more about later. There are also several new indie game studios that have made blind accessibility part of their mission statements and now dozens of Twitch and YouTube channels hosted by gamers with visual impairments.

With all that said, blind accessibility is still relatively unheard of outside communities who are passionate about game accessibility. While there are those who think gaming without sight isn’t possible because it may not conform to their ideas of entertainment, these cynics are not whom I'd like to target. Instead I'd like to speak to the audience of sighted developers and sound designers whose ears perk up and their minds buzz with ideas when they are introduced to blind gamers. So if that's you, please read on. 

 

a1

Since I mentioned all these great initiatives that are taking place to improve accessibility, you may be wondering, isn't it all being taken care of? What kind of role would I play? As a sound designer—whether you design sound effects, write music, implement sounds, create audio tools, etc.—you have a very important role. You must communicate with players, not only through your sound design but also in a more literal sense. In order to know what someone with visual impairments requires to play your game, you need to speak with them and learn about their experiences, and then ask them to play your game and get their feedback. Luckily, there are several professionals who are willing to help:

  • SightlessKombat is an accessibility consultant who has spoken at the Game Accessibility Conference (GAconf) and Microsoft’s Gaming and Disability boot camps. In 2017, he undertook a fellowship with the Winston Churchill Memorial Trust where he attended E3 and reported on the state of blind accessibility in the games industry, covering companies like Nintendo, Microsoft, Sony, and Valve. He has also visited several studios in person to offer insight into how he plays games, and he creates a range of content for his YouTube channel.
  • Brandon Cole is an accessibility consultant and blogger who has been writing about blind accessibility and his experiences as a blind gamer since 2012. He, too, has made in-person studio visits and spoken at several conferences to include GDC, GAconf, and accessibility boot camps for Microsoft and Sony.
  • Tomasz Tworek is a sound designer and gamer with visual impairments who, in 2013, started a thread to compile all the mainstream games that could be played without sight, meaning they could be generally learned through sound but may require online manuals or sighted assistance to navigate menus or certain scenes. This was at a time when blind accessibility was relatively unknown even in the game accessibility community. Since he first wrote this post, it has received over 887,000 views and 1,400+ comments.
  • Jesse Anderson owns the YouTube channel IllegallySighted and has over 1,000 videos about gaming from his perspective as a low vision gamer. He reviews mainstream and blind-specific games, and he has a playlist dedicated to accessibility in VR for users with visual impairments. He has presented on VR accessibility at the #ID24 conference, and he will be on a panel at this year’s GAconf.

These consultants have years of experience and insight from not only playing and writing about games but also from using assistive technology and navigating the world in a fundamentally different way than someone who is sighted. While I have heard of "blindfold" days where developers attempt to use technology without sight in order to raise awareness and empathy, blind accessibility goes vastly beyond simply closing one's eyes. From the insanely fast speech rates at which some players can comprehend their screen readers to their abilities to mentally map their surroundings and pick up on the slightest audio details, these skills can be generally understood by sighted people, but not attained by them, at least not to the level that these are intuitive abilities for blind gamers who use them on a daily basis and can be used to their advantages in games.

However, it is a knee-jerk reaction for sighted developers, especially sound designers, to want to test the accessibility of their games by listening and closing their eyes. One such developer, Brian Schmidt, whom we all know as the creator of GameSoundCon and President of the Game Audio Network Guild, discovered just how nuanced and unexpected these interactions can be as someone who is sighted. After releasing his audio game Ear Monsters, he received feedback from blind players who said the audio’s left and right panning was flipped. As the game detected the phone’s landscape orientation, it turns out that since blind players did not hold their phones toward their faces as sighted people instinctively do—rather, holding their phones flat or even slightly away from them—the gyroscope would flip the screen, thereby reversing the stereo panning. Brian was able to solve this issue simply by locking the screen orientation. He then shared this experience in a Gamasutra post, explaining how easy it was to overlook these differences in how blind and sighted people interact with technology.

So while you as a sighted developer may have an expert ear from years of designing sound and may be up for learning to use a screen reader, I instead recommend that you hire the accessibility consultants I mentioned to test your games and get feedback from their perspectives. And don't let any worries that you may say something unintentionally offensive stop you. The English language is full of expressions like "see you later" and "take a look" (which many sound designers have modified to "take a listen" instead), and the paranoia of using these vision-oriented words seems to stem more from sighted people not wanting to hurt anyone's feelings than an actual sensitivity in the blind community. As long as you treat others as being worthy of dignity and respect and have an openness to adjust your language with a better understanding of another person's experiences, there is no reason to fear reaching out to gamers with disabilities.

a2

Now that you have a pathway to communicate with blind gamers, you may wonder where to even begin with accessibility in your development process. What are the known issues? What if you get halfway and realize you can't keep your promises or afford to make your game completely blind accessible? Not everyone may agree with me, but I believe every attempt at accessibility is a step in the right direction. Even if your game may not be playable without sight, perhaps you've opened it up to gamers with low vision or dyslexia. But you also want to have a plan in order to avoid falling into the rabbit holes of design, since we all have limited time. While I talk about ways to approach accessible sound design in my presentation for MIGS18, I'd like to go more in-depth about screen readers in this post.

 

Screen readers have a nebulous relationship with games. So far, the most successful implementation of text-to-speech has come from games made with proprietary or open source engines. Skullgirls was developed with the Z-Engine, A Hero’s Call was developed with MonoGame, and most of the blind-accessible games on Audiogames.net were developed in Python. Part of the reason for this is because two of the most popular game engines, Unity and Unreal Engine 4, do not have built-in support for screen reader compatibility. This issue luckily has improved vastly in recent years and hopefully this part of my post will be outdated soon. As of now, there are third-party solutions that exist to varying degrees of success. One issue is that some of these plug-ins have a slight delay from generating audio files or from interacting with the screen reader itself. This may be acceptable for menus, but these delays can quickly kill the feel of a fast-paced game that uses text to indicate gameplay elements. Another issue is that the Xbox SDK’s text-to-speech API was released less than two years ago, so it is at its very beginnings of being implemented in games. However, as I mentioned before, two recent releases have included voiced menus, Crackdown 3 and the current beta of The Division 2. Though text-to-speech isn't complete in these games, this is very exciting news. To understand their importance, I'd like to introduce you to the CVAA.

The 21st Century Communications and Video Accessibility Act 2010 covers products and services in any industry that uses "advanced communications services"—specifically electronic messaging, interconnected and non-interconnected VoIP, and interoperable video conferencing—and it requires these services to offer at least one setting or mode that is usable by individuals with disabilities such as epilepsy, colorblindness, limited strength, and the focus of this article, blindness and low vision.

This law applies specifically to communication services—and the UI elements and information used to operate or navigate to these services—in video games that have been released or substantially updated after 1 January 2019. This is why the Xbox SDK allows real-time two-way transcription between text and speech for in-game chat, supporting games like Forza Horizon 4, Sea of Thieves, Halo Wars 2, and soon more. Also, other games have started to implement their own bespoke platform-agnostic text-to-speech solutions, like Apex Legends, and the Xbox includes a “Let games read to me” option in its Ease of Access menu that makes UI elements accessible: Crackdown 3 is a limited example. The CVAA also covers hardware, which is why Narrator was implemented on the Xbox One home screen to meet the October 2015 deadline for screen reader accessibility on consoles with online capability. However, while these text-to-speech integrations are historic for consoles, and accessible home screens and game chat are excellent for blind gamers, this means there are no accessibility requirements for gameplay elements that are not related to player-to-player communication—such as tutorials, story text, and other in-game menus and user interface elements—even though these are essential for blind gamers to get a fuller experience. So why am I so hopeful?
Throughout the history of accessible media and disability law, it often takes the threat of being sued for companies to abide by accessibility and discrimination laws, such as regulations in the US under the Americans with Disabilities Act, Federal Communications Commission, Section 508, and the CVAA. However, game accessibility faces a wider range of complex design scenarios than web and broadcast accessibility. While these industries have the Web Content Accessibility Guidelines 2.0 and FCC regulations for audio-described programming, the CVAA is more open to “how” designers implement accessibility. This can open up it to the inconsistencies of case-by-case bases and may not ensure complete access; just as how even with ADA laws, Hulu has offered no audio description while iTunes and Netflix provide over 900 audio-described titles. Yet while web and broadcast accessibility are effective in ensuring basic access, the games industry has been more creatively inspired in how it approaches accessibility. For example, compare the customizable subtitles in games like Spider-Man or Life is Strange to closed captioning on television, or look at how the game Eagle Island offers not only screen reader accessibility but also the options to slow the speed of the in-game clock, outline the characters, dim the background, and disable the lighting to aid players with low vision and even sensory sensitivities.

It also seems many companies have already been exceeding the base level of accessibility required by the CVAA. The text-to-speech in Crackdown 3 and The Division 2 may not extensively cover gameplay elements like Skullgirls or A Hero's Call, but they do offer text-to-speech for menus and UI elements not related to game chat, going beyond the minimum requirements. As experimentation with text-to-speech continues, developers can expand upon each success—just as how colorblind modes incrementally improve—and perhaps one day text-to-speech will achieve the same level of industry support as subtitles. Plus, just as how many gamers who use these other features are not deaf or colorblind, so to could text-to-speech become a “nice feature” for some and a crucial element for those who need it.

 

 
In the meantime as text-to-speech is improved and streamlined, it is not the only method to create accessible menus. Audio files with menu text can be initiated through middleware or the engine’s built-in audio system, which can offer not only lightning fast responsiveness but also a consistent experience on any device, like the voiced menus in The Division 2. However, it comes at the sacrifice of a screen reader's flexible speaker voices and clear, fast rates of speech—unless these are built into your solution. One method I used recently was RT-Voice Pro in Unity, which allowed me to generate audio files quickly from text files which I implemented in Wwise while attempting to make a tower defense game blind accessible. You can also go further to add menus spoken by voice actors, which can bring more character to your soundscape, like in the game Frequency Missing.

This debate over using synthesized speech versus voice actors, however, brings me to my MIGS talk which you can view above. While I was originally going to write a post that mirrored my presentation, I thought it'd be more useful to go into the areas that I wasn't able to cover in-depth because of time. From here, if you are interested in making a blind-accessible game, you can hear how to develop Personas to better understand your audience's accessible audio needs, to include gamers with disabilities other than visual impairments, too. I also break down the core elements of accessible sound design which was influenced by my research in User Interface and User Experience design.

For more about these concepts, you can also check out my presentation videos and slides from GameSoundCon and GAconf (as well as my MIGS slides), which cover more methods to approach accessible sound design and the populations that can benefit from blind accessibility. I am currently converting these into Open Educational Resources with more substantial information and evidence, which will be available for the collaboration of Start, Design and Engineering Tracks on the Game Accessibility Special Interest Group (GASIG) website in the coming months.

Thank you very much for reading! If you'd like to contact me, please send me an email at adriane (at) smashclay (dot) com or DM me at @smashclayaudio. For more information and resources about blind accessibility in games, check my website Smash Clay Audio.

And a huge thanks to everyone who offered extensive feedback about this post:
  • Tomasz Tworek can be emailed at seal11111 (at) gmail (dot) com, and you can follow him at @lirin111 and on Audiogames.net under his username Lirin.
  • Ian Hamilton, whom I have not yet introduced, is an accessibility expert who explained the ins and outs of the CVAA to me and offered incredible feedback. He can be followed on Twitter at @ianhamilton_where he shares valuable information, and he runs the incredible resource, Game Accessibility Guidelines. Also, to learn more about the CVAA, check out his Gamasutra article and interview with DualShockers. Finally, if you are attending GDC, consider attending the Game Accessibility Conference mentioned above, which was started by Ian and will take place for its third year on 19 March 2019 at the Children’s Creativity Museum, located conveniently across the street from the Moscone Center.
If you’d like to support current initiatives for blind accessibility in the indie games space, here are a few to follow:
  • Ebon Sky Studios are developing Sable, a game engine that is completely screen reader accessible, allowing blind creatives to express themselves by developing their own audio games without needing to code.
  • Blind Sparrow Interactive are dedicated to creating accessible games for blind and low vision players, and they have announced their first project, Ready to Roll, an RPG dice game.
  • Pixelnicks are developing extensive accessibility in their beautiful game Eagle Island, which was mentioned above.
And if you would like to get in touch with the blind communities around you, you can visit:
Finally, be sure to look out for the next Audiogame Jam, and I recommend you check out the Polygamer podcast interview with Joseph Bein from Out of Sight Games, who talks about his experiences developing A Hero’s Call as a developer with visual impairments.

Adriane Kuzminski

Producer

Team Audio

Adriane Kuzminski

Producer

Team Audio

Adriane is a sound designer with a passion for accessibility and collaboration. This has led her to work on blind-accessible games and apps like A Hero’s Call, Frequency Missing, Earplay and others, and serve as an accessibility consultant for companies like imitone LLC and The Deep End Games. She has also spoken about her research in blind accessibility in video games at GameSoundCon, GAconf, CarouselCon and MIGS, and she has donated countless hours to advising developers and educators who are looking to learn more about accessibility by sharing her research and getting them in touch with gamers and sound designers with visual impairments. She is currently a Producer with Team Audio, a Content Creator with A Sound Effect/Soundlister, a leading Volunteer Council member with The Audio Mentoring Project, a collaborator with the Game Accessibility SIG, and a Drill Sergeant Training NCO in the U.S. Army Reserves.

For more information, you can check out her LinkedIn profile and her website: www.smashclay.com. If you’d like to get in touch, you can email her at adriane (at) smashclay (dot) com or message her on Twitter.

 @smashclayaudio

Comments

Leave a Reply

Your email address will not be published.

More articles

Empowering your Sound Designer in Unity3D

When we recently gave our talk at MIGS, entitled Empowering Your Sound Designer, our purpose was...

30.1.2018 - By Beatrix Moersch

Watson Wu takes us on board ATLAS by Studio Wildcard

This is a behind the scenes video of ATLAS, a pirate MMO game by Studio Wildcard (Grapeshot...

13.8.2019 - By Watson Wu

Tell Me Why | Audio Diary Part 2: The Music

The music of Tell Me Why was intrinsically designed to support the narrative and the emotions of the...

17.6.2021 - By Louis Martin

Virtual Acoustic Reality in Architectural Design

Within the Architecture Engineering and Construction industry (AEC), presenting design ideas through...

22.10.2021 - By Keith Yates Design

Playing an iconic puzzle game to the beat

More than a year ago, when I was called to work on a new game, I didn't expect it to be for the...

30.6.2022 - By Uriel Orozco

Animation Lip Sync with Wwise Meter Plug-in

Introduction This is the last of a 3-part tech-blog series by Jater (Ruohao) Xu, sharing the work...

26.8.2024 - By Ruohao (Jater) Xu

More articles

Empowering your Sound Designer in Unity3D

When we recently gave our talk at MIGS, entitled Empowering Your Sound Designer, our purpose was...

Watson Wu takes us on board ATLAS by Studio Wildcard

This is a behind the scenes video of ATLAS, a pirate MMO game by Studio Wildcard (Grapeshot...

Tell Me Why | Audio Diary Part 2: The Music

The music of Tell Me Why was intrinsically designed to support the narrative and the emotions of the...