Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.
Microsoft’s Inclusive Technologies Lab is full of insightful ideas about how to enable as many people as possible to enjoy video games. One of them is a game controller that is really, really heavy. It is meant to give game designers an idea of what it’s like for a person without strong hands to use a typical controller.
And on the wall is a sign that says, “When you do not intentionally, deliberately include … you will unintentionally exclude.” The lab is meant to change the perspective of game designers and challenge their assumptions about what accessibility really means. Bryce Johnson and Evelyn Thomas took me on an exclusive tour of the lab this week, and they showed me the work they are doing to make gaming more inclusive for people with disabilities.
I spent just a half hour at the Xbox team’s Inclusive Technologies Lab, but I feel like it made a lasting impression on me. The lab really shows how to think about people and games in an inclusive and accessible way.
And one of the first things Johnson, who is on the Xbox design team and is the design lead for “gaming for everyone,” pointed out to me was that the World Health Organization redefined disabilities in a new way. If someone in a wheelchair goes to a building that is only accessible by stairs, then the person isn’t the one who is disabled. Rather, it’s the building that has a disability.
“We built this space around the gaming for everyone goal, that Xbox is a place where everyone can have fun,” Johnson said. “The goal is to inform the visitors who come here, whether it’s Microsoft game designers or third party partners or the public, what it’s like to play games with a disability, what are the challenges those people face. We want to inspire the creators of Xbox to be intentional about including people with disabilities in the things that we create.”
Thomas, who works on the Xbox platform team and is in charge of Xbox accessibility, said the software and hardware teams share the room and use it to educate third-party partners on inclusive gaming. The room also has a goal to strengthen Microsoft’s relations with charities such as Special Effect in the United Kingdom, Able Gamers, and Warfighter Engaged, and others that have helped the company be more inclusive over the years.
“We think of this room as an embassy for them on campus,” Johnson said.
I tried out a PC version of Rocket League, the popular esports title where you drive cars and try to bump a soccer ball into a goal. But the machine I played it on had no keyboard, mouse, or game controller. Rather, it had a foot pedal, which could make the car go forward, and it had two large buttons on the either side of my knee. I could hit the button on the right with my knee to turn right, or the one on the left to turn left.
At first, I was pretty bad at it. But after a while, I got the hang of it and I was able to score a couple of goals. But it made me realize that I shouldn’t take my hands for granted.
“The primary use case for this handheld game controller makes a lot of assumptions, like you have two thumbs, that you can reach the buttons, and you have the endurance to hold it,” Johnson said.
The next demo showed how you can play a game using only your mouth or your face. Quadriplegic people don’t have use of their limbs, so the team showed off a “quad stick,” which is a device that you can blow air into or suck on. Johnson showed how he could use it as a computer mouse to play a game of Mahjong. With three tubes, a user could get the functionality of six buttons.
Thomas showed another demo of Solitaire, which uses Tobii eye-tracking sensors to allow you to control the mouse cursor with your eyes. And Johnson used his voice to control where to point the cursor. He said, “Mouse grid. Six. Four. Double click.” The commands brought up a grid and focused on one part of the screen and then performed a double-click on the object in that section of the screen.
Next to that demo was the Eyes Free station. The screen was covered, giving the visitor the same experience that a blind person would have. Thomas used a game controller stick to move around the user interface menu. As she did so, the computer read aloud the description of the icons that Thomas was hovering over. That allows a blind person to maneuver through menus and start listening to a Netflix video or otherwise navigate through a visual user interface.
Across the room, Halo Wars 2 was running on a big TV. Thomas showed how you can normally interact with friends in multiplayer through voice chat. But if you can’t hear, then you won’t be included in the conversation. So the system can translate your voice to text so that someone who can’t hear can see what is being said. They can then respond with voice or typed responses. The same kind of system can also work for people who can’t speak or can’t see.
As we all know, my big disability is an inability to play Cuphead. So Thomas showed something called Co-pilot, which enables two people to control the same character in a video game. The controllers are virtualized as if they were one controller.
One person might use the controller merely to keep shooting, while a second player handles the tough part of jumping at the right moment. With Cuphead, I could actually perform an easy role and get someone else to play the hard part. The Co-pilot could be useful if you’re showing a young child how to get past a tough part in a game, without forcing the child to give up using a controller. I played Cuphead with Johnson, and it turned out to be far easier than with just one person.
“We had blind people in here who said that their kids could be their eyes,” Johnson said.
The video above shows a girl playing Minecraft with her father using Co-pilot. The Minecraft education team took a tour of the lab. Johnson said the aim isn’t to tell game developers what to do. Rather, it is to inspire them to hear about what people need, and then to think of their own ways of serving those people.
“It lights up a whole set of experiences for kids, parents, grandparents, and more,” Thomas said.
Johnson and Thomas and their colleagues have made most of the devices from off-the-shelf components. AbleNet makes big buttons that can be incorporated into larger designs that can be used. Friends at Special Effect helped the Microsoft team create new kinds of controllers. Microsoft tries to work on technologies that deliver the biggest bang for the buck, but the goal is to enable everyone on the planet, Johnson said.
Thomas said, “Our goal is to open up gaming for everybody. We are exploring all of the aspects of gaming and how we can light those up for people.”
GamesBeatGamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
- Newsletters, such as DeanBeat
- The wonderful, educational, and fun speakers at our events
- Networking opportunities
- Special members-only interviews, chats, and "open office" events with GamesBeat staff
- Chatting with community members, GamesBeat staff, and other guests in our Discord
- And maybe even a fun prize or two
- Introductions to like-minded parties