When Jessie Humani made its debut in April, it was a young white woman with straight reddish-brown hair in a tank top and pair of sunglasses. About a month ago she became a racially ambiguous young woman with hoop earrings, curly hair, and a nose ring.
I see and experience new bots daily. Occasionally early in their life they may change their name, logo, or avatar, but this is the first instance I’ve ever encountered of a bot’s avatar transforming from a white person to a person of color.
It’s pretty easy to say that the face you see — if a personal assistant has a face — is very often less than diverse. I’m aware of no analysis of diversity among avatars, but virtual assistant avatars are quite often white women.
PullString didn’t make Jessie a woman of color to make a statement at a time of much conversation about the lack of diversity in tech or inequality, or that it makes business sense for a character meant to have the personality of a teenage girl be a person of color when more than 50 percent of people under 18 will not be white by 2020, U.S. Census Bureau data.
And it’s not because they wanted to revamp the bot’s personality, said PullString CEO Oren Jacob, who spoke with VentureBeat at the PullString office in San Francisco.
Jessie changed because the team who built Jessie changed how they felt about her four months after her launch.
The real Jessie
Jessie Humani is a character bot, part of PullString’s effort to make bots for a wide array of age groups. PullString was founded by former Pixar employees, and it has made characters for Mattel (Hello Barbie), Thomas the Tank Engine, and Sesame Street. On Tuesday the company released its platform to help people make bots and characters.
Talking to Jessie is like a choose-your-own-adventure conversation with a playful young woman having trouble with adulting who needs your help. When you meet her, she just lost her job and her apartment. She’s got a job interview lined up, and she’s supposed to be looking for a new place when she experiences a “hottie sighting” at the coffee shop.
“It came from a feeling about that drawing and what it would be like if you imagine a piece of art being a nice personality you can talk to, like what would it be like to hang out with that Jessie. The Jessie bound to that image versus the Jessie image we chose at launch, our opinion of her drifted over several weeks,” he said.
PullString released the bot this spring on Kik, Facebook Messenger, and Skype. As with any other product, the PullString team followed the bot’s initial progress, shared it with family and friends for a few months, and came to find that the bot’s avatar no longer matched its personality, Jacob said.
“At some point, we’re sitting at lunch here and someone says, ‘I don’t .. think we picked the right one. I think she’s actually more like Jessie this way,'” he said. “I think in the end we just felt having lived with that personality and the character for a couple of weeks, as a group, we felt that the visual representation of her now more closely maps to how we feel about her, having texted and lived with her.”
Reflecting the brand
Like a company logo, a bot mascot or avatar is chosen based on the impression the brand behind it wants to leave with the consumer.
Mattress maker Casper’s Insomnobot, for example, has the personality of a night owl.
A.I.-powered group shopping bot Kip chose its blue penguin mascot because blue is a soothing color, penguins are sort of gender-neutral animals, people like penguins, and they’re generally awesome animals, Kip cofounder Rachel Law told VentureBeat.
With major conversational commerce buy-in from Facebook, Apple, Microsoft, Google, and developers in recent months, we could all be on the edge of a wave of bots and virtual assistants.
Jessie Humani’s change is a reminder of the thinking that goes into the visual depiction of a bot, and what or who A.I.-powered bots will look like in the coming wave. It’s only software, but especially in an industry in which virtual assistants are almost always female, how these bots or assistants look can bring up issues of race, gender, and identity.