Kik, a chat app with 15 million monthly active users, is primarily used by teenagers and has been called a defacto app for child predators. Today, Kik announced that it has appointed a safety advisory board ahead of its plans to launch a cryptocurrency that will be used to expand its developer ecosystem.
In May, Kik revealed plans to base transactions on the chat app around Kin, an Ethereum-based cryptocurrency maintained by a foundation Kik established. An initial token sale in September raised $98 million.
“We’re bringing [the advisory panel] on now to help us inform this strategy, so as Kin evolves and there are a few iterations, we will have a wide breadth of expertise advising us on making these product decisions and the strategy decisions with Kin,” Kik’s head of trust and safety, Catherine Teitelbaum, told VentureBeat in a phone interview. “They are an advisory [panel], but my philosophy and the philosophy for Kik is we want that advice, and we want to get this right from the get-go, and we believe a big part of getting it right and baking safety into our cryptocurrency is having these educated minds and having them be empowered to give us very frank advice.”
In a reversal of initial plans for the chat app’s cryptocurrency project, Kin will only be given to users 18 or over, Teitelbaum told VentureBeat. Initial Kin Foundation plans called for all Kik users to receive wallets preloaded with Kin in order to seed an ecosystem of human and automated services. The first Kin transactions on Kik will now focus on buying and selling editorial content or sticker packets, Teitelbaum said.
The advisory board announcement comes after a $10 million commitment made earlier this year for trust and safety investments in the platform. It also follows a series of reports earlier this year that found sexual predators and convicted pedophiles routinely used Kik to contact underage users.
“The terms, guidelines, and our respective policies for content moderation have all been expanded and are being used with more rigor than they were in the past. I think we still have room to grow on that and that’s why we’re investing so much more,” Teitelbaum said.
The board will have the power to publish reviews of safety matters without permission or review by Kik Interactive or the Kin Foundation, said Teitelbaum, who is relatively new at the company. She added that when she started at Kik she saw areas that needed urgent revision. Such necessary changes include expanding user controls, giving users the ability to control who speaks to them, and making it clear how to report something when things go wrong.
Earlier this year, a sweep of 18,000 public groups found that 4,000 of those groups were in violation of Kik’s terms of service, according to Forbes and Point Report.
Safety upgrades planned for Kik in the year ahead include using machine learning to moderate content flagged by users. Additional details about machine learning for moderation on Kik will be announced in 2018, a company spokesperson told VentureBeat. Kik and its partners are working on machine learning solutions for the platform, Teitelbaum said.
One solution available today comes from Crisis Text Line, an all-volunteer operation whose machine learning is used on several platforms to identify people in need of help with everything from suicide prevention to anxiety. Teitelbaum is currently in talks with Crisis Text Line to bring additional support to Kik for those with less serious concerns, like users who are nervous before taking a big test or who have health or financial worries.
Another chat service, Koko, also offers emotional support.
Plans to implement Koko and Crisis Text Line into Kik were shared with VentureBeat last fall.