Paul LaFontaine has lived through as bad a crisis as any chief executive of an online company has encountered. The head of Sulake, creator of the social-networking service Habbo, faced a public outcry after a two-month TV news investigation found the company’s virtual-world site, which had 250 million registered users, to be “full of pornographic sexual chat despite the fact that it is aimed at children as young as 13 years old.” The report prompted venture-capital firm Balderton Capital to drop its stake in Habbo and retailers stopped selling access cards for the game. Habbo “muted” all chat, effectively removing one of the main reasons why teens visited the experience. LaFontaine went into crisis mode and got his team to completely review chat records, which showed that about 3.7 percent of members were engaged in inappropriate chat. The service lost 35 percent of its base.
Habbo now has about 85 percent of its users back and is on track to grow beyond the pre-crisis size of its audience. At the Casual Connect game conference in Seattle, he recently called for an online safety standard and collaboration among sites, making game worlds safer for kids. We sat down with LaFontaine for an interview. Here is an edited transcript.
Paul LaFontaine: I’m going to give a brief overview of the story, the events that occurred, and what we did in response. I think people have read about it, but they might be interested in hearing some of the details. Then I’m going to talk about the need for the industry to actually set standards around what good safety is. Because right now everyone is positioning themselves, alone, as…”I set a standard by doing X. I set a standard by doing Y.” Another person says, “I do Z.” It creates the collective perception that there really is no standard. There are multiple approaches to it. It’s time for the industry to stop, take stock, and actually say, “This is what we consider to be good safety.” And third, we’re going to make as much as we can of our lessons learned available to our colleagues, so the collective can learn from this. Much like any sort of incident where lessons can be learned. Rather than retreating, we’re going to become more transparent. We’d like to be the first ones to push with this idea that the whole industry should make safety non-negotiable. We all create safety and we’re all going to learn together, and then we compete on fun and engagement and these types of game concepts. Not safety.
GamesBeat: How much will you suggest as far as the minimum that everybody should do? As you tell the story of what happened to you guys, what would be apparent?
LaFontaine: We think that there are three levels that make a site safe, or as safe as possible. There’s the community level. You have to engage the community in collaborating to be educated and to help identify when there’s inappropriate content occurring. At a minimum, the community needs to be engaged. There needs to be a software layer. Certainly you need to have some sort of algorithmic way to filter chat and filter communication.
GamesBeat: For you guys, how long was it down? The whole service.
LaFontaine: We limited communication for a period of a month. We started by muting it completely while we went back and investigated, to make sure we had a complete understanding of the situation. Then we began to restore limited, what we call white-list chat, meaning you could only say certain words that we’d put into a white list. Our team developed that in three days, and then over a period of three weeks, we put all the sites back online. Ultimately, the site is about friendship. We wanted friends to be able to communicate, even in a limited way. And we’ve been moving back to a freer, but filtered, chat. We now have 85 percent of the user base restored back to that. By the end of next week, everyone will be back.
GamesBeat: How much drop did you see in activity?
LaFontaine: Initially we saw around 35 percent of folks staying away. We’re now back up to 85 percent.
GamesBeat: Does the trend look like it’ll completely come back, then?
LaFontaine: We think it’ll completely come back. We also plan on opening the Turkish market, so we think in short order we’ll be ahead of where we were before we had to mute the site.
GamesBeat: When you looked back, what did you find? Was there a way to improve it on any one of those levels?
LaFontaine: Sure. We went back and took, basically, an algorithmic approach. We keep all the chat logs for a period of time, so we can recreate what happened. We went back and found that 3.7 percent of the users in that period were engaging in inappropriate chat. When we did the unmute, which was pulling together the comments of the whole community, one thing that we saw was that there was more talk about avoiding that behavior than there was about the positive alternative. So we think that the inappropriateness was over-represented in terms of what people were talking about. We initiated what we called the Habbo Patrol. We called for volunteers from people who wanted to help police that site, engaging that upper layer of the community. We had 8 percent of the users volunteer to be on the Patrol. You look at the whole user base, about 3.7 percent are behaving inappropriately. Eight percent were passionate enough to help police that. Then the remainder were just neutral, using the service to interact with their friends and have fun. When we initiated the Patrol, added that layer, it immediately put the odds on the side of behavior being appropriate.
GamesBeat: Have you already been working on filtering out predators as well?
LaFontaine: Very actively. One area that I want to be really clear on is, we have a deep cooperation with law enforcement on that issue. We can’t talk about our individual cases, but we have contributed to convictions, and we have been the reporting agency in that area. We have a very strong stance on that. We’re very aggressive in terms of pursuing that kind of behavior…We want to set the record straight on that. We have an absolutely hard stance on that issue. We have a data-sharing agreement with law enforcement. We follow all the privacy procedures. We’ve mapped out every jurisdiction. We’re tied into the national child-safety task force. So we have an incredibly close and long-standing relationship with law enforcement on that issue.
GamesBeat: How many people are there now on the service?
LaFontaine: Right now it’s about 6 million a month. That breaks down to around 2 million a week. We use weekly actives because of the way teens’ lives are organized. They’re organized around the weekend. Catching them on the weekend is a big part of the service.
GamesBeat: What’s the hot thing they do now, what’s trending?
LaFontaine: They form groups and roleplay. The Habbo Patrol that I mentioned, it was a service to the community, but it’s also a chance to roleplay. Within the Patrol, they created ranks. They create a room…. What they do is they socially organize around status. There’s a very clear hierarchy of who’s been there the longest, who has the most badges, who is the expert at a certain thing, who wins contests. What they like to do is create really social groups. That’s the main activity. We have over 65 percent of the users involved in some active group. That group is a tribe or an identity around a certain activity.
GamesBeat: Have you ever broken out how many people deal with the safety part [of Habbo], then?
LaFontaine: We have 28 that are staff directly responsible for overseeing the day-to-day activity. We have a team of 16 that is a product team. They create new tools for us. And then we devote a significant proportion of our expense lines around moderation complements. People that moderate and use the tools to follow different activities. Easily a third of our cost base is around safety and moderation. I think that’s been consistent for the service throughout its life.
GamesBeat: It sounds like you have an increase, then, in the number of people who are willing to join the self-policing efforts.
LaFontaine: For sure. Habbo has had programs over the years that…they called it all sorts of different things. But they were largely manual because many of the social-networking tools that are very common today weren’t available. The idea of reputation management…you can earn reputation points, and that way, we give you more credibility as you become an internal guardian. That’s something there was less precedent for. Because Habbo started in the early days of this version of the Internet. Now, when we talk about a reputation-management system…there are several different models we’ve drawn from to create this skills base, and we can automate it. We can create a sustainable program around self-policing, the gamification of self-policing, because you achieve more levels the more you help. The more you monitor, the higher you go. We’re gamifying it in a way, which is part of what Habbo is all about.
GamesBeat: How long has that kind of gamifcation of the safety part been around?
LaFontaine: Well, for Habbo, I can only speak for our site…. When we reorganized in February, that’s when I put the full product team on the creation of this talent tree. The talent tree starts as a tour guide or a helper role, rather than a policing role. In order to build your reputation up, you want to show that you can reliably interact with other users. You can build up through that. We launched that in the end of May and had tremendous participation in that. All we’re doing now for the next phase is adding a level, the Guardian level, which allows you to take some limited police actions. It allows you to participate in the process of cleaning up the site. The infrastructure had already been built earlier this year. The other thing we did that we started earlier in the year is, we opened an API to the platforms, so third parties can build entertainment content, games, and so on for the platform. We’re launching, next week, our first third-party game. So what happens is, we increase the skill path for safety, and we put more games in the service. There’s more to do, more people watching what’s going on. It’s part of a plan we initiated back in February.
GamesBeat: Is there any correlation between how much time people put in [Habbo] and whether or not they were part of this 3.7 percent?
LaFontaine: Yes. People who spend more time on the site are less prone to participate in this. What we see is that the people who participate in this kind of inappropriate behavior. They pop in. They make an account. They do their thing. They pop out. Then they pop in with a different account. Which is why we’re moving to a system where we’re stiffening the requirements to make a new account. You have to go through the safety program…. You have to earn your way to free chat. It mitigates this fast move in, fast inappropriate behavior, fast out. We definitely want to look at that behavior cycle. That’s part of what we did to improve the site.
GamesBeat: Is there a particular point whether you decide to go to law enforcement? A repeat offender sort of thing?
LaFontaine: There are understood patterns of behavior that are suspicious. Those are understood by psychologists and by law enforcement. So we sit with law enforcement and we say, tell us the patterns of behavior. When we identify the pattern of behavior, we can flag it. That’s what we do. We have initiated several investigations, and we’re the collaborative partner with law enforcement. We are very clear on how to do that. Our user-care people are trained on what adults who prey on children, how they behave. They go through training on that. They’re able to identify that and verify it through the log once it’s flagged by the software. We have a system in place to handle that.
GamesBeat: Are they able to track some of the offenders, then, across different games and services?
LaFontaine: Depends on law enforcement. Law enforcement has very strict rules around privacy, as you know. So in each jurisdiction, they have a different amount of information that’s appropriate to share. We always start there. We sit down with a law enforcement agency and we say…if we identify suspicious behavior, what is the appropriate way to report? We start from there and work backwards. It varies by jurisdiction.
LaFontaine: It doesn’t always indicate anything specific because there could be a variety of reasons for that. There’s a pattern of behavior that is very clear, and it has nothing to do with dirty chat. That pattern of behavior is…you go to your psychology books, and it lays it out pretty clearly. When you set your software to detect a pattern of behavior that’s very well known in the law-enforcement-psychology community, you can flag it pretty quickly. One of the facts that I’d like to set straight is that we didn’t use software. We used software to a tremendous degree. Our software filtering and the way that we built white lists and built blacklist systems, we’ve done all this. We’re still going to partner with a software vendor, that we’ll announce very shortly, to continue to make sure we have the best software available. That software layer has always been very important to the company, and it remains so. There was a bit of a discussion about large numbers of users just running around in a circle, that’s totally not true. We’ve always used software, back to 2003, when we developed our first filtering system.
GamesBeat: It still sounds like there’s not quite a perfect software, a perfect system.
LaFontaine: No. There’s no silver-bullet solution. The combination of community, software, and then your judicial system, if you will, has to rest on the community. A community of young children with adults looking over their shoulders is going to have a different system than a community of teens who specifically don’t enjoy having adults looking over their shoulders. A 13-plus set is going to have a different approach than anyone else. Our community is unique in its enjoyment, so we have to have a system that responds to that.
GamesBeat: Is that industry changing much, or otherwise being very proactive here? Is it consistently improving?
LaFontaine: They are. We sit on the EU CEO Coalition for Child Safety. It’s a group of companies and CEOs who signed off on a charter saying, we’re going to adhere to five areas of improvement. We were one of the early members of that. I was just at a meeting there a month ago. The really big topic on everyone’s mind is imagery and how imagery is uploaded into the Internet, certainly creating protections around that. Most of the software solutions that I’ve seen, just anecdotally…there’s a lot of discussion around software solutions that go in that direction. Well, Habbo, we have no imagery. We don’t allow any imagery, specifically because that’s an area where we want to make sure we maintain absolute control. In Habbo, the only expressions are movement and chat. When those services that allow uploading photos, for example…that’s a very big area relative to moderating, controlling the imagery.