News

Actions

20 million people are sharing their deepest secrets on this app

Posted

VENICE, Calif.  — Bullying. Sexual harassment. Terrorist plots. Anonymous social media apps have been called havens for any number of these activities.

Anonymous apps have found themselves spotlight over the years: One saw so much harassment that it led to the murder of a college student, while another has been dubbed the ‘hot new thing’ among jihadists.

Whisper, an anonymous social app that launched in 2012, wants to prove that anonymous social platforms can be a vehicle for stories that often go unheard, like men opening up about being raped or what it’s really like to wear a hijab. Users turn to the app to vent and divulge their innermost thoughts for others to “heart” and respond to.

Whisper cofounder and CEO Michael Heyward said the most used and the most searched-for word on the app isn’t one of hate or anger: It’s “love.”

Whisper has raised over $61 million in funding, but it hit a road bump in October 2014. The Guardian published an article questioning Whisper’s claims of anonymity but later walked back some of its reporting. Since then, the company has been quietly swelling.

On Friday, it announced that it has 20 million active users in 187 countries — double the number of users it had six months ago.

“We’re actually really not even about anonymity, we’re about authenticity,” said Heyward, 28, while noting that the company treats all its user data “like private health information.”

Creating a safe space for authenticity isn’t easy. That’s why the startup — whose golden rule is, “Don’t be mean, don’t be gross, and don’t break the law” — has been laser-focused on moderating its content. Heyward says it’s the company’s core competency.

“We set the tone in the very beginning to be positive,” he said. “We have zero tolerance for bad stuff.”

Whisper bans things like bullying, fat-shaming and slut-shaming.

The goal is that these posts never go live on Whisper and that they’re filtered out proactively.

For years, Whisper has been using human moderators to vet posts. For the past year and a half, Whisper has also been automatically flagging posts from high schoolers. Words such as “parents” or “football game” point to a user’s age, flagging moderators to review the post more closely.

Yik Yak, an anonymous social platform popular among college students, says that it bans high schoolers altogether.

Six months ago, Whisper developed a machine-learning system to make faster decisions about posts’ content. Using hundreds of millions of data points based on decisions that human moderators made in the past, The Arbiter (as it’s called internally) can determine what content to post and what to ban.

“There’s a level of friction when a human moderator has to take the time and make a decision,” said Nona Farahnik, Whisper’s director of trust and safety. “A machine is doing it with the blink of an eye.”

When the machine isn’t sure how to categorize the post, it goes to the human moderators.

(Yik Yak said it is also committed to creating a safe place for anonymous comments, but it relies on a combination of moderation and user upvoting or downvoting to flag content. “Encouraging a positive, constructive, and supportive community environment on Yik Yak is a top area of focus for us,” said a spokeswoman.)

Whisper’s Farahnik also works closely with law enforcement, Whisper data scientists and a team of more than 100 moderators in the Philippines. Farahnik or a colleague travels to the Philippines on a quarterly basis to train moderators on cultural issues. That includes showing clips from movies like Mean Girls and recent newspaper clippings.

After all, precise moderation is also a business strategy: “There’s no successful [company] that can be built on top of a product that makes kids cry,” said Heyward.