VANCOUVER - As online communities come under the attack of cyberbullies, racist speech and spam, a British Columbia tech firm has developed technology to keep the trolls under the bridge.
Community Sift, based in Kelowna, has built digital armour for social media and gaming companies trying to protect their virtual worlds. The chat filter and moderation tool examines real-time website commentary, chat room conversations and banter between game players.
"We're not just talking about four-letter words," said CEO Chris Priebe, a senior programmer and security specialist. "We want to get rid of bullying across the entire Internet."
The firm's technology advances a global campaign against digital abuse in part spurred by the 2012 suicide of Amanda Todd, a teenager from Port Coquitlam, B.C., who was victimized by online sexual exploitation.
"The Amanda Todds of the world, we want to prevent that," said Karen Olsson, the firm's chief operating officer. "We want to be part of the solution."
Based on the firm's analysis of four billion messages sorted daily, less than one per cent of social users behave badly yet they're causing the bulk of harm. Offensive material is classified into categories such as bullying, sexting, racism and bomb threats.
The firm has catalogued more than one million phrases used frequently by trolls, for example, "u r so ugly," Priebe said.
The technology takes context into account when identifying toxic behaviour. It combines machine learning and human verification by employing artificial intelligence and 30 language specialists. Priebe said online users are shielded from cyberbullies like anti-virus software protects computers.
"We're looking for social viruses that are causing social destruction of social products and social lives."
About 30 global clients are already using Community Sift. The flexible technology is tailored to client specifications, such as modifying content filters to be age appropriate.
An internal database query by the firm estimated it has protected at least 34 million users over a recent two-week period in its U.S. data centre alone.
Online cruelty inflicted on a Kelowna teenager was also part of the impetus for Community Sift, Priebe said. The teenager was goaded into uploading a selfie that trolls turned against her, generating pages of comments urging her to kill herself.
The technology sifts the posts to emphasize positive comments from the 40 per cent of online users who are normally well-behaved to derail the attacks.
"They're going to say, 'You're beautiful, you're wonderful, you're helpful,'" Priebe said. "Now she'll have two voices inside her head and she can build the ability to handle all this bullying."
The firm builds reputations for users participating online, and detects when someone crosses into a high-risk threshold. Consequences may include limiting identified trolls to certain queues where a moderator can decide if the content is inflammatory, silencing them automatically or banning them outright.
"We always joke you can put them in the basement with all the other trolls and let them harass themselves," Olsson said.
Others have also taken up the cause.
A 13-year-old Illinois girl designed software that detects hurtful language as a Google Global Science Fair project. Trisha Prabhu's program ReThink prompts posters to think twice before hitting send. She found more than 93 per cent of teens alter their posts.
Programmers with the National Youth Mental Health Foundation in Australia have also developed a Google extension called "reword" that flags potential insults by crossing them out with a red line.
Community Sift identifies the tone of online communities rather than policing the Internet, Priebe said. It gives users options to choose settings for avoiding unwanted content, in the same way moviegoers can select films based on ratings.
An emerging social world, called Medium.com, has deployed Community Sift to protect its users as they interact and post personal stories.
"We want to provide the best place for people to freely and openly express themselves," said Greg Gueldner, who implements the startup's trust and safety protocol.
Priebe has boosted online security before by co-developing safety and moderation tools for Club Penguin, a virtual world where it's safe for children to play games and interact. The company partnered with Disney in 2007 and has a user base of 300 million.
The B.C. programmer, who has his own painful story about being bullied into his teens, said people currently believe they're powerless against trolls.
"When people realize that it's a solvable problem," he said, "they won't put up with it anymore."
— Follow @TamsynBurgmann on Twitter
Note to readers: This is a corrected version of a story originally published April 3. The earlier story erroneously reported that Chris Priebe co-founded Club Penguin.