After a string of violent crimes shown live on social media, companies around the world are racing to improve artificial intelligence so software can automatically spot and block undesirable videos. Rosanna Philpott reports.
Grisly murders on Facebook Live. On Tuesday (April 25) in a deserted hotel in Thailand a self-shot video of a father killing his 11-month-old daughter by hanging her with a rope from a rooftop, lingered on the site for 24 hours. Barely two weeks ago in Cleveland a man bragging live about murdering a 74-year-old, before posting the video of the killing itself online. Just the latest in a string of violent crimes appearing on the social media site Now, competing to solve the policing problem: tech firms. From Singapore to Finland, racing to improve artificial intelligence so it can automatically spot and block content before it goes viral. (SOUNDBITE)(English) PRESIDENT AND CEO OF GRAYMATICS, ABHIJIT SHANBHAG, SAYING: "So it will detect 'there is a person', it will detect 'there is a baby', it will detect the activities that the person is involved with the baby, it will detect the emotions of the person, it will detect any, the screaming sound of the baby and various other clues that might be there. So if the baby has moved from one place to another place in an anomalous way that will be part of the data which it will extract." Graymatics is an AI company in Singapore using "deep learning": a type of AI that mimics the way neurones work and interact in the brain. The teaching system feeds images of what might be a violent scene. Anything from weapons to hacking movements and blood. But the software can only identify examples it's been taught to watch out for and before this someone hanging a child from a building may not have been one of them. As a dozen or more companies wrestle with the problem many are focusing on this method. Facebook under pressure over how it monitors content and live feeds and Google faces similar problems with its YouTube service. None, so far, claim to have cracked the problem completely.