Research

Facebook Apologizes for Auto-Suggesting Child-Porn Videos

Facebook Apologizes for Auto-Suggesting Child-Porn Videos

Facebook has apologised after a glitch in its search function suggested that users might be interested in child sexual abuse videos.

While intelligent search algorithms are useful and are a large part of the way we efficiently access information online, sometimes they lack the common sense that a human would provide. Users took to Twitter to bring attention to the search results which included terms such as "video of girl sucking dick under water", "videos of sexuals" and "video of little girl giving oral". But, hey, they said they're sorry! Other users are reporting similarly concerning results that don't relate to Facebook or their search history. That's not it, other suggestions were "video of florida school shooting", "videos of florida shooting" and "videos of school shooting".

The search feature still appears to be acting up.

Facebook always want to help its users to find what they want easily among the billions of staus, photos, and videos that are shared on the social network.

More news: The Walking Dead Is Heading To Theaters For A Big Event

Unlike Google's autocomplete feature on search, which is based on the words people use to search, Facebook mixes predictions that would link to both profiles, pages and to content like posts. As soon as we became aware of these offensive predictions we removed them. This survey sparked a ton of outrage in the community and from politicians who anxious for the safety of children using social media.

Questions naturally are also being raised about the autocomplete search algorithm Facebook uses and whether it was just a one-off incident.

The problem affected not only English language users, and there were lots of reports from Facebook users seeing incredibly unusual suggestions, even after the offensive ones were removed.