Suara Malaysia
ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
Monday, December 23, 2024
More
    ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
    HomeNewsHeadlinesGoogle works to reduce non-consensual deepfake porn in search

    Google works to reduce non-consensual deepfake porn in search

    -

    Fly AirAsia from Kuala Lumpur

    Google is making adjustments to its search engine to reduce the prevalence of sexually explicit fake content high in results, responding to the explosion in non-consensual unsavory content people have created using generative artificial intelligence tools.

    When that AI-generated content features a real person’s face or body without their permission, that person can request its removal from search results. Now, when Google decides a takedown is warranted, it will filter all explicit results on similar searches and remove duplicate images, the company said July 31 in a blog post.

    The Alphabet Inc unit also said it had improved its search ranking systems so that explicit fake content would not appear as top results – a change that Bloomberg reported in May was already in the works.

    “We’ve long had policies to enable people to remove this content if they find it in search, but we’re in the middle of a technology shift,” said Emma Higham, a product manager who spearheads protections for Google’s generative AI technology in search and other apps, in a briefing with reporters. “As with every technology shift, we’re also seeing new abuses.”

    In 2023, Bloomberg found that Google Search was the top traffic-driver to websites hosting deepfakes, or sexually explicit AI-generated pornography. On Google, a search for many well-known celebrities matched with the word “deepfake” pointed users to MrDeepfakes.com and other sites that largely exist to trade in pornographic imagery. One year ago, Google Search accounted for 44% of the 4 million desktop visits to MrDeepfakes.com, according to data from Similarweb.

    ALSO READ:  Maybank warns of fake phishing email targeting customers’ login and credit card details

    Google has been adjusting the results for queries specifically seeking deepfake content tied to someone’s name. Instead of showing those images, the search engine will be trained to surface high-quality, non-explicit content, like news articles, when those results are available, the company said. So far, such changes have reduced exposure to explicit image results on these types of queries by over 70%, Google said.

    Between April and May, US-based search traffic to the top two deepfake pornography websites plummeted, according to data from Similarweb published in a May Bloomberg report.

    Google is also demoting websites that feature a high volume of pages that have been removed from search because they violated policies against explicit fake content, the company said in its blog post.

    In a separate interview, Higham, the product manager, said that Google is facing new challenges in deciding when to take action on non-consensual explicit imagery and when to leave results on its search engine untouched, citing the need to balance users’ ability to find information with their online safety. Advocates have criticised Google for de-ranking, rather than completely de-listing, the deepfaked content.

    “We have to be careful about not taking too blunt an approach and having unintended consequences on access to information,” Higham said, referencing adult performers who may not want explicit, consensual content to be de-ranked on the search engine. “But when we’re seeing high-risk queries and a high risk of fake explicit content showing up non-consensually, we are taking strong action now.” – Bloomberg

    Wan
    Wan
    Dedicated wordsmith and passionate storyteller, on a mission to captivate minds and ignite imaginations.

    Related articles

    ADVERTISEMENTFly London from Kuala Lumpur

    Subscribe to Newsletter

    To be updated with all the latest news, offers and special announcements.

    Latest posts