Suara Malaysia
ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
Friday, September 20, 2024
More
    ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
    HomeNewsHeadlinesBollywood star or deepfake? AI floods social media in Asia

    Bollywood star or deepfake? AI floods social media in Asia

    -

    Fly AirAsia from Kuala Lumpur

    Several incredibly convincing deepfake images have caused quite the stir in Asia recently, from a Bollywood star dressed in skin-tight lycra to a Bangladeshi politician purportedly filmed in a bikini. But it’s more than just harmless fun – one such fake image has even been linked to a murder. This outbreak of deepfakes highlights the sophistication of artificial intelligence and the potential threats it poses to women across Asia.

    None of these videos or the photo are real, but they went viral in a vibrant social media space struggling to come to grips with the technology that has the power to create convincing copies that can upend real lives. Indian actor Rashmika Mandanna expressed her concern in a post on X, formerly Twitter, that has amassed more than 6.2 million views, urging the community to address the dangers of identity theft through deepfakes.

    Similarly to Mandanna, several Bollywood stars including Katrina Kaif, Alia Bhatt, and Deepika Padukone have been targeted with deepfakes as well, highlighting the very real threat that generative AI poses to vulnerable individuals. Furthermore, the prevalence of deepfake content is particularly challenging in conservative societies, where women have long been harassed online and abuse has gone largely unpunished.

    But social media firms are struggling to keep up with this threat, which is escalating in severity with every new generation of AI software released into the market. Google’s YouTube and Meta Platforms have updated their policies to require creators and advertisers to label all AI-generated content. However, as AI continues to evolve and deepfakes become increasingly realistic, these measures may not be sufficient to protect individuals.

    ALSO READ:  11-year-old's YouTube-inspired abduction hoax leads to arrest.

    Regulations to address the issue of deepfakes have been slow to emerge globally, though some countries such as China and South Korea have begun taking a stand by requiring the reporting of illegal deepfakes and making it illegal to distribute deepfakes that harm public interest in South Korea. India is also taking a tough stance, requiring social media firms to remove deepfakes within 36 hours of receiving a notification, or risk losing their safe-harbor status.

    Deepfakes of women and other vulnerable communities – especially sexual images and videos – can be particularly dangerous in deeply religious or conservative societies, human rights activists say. In Pakistan, a woman was allegedly shot dead by her father and uncle after a photograph of her with a man went viral. It was later revealed that the image had been doctored.

    Founder of the non-profit Digital Rights Foundation in Pakistan, Nighat Dad, has expressed deep concern over the threat to women’s privacy and safety, particularly as disinformation campaigns gain steam ahead of an election scheduled for Feb 8. She believes that deepfakes are creating an increasingly unsafe online environment for women, even non-public figures, and may discourage women from participating in politics and online spaces.

    It is crucial to recognize that the threats posed by deepfakes extend beyond just public figures and to focus on creating a safer and more inclusive online environment.

    Wan
    Wan
    Dedicated wordsmith and passionate storyteller, on a mission to captivate minds and ignite imaginations.

    Related articles

    Follow Us

    20,249FansLike
    1,158FollowersFollow
    1,051FollowersFollow
    1,251FollowersFollow
    ADVERTISEMENTFly London from Kuala Lumpur

    Subscribe to Newsletter

    To be updated with all the latest news, offers and special announcements.

    Latest posts