Suara Malaysia
ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
Sunday, December 22, 2024
More
    ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
    HomeNewsHeadlinesPredators hiding in plain sight: Putting protections in place to safeguard our...

    Predators hiding in plain sight: Putting protections in place to safeguard our children

    -

    Fly AirAsia from Kuala Lumpur

    Amidst global calls for safer online environments for children, the Malaysian government has been taking significant steps.

    Communications Deputy Minister Teo Nie Ching stated in May that the forthcoming Online Safety Bill, scheduled for tabling this month, supports the government’s move to require social media platform operators to obtain licences to operate within the country and to establish standard operating procedures for collaborating with local authorities.

    Additionally, the Bill will tackle issues such as scams, online gambling, cyberbullying and sex crimes against children.

    Communications Minister Fahmi Fadzil indicated that amendments to the Penal Code would be required to clearly define these criminal activities.

    Other countries have also taken steps to ensure online child safety, such as the United States’ Kids Online Safety Act and Britain’s Online Safety Act. Some countries have also banned devices in schools.

    A key example is Turkiye’s blanket ban in early August on the multiplayer platform Roblox, which primarily targets children. The ban was imposed due to concerns over harmful user-generated content and the risk of child abuse.

    Siraj highlights that predators often pose as peers, offer gifts for information or images, and threaten children to keep them silent. — SIRAJ JALILSiraj highlights that predators often pose as peers, offer gifts for information or images, and threaten children to keep them silent. — SIRAJ JALIL

    Malaysia Cyber Consumer Association (MCCA) president, Siraj Jalil, believes that similar precautions should be considered in Malaysia, especially if there is “credible evidence that platforms like Roblox are facilitating child exploitation”.

    According to Siraj, the threats posed by such services cannot be ignored, stressing that the government, along with relevant agencies such as the Malaysian Communications and Multimedia Commission (MCMC), and law enforcement must “closely monitor these platforms”.

    “We cannot ignore the dangers posed by such platforms, especially when it comes to the safety of our children.

    “If any are found to enable exploitation, swift action should be taken, including potential restrictions or bans,” he says, adding that “incidents of child exploitation and grooming have been on the rise in Malaysia”.

    Angela M. Kuga Thas, partner-director of volunteer group Kryss Network, calls such blanket bans problematic, arguing that rather than outright banning certain content or actively censoring platforms, governments should instead “criminalise the business model of profiting from harmful content”.

    Angela adds that users are steered on digital platforms based on their content consumption patterns and online behaviour, such as web searches, by the algorithms employed by various platforms. This effectively means that these platforms are indirectly benefiting from the trafficking of harmful content.

    CRIB Foundation (Child Rights Innovation and Betterment) co-chairperson Srividhya Ganapathy also wants a nuanced approach to the issue, pointing out that the platforms are not inherently harmful; rather, it is the people abusing them for nefarious purposes who are at fault.

    “It’s like saying, ‘There are so many WhatsApp scams out there to con people, so we should ban WhatsApp’. It is a platform meant to facilitate communication, but how people communicate is a different thing entirely. How can you say you want to ban the entire platform?” she asks.

    Angela notes that while child predators, exploitation and grooming are issues that have always existed, we are now witnessing the effective organisation of perpetrators using online and digital means.

    Patterns of predation

    Reports of perpetrators “bribing” children with video game currency in exchange for sexual favours are not uncommon.

    One such case was reported in late August involving a 25-year-old from Kentucky, United States, who pleaded guilty to child sexual exploitation charges. The predator had traded Fortnite in-game currency for sexually explicit photos from minors.

    Such tactics are commonly employed to manipulate children into creating and sharing child sexual abuse materials (CSAM) on their own, according to Srividhya.

    ALSO READ:  TikTok fights back over €345mil teen privacy fine in EU

    She cites data from the Internet Watch Foundation (IWF) that much of CSAM is self-generated, with 92% of CSAM removed in 2023 falling under that category. The IWF also says it received 392,665 reports of suspected CSAM online in 2023 alone.

    She adds that the problem is exacerbated as children gain access to the Internet and the wider connected world, with even three- or four-year-olds livestreaming on the Internet.

    “When you look at Roblox, for example, children can do live streaming and then they get paid in Robux, that’s one of the ways that it happens.

    “But really, children can get paid in money, in gifts, in food, like now with online shopping and online deliveries,” she says, adding that even children in their early teens who know the risks are willingly and voluntarily engaging these predators.

    Reports of perpetrators “bribing” children with video game currency in exchange for sexual favours are not uncommon. — Image by FreepikReports of perpetrators “bribing” children with video game currency in exchange for sexual favours are not uncommon. — Image by Freepik

    According to reports received by Siraj’s MCCA, incidents involving child exploitation and grooming are on the rise, with predators soliciting victims on social media, gaming and messaging platforms that are popular among children and teens.

    “Common tactics include pretending to be someone of the child’s age, offering gifts or rewards in exchange for information or images, and using threats to silence the child if they try to speak out.

    “These predators are highly manipulative, and they exploit the naivety of young users,” says Siraj.

    Siraj also shared several screenshots from the general chat of a popular mobile game, revealing public messages of a sexual nature, including offers targeting underage children.

    “Increased moderation and oversight are critical. These platforms must be held accountable for creating a safe environment for their users, especially children. Collaboration between platform operators, law enforcement, and parents is key to ensuring safety,” he says.

    However, for Angela, an increase in oversight and moderation is insufficient. She emphasises that perpetrators would circumvent such measures by simply creating new accounts and trying again.

    Other countries have also taken steps to ensure online child safety, such as the United States’ Kids Online Safety Act and Britain’s Online Safety Act. — Image by freepikOther countries have also taken steps to ensure online child safety, such as the United States’ Kids Online Safety Act and Britain’s Online Safety Act. — Image by freepik

    “Unlike other types of content, it is not sufficient for tech platforms to merely be obliged to introduce systems that will allow the users to better filter out the ‘harmful’ content they do not want to see.

    “When it comes to child predators, child exploitation and grooming, not seeing the content does not mean it is not there or that it does not remain harmful.

    “Additionally, platforms have already stated that their automated censoring or filtering system cannot 100% identify all texts, including images, because of context, nuances and language issues.

    “They still need human beings who are able to speak or understand multiple languages and cultures to help them do this work of filtering harmful content, and this content is in the hundreds of thousands,” she says.

    The role of moderators is especially crucial on community-centric platforms like Discord, a popular communication service among gamers. The platform is organised into private servers where users can join specific communities or “servers” based on their interests.

    In a report published last year, NBC News disclosed that it had identified 35 cases over the past six years where adults were prosecuted on charges of kidnapping, grooming, or sexual assault allegedly involving communications on Discord. This was based on a review of international, national, and local criminal complaints, news articles, and law enforcement communications since Discord’s inception.

    ALSO READ:  Ecommerce platform sells access to child porn group.

    Some countries have banned devices in schools. — Image by freepikSome countries have banned devices in schools. — Image by freepik

    The report revealed that at least 15 of these prosecutions resulted in guilty pleas or verdicts, while many other cases remain pending.

    Additionally, NBC News identified 165 more cases where adults were prosecuted for transmitting or receiving CSAM via Discord or allegedly using the platform to extort children into sending sexually explicit images of themselves.

    A Discord moderator, who requested to be quoted only as Adrian, notes that these cases are common, sharing an incident from his own server centred around a popular multiplayer online battle arena game.

    “There was one user sending explicit pictures of his genitalia to some of the younger kids in their early teens, and was trying to get into private video calls with them.

    “When we found out, we removed him from the server and asked the kids if they wanted to make a police report, but they declined, feeling nothing would be done,” he says.

    According to Adrian, it is difficult for moderators to be aware of such incidents, especially since such predators tend to approach their targets via direct messaging features on these online platforms.

    “There’s also a perception that there are no consequences for actions online, so whether it’s private servers, voice chat groups, or platforms like Discord or Roblox, parents need to be involved in some way,” he says.

    Teaching about threats

    Srividhya stresses that, more than anything, children have to be properly equipped to engage with the online world, and that depriving them of technology entirely is not the solution.

    “When your child is about to cross a road, you tell them to look both ways, pay attention, and watch what’s happening before crossing – we have to give them directions on how to navigate hazardous situations. But we don’t give children any kind of instructions when we launch them online,” she says.

    Angela believes that children need to feel comfortable discussing what they consume online with their parents. — Kryss NetworkAngela believes that children need to feel comfortable discussing what they consume online with their parents. — Kryss Network

    Srividhya points out that even when parents take the trouble to monitor their kids’ messages, they may not account for how tech-savvy they can be.

    In her experience, most children have the know-how to hide apps, conceal photos with other apps, or even send secret messages in ways that most parents may not detect.

    “It’s not about playing big brother and monitoring your kids. It’s about teaching them how to be responsible online.

    “They need to be aware that there are real dangers out there – people who may try to sexually exploit them or misuse their photos, which will remain online for the rest of their lives,” she says, adding that every time such images resurface, they will be re-victimised.

    The CRIB Foundation has also been advocating for comprehensive sex education in schools, with Srividhya noting that this is a conversation many Malaysians find difficult.

    “They say, ‘How can we talk about sex and sexuality to children who are so young?’ But other people are talking to your kids about it already, and what are we doing to raise their awareness about it?”

    “It’s really a pressing problem that should have been remedied five years ago, we don’t talk to children enough about sex, and this is a huge responsibility that schools should undertake personally,” she says.

    Angela also highlighted the role of schools as being instrumental in teaching children about online safety, adding that a change in approach is necessary.

    ALSO READ:  Opinion: Have you called any scammers lately?

    “The approach to teaching about online safety should change from one that is authoritative to one that allows children to safely ask questions and encourages them to think critically for themselves.

    “Just saying ‘don’t do’ does not work, even if you share about the harms that have affected other children because for children, it is a distant reality,” she says. “We need to provide children facts and information about online predators, including what they say and do to build trust.”

    “Online predators take time to build that trust with children. We need to do the same if we hope to be effective in educating children on online harms.”

    Angela says parents should monitor their children’s online activities, but with their knowledge, and set clear boundaries on what they can and cannot do online. — Photo by Aaron Burden on UnsplashAngela says parents should monitor their children’s online activities, but with their knowledge, and set clear boundaries on what they can and cannot do online. — Photo by Aaron Burden on Unsplash

    She also believes that children need to be taught how to differentiate between good and bad or misleading content, and that the only way this can be achieved is if they feel comfortable discussing what they consume online with their parents.

    “Children may be able to discuss these issues with their friends and build their own support network, but the reality is, some children are isolated or not as social, and hence, being encouraged to share what they think about what they consume with their parents or someone they trust is important,” she says.

    Angela says parents should monitor their children’s online activities, but with their knowledge, and set clear boundaries on what they can and cannot do online.

    “Open communication – without accusation or overreaction – is usually the most effective. This way the relationship between parents and children remains a relationship of trust,” Angela says.

    ‘Safe by design’

    Angela is also of the opinion that digital platforms should be held criminally liable for capitalising and profiting from CSAM content.

    In late August, a notable instance occurred in France, where Telegram CEO Pavel Durov was charged and barred from leaving the country for allegedly allowing criminal activity on the company’s messaging app.

    The company fired back, arguing that it is unjust to hold the platform and owner responsible for its misuse by criminals.

    According to reports, Telegram has become a hotspot for illegal activities, including the distribution of CSAM. However, it has since become more cooperative with authorities, which was not the case previously.

    Srividhya feels that digital platforms should be obligated to be safe by design and to take down, report and block CSAM in a timely manner under the purview of a new, specialised agency.

    “Establish a clear and harmonised legal framework to prevent and respond to online child sexual exploitation materials.

    “We need to have a single clear and accountable agency with the mandate, capacity, and resources to deal with this, which can establish multi-sectoral and international coordination,” she says.

    Wan
    Wan
    Dedicated wordsmith and passionate storyteller, on a mission to captivate minds and ignite imaginations.

    Related articles

    ADVERTISEMENTFly London from Kuala Lumpur

    Subscribe to Newsletter

    To be updated with all the latest news, offers and special announcements.

    Latest posts