Suara Malaysia
ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
Friday, September 20, 2024
More
    ADVERTISEMENTFly London from Kuala LumpurFly London from Kuala Lumpur
    HomeTechDeepfake imposter scams are driving a new wave of fraud

    Deepfake imposter scams are driving a new wave of fraud

    -

    Fly AirAsia from Kuala Lumpur
    Computer-generated children’s voices that fool their own parents. Masks created with photos from social media deceive a system protected by face ID. They sound like the stuff of science fiction, but these techniques are already available to criminals preying on everyday consumers. The proliferation of scam tech has alarmed regulators, police, and people at the highest levels of the financial industry.

    Artificial intelligence (AI) in particular is being used to “turbocharge” fraud, US Federal Trade Commission chair Lina Khan warned in June, calling for increased vigilance from law enforcement. Even before AI broke loose and became available to anyone with an Internet connection, the world was struggling to contain an explosion in financial fraud.

    In the United States alone, consumers lost almost US$8.8bil (RM40.9bil) last year, up 44% from 2021, despite record investment in detection and prevention. Financial crime experts at major banks, including Wells Fargo and Co and Deutsche Bank AG, say the fraud boom on the horizon is one of the biggest threats facing their industry. On top of paying the cost of fighting scams, the financial industry risks losing the faith of burned customers.

    “It’s an arms race,” says James Roberts, who heads up fraud management at the Commonwealth Bank of Australia, the country’s biggest bank. “It would be a stretch to say that we’re winning.”

    The history of scams is surely as old as the history of trade and business. One of the earliest known cases, more than 2,000 years ago, involved a Greek sea merchant who tried to sink his ship to get a fraudulent payout on an insurance policy. Look back through any newspaper archive, and you’ll find countless attempts to part the gullible from their money. But the dark economy of fraud, just like the broader economy, has periodic bursts of destabilising innovation. New technology lowers the cost of running a scam and lets the criminal reach a larger pool of unprepared victims.

    ALSO READ:  Olympic venue among 40 museums hit by ransomware attack, says French police source

    The AI explosion offers not only new tools but also the potential for life-changing financial losses. And the increased sophistication and novelty of the technology mean that everyone, not just the credulous, is a potential victim. The Covid-19 lockdowns accelerated the adoption of online banking around the world, with phones and laptops replacing face-to-face interactions at bank branches. It’s brought advantages in lower costs and increased speed for financial firms and their customers, as well as openings for scammers.

    Some of the new techniques go beyond what current off-the-shelf technology can do, and it’s not always easy to tell when you’re dealing with a garden-variety fraudster or a nation-state actor. “We are starting to see much more sophistication with respect to cybercrime,” says Amy Hogan-Burney, general manager of cybersecurity policy and protection at Microsoft Corp.

    Globally, cybercrime costs, including scams, are set to hit US$8 trillion (RM37.18 trillion) this year, outstripping the economic output of Japan, the world’s third-largest economy. By 2025, it will reach US$10.5 trillion (RM48.8 trillion), after more than tripling in a decade, according to researcher Cybersecurity Ventures.

    In the Sydney suburb of Redfern, some of Roberts’ team of more than 500 spend their days eavesdropping on cons to hear firsthand how AI is reshaping their battle. A fake request for money from a loved one isn’t new. But now parents get calls that clone their child’s voice with AI to sound indistinguishable from the real thing. These tricks, known as social engineering scams, tend to have the highest hit rates and generate some of the quickest returns for fraudsters.

    ALSO READ:  Fahmi: AI cannot replace journalists in the field

    Today, cloning a person’s voice is becoming increasingly easy. Once a scammer downloads a short sample from an audio clip from someone’s social media or voicemail message – it can be as short as 30 seconds – they can use AI voice-synthesising tools readily available online to create the content they need.

    As fraud gets more sophisticated, the question of who’s responsible for losses is getting more contentious. In the United Kingdom, for example, victims of unknown transactions – say, someone copies and uses your credit card – are legally protected against losses. If someone tricks you into making a payment, responsibility becomes less clear. In July, the US top court ruled that a couple who were fooled into sending money abroad couldn’t hold their bank liable simply for following their instructions. But legislators and regulators have leeway to set other rules: The government is preparing to require banks to offer better fraud protections.


    Credit: The Star : Tech Feed

    Suara
    Suarahttps://www.suara.my
    Tech enthusiast turning dreams into reality, one byte at a time 🚀

    Related articles

    Follow Us

    20,249FansLike
    1,158FollowersFollow
    1,051FollowersFollow
    1,251FollowersFollow
    ADVERTISEMENTFly London from Kuala Lumpur

    Subscribe to Newsletter

    To be updated with all the latest news, offers and special announcements.

    Latest posts