NEW YORK (Reuters) – Mark Zuckerberg has launched Meta’s new app, Threads, with a focus on creating a friendly and kind online environment for public discourse. This sets Threads apart from its rival, Twitter, which is owned by Elon Musk.
Zuckerberg stated, “We are definitely focusing on kindness and making this a friendly place,” during the launch of the app on Wednesday.
However, maintaining this idealistic vision for Threads may prove to be a challenge. Meta Platforms, the parent company of Facebook and Instagram, has had experience managing online rage and explicit content. Despite this, Meta has promised to enforce the same rules on Threads as it does on Instagram.
Meta has been actively implementing an algorithmic approach to content curation, allowing them more control over the type of content that performs well. Their goal is to shift towards entertainment rather than news. By integrating Threads with other social media platforms like Mastodon, Meta will face new challenges and aims to navigate a unique path in managing them.
Meta has decided not to extend their existing fact-checking program to Threads. This deviates from their approach to misinformation on other apps. Posts that are rated as false on Facebook or Instagram will carry their fact-check labels if posted on Threads as well.
When asked about this different approach, Meta declined to comment.
Adam Mosseri, the head of Instagram, acknowledged that Threads is more inclined to attract a news-focused crowd, but the company’s intention is to focus on lighter subjects such as sports, music, fashion, and design.
Immediately after its launch, Threads faced criticisms and controversies. Users posted about various conspiracy theories, engaged in heated debates over sensitive topics, and complained about censorship.
Meta’s next challenge lies in moderating content when Threads is connected to the “fediverse,” enabling communication with users from other non-Meta servers. Meta’s rules will apply to these users as well.
Experts in online media caution that the details of how Meta handles these interactions will be crucial. Alex Stamos, the director of the Stanford Internet Observatory, highlighted the challenges Meta will face in enforcing content moderation without access to back-end data.
“Federation makes it harder to stop spammers, troll farms, and economically driven abusers as the metadata used by big platforms aren’t available,” Stamos said.
Stamos believes that Threads will limit the visibility of fediverse servers with abusive accounts and impose stricter penalties for illegal content, such as child pornography.
However, the interactions themselves present additional challenges, particularly when it comes to illegal material. Experts question whether Meta has a responsibility beyond blocking such content from Threads.
(Reporting by Katie Paul in San Francisco; Editing by Kenneth Li and Matthew Lewis)
Credit: The Star : Tech Feed