6 Social Media Platforms Issued With ‘Please Explain’ Notice on Extremist Material

[ad_1]

TikTok was exempt however.

Australia’s online safety watchdog has required social media giants to explain how they are tackling terrorist and violent extremist materials on their platforms.

On March 19, the eSafety commissioner issued legal notices to six social media companies, Meta, X (formerly known as Twitter), Google, WhatsApp, Telegram and Reddit, asking them to report on the measures they have put in place to protect Australians from radical online content.

Those companies will have to answer a series of detailed questions about how they are dealing with the issue.

eSafety said authorities in Australia and other countries were concerned about the role of violent extremist materials in some terror attacks, such as the Christchurch mosque shootings in 2019 and the murder of 10 Black Americans in New York in 2022.

eSafety Commissioner Julie Inman Grant said online users had been reporting that perpetrator-produced material from terror attacks continued to be reshared on mainstream social media apps.

“We remain concerned about how extremists weaponise technology like live-streaming, algorithms and recommender systems and other features to promote or share this hugely harmful material,” she said.

Related Stories

Australia’s Online Regulator Concedes It Has No Power to Force Musk’s X to Pay Fine
Australia’s eSafety Commissioner Calls Big Tech Senate Hearing ‘Disappointing’

Ms. Inman Grant also noted that there were rising concerns about terrorists and violent extremists taking advantage of the emerging generative AI technology to find new ways to cause harm.

“Earlier this month the UN-backed Tech against Terrorism reported that it had identified users of an Islamic State forum comparing the attributes of Google’s Gemini, ChatGPT, and Microsoft’s Copilot,” she said.

“The tech companies that provide these services have a responsibility to ensure that these features and their services cannot be exploited to perpetrate such harm.”

The six companies will have 49 days to respond to eSafety’s inquiry.

Why the Six Companies Were Chosen

The commissioner also explained the rationale behind her decision the companies.

An OECD’s (Organisation for Economic Co-operation and Development) report (pdf) in 2022 indicated that Telegram was the platform with the largest amount of terrorist and violent extremist materials, followed by YouTube, Twitter/X, Facebook and Instagram.

WhatsApp came 8th in the ranking, while there was evidence that Reddit had a role in the radicalisation of the murderer in the New York shootings.

“It’s no coincidence we have chosen these companies to send notices to, as there is evidence that their services are exploited by terrorists and violent extremists. We want to know why this is and what they are doing to tackle the issue,” Ms. Inman Grant said.

“Transparency and accountability are essential for ensuring the online industry is meeting the community’s expectations by protecting their users from these harms.

“Also, understanding proactive steps being taken by platforms to effectively combat terrorist and violent extremist content is in the public and national interest.”

At the same time, Ms. Inman Grant said she was disappointed that none of the social media platforms provided the watchdog with information on the issue under the current voluntary framework, forcing eSafety to issue legal notices.

eSafety’s announcement was well received by opposition communications spokesman David Coleman, who believed self-regulation did not work.

“This kind of content is completely abhorrent and the fight against it must continue to be a top priority,” he said.

“The big digital platforms must absolutely be held accountable for the content they publish and profit from.”

eSafety’s announcement comes one year after the watchdog conducted an inquiry into the measures implemented by social media companies to tackle child sexual abuse materials.

The inquiry resulted in X being fined $610,000 (US$400,000) for failing to comply with eSafety’s requirements.

However, the watchdog recently admitted that it had no power to force X to pay the fine.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *