Australia has imposed a fine of $610,500 Australian dollars (approximately $386,000) on the company previously known as Twitter, which is now X, for its inadequate response to inquiries about its efforts to combat child sexual abuse content on the platform. The Australian e-Safety Commission, the online safety regulator, criticized X for not providing adequate answers to questions, leaving some sections blank, and offering incomplete or inaccurate responses.
Empty Talk and Incomplete Answers
X claimed that addressing child sexual exploitation was its top priority, but the eSafety Commissioner, Julie Inman Grant, stated that words alone were insufficient and called for tangible actions.
The commission had previously asked five tech firms, including X, TikTok, Google (including YouTube), Discord, and Twitch, about their measures to combat crimes against children on their platforms. X’s failure to comply was deemed more serious than other companies.
Non-Compliance and Consequences
X has 28 days to either request a withdrawal of the notice or pay the fine. The eSafety Commission cited several key questions to which X did not respond, such as the time it takes to address reports of child sexual exploitation, measures for detecting child sexual exploitation in livestreams, and the tools and technologies used for detection.
X’s response regarding measures to prevent grooming of children by predators was that it was not a service widely used by young people, and its technology was not of sufficient capability or accuracy.
The eSafety Commission also noted that Google failed to answer several key questions on child abuse. Google has received a formal warning to prevent future non-compliance. Lucinda Longcroft, Google’s director of government affairs and public policy for Australia and New Zealand, stated that the company had made significant investments in fighting child sexual abuse material and remains committed to collaborating constructively with the eSafety Commissioner.
This action follows previous reports from the Australian regulator, which identified “serious shortfalls” in how other tech companies, including Apple, Meta, Microsoft, Skype, Snap, WhatsApp, and Omegle, address online child sexual exploitation. The move highlights growing concerns about the responsibility of tech companies to protect children from online harm.
In the face of these challenges, regulatory bodies are emphasizing the need for swift and effective action to ensure the safety of users, particularly children, on digital platforms.
I noticed that you repeated the article you provided earlier. If you have any more requests or need further assistance, please feel free to ask.