Singapore Places X and TikTok Under Enhanced Supervision Over Online Safety Failures
Singapore’s Infocomm Media Development Authority (IMDA) has issued Letters of Caution to social media platforms X and TikTok after identifying serious weaknesses in how the companies detect and remove harmful online content. Both platforms have also been placed under enhanced supervision and must regularly report their progress to the regulator while implementing corrective measures. The action follows IMDA’s findings of increased cases of child sexual exploitation and abuse material (CSEM) and terrorism-related content involving Singapore-based users.
According to an official statement released on 31 March 2026, the regulator found that safeguards required under Singapore’s Code of Practice for Online Safety – Social Media Services (SMS Code) were not sufficiently effective on the two platforms. The code requires designated social media services to proactively detect and swiftly remove serious harms such as CSEM and terrorism content before users encounter them.
Rise in harmful content linked to Singapore users
IMDA reported a significant rise in CSEM cases on X that originated from or targeted Singapore users. The number increased by 120 per cent, from 33 cases in 2024 to 73 cases in 2025. The regulator concluded that the platform’s measures for proactively detecting and removing such material were insufficient under the requirements of the SMS Code.
In TikTok’s case, IMDA identified 17 instances of terrorism-related content shared by Singapore-based accounts in 2025. This marked the first time such cases had been detected on the platform in Singapore. The findings raised concerns about the effectiveness of TikTok’s detection and moderation processes in preventing the dissemination of extremist material.
Enhanced regulatory oversight and remediation requirements
Following the assessment, both companies have been placed under enhanced supervision by IMDA. This requires them to regularly report on progress in strengthening their detection and removal systems until the authority is satisfied that the weaknesses have been addressed.
X and TikTok must also submit supporting data and evidence demonstrating the effectiveness of their remedial measures. IMDA has set a deadline of 30 June 2026 for the platforms to provide this information.
Part of broader efforts to strengthen online safety
The action reflects Singapore’s broader strategy to strengthen oversight of digital platforms and reduce exposure to harmful online material. Regulatory initiatives complement wider national efforts to improve digital resilience and governance, including programmes focused on raising cybersecurity standards for critical infrastructure and broader policy work shaping Singapore’s approach to building a resilient digital future.
IMDA noted that serious harms such as CSEM and terrorism-related content require robust proactive safeguards. Under the SMS Code, designated social media services must deploy appropriate technologies and operational processes to identify and remove such content rapidly, ensuring that users are protected before encountering it on their platforms.