Singapore is set to create a powerful online safety commission that can order social media platforms to block harmful content and restrict access to abusive material, under a bill introduced to parliament on Wednesday.
The new body, expected to begin operating by mid-2026, will address user reports of cyberbullying, doxxing, stalking, child abuse, and other online harms. It will have the authority to ban offenders, force content removal, and order internet service providers to block specific websites or online groups within the country.
The Infocomm Media Development Authority (IMDA) said earlier this year that over half of legitimate complaints about harmful posts were ignored by platforms. “More often than not, platforms fail to take action to remove genuinely harmful content reported to them by victims,” said Digital Development Minister Josephine Teo.
The commission will be formed under the new Online Safety Bill, which lawmakers will debate in the next parliamentary session. Over time, its scope will expand to include non-consensual sharing of private information and incitement of hostility.
The law builds on Singapore’s Online Criminal Harms Act, enacted in February 2024, under which the government has already ordered Meta to curb impersonation scams on Facebook, threatening fines of up to S$1 million ($771,664) for non-compliance.




