
MCMC’s deputy managing director for development Eneng Faridah Iskandar said the law would allow the government to take legal action against social media companies should they fail to effectively filter out harmful content on their platforms, placing the responsibility on them.
She said current laws only allow legal action to be taken against persons who share or create harmful or dangerous content.
Eneng said complaints could soon be made against both the senders and the platform when users receive content on child sexual abuse as the platform would be required to ensure that such content does not reach the user in the first place.
“This is the main difference between the act (Onsa) and other legislation related to online content. Onsa is not directed at the user, it is about imposing responsibility on the platform providers,” she told FMT during an interview with selected media companies.
The platform will be required to respond to complaints in a fair and transparent manner, and explain to users the mechanisms implemented to weed out harmful content.
“As licensees, they will be subjected to regulations, and MCMC can request periodic reports to monitor their compliance with Onsa,” Eneng said.
The law was passed by Parliament last year and gazetted in May this year. It will require social media companies to ensure that their platforms are free from nine types of harmful content, with content related to child sexual abuse and fraud explicitly highlighted.
Platforms will also need to submit to the government an annual digital safety plan detailing their mitigation strategies and accomplishments, with non-compliance resulting in penalties of up to RM10 million.
Eneng said the freedom of expression of internet users would not be restricted once the new law comes into effect, as the nine types of prohibited content are already deemed illegal, including drug dealing, scams and spreading radical ideologies.
Focus on children’s cyber safety
Eneng said one of the major objectives was to protect children, who record high internet usage despite being the most vulnerable group, from harmful and predatory content.
She said statistics showed that children form about a third of internet users, and that media reports have also revealed a disturbing reality about children experiencing sexual threats or abuse on the internet.
“Various platforms may have different standard operating procedures and Onsa aims to harmonise them,” she said, adding that this would create easier reporting mechanisms.
On Oct 24, police announced that they had crippled a criminal network linked to child sexual abuse material, with the arrest of 31 people and the seizure of more than 880,000 digital files.
MCMC cannot directly block or restrict under-16s
Eneng said while MCMC cannot directly block or restrict people from accessing platforms once the social media ban for under-16 users kicks in next month, the onus of implementing age restrictions would be placed on the platform providers.
“Users should not be overly concerned about what will happen on Jan 1. We will only monitor platforms to see how they are blocking users who are under 16. MCMC cannot actually directly block the accounts,” she said.