YouTube and TikTok discuss child social media restrictions with Indonesian government

Major technology platforms including YouTube and TikTok are in active discussions with Indonesia’s government as authorities move forward with plans to restrict social media access for children under sixteen. The proposed policy is part of a broader effort to strengthen digital safety protections for young users as concerns grow globally about the impact of social media on mental health and online behavior. Indonesian officials have said the new regulation will require technology platforms to identify and deactivate accounts considered high risk when they belong to underage users. The measure reflects increasing pressure on digital companies to improve safeguards for minors.
Indonesia’s communications and digital ministry recently circulated a ministerial regulation outlining how the restrictions would be implemented. Under the proposal, social media platforms would need to monitor and deactivate accounts belonging to children below the age threshold if they are categorized as high risk. Authorities have identified several major platforms within the scope of the regulation including TikTok, YouTube, Instagram and Roblox. Officials say the goal is to ensure stronger protections for children while encouraging technology companies to adopt clearer systems for verifying user age and managing youth accounts.
Technology companies have responded by engaging with Indonesian regulators to better understand how the new rules will be applied. Representatives from YouTube said the platform is reviewing the regulation and is seeking ways to maintain access to educational content while strengthening parental oversight tools. The company emphasized that digital platforms can provide valuable learning opportunities for younger users when appropriate safety measures are in place. TikTok also confirmed that it is communicating with government officials to clarify the technical requirements of the new regulation and how compliance measures will be implemented.
Social media firms argue that their platforms already include a range of safety features designed specifically for younger users. TikTok noted that its teen accounts include numerous built in protections covering privacy settings, content filtering and time management tools. Meta, the parent company of Facebook and Instagram, has also warned that strict bans could unintentionally push teenagers toward unregulated online spaces where fewer protections exist. Technology companies have therefore encouraged policymakers to focus on strengthening safety features and parental control systems rather than imposing blanket restrictions.
Indonesia’s proposed policy reflects a broader global trend as governments consider stronger regulation of social media use among minors. Countries including Australia and several European nations have introduced measures aimed at limiting harmful online experiences for younger audiences. These policies often address concerns related to cyberbullying, exposure to inappropriate content and excessive screen time. Indonesia’s government plans to implement the new rules later this month as part of its wider strategy to improve digital governance and create safer online environments for children.

