FB pixel

Indonesia’s new regulation frames age assurance as defense against child exploitation

Even where porn is illegal, kids face risk of harm in online interactions
Categories Age Assurance  |  Biometrics News
Indonesia’s new regulation frames age assurance as defense against child exploitation
 

The complexity of the task facing age assurance providers comes into sharp focus when viewed against the global context. For large swaths of the world’s population, pornography – arguably the largest driver of the age assurance industry at present – is outright illegal. While there are grey areas in some countries’ laws, in China, India, and Indonesia, which have three of the four largest populations in the world and together account for more than 3 billion of Earth’s roughly 8 billion people, porn is officially banned.

Yet as noted by Tony Allen of the Age Check Certification Scheme (ACCS) on LinkedIn, the same region of the world has seen technology enable a proliferation of child sexual abuse and exploitation activity. A report by the ECPAT Indonesia and the National Commission for Child Protection 2023 found that more than 15,000 children were victims of online sexual exploitation.

As such, the mission to (in Allen’s words) “make the internet age aware” is not just about who can look at pornography, but also how it can protect kids from physical harms.

These concerns are driving Indonesia’s current legislative push toward more robust online child safety laws. A report in MLex has the country framing its new online child protection regulation as “designed to protect, not restrict.” In February, Minister of Communication and Digital Meutya Hafid said the regulation “is presented not only as a government policy, but as a real form of response to the concerns of parents, educators, and the digital community who want a safe environment online.” It takes cues from the UK Age Appropriate Design Code in outlining requirements for assessment, age classification and age verification.

Age assurance law targets sites for kids, or just sites kids use

Which is to say, it is focused on regulating sites that are specifically intended for children – rather than on sites that wish to block children’s access to adult material. An article in Lexology outlines the law’s scope: “the Regulation applies broadly to all electronic system operators – public and private, domestic and foreign – as long as their products, services, or features are specifically intended to be used or accessed by children (defined as anyone under 18 years old) in Indonesia.”

This includes “social media sites, online games, e-commerce platforms, educational apps, smart toys and devices, as well as streaming services for entertainment and other content.”

The law is broad enough that, even if a site argues it is not specifically aimed at children – Instagram, for instance, could argue that it’s a grown-up app – requirements could still apply, if it is “likely that children will use it.” What qualifies is a complicated matter that is “not yet crystal clear” to regulators at the Ministry of Communications and Digital Affairs. Potential indicators to be looked at include a site’s formal terms and conditions, evidence of a large user base of young people, targeted ads and “if its design and presentation appeal to children.”

Law has comprehensive risk assessment requirement, detailed age categories

Entities covered by the law are required to conduct a risk assessment that gauges specific risk indicators of harm to kids. The assessments evaluate the likelihood of children being contacted by strangers and children encountering harmful content such as pornography or violence – both core issues for age assurance globally. It also looks at “the presence of design elements that could lead to addictive behaviour in children,” and broad negative impacts on kids’ mental or physical health.

Parental consent is also a key piece. Entities must implement a clear opt-in consent mechanism, meaning they “absolutely cannot allow a child to register, access, or interact with their services unless a parent or legal guardian has actively and explicitly given their consent.”

There must be a clear “OK” from a parent or guardian.

Things get complicated with users aged 17 or over, who are able to give their own consent, which then must be validated by a  parent or guardian within a six-hour window. The law also gets granular about age groups, identifying five distinct groups within the 3-18 age range.

Indonesia’s regulation adopts a privacy-by-default approach, meaning the most stringent data protection safeguards must be automatically activated; this includes limiting data sharing, restricting tracking of online activity, turning off location services and other measures. It is the responsibility of the platform, app or service to ensure this is the case.

Any entity that processes children’s personal data is required to conduct a Data Protection Impact Assessments (DPIA) in accordance with the Personal Data Protection Law (Law No. 27 of 2022), and to appoint a Data Protection Officer (DPO) to oversee data protection activities.

Entities must establish and clearly communicate the minimum age required for children to access their platforms. And here is where age assurance services come into play: to ensure compliance, they “also need to implement a mechanism to check a user’s age when they sign up. These age verification tools should be built into the registration process and designed with children’s privacy in mind. Any information to verify age should be limited to what is absolutely necessary, used only to confirm if the user meets the age requirement, and then securely deleted once that is done (unless retaining such information is legally required).”

Indonesia’s child protection law is clearly a work in progress, as evidenced by the frequency with which reports “anticipate further clarification.” The compliance deadline is still two years away, on March 27 2027. But in the meantime, per Lexology, “it is important to note that even during this transition period, affected parties like parents, guardians, or child protection groups can still take private legal action, such as filing civil lawsuits, if they believe a service is harming children or not complying with the Regulation.”

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Deepfakes advance from enough heart for romance scams to thwarting detection

Minute changes in skin color due to blood flow in time with heartbeats have been used to differentiate deepfakes from…

 

Papua New Guinea launches integration exercise for national digital identity

Papua New Guinea is forging ahead with its national digital ID. A release says the nation is launching a “data…

 

Sri Lanka conducting user and security testing for biometric identity card project

The user and security testing for Sri Lanka’s Electronic National Identity Card (e-NIC) project is now ongoing, M.S.P. Suriyapperuma, director…

 

Age assurance lobby gets busier with new Meta-backed coalition

Washington has a new advocate for consumer choice in the smartphone ecosystem: a release announces the launch of the Coalition…

 

World storms into US market, riding wave of relaxed regulations

It was only a matter of time – both before Sam Altman and Alex Blania’s biometric World Network announced its…

 

Facial recognition OK for FC Copenhagen football matches at home and away

FC Copenhagen is now allowed to use facial recognition at its football matches, following a review in the Data Council…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events