The European Commission has opened an investigation into Snap’s Snapchat, Google’s YouTube, Apple’s App Store and Google Play on their protections for children as required by the Digital Services Act, technology commissioner Henna Virkkunen said.
The Commission said it is asking the platforms to provide information on their age-verification systems and how they prevent minors from accessing illegal products such as drugs and vapes, and harmful materials such as information promoting eating disorders.
The businesses have been designated as Very Large Online Platforms under the DSA and as such must comply with stringent user safety rules.
Harmful materials
“Today, alongside national authorities in the member states, we are assessing whether the measures taken so far by the platforms are indeed protecting children,” Virkkunen said.
Google said it has already instituted measures to ensure its platforms offer age-appropriate experiences, and has “robust” controls for parents.
Concern has grown over the potentially harmful effects of online services on young people, with a number of countries instituting or considering increased restrictions on social media services and other platforms.
Last month the Netherlands’ Authority for Consumers and Markets, or ACM, opened a formal probe into Snapchat after doctors filed a complaint that the company was not doing enough to stem sales of electronic cigarettes, or vapes, to minors, after complaints by doctors.
The doctors, through the youth smoking prevention group Stichting Rookpreventie Jeugd (SRPJ) contacted Snapchat parent Snap in early June asking it to urgently take measures to prevent vape sales to minors over the platform.
In early August, Snap said it had taken various measures, including improved detection of slang words and emojis used by illegal vape sellers, account blocks, filters for teenagers and additional parental controls.
Youth protections
The SRPJ group worked with a youth panel to test the effectiveness of the measures, but found the changes had made little difference.
Australia earlier this year instituted a ban for under-16s of social media platforms, including YouTube, which goes into effect in December, but the measure faces challenges in implementation and enforcement.
New York City last week filed a federal lawsuit against Snapchat, Facebook and Instagram parent Meta Platforms, TikTok and YouTube for allegedly contributing to a mental health crisis amongst youths by developing intentionally addictive platforms.
The lawsuit says the platforms are not doing enough to block under-age users, accusing them of gross negligence and creating a public nuisance.