Instagram owner Meta fined €405m over handling of teens’ data
Instagram owner Meta has been fined €405m (£349m) by the Irish data watchdog for letting teenagers set up accounts that publicly displayed their phone numbers and email addresses.
The Data Protection Commission confirmed the penalty after a two-year investigation into potential breaches of the European Union’s general data protection regulation (GDPR).
Instagram had allowed users aged between 13 and 17 to operate business accounts on the platform, which showed the users’ phone numbers and email addresses. The DPC also found the platform had operated a user registration system whereby the accounts of 13-to-17-year-old users were set to “public” by default.
The DPC regulates Meta – which is also the owner of Facebook and WhatsApp – on behalf of the entire EU because the company’s European headquarters are in Ireland.
The penalty is the highest imposed on Meta by the watchdog, after a €225m fine imposed in September 2021 for “severe” and “serious” infringements of GDPR at WhatsApp and a €17m fine in March this year.
The fine is the second largest under GDPR, behind the €746m levied on Amazon in July 2021.
A DPC spokesperson said: “We adopted our final decision last Friday and it does contain a fine of €405m. Full details of the decision will be published next week.”
Caroline Carruthers, a UK data consultancy owner, said Instagram had not thought through its privacy responsibilities when letting teenagers set up business accounts and had shown an “obvious lack of care” in users’ privacy settings.
“GDPR has special provisions to make sure any service which targets children are living up to a high standard of transparency. Instagram fell foul of this when accounts of children were set to open by default rather than private.”
Last year Meta suspended work on a version of Instagram for children following revelations about the app’s impact on teen mental health.
Instagram said it was “pausing” work to address concerns raised by parents, experts and regulators. The move followed revelations from a whistleblower, Frances Haugen, that Facebook’s own research showed Instagram could affect girls’ mental health on issues such as body image and self-esteem.
Instagram has said that prior to September 2019, it had put user contact details on business accounts and had informed users during the setup process. Under-18s now have their account set to private automatically when they join the platform.
Andy Burrows, head of child safety online policy at NSPCC, said: “This was a major breach that had significant safeguarding implications and the potential to cause real harm to children using Instagram.
“The ruling demonstrates how effective enforcement can protect children on social media and underlines how regulation is already making children safer online.”
A Meta spokesperson said: “This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private.
“Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them.
“While we’ve engaged fully with the DPC throughout their inquiry, we disagree with how this fine was calculated and intend to appeal it. We’re continuing to carefully review the rest of the decision.”
Read more:
Instagram owner Meta fined €405m over handling of teens’ data