Posted inAmericasEuropeLatest NewsTechnology

Meta fined $400 million for failing to protect children’s privacy

The commission alleges that Instagram failed to protect children’s privacy when switching to business accounts; It’s the third fine on a Meta-owned company by the Irish

meta
Image: Shutterstock

Ireland’s Data Protection Commission is all set to slap Instagram the second-largest fine in the European Union for failing to safeguard children’s information.

It also happens to be the third fine for a Meta-owned company handed down by the Irish regulator.

The popular social media platform, owned by Meta, faces a fine of 405 million euros (approximately $400 million, AED1,471 million). It will be the second biggest fine imposed on a company after the record $888 million (746 million euros at that time) fine on Amazon in July last year by Luxembourg.

Washington Post reported that the decision came after a two-year investigation by the Irish watchdogs into Instagram’s ‘business accounts’, which give users more advanced metrics for tracking views and likes. Before 2019, the business account published users’ phone numbers and email addresses under default settings. Instagram’s minimum age for users is 13.

A study in 2019 found that Instagram gave a chance to more than 60 million users under the age of 18 to change their personal accounts into business accounts. Many of the underage users did so, driven by the access to metrics such as how many people had visited a profile and views for individual posts.

The Data Protection Commission said that underage users could be unaware that their contact information was exposed by default.

Full details of the decision will be published next week.

A spokesperson said Meta disagrees with how the fine was calculated and is reviewing the rest of the decision.

Instagram’s minimum age for users is 13

“This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private,” a Meta spokesperson said.

“Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them.

“We engaged fully with the DPC throughout their inquiry, and we’re carefully reviewing their final decision.”

Last year, Instagram was on the verge of launching a kids’ version of the app, but several members of Congress wrote a letter to the company, urging it to “cease all efforts” to launch a kids’ version after internal research indicated that Instagram can be harmful to young users.

There have been fines on Meta-owned company for breaching data privacy rules. It was fined 17 million euro in March by the Irish regulator following an investigation into Facebook, and last year, it was fined 225 million euro for violating privacy laws on WhatsApp.

Follow us on

Abdul Rawuf

Abdul Rawuf

Author