Title: Meta Faces Lawsuit Over Alleged Targeting of Children on Instagram
In a major legal battle for social media giant Meta, formerly known as Facebook, 33 states are jointly filing a lawsuit accusing the company of intentionally targeting children under the age of 13 on its platforms, particularly Instagram. The lawsuit alleges that Meta has engaged in dishonest practices, knowingly allowing underage users’ accounts to remain active and continuing to collect their data.
The recently unsealed complaint sheds light on previously redacted information, asserting that Meta’s knowledge of underage users is an “open secret.” Despite Facebook and Instagram’s policies clearly stating that users must be at least 13 years old to sign up, the lawsuit claims that Meta has not done enough to prevent children from lying about their age during registration.
The lawsuit draws attention to Meta’s potential violation of the Children’s Online Privacy Protection Act (COPPA), which prohibits targeting children and collecting their information without parental consent. The complaint argues that Meta’s platforms, including Instagram, deceive young users into spending excessive amounts of time on the apps, promote body dysmorphia, and expose them to potentially harmful content.
In response to the allegations, Meta expressed disappointment and emphasized its commitment to providing safe online experiences for teenagers. The company acknowledged the need for improved parental control over children’s app downloads and recently published a blog post advocating for federal legislation that would grant parents more authority in monitoring their children’s online activities.
Meta’s global head of safety proposed a requirement for parental approval before downloading apps by children under 16 years old. This move reflects Meta’s stance on addressing concerns raised in the lawsuit and their intention to prioritize the safety and well-being of young users.
The lawsuit against Meta highlights the growing concerns surrounding the influence of social media platforms on children’s lives. As the case progresses, it will likely bring attention to the need for robust regulations and safeguards to protect vulnerable users from potential harm while using these platforms.