Mehta allegedly had a 17-strike policy against sex trafficking posts.

Date:

play

A former employee of Meta, the company behind social media platforms Facebook, Instagram, WhatsApp and Threads, testified as part of a major lawsuit that the tech company had a policy that allowed for 17 strikes before suspending accounts involved in “sex trafficking.”

Vaishnavi Jayakumar, Instagram’s former head of safety and welfare, also testified that as of March 2020, Mehta did not have a concrete way for people to report child sexual abuse material (CSAM) on Instagram, according to federal court documents filed in the Northern District of California on Friday.

“I was very surprised by this,” Jayakumar said, adding that he tried to raise the issue “many times” but was told it would take a lot of work to build.

Jayakumar became concerned when he learned about Mehta’s so-called “17x” policy.

“So, there are 16 violations for prostitution and sexual solicitation, and a 17th offense could result in your account being suspended,” she said, according to court documents. This is “an extremely high strike standard for the industry as a whole.”

The plaintiffs argue that Jayakumar’s testimony is backed up by internal documents, which state in court documents that they believe Mehta never told parents, public agencies or school districts that it “will not delete accounts that engage in sex trafficking more than 15 times.”

Meta told USA TODAY on Saturday, Nov. 22, that the company now has a “one-strike” policy and will immediately delete accounts if the company determines that a user has violated its strictest policies against human exploitation and human trafficking. The company said its strike policy began in 2019, and the number of violations required for discipline has decreased over time.

Why was Mehta named in the lawsuit?

Court documents filed by the plaintiffs allege that Meta, along with other named technology companies such as ByteDance (TikTok) and Snap (Snapchat), are contributing to the “unprecedented mental health crisis” currently facing American teenagers.

According to court filings, Meta is “following the same strategy used in the past by Big Tobacco” by determining that its most valuable users, or “the ones that generate the most advertising revenue,” are young people. The plaintiffs allege that even though Meta’s platforms, especially Instagram, are targeted at young people, the company “deliberately omitted safeguards for parents and teachers.”

“Meth designs social media products and platforms that it knows are addictive to children, and children know that their addiction leads to a number of serious mental health problems,” Previn Warren, co-lead counsel for the plaintiffs in the case, told Time, which first reported the court filing. “Just like with tobacco, we have a situation where we have dangerous products marketed to children,” Warren added. “They did it anyway, because the more usage, the more profit the company makes.”

Meta spokeswoman Stephanie Otway said the company is proud of its progress and stands by its record.

“We strongly oppose these claims, which rely on cherry-picked quotes and misinformed opinions in an attempt to present an intentionally misleading portrayal,” she told USA TODAY. “The full record shows that for more than a decade, we have listened to parents, investigated the issues that matter most, and made real changes to protect teens, including introducing Teen Accounts with built-in protections and giving parents more control to manage their teens’ experiences.”

How difficult was it to build a way to report CSAM in 2020?

Jayakumar said it wouldn’t have taken much effort at the time to build a product that would allow users to report child sexual assault content on Instagram.

“Essentially, it would add additional options to the existing reporting numbers…options that are there,” she said, according to court documents.

At the time, Instagram already allowed users to report several other “less serious violations” directly within the app, including “spam,” “intellectual property infringement,” and “promotion of firearms,” ​​according to court filings.

Mehta pointed to USA TODAY’s February 2021 blog post detailing the company’s analysis of illegal child exploitation content reported to the National Center for Missing and Exploited Children (NCMEC) in October and November 2020. The content was shared on Facebook and Instagram, the company said.

Based on Meta’s findings, the company said it has begun “developing targeted solutions, including new tools and policies to reduce the sharing” of illegal child exploitation content. Part of the solution includes in-app pop-ups offering assistance to people who encounter illegal content.

What does Instagram’s child sexual abuse policy currently contain?

Instagram is currently providing instructions to users on how to report child sexual abuse on the platform.

Instagram’s Information for Law Enforcement page states that the platform reports “all instances of apparent child sexual exploitation” that appear on its website “to the National Center for Missing and Exploited Children (NCMEC) anywhere in the world in accordance with applicable law, including those brought to our attention by government requests.” From there, NCMEC will refer the matter to law enforcement to help the victim, the platform said.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

March Madness Bracket, NCAA Tournament Schedule Updated

From 68 years old to 16 years old.After the...

Religious Freedom and Abortion | State Court Report

you are reading...

Gold price today on March 23, 2026

How much is gold per ounce today?As of 8:15...

President Trump deploys ICE at airports. Know Your Rights in O’Hare, Midway

President Trump sends ICE agents to assist TSA with...