Discord is facing a new lawsuit from the state of New Jersey, which claims that the chat app is engaged in “deceptive and unconscionable business practices” that put its younger users in danger.
The lawsuit, filed on Thursday, comes after a multiyear investigation by the New Jersey Office of Attorney General. The AG’s office claims it has uncovered evidence that, despite Discord’s policies to protect children and teens, the popular messaging app is putting youth “at risk.”
“We’re the first state in the country to sue Discord,” Attorney General Matthew Platkin tells WIRED.
Platkin says there were two catalysts for the investigation. One is personal: A few years ago, a family friend came to Platkin, astonished that his 10-year-old son was able to sign up for Discord, despite the platform forbidding children under 13 from registering.
The second was the mass-shooting in Buffalo, in neighboring New York. The perpetrator used Discord as his personal diary in the lead-up to the attack, and livestreamed the carnage directly to the chat and video app. (The footage was quickly removed.)
“These companies have consistently, knowingly, put profit ahead of the interest and well-being of our children,” Platkin says.
The AG’s office claims in the lawsuit that Discord violated the state’s Consumer Fraud Act. The allegations, which were filed on Thursday morning, turn on a set of policies adopted by Discord to keep children younger than 13 off the platform and to keep teenagers safe from sexual exploitation and violent content. The lawsuit is just the latest in a growing list of litigation from states against major social media firms — litigation that has, thus far, proven fairly ineffective.
Discord’s child and teen safety policies are clear: Children under 13 are forbidden from the messaging app, while it more broadly forbids any sexual interaction with minors, including youth “self-endangerment.” It further has algorithmic filters operating to stop unwanted sexual direct messages. The California-based company’s safety policy, published in 2023, claims “we built Discord to be different and work relentlessly to make it a fun and safe space for teens.”
But New Jersey says “Discord’s promises fell, and continue to fall, flat.”
The attorney general points out that Discord has three levels of safety, to prevent youth from unwanted and exploitative messages from adults: “Keep me safe,” where the platform scans all messages into a user’s inbox; “my friends are nice,” where it does not scan messages from friends; and “do not scan,” where it scans no messages.
Even for teenage users, the lawsuit alleges, the platform defaults to “my friends are nice.” The attorney general claims this is an intentional design that represents a threat to younger users. The lawsuit also alleges that Discord is failing by not conducting age verification to prevent children under 13 from signing up for the service.
In 2023, Discord added new filters to detect and block unwanted sexual content, but the AG’s office says the company should have enabled the “keep my safe” option by default.