According to Mashable, Discord just announced new Family Center features that give parents visibility into their teen’s activity on the platform. Starting this week, parents can see their teen’s top five messaging and calling contacts, frequently messaged servers, total call minutes, and all purchases. But here’s the catch – this tracking only covers the past seven days, meaning parents who forget to check lose that historical data. The company’s global head of product policy Savannah Badalich says these features reflect direct feedback from parents and partner organizations. Discord plans to introduce more safety measures early next year, though the platform currently only uses age assurance in the UK and Australia.
The legal backdrop is impossible to ignore
Look, these safety updates don’t exist in a vacuum. Discord is facing multiple lawsuits, including one from Dolman Law Group that names the platform as a co-defendant. The lawsuit alleges Discord and Roblox created a “breeding ground for predators” and describes horrific cases where predators used both platforms to target children. One case involves an 11-year-old who was allegedly groomed and exploited through these apps. Another represents a parent whose child died by suicide after being manipulated by someone who contacted them via Roblox and Discord. When you read that context, these new safety features start to feel… inadequate.
The privacy versus safety tension
Badalich acknowledged the fundamental challenge here: teens want privacy while parents want oversight. Discord’s trying to walk this tightrope by giving parents some visibility while not completely invading teen privacy. But honestly, the one-week tracking limit seems designed more to avoid looking like Big Brother than actually protecting kids. If a predator grooms a child over several weeks, that pattern would be invisible to parents who only get weekly snapshots. The company says it wants to “catalyze conversations” between parents and teens, which sounds nice but feels like corporate-speak for “we don’t want to be responsible.”
Where this fits in the bigger picture
Every social platform eventually faces this moment – the pivot from growth-at-all-costs to responsible moderation. Discord built its reputation on being the cool, privacy-focused chat app for gamers. Now it’s discovering that what makes it appealing (anonymity, ease of connection) also makes it dangerous. The company’s playing catch-up while lawsuits pile up and public pressure mounts. They’re not alone – Roblox recently launched age verification too. But here’s the thing: when your platform allegedly facilitates life-altering harm to children, adding parental controls feels like putting a bandage on a hemorrhage. These features might help somewhat, but they don’t address the fundamental design choices that made Discord attractive to predators in the first place.
What’s still missing
Discord’s approach feels reactive rather than proactive. Sure, parents can now see who their teens talked to last week. But what about the anonymous users who pop in and out of servers? What about the content that gets deleted before parents see it? The company says it proactively flags risky content and accounts, but the lawsuits suggest those systems aren’t working well enough. And while they prohibit synthetic child sexual abuse media, that’s basically the bare minimum any platform should do. The real test will be whether these new features actually prevent harm or just give Discord legal cover. Given the limited tracking and the fact that teens can still easily lie about their age, I’m skeptical.
