Can AI Be Safe for Teens? Exploring Snapchat’s My AI and Its Parental Controls
Introducing My AI to Snapchat’s Young Audience
Snapchat’s My AI feature, introduced in 2023,
was created to boost user engagement with a chatbot capable of holding
conversations, answering questions, and recommending activities. However, since
Snapchat’s
user base includes a significant number of teens, the chatbot’s implementation
has raised questions about AI safety in social media spaces. While My AI offers
innovative interaction, reports of privacy issues and inappropriate responses
have led parents and privacy advocates to question its suitability for younger
users.
Potential Risks for Young Users
A primary concern with My AI is its capacity
to generate responses that may not always be appropriate for younger users.
Reports surfaced of the AI providing content on sensitive topics like substance
use and relationships, creating unease among parents and guardians. Such incidents
illustrate the risks involved in introducing AI tools to a young audience
without extensive content filtering. These responses have prompted critics to
argue for stronger safeguards and more effective content monitoring to prevent
exposure to potentially harmful content.
Family Center: Parental Controls for Safer Interactions
In an effort to address these concerns,
Snapchat updated its Family Center in early 2024, introducing controls
specifically designed for My AI interactions. This expansion allows parents to
monitor or limit their teens’ AI use, manage location-sharing settings, and
gain insight into their children’s online experiences. These updates represent
Snapchat’s attempt to balance innovation with safety by providing tools that
help parents oversee their children’s digital interactions more effectively. By
enabling more parental control, Snapchat aims to align My AI with the safety
expectations of families and regulators alike.
The Need for Responsible AI in Social Media
Snapchat’s experience with My AI highlights
the challenges of implementing AI on platforms frequented by teens. For social
media companies, balancing engagement with age-appropriate content is critical,
particularly when AI systems may offer responses that need better oversight.
Snapchat’s improvements through Family Center controls underscore the
importance of responsible AI, where engagement tools are structured to
prioritize user safety and address parental concerns.