
OpenAI has announced that it will be rolling out new parental controls in ChatGPT within the next month, marking a significant shift toward safety-focused features for younger users. The new system will allow parents to directly link their accounts to those of their teenage children, giving them greater oversight and control over how the chatbot is used. By offering parents the ability to manage key functions, OpenAI is aiming to address growing concerns about the potential risks of unsupervised AI use among younger audiences.
The parental controls include the option to disable features such as memory and chat history, limiting the chatbot’s ability to store and recall details from past conversations. Another safeguard allows the system to automatically send alerts to parents if it detects that a child is showing signs of “acute distress” during interactions with ChatGPT. These measures are designed to give parents both preventative tools and reactive support in moments where intervention may be needed, highlighting OpenAI’s recognition of the risks tied to adolescent use of AI-powered systems.
In addition to these changes, OpenAI has committed to introducing further security and well-being features within the next 120 days. The company emphasized that these efforts are being “guided by experts,” signaling that it is working with child safety and mental health professionals to shape the tools responsibly. This broader strategy suggests that parental controls are only the first step in an evolving framework aimed at safeguarding young users.
The announcement also comes in the shadow of a high-profile lawsuit filed against OpenAI. The case, brought by the parents of a teenager who tragically died by suicide, alleges that ChatGPT played a role in helping him plan and follow through with his actions. By introducing parental controls and promising additional protections, OpenAI is both responding to mounting public scrutiny and seeking to reassure parents and policymakers that it is taking accountability seriously.




