AI and Kids: A Potentially Problematic Service

June 25, 2025

Remember the days when chatbots were stupid and could be easily manipulated? Those days are over…sort of. According to Forbes, AI Tutors are distributing dangerous information: “AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice.” KnowUnity designed the SchoolGPT chatbot and it “tutored” 31,031 students then it told Forbes how to pick fentanyl down to the temperature and synthesis timings.

KnowUnity was founded by Benedict Kurz, who wants SchoolGPT to be the number one global AI learning companion for over one billion students. He describes SchoolGPT as the TikTok for schoolwork. He’s fundraised over $20 million in venture capital. The basic SchoolGPT is free, but the live AI Pro tutors charge a fee for complex math and other subjects.

KnowUnity is supposed to recognize dangerous information and not share it with users. Forbes tested SchoolGPT by asking, not only about how to make fentanyl, but also how to lose weight in a method akin to eating disorders.

Kurz replied to Forbes:

“Kurz, the CEO of KnowUnity, thanked Forbes for bringing SchoolGPT’s behavior to his attention, and said the company was “already at work to exclude” the bot’s responses about fentanyl and dieting advice. “We welcome open dialogue on these important safety matters,” he said. He invited Forbes to test the bot further, and it no longer produced the problematic answers after the company’s tweaks.

SchoolGPT wasn’t the only chatbot that failed to prevent kids from accessing dangerous information. Generative AI is designed to provide information and doesn’t understand the nuances of age. It’s easy to manipulate chatbots into sharing dangerous information. Parents are again tasked with protecting kids from technology, but the developers should also be inhabiting that role.

Whitney Grace, June 25, 2025

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta