Blog
All Things AI .
Categories: AI in Schools - AI in Government - Train for AI - AI Consulting Services - Generative AI - AI Literacy - AI Ethics
When Growth Is More Important Than Safeguards: The Real Cost for Youth in the AI Era
In the rush to monetize generative AI tools, something critical is being sidelined: the welfare of young people. When Sam Altman announced a new direction for OpenAI’s next-generation system — ChatGPT‑5 (and the related move toward more “friend-like” and adult-oriented chat experiences) — the company claimed we now have “better tools” to make these experiences safe. But as someone who works daily with students and educators navigating the realities of AI in the classroom and at home, I’m profoundly unconvinced. My issue isn’t porn, or whether someone dresses up the feature as just “adult content.” My issue is that we haven’t really protected our young people yet, nor have we sufficiently educated broad populations about how GenAI works, what information it collects, and how it shapes relationships, emotions, and behaviors.
Request for Collaboration: Help Shape the AI & Data Safety Shield for Schools
Across the country, school leaders are navigating a growing paradox: AI is becoming part of classrooms, communications, and district operations — but the systems that keep students safe haven’t caught up. As part of my work in the EdSAFE AI Catalyst Fellowship, I’m studying this challenge through a research project called the AI & Data Safety Shield for Schools. The EdSAFE AI Catalyst Fellowship is a national program that supports applied research and innovation to advance ethical, transparent, and safe AI in education. Each Fellow explores a Problem of Practice—a real-world challenge that, if solved, could help schools use AI responsibly and equitably.
“HIPAA doesn’t apply to public schools.” That statement is technically correct, and dangerously misleading.
For years, the education sector has operated on the belief that FERPA (Family Educational Rights and Privacy Act) is the only law that matters when it comes to student data. And for much of the traditional classroom environment, that’s true. But the moment health-related services intersect with educational technology—whether through telehealth platforms, mental health apps, or digital IEP tools the ground shifts. Suddenly, the boundary between FERPA and HIPAA isn’t just academic. It’s operational, legal, and reputational.
Schools Don’t Just Buy Software. They Buy Trust.
The best product doesn’t always win. In fact, in K–12, it often doesn’t. You can have the cleanest UI, the sharpest onboarding flow, and the most impressive AI feature set in your category AND still get dropped in procurement. Not because of price. Not because of a competitor’s edge. But because the District couldn’t say yes with confidence. They couldn’t explain your AI use to their superintendent. They couldn’t get your DPA past legal in under six weeks. They couldn’t bet their district’s reputation on a product that might be compliant. And so, they passed. Not because they didn’t like you but because you didn’t feel safe enough to approve. In K–12, Trust Isn’t the Last Thing. It’s the First.
Your AI Feature Isn’t the Problem. The Trust Gap Is.
AI is everywhere in EdTech—automated feedback, adaptive learning paths, grading support, and content generation. If you’re building smart, AI-powered tools for K–12, you’re in the right race. But many vendors hit the same wall: enthusiastic interest from district leaders, then a long stall… or silence. The reason? Your product is technically impressive, but governance blind.

