
Blog
All Things AI .
Categories: AI in Schools - AI in Government - Train for AI - AI Consulting Services - Generative AI - AI Literacy - AI Ethics

The First AI Incident in Your Organization Won’t Be a Big One. That’s the Problem.
Your first AI incident won’t be big. But it will be revealing. It will expose the cracks in your processes, the ambiguity in your policies, and the reality of how your team uses AI. If you wait for a significant event before acting, you’ll already be behind. Building responsible AI systems doesn’t start with compliance. It begins with clarity and a willingness to take the first step before the incident occurs.

“HIPAA doesn’t apply to public schools.” That statement is technically correct, and dangerously misleading.
For years, the education sector has operated on the belief that FERPA (Family Educational Rights and Privacy Act) is the only law that matters when it comes to student data. And for much of the traditional classroom environment, that’s true. But the moment health-related services intersect with educational technology—whether through telehealth platforms, mental health apps, or digital IEP tools the ground shifts. Suddenly, the boundary between FERPA and HIPAA isn’t just academic. It’s operational, legal, and reputational.

Schools Don’t Just Buy Software. They Buy Trust.
The best product doesn’t always win. In fact, in K–12, it often doesn’t. You can have the cleanest UI, the sharpest onboarding flow, and the most impressive AI feature set in your category AND still get dropped in procurement. Not because of price. Not because of a competitor’s edge. But because the District couldn’t say yes with confidence. They couldn’t explain your AI use to their superintendent. They couldn’t get your DPA past legal in under six weeks. They couldn’t bet their district’s reputation on a product that might be compliant. And so, they passed. Not because they didn’t like you but because you didn’t feel safe enough to approve. In K–12, Trust Isn’t the Last Thing. It’s the First.

Your AI Feature Isn’t the Problem. The Trust Gap Is.
AI is everywhere in EdTech—automated feedback, adaptive learning paths, grading support, and content generation. If you’re building smart, AI-powered tools for K–12, you’re in the right race. But many vendors hit the same wall: enthusiastic interest from district leaders, then a long stall… or silence. The reason? Your product is technically impressive, but governance blind.