AI for an EHCP? Here's why it's an absolute no-go.
- Lillie
- May 3
- 4 min read
Updated: May 5

There's a lot of talk at the moment about artificial intelligence (AI). Tools like ChatGPT, Claude, and CoPilot are being recommended more and more for things like writing documents, drafting emails, and even preparing for legal cases. Some services are encouraging parents to use them when applying for an EHCP or preparing for a Tribunal or Judicial Review. On the surface, this might sound like a smart, cost-saving option.
But we want to be absolutely clear: using AI tools to prepare or manage your child's SEND case is unsafe and unsuitable and could seriously damage your child's chances of getting the support they need. There are very real risks, legal, personal, and even cybersecurity-related, that parents must understand before handing over their child's future to a piece of software.
AI isn't Legal expertise; it just looks like it is
AI is very good at producing things that look polished. But legal language is not just about how something reads. It's about precision, accuracy, and meeting specific statutory thresholds. An EHCP request or tribunal bundle that misses or misstates the legal test will be ineffective, no matter how well-written it appears.
AI systems don't know your child. They don't understand how the law applies to the unique facts of your case. They can't question an expert report, spot inconsistencies in LA evidence, or adapt arguments in response to new developments. We've seen examples where AI has confidently given completely incorrect information or misinterpreted UK SEND law based on outdated or non-UK legal principles.
Even more worrying, parents often don't realise there's a problem until they receive a refusal, or worse, attend a hearing where a judge points out flaws in their case. The stakes are too high to leave this to guesswork. A Tribunal decision can shape a child's educational journey for years to come.
Your child deserves more than a robot's guess.
At the heart of this issue is the human side of SEND advocacy. Supporting families through the EHCP process or a legal challenge isn't just about paperwork; it's about understanding the child, their needs, and the wider family context. It's about building trust, navigating complex emotions, and making families feel heard and empowered.
AI can't do this. It doesn't understand trauma. It doesn't recognise distress. It doesn't pick up on the subtle signs that a parent is overwhelmed or that a piece of evidence needs careful handling. Only a real person with relevant experience and empathy can offer this kind of support.
Parents need more than a few paragraphs generated by an algorithm and a black box. They need someone to talk to, to challenge ideas with, to check things over with fresh eyes. Someone who knows what to say in a difficult meeting and how to tailor arguments to what's actually happening on the ground. No AI tool can do that.
Cybersecurity and data protection: What you share might not be safe
There's another issue here, and it's not just about poor advice; it's about where your information ends up.
AI tools, especially well-known ones like ChatGPT, are typically hosted by companies in the United States or outside the UK and EU countries. You may upload that data to external servers when you type something into these tools, such as your child's name, diagnosis, or personal circumstances. Many of these companies state in their terms that your input may be stored, analysed, and even used to improve their systems.
That means your child's sensitive data could be accessed by people you don't know, stored in locations you can't control, and processed under laws that don't meet UK GDPR standards. Even if no one intentionally misuses the data, breaches happen, and when they do, your child's privacy is at risk.
You might also unknowingly breach data protection rules by copying in medical reports, educational assessments, or other third-party content into these platforms. Once that information is submitted, there's often no way to delete it or guarantee it hasn't been shared further.
When it comes to your child's future, get the right support.
We understand why people are drawn to AI: it's quick, free, and promises instant answers. However, there is no such thing as a free lunch. The price you pay is yours, or your child's, personal and sensitive data being used globally, accessible by anyone should it be used for LLM training. This is your child's legal right to education, and getting it wrong can have long-term consequences. Mistakes made at the EHCP or Tribunal stage can delay support for months or even years.
SEND law is complex. The stakes are high. And your child deserves advocacy that reflects their unique circumstances, not the output of a generic tool trained on internet data.
Please don't risk your child's future for the sake of convenience. Work with trained professionals who understand the law, the process, and the lived reality of SEND families. Look for transparent, qualified, and experienced professionals who will treat your child's information with the sensitivity and security it deserves.
You wouldn't trust a robot to write your Will, manage your divorce, or represent you in court. So don't trust one with your child's education either.
We are, and always will be, Stronger Together.
You can find testimonials to our bespoke services on our website and Facebook business page - here
Comments