AI and EHCPs

The Future of Local Authority Decision-Making or a Risk to Families?

This week, several reports have highlighted the use of artificial intelligence (AI) by Local Authorities (LAs) in managing Education, Health and Care Plans (EHCPs). It is a development that raises both hope and concern for families navigating the SEND system.

So what does it mean if AI becomes part of EHCP decision-making? Let’s weigh up the pros and cons.

The Potential Benefits of AI

1. Faster Administration
One of the biggest frustrations for families is delay. AI tools could speed up routine tasks like flagging missing evidence, checking timeframes, or drafting standard sections of reports. This could, in theory, free up human staff to focus on meaningful engagement with families.

2. Consistency Across Cases
AI systems can apply rules consistently, reducing the chance of two families receiving different decisions simply because they had different caseworkers. In principle, this could improve fairness.

3. Better Use of Data
AI can analyse large amounts of information quickly. Used well, it could help spot gaps in provision across an area, identify common needs, and highlight where schools or services are struggling to meet demand.

The Risks and Concerns

1. “Computer Says No” Decisions
Families fear that decisions about children’s futures could be reduced to algorithms. AI can only work with the data it is given. If reports are vague, incomplete, or biased, the outcome will be flawed. There is a danger that complex human needs will be oversimplified.

2. Lack of Transparency
If a decision is influenced by an AI model, how can families challenge it? Unlike a human caseworker, an algorithm cannot explain its reasoning in plain English. Without transparency, parents may be left powerless.

3. Risk of Cost-Cutting
If AI is trained with financial “savings” as a priority, it may recommend lower levels of provision. This could turn AI into a tool for reducing support rather than improving it.

4. Data Protection Issues
EHCPs contain sensitive personal information. Families have a right to know how that data is used, who can access it, and whether it is secure. The use of AI adds another layer of risk.

Striking the Balance

AI has the potential to improve efficiency, but it cannot replace professional judgement, empathy, or legal accountability. The Children and Families Act 2014 and the SEND Code of Practice place duties on LAs that cannot be delegated to a machine.

For families, the key questions to ask are:

  • Is AI being used to support caseworkers, or to replace them?

  • Are decisions still being made by accountable human professionals?

  • How can parents appeal or challenge AI-influenced outcomes?

Final Thoughts

Technology is not the enemy. Used carefully, AI could help reduce delays and bring greater consistency to the EHCP process. But families must be assured that human judgement, lawful process, and the child’s voice remain central.

The SEND system already struggles with trust. If AI is seen as a cost-cutting shortcut, that trust will erode further. Families need reassurance that any use of AI in EHCPs will enhance fairness, not undermine it.

Previous
Previous

School Absences: What Parents Need to Know

Next
Next

Your Rights Under the SEND Code of Practice: What Every Parent Should Know