2026 admissions still open | Term 2 starts 7 April | Fees from R3000 per month | Enrol now

News & Media
Academics
Resources

Your child's school has an AI policy. It just hasn't told you what it is.

There is a decision being made right now about how artificial intelligence will be used in your child's education. It is being made in boardrooms, curriculum committees, and technology procurement meetings. In most schools, whether in Johannesburg, London, or Dubai, it is not being communicated to parents in any meaningful way.

This matters because the decisions being made now will determine whether your child develops a productive relationship with AI, or a dependent one. And those two outcomes are not the same thing dressed differently. They produce different learners, different skill sets, and different adults.

The scale of what is already happening

86% of students globally are already using AI for their studies. That figure jumped from 66% in 2024 to 92% in 2025 - the largest single-year increase on record. 26% of teenagers are now using AI tools specifically to complete schoolwork.

These numbers are not going down. The technology is already in your child's classroom, on their phone, and available at any hour without supervision. The question is not whether your child will encounter AI in their education. They already have. The question is whether anyone has designed a thoughtful relationship between the learner and the tool.

Most schools have not. A 2025 global survey found that only 35% of school leaders had provided students with any formal AI training. 45% had no policy on AI in schools at all. The UAE is the notable exception - its Ministry of Education formalised AI as a compulsory subject from kindergarten to Grade 12 in its 2026–2027 national curriculum framework, the most explicit government-level position on AI in education anywhere in the world. Most countries, and most schools within them, are still reacting rather than designing.

The distinction that changes everything

A 2026 Brookings Institution report on AI in K-12 education identified what separates AI that develops learners from AI that diminishes them. Unguided AI use leads to cognitive offloading — students outsourcing thinking they should be developing. Structured, guided AI use produces significantly better critical reasoning outcomes than traditional instruction alone. The variable is not the technology. It is whether an adult has designed a purposeful relationship between the learner and the tool.

Nearly two thirds of parents globally believe AI is already weakening academic skills - writing, reading comprehension, critical thinking. That concern is legitimate. But it is pointing at the technology when it should be pointing at the design. The problem is not AI in education. The problem is AI in education without intention.

What an intentional answer looks like

Teneo's Smart School System™ is not a tool learners interact with to produce work. It is the infrastructure behind every teacher - tracking engagement, submission patterns, and learning progression in real time, and surfacing that information to a qualified human who can act on it. The AI operates in the background. The teacher operates in the relationship.

This distinction matters. A system that uses AI to give teachers better information produces better teachers. A system that gives students AI to produce answers produces students who cannot produce answers without it.

Independent actuarial analysis of all Teneo learners (2023–2025) shows an average mark improvement of 12% in year one and 25% by year four. That improvement is not an AI result. It is a teacher result, made possible by AI that operates where it belongs - informing the people who are responsible for the outcome.

Your child's school has an AI policy. The question worth asking is whether it is one anyone has actually thought through.

Share