The Use of AI for Better Education
In Response to Brookings Institution Rebecca Winthrop’s Top 5 Davos Takeaways
First Published on LinkedIn January 29, 2025
Setting the scene
Rebecca Winthrop, Senior Fellow and Director, Center for Universal Education at the Brookings Institution reported some findings of what she heard at Davos. Link to here article here.
The discussion below, comments on three of her five takeways, regarding Neuroplasticity and AI in education.
Link to Rebecca's article http://tiny.cc/gf18001
Takeaway #1: Neuroplasticity and Manipulation
A comment from Daniel Barcay, Executive Director of the Center for Humane Technology, has stuck with me. He said something close to “I think the most beautiful thing about humans is neuroplasticity, but it also makes humans really easy to manipulate.” This dual nature of neuroplasticity lies at the core of why education is often targeted by oppressive regimes.
This comment couldn’t be more apt. Rather than unlocking a world of potential and possibilities for learners and teachers, the influence of oppressive regimes on education, does the inverse and manipulates their neuroplasticity aiding their turning into instruments of the regime. Yet, we should also recognize that even in non-oppressive regimes, education is not entirely void of manipulation, lacking in promotion learner agency. That, however, is a discussion for another time.
Daniel Barclay’s comment highlighted by Rebecca Winthrop also invites us to consider its implications in the world of AI.
For quite some time, AI has received significant attention in education and education research, especially in the context of personalized learning, where machine learning has and still is gradually replacing the rule-based algorithm. Personalized learning became the new promise for improved learning outcomes. Adaptivity and personalization became a new kind holy grail for education. Simply because the technology makes it possible?
This is where the intersection of neuroplasticity and AI merits deeper questions and discussion.
We are all unique learners with distinct learning needs and aptitudes and, most crucially, unique brains. This provided the perfect argument for the introduction of ‘personalized learning’. Let’s be clear, context allowing, good and experienced teachers have used personalized approaches in the classroom for years, without technology.
But t promise of AI-driven personalized learning became quite beguiling. I remember reading a report in 2015, published by the Word Economic Forum in collaboration with the Boston Consulting Group in 2015: New Vision for Education Unlocking the Potential of Technology[1] (note what is being “unlocked” here). The report championed the use of technology to not only to improve learning outcomes and develop higher order thinking skills. What I found rather staggering, was that it referred to, and recommended, the ‘closed loop’ instructional system served beautifully by technology. The alarm bells started ringing.
Serving the needs of closed loop instructional systems? What does this mean in practice? Too often adaptive and personalized learning technologies focused very narrowly on improving grades, relying solely on a ‘rote learning’ regimen. This may help memorization but doesn’t necessarily foster neuroplasticity. Although spaced practice is beneficial for learning, the approach of many an adaptive platform traps learners and teachers (when using adaptive teaching platform) in learning loops. These often amplify the learner’s struggles through repeated narrow practice rather than appropriate feedback and intervention – unless the good and practiced teacher intervenes. The goal? Passing the exam.
While machine learning has come a long way since 2015, it has also intensified the strive and amplification of personalization. The highly data-dependent approach has been combined with prediction, risking a rather deterministic defining of a learner’s ability and promoting the prediction of grades.
Is neuroplasticity’s beauty being developed or are we simply amplifying machine-led manipulation? If the focus remains on knowledge construction through curriculum adherence and too strong a focus on memorization, rather than application and self-regulation. Aren’t we missing the real opportunity?
The question remains open. Promising EdTech innovations continue to emerge, but in K-12, they are still too often retrofitted into systems that do not promote higher order thinking skills and undervalue learner agency. Without thoughtful implementation, AI’s predictive capabilities could easily stifle the beauty of neuroplasticity, boxing learners into their predefined expectations – their potential being predetermined by the algorithm, rather than nurturing their neuroplasticity and capacity to grow.
Takeaway #2: Education Is Fundamentally Different When It Comes to AI
Rebecca Winthrop highlights AI for good and its innovative use. e.g. to support the delivery of clean water to underserved communities, a super tool with meaningful goals - one firmly controlled by the adults.
Her discussion on educator-mediated vs direct-to-child use highlights the importance of retaining teacher agency in mediated use of AI in the classroom.
I would argue that there is and must be room for both. This is especially the case with the introduction of generative AI (GenAI) in the classroom and beyond. These technologies, though not yet ubiquitous in education, are increasingly become integral to life outside of school and society. They will even play a greater role in the workplace of tomorrow. The extent of their influence remains debated and uncertain, but one thing is certain Education needs to become Fundamentally Different when it Comes to AI.
And here is where neuroplasticity’s can flourish – if we allow it. The adult in the room should remain the mediator, but mediation itself must evolve. Learning needs to evolve beyond being the recipient of instruction and make room for more co-creation. Too often teachers are distanced from learning, being positioned as the disseminator of learning, rather than active meta-learners. This must change.
The introduction of GenAI in the classroom make this even more pressing. As an epistemic tool it has the potential to aid the development of knowledge, skills and agency. If used in collaborative and social learning contexts – rather than limited to the tunnel vision of personalized practice – AI can genuinely transform education and the how learners engage in learning.
This is why education cannot simply adapt AI to its current structure, nor can it stick to its current views; it needs a rethink for a human-AI world. That means ensuring that learners and their teachers can be the custodians and be in control of the relationship between human intelligence and artificial intelligence. This will be a fundamental skill for humanity’s future, one that will determine whether we merely survive or truly thrive.
Where should this start? With teacher-mediated (or rather, meta-learner and co-learner) experimentation in the use of GenAI in the classroom. This needs to happen in a safe, structured and intentional process, supported by relevant pedagogies in a dialogue that benefits both learners and educators[2].
Takeaway #3: Learning “AI Skills” Is Just One Piece of the Puzzle
Rebecca points to the growing emphasis on equipping young people in the use AI to meet workplace demands but argues that this is not just the endgame. The real goal must be to cultivate the ability of “learning to learn”. to empower young people to adapt to rapidly evolving technologies. After all, AI itself will soon look radically different.
This strengthens my argument that mediated and self-regulated use of GenAI and AI tools in the classroom is critical for learners and their teachers. If they are not able to engage as active co-learners, they will fall behind and at risk of suffering from skills atrophy, unable to adapt as technologies advance.
The time has come to see GenAI and AI, not as mere tools, but as partners in learning and teaching (co-learning). They must serve as co-mediators to ensure that learners can become the controlling human agent in the process of “learning to learn”, augmenting and augmented by the beauty of neuroplasticity – not manipulated by it, whether by the machine or the regime.
Final Thoughts
The intersection of AI and education presents immense opportunity—but also profound challenges. If we allow AI to determine learning pathways and predict learner potential, without human oversight and control, we risk undermining the very essence of what makes us human: our ability to learn, adapt, and think freely. But if we leverage AI as another co-learner rather than a controlling force, we can create an education system that not only prepares learners for the future but galvanises them to shape it.
The choice is ours.
[1] World Economic Forum, (2015), New Vision for Education Unlocking the Potential of Technology , Industry Agenda, https://www3.weforum.org/docs/WEFUSA_NewVisionforEducation_Report2015.pdf
[2] Note: The arrival of new LLMS that don’t require the bells and whistles such as deepseek, may also help bridge the inequality in access to education and learning. It could make the use of GenAI available to under-resourced contexts. However, the challenge to support minority languages and local context may not be solved that soon (topic for another post).