In my previous article, When Machines Do the Thinking, I raised a set of provocations around the deployment of Artificial Intelligence in education. The thrust of that piece was to question the seductive promise of AI-led personalization and efficiency by pointing to what gets lost when education becomes data-driven: the embodied, ethical, and imaginative dimensions of learning.
This second article picks up where the first left off. It ventures further into the grey zone between technology and pedagogy, where consciousness meets cognition, presence meets interaction, and decision-making transcends algorithms.
But let me start with a simple, perhaps inconvenient, question:
What is it that AI cannot know?
And this, in turn, leads us to ask:
What is education really for?
These questions are not rhetorical. In India today, where EdTech companies court government partnerships and AI tutors populate urban classrooms and remote intervention programs, they have real consequences. When we install AI systems to teach, test, and track students, especially those from economically and socially disadvantaged backgrounds, we must also ask: Whose understanding of learning is being encoded in these systems? And what gets effaced in the process?
Let us look at five domains where human learning stretches far beyond what algorithms can currently comprehend:
- Consciousness and Emotion
- Social Interaction
- Adaptation and Learning
- Physical Presence
- Decision-making
I. Consciousness and Emotion: What Machines Simulate but Cannot Sense
We often confuse cognition with computation. A student solving an algebraic equation or analyzing a poem may appear to be engaged in pure logic. But beneath that visible task lies a turbulent emotional landscape. The fear of failure. The joy of discovery. The anxiety of being watched. The self-consciousness of speaking in a second language. The boredom of repetition. All these feelings shape, distort, deepen, or block the learning process.
AI, even in its most sophisticated avatar, is a pattern recognizer. It can analyze sentiment from keystrokes or facial micro-expressions, but it does not feel with the learner, nor does it feel for the learner.
Let me offer two examples from field interventions where the limits of AI’s perception become painfully evident. In a government school in the Giridih district of Jharkhand, tablets had been introduced to support digital storytelling modules. The interface was vibrant, with animated visuals, cheerful voices, and interactive prompts. Everything looked successful on paper. One of the students, a young Adivasi girl, was marked present and fully logged in. She was technically completing the tasks, moving through the lessons as expected. But something felt off. Her teacher noticed that she sat quietly, her eyes distant, her body language unresponsive. She was going through the motions, but the spark of engagement was missing.
Trusting her instinct, the teacher paused the session and walked over to the girl. She sat beside her and asked gently, “Tum kahaani mein kya banna chahti ho?” which means, “Who would you like to be in the story?” The girl hesitated. After a long silence, she whispered, “Main toh hamesha bimar banti hoon,” or “I’m always the sick one.”
In the days that followed, the teacher came to understand the deeper meaning behind those words. The girl’s mother had recently left to work in a faraway city, and the child, struggling with separation and loneliness, had begun to express her emotional pain through the idea of illness. But there was more to the story. The tales being told through the digital platform were completely alien to her world. They were set in urban environments with characters and routines she could not relate to. There were no forests, no cattle, no mahua trees, no festivals, she knew. There were no signs of her Adivasi life. The stories excluded everything she knew and loved.
To the AI system, she appeared to be engaged. But in reality, she was invisible within the world it presented.
The teacher responded not with more data, but with care and creativity. She began telling stories that included the forest trails the girl walked on, the birds she recognized, the local festivals she celebrated, and the wisdom of elders in her community. These stories gave the child a sense of belonging. They made her feel seen. Over time, the girl began to participate. She asked questions, offered answers, and even laughed during class.
The second instance comes from a government school in Udham Singh Nagar, Uttarakhand. A Muslim girl, older than her classmates due to delayed enrollment, had been placed in Grade 2. She often reacted violently in the classroom. She pushed other children, threw objects, and screamed when contradicted. If an AI-based behavior management system had been in place, it might have flagged her as disruptive or aggressive and could have suggested isolation or disciplinary action. Instead, her teacher, concerned by the repeated outbursts, chose to visit her home. There, she observed a crucial detail. The girl’s father routinely used physical aggression to assert control within the household. For this child, violence was not rebellion. It was mimicry. It was the only language of assertion she had seen modeled. Recognizing this, the teacher changed her approach. She introduced routines built on trust, helped the child express herself through drawing and storytelling, and slowly guided her toward more regulated behavior, not through punishment, but through empathy and steady presence.
In both cases, it was not a dashboard or dataset that revealed the deeper issue. It was the attentiveness, the human inquiry, and the moral imagination of a teacher.
No matter how personalized an algorithm may appear, it cannot reach into the emotional and cultural depth that learning often requires.
And this should compel us to ask a fundamental question:
When we delegate core pedagogical responsibilities to machines, are we prepared for what they are bound to overlook?
II. Social Interaction: Education as a Human Dialogue
In much of India, learning is not an isolated process. It is deeply social. Children learn not just from textbooks but from mimicking elder siblings, singing in groups, debating in class, playing under banyan trees, and arguing over who gets to hold the chalk. Sociality is not peripheral to learning; it is learning.
AI systems, however, engage learners in dyadic interactions between the student and the machine. The conversation may be sophisticated, but it remains two-dimensional. It lacks the unpredictability, the shared silences, the off-topic jokes, the subtle power dynamics, and the peer learning that give real classrooms their texture.
Consider the philosophy clubs in Krishnamurti Foundation schools, where students engage with moral dilemmas through open-ended dialogue. If these were replaced by AI chatbots, something vital would be lost. It would not just be the content, but the relational fabric of learning - the ability to challenge, misunderstand, respond, and grow together in a shared space.
Even in highly digitized contexts such as South Korea, where platforms like Riiid are commonly used for exam preparation, the government had to reintroduce structured, peer-interaction modules. Students were reporting social fatigue, isolation, and a diminishing sense of community. In India, where education is deeply embedded in networks of kinship, caste, language, and local culture, this challenge is likely to be even more pronounced.
This brings us to a critical question:
How can we integrate AI into classrooms without displacing social interaction?
The answer may lie in using AI to facilitate human dialogue, not replace it. Technology can be helpful when it recommends conversation prompts, organizes students into groups based on complementary skills, or tracks equity in participation. But the conversation itself must remain distinctly human, rooted in curiosity, emotion, disagreement, and mutual recognition.
III. Adaptation and Learning: The Intelligence of Improvisation
Machine learning thrives on pattern recognition. It adjusts inputs to optimize outputs. But human learning is not merely reactive. It is reflective, imaginative, and deeply contingent on context. Human learners improvise in ways that machines do not yet comprehend.
Let me illustrate with an example from a middle school in Purnia district, Bihar. During a math lesson on perimeter and area, the textbook posed a problem that involved calculating the area of a lawn enclosed by a wooden fence. The teacher quickly realized that most students, who were children of daily-wage labourers and small farmers, had never seen a manicured lawn or a wooden fence. The abstract nature of the question made it irrelevant to their experience. Instead of continuing with the textbook, the teacher walked the students outside to a shared handpump and asked them to trace the rectangular concrete slab that surrounded it. She said, “Yeh haath-pump wala chabutra kitne pair lambaa hai?” (How long is this platform around the pump?) The children began measuring it with their feet, laughing, arguing, and estimating. She then wrote the measurements on the wall and returned to the problem, now calculating the area not of a distant and unfamiliar lawn, but of a chabutra they used every day.
Can AI do this?
At present, no. Adaptive learning platforms can modulate difficulty levels, rephrase questions, or select different examples - but they cannot tap into culturally embedded metaphors or objects from lived experience.
Moreover, India's cognitive landscapes are multilingual, multiepistemic, and often non-linear. Students simultaneously navigate oral traditions, religious texts, visual storytelling, and digital media. AI systems designed with Western epistemologies frequently fail to grasp this complexity.
We must thus ask: What forms of knowledge does AI privilege? And whose knowledge does it render invisible?
IV. Physical Presence: The Pedagogy of Proximity
The physical presence of a teacher communicates far more than just academic content. Her body, gestures, gaze, voice modulation, and even her silences create an atmosphere of trust, rhythm, accountability, and care. For young children, especially in the foundational years, learning is deeply sensory. They observe, touch, and explore, mimic movements, and mirror emotions. This kind of learning depends on human presence.
Digital AI tutors, regardless of how advanced they may be, remain disembodied. They can simulate interaction on a screen, but they cannot inhabit the same space as the learner. In Indian classrooms, where teacher-student relationships are shaped by rituals such as greeting each other with a namaste, chanting shlokas together, or sharing a midday meal, the absence of physical and emotional presence becomes all the more evident.
During the pandemic, several NGOs introduced AI bots and IVR-based learning modules to maintain some form of educational continuity. These tools offered access to lessons, but in many tribal and rural areas, the simple return of a teacher had a far greater impact. Even one caring adult in the room brought a dramatic improvement in student retention and participation. Learning regained its vitality through human interaction, trust, and the comforting rhythm of shared time.
This insight is not unique to India. In Finland, the National Agency for Education emphasized the need for classrooms to sustain a shared emotional space. That kind of space can only be nurtured through real human presence, not through virtual proxies.
As India continues to adopt AI in education, we must be clear about its role. AI should support teachers, not replace them. It can assist quietly in the background, organizing information or suggesting tasks, but it cannot replace the warmth in a teacher’s voice or the meaning conveyed through a glance.
The real work of teaching still belongs to those who are present with children, in body, mind, and heart.
V. Decision-Making: From Predictive Analytics to Ethical Judgment
Finally, the most profound concern with the use of AI in education is not just a technical one. It is an ethical one. Increasingly, AI systems are being used to make important decisions. They determine who should receive remedial support, who qualifies for scholarships, and who might be best suited for STEM pathways. These decisions are often made through opaque processes, built on data histories that reflect existing structural inequalities.
Consider what happened in the United Kingdom in 2020. An algorithm used to assign A-Level grades ended up downgrading students from disadvantaged schools because it relied heavily on historical performance data. The result was widespread outrage. Many high-performing students, especially from working-class backgrounds, were unfairly penalized. Eventually, the system had to be withdrawn.
Now think about what this could mean in India, where social hierarchies related to caste, gender, and language are deeply embedded in educational experiences. Imagine an AI system trying to assess the performance of a Dalit girl whose academic struggles stem from long-standing exclusion, limited resources, or systemic neglect.
Will the system understand her context and history, or will it simply classify her as a low performer and place her in a remedial track?
Making decisions in education requires more than data. It requires judgment. It demands the ability to hold context alongside performance, and to balance fairness with care. These are qualities that no machine, no matter how advanced, can replicate. What we need are systems that keep teachers and educators at the center. AI can offer suggestions, but humans must interpret them. Teachers must be able to review, question, and adapt AI-generated recommendations. Professional discretion must guide the final call, not automated prediction.
In a country as diverse and unequal as India, human insight is not a luxury. It is a necessity.
Conclusion: Not Just Smarter Systems, But Wiser Societies
As India explores the possibilities of AI in its pursuit of educational equity and excellence, we must remind ourselves of a few simple truths. Efficiency is not the same as wisdom. Prediction does not equal understanding. And simulation cannot replace presence.
Yes, AI can help us scale interventions, customize learning paths, and ease the burden on overworked teachers. It can assist, analyze, and organize. But it cannot care. It cannot respond with warmth to a child’s silence. It cannot improvise when a lesson is met with confusion or grief. It cannot ignite wonder or curiosity. Most importantly, it cannot carry the moral weight of decisions that shape young lives.
The future of education in India does not depend on a choice between machines and teachers. It depends on our ability to reimagine the relationship between the two. We must build a partnership where technology extends human capacity, but never replaces human responsibility.
Such a partnership will require a new kind of institutional culture. One that acknowledges the limits of machines, equips teachers to engage with AI critically, and places consciousness, ethics, and cultural context at the heart of every educational decision. Jiddu Krishnamurti once said, “When you learn something in relationship, then that learning is yours. It is not imposed by authority. It is not forced through fear. It is not learned to pass an examination.” This insight stands at stark odds with how AI systems in education are being designed today. Authority is now coded into the algorithm. Assessment precedes experience. Relationship is reduced to interface.
In the final part of this series, I will offer a detailed framework for integrating AI into Indian education ethically and inclusively. This will include suggestions on policy design, EdTech governance, teacher training, and context-sensitive implementation.
Until then, I leave you with a simple thought.