Share

Ruchi Tewari, Associate Dean – Marketing, Communications and Public Affairs – MICA

The question oft asked is, ‘Is Artificial Intelligence Stifling Students’ Thinking’? and my answer to that is both, ‘Of course Yes’, but that forces me to ask a bigger question, ‘Can we do something about it?’ and the answer to that is, ‘Yes, but you will have to think critically, work passionately and seek support from learners and administrative machinery to make it happen’.

The joy of being a part of education is the challenges that it throws up and each challenge comes with a similar threat, ‘compromised learning output leading to reduced human potential’ and a true educator is always willing to strike back. In today’s rapidly evolving academic landscape, the use of Artificial Intelligence (AI) in management education has become widespread—almost ubiquitous. From auto-completing case study analyses to assisting in business writing, AI is now the go-to for students seeking quick answers. According to a 2024 McKinsey report, nearly 67% of management students in postgraduate programs globally admit to using AI tools for academic assignments at least once a week. However, this convenience comes at a cost.

The AI Paradox in Education

Generative AI, such as ChatGPT and others, processes massive amounts of information to generate coherent responses. While incredibly helpful, these systems operate within the boundaries of prompts given to them. This “prompt dependency” often results in content that lacks depth, contextual relevance, and emotional or cultural sensitivity. Scholars like Brennen & Kreiss (2020) argue that undue reliance on AI in academic writing promotes superficiality and dilutes authenticity, creating a generation of thinkers who might never have really learned how to think deeply.

So, what can management institutions do to build cognitive rigor alongside technological fluency?

Critical Thinking: The Missing Piece

Critical thinking remains a core skill in the modern management toolkit. As Paul & Elder (2020) put it, critical thinking enables students to evaluate AI-generated responses not just for grammar and structure, but for contextual fitmentethical nuance, and organizational relevance. This skill is of primal importance for management students whose success depends on the decisions that they take which in turn rests on the nature and quality of data and information at hand. Given the growing inclusion and dependence on AI generated output, the students critical thinking skills need to be very high to spot biases, false claims and assertions, misrepresentation and fake output. Most schools are rising to this need and testing ways in which AI use is impacting student learning. As a solution to combat the ill-wills of AI,  schools are adopting unique pedagogical innovations to ensure that they combine analytical ability with creative agility.

Reporting one such Experiment:

Name of the Experiment: Writing Course Meets a Digital Dilemma

Respondents: First trimester students of a management program, doing a course focussed upon, ‘Communication – Thinking and Writing’. This course was writing-intensive and designed to train students in logical reasoningargument building, and strategic message design.

Challenge: It was known from the past experiences that many students used AI tools to draft their assignments—even though it clashed with the course’s academic integrity guidelines.

Response: Instead of simply clamping down on AI use, the faculty adopted a research-based approach. Collaborating with a teaching assistant, the faculty member conducted:

  • Field notes during classroom writing activities
  • Analyses of submitted assignments
  • Focus group discussions and semi-structured interviews

Results: The findings were both revealing and instructive:

  • Students lacked awareness of the risks and limitations of generative AI
  • They were torn between ease and ethics, unsure whether to choose speed over originality
  • AI usage peaked during the first assignment, and persisted unless explicitly addressed
  • Constructive feedback loops and iterative writing tasks helped students rediscover their voice


Designing a Solution: AI as a Tool, Not a Crutch

Rather than banning AI outright, the course design was restructured to integrate AI use transparently. The updated pedagogy followed a three-tiered approach:

  1. Structured Group Task with AI Use

Students were encouraged to use AI tools—but had to submit their promptsoutputs, and rationale for each interaction. This not only made the process open, but helped them reflect on what they were asking—and why.

  1. Collaborative Feedback + Individual Rewrite

Each group then participated in a faculty-led feedback session. Students took notes and reworked the group output into individual submissions, once again using AI if needed—but now with a clear critical lens.

  1. Reflection and Self-Assessment

The final submission included a reflective note on AI’s limitations. Many students admitted that while AI was helpful in structuring their thoughts, it failed to produce the insight and emotional resonance they could bring through their own thinking.

By the end of the third round, most students chose not to use AI at all—opting instead for their newly strengthened writing abilities.

Key Learnings and Takeaways

This exercise surfaced three critical lessons for educators and curriculum designers:

  • AI doesn’t need to be banned—it needs to be framed. When integrated with intention, AI can be a springboard for deeper thought rather than a replacement for it.
  • Reflection and iteration are essential. Students only begin to trust their own voice when given space to try, fail, and improve with support.
  • Ethics matter. Transparent use of technology and honest conversations around it create a culture of responsibility and trust.


The Road Ahead: Human-Centered Learning in an AI World

As more AI tools flood into classrooms, management education stands at a crossroads. Should we fear AI’s influence, or teach students to master it critically and ethically?

The results of the experiment showed that the answer lies not in choosing between human and machine, but in aligning them for better learning outcomes. With AI as a collaborative partner—not a ghostwriter—students are more likely to become the strategic thinkers, persuasive communicators, and ethical leaders that the future demands.

In the words of one student, “The more I wrote on my own, the more I realized that AI couldn’t think like me. It could only repeat what was already said.” And in that realization lies the true power of education.

Declaration: The above is a machine generated image.

Share

Skip to content