°®Âþµº

An AI Companion Helps Students Learn to Learn

Article Icon Article
Monday, December 2, 2024
By Anand Nandkumar
Illustration by iStock/Svetlana Larshina
In a Strategic Innovation Management class at the Indian School of Business, students learn to ask questions by engaging with a customized chatbot.
  • For each assignment in an EMBA class, a session bot is primed with materials such as newspaper articles and transcripts of videos.
  • Before class starts, students are given assignments to complete, as well as guided prompts they can use to query the AI tutor to deepen their learning.
  • Students who embrace the chatbot’s capabilities engage with primary sources at a higher rate and perform better than students who do not use the tool.

 
In today’s workplace, managers must constantly navigate ambiguous or uncertain domains. They are expected to make strategic decisions about problems or situations they might never have faced in their lifetimes.

That’s why one of the most critical skills that aspiring managers can have is the ability to learn to learn continuously. Managers may not have all the answers, but they will make better decisions if they are willing to invest the time in educating themselves.

Instructors can help students develop the learning “muscle” by teaching them how to ask questions. One tool instructors can use to achieve this goal is artificial intelligence (AI). When used well or creatively, AI has the potential to amplify individual student learning. It can support student exploration, push students to be curious, and interact with them in ways that match their individual experiences and abilities.

In fact, I believe AI can be part of the instructional process that Kathleen Hogan and Michael Pressley describe as . It can be a device that allows students to learn by themselves or with some help.

For that reason, I recently incorporated an into the Strategic Innovation Management (SIMT) course I teach at the Indian School of Business’s executive MBA program for working professionals. I piloted this version of the SIMT course with a total of about 125 students in the Delhi and Hyderabad cohorts. Here are the results of that experiment.

Introducing the AI Tutor

I designed the pilot of my SIMT course around the central theme that learning comes from asking questions. My goal was to simulate real-world business challenges, where decision-making often requires managers to understand the interplay between unfamiliar strategy, ever-changing technology, and a constant need for innovation.

In the course, students meet in person one weekend a month. Traditionally, EMBA students receive assignments and take exams after the in-person meetings, but that’s not the way I organized this version of the SIMT class.

Instead, I traded the traditional lecture-homework-exam approach for a learning-by-doing format. As Charles Bonwell and James Eison , strategies that promote active learning should be “instructional activities involving students in doing things—discussing, writing, analysing—and thinking about what they are doing.” My class exemplified active learning in action.

Students were encouraged to use a specially trained AI tutor as a sparring partner when they engaged with the course materials.

To ignite critical thinking and train students to ask for information in a structured way, I created integrated assessments that had to be submitted before the in-person class even began. Course outlines and assignment questions were given to students three weeks before the assignment was due.

To tackle these assessments, students were encouraged to question an AI tutor that had been trained on class materials and videos. To guide students in their discussions with the AI tutor, I shared 10 to 12 prompts per session in the course summary. Students could use the AI tutor as a sparring or discussion partner as they engaged with the course material. Then they could interact with it more deeply, either using the guided prompts I had provided or asking their own questions.

“After using the given prompts, sometimes I needed more clarifications or details or did not understand the focal point of the reading,” says Anu Jalan, a student from the Dehli cohort who works as a data analyst with an international bank. “I added my questions to the given prompts and asked the AI tutor to give more clarity based on my background and experience. It was like having a study companion there, especially for me.”

I also alerted students to the possibility that the AI tutor could sometimes, which meant they would need to double-check the information it provided. Student feedback indicated that the prompt-based approach helped mitigate the issues associated with AI hallucinations.

I directed students to work with the AI tool until they understood multiple frameworks and became informed enough to complete their assignments. Then, once they were in class and we went over the assignments, they could figure out what they did well and what they could have done differently.

Although some students were hesitant about this format at first, many of them came to greatly appreciate it. “I was taken aback when the course outline came,” admits Sarat Chandra Pemmaraju, a technical architect at Hewlett Packard who was part of the Hyderabad cohort. “I thought to myself, ‘With this grading pattern, we have finished the course even before we set foot into the classroom.’ However, since 90 percent of the grading was on what we did in our assignments, I had to be more prepared. Having an AI tutor meant I covered all the reading material, which is not usually the case.”

Creating the Companion

Introducing an AI tutor meant I had to prepare differently for this course, even though I have used this material multiple times. Since questions or prompts had to be planted up front, I first had to prepare eight sets of detailed questions—almost 100 altogether. Then, I spent a significant amount of time synchronizing the responses from the AI tutor with how I would teach the case in the classroom. This required me to construct the story in advance.

Working with predefined materials, I used the AI tool to train and create custom GPTs using ChatGPT 3.5. I created an individual session bot for each lecture, setting its characteristic to “creative” so as not to restrict responses. I uploaded reading materials such as newspaper articles, session slides, and transcripts of videos (Cody does not accept video files or mp3 files). All of this information became training material for the SIMT bots.

Introducing an AI tutor meant I had to prepare almost 100 questions in advance and synchronize responses from the AI tutor with how I would teach the case.

Since I have a background in information systems, designing the technical architecture was easy for me. The challenging part was designing the prompts for every session and each particular reading. I didn’t want students to start working with the AI tutor until I had tested the prompts myself. I spent about 20 hours with each set of prompts, doing dry runs, testing updates, and clarifying questions.

I wanted to validate the accuracy and appropriateness of the bot’s responses for all the significant focal points from the reading material, and I also wanted to check for hallucinations. I found that the tool occasionally produced inaccurate or fabricated responses. For example, sometimes it cited nonexistent sources or references and, at times, offered overly generic information that didn’t align closely with the intended concepts.

When I thought of introducing an AI tutor, it never occurred to me that I would have to do so much work myself. However, planning the details of the case teaching should be a one-time effort, and I don’t expect to change much for the next few cohorts. I believe I have anticipated the questions students might think of when they first encounter a case, and I have created the right prompts to guide them. And, overall, I think all the effort I invested has made the AI tutor that much more effective for students.

So far, most students have quickly learned how to use this “evolving tool,” as it was described by Raghu Kiran Gajula, a student from the Hyderabad cohort who is a product lead with a software firm. According to Gajula, “We needed to give it the correct question to get the response that suited our specific level of understanding. If I wanted to drill down, I had to prompt the AI tutor again. In terms of accuracy, the bots were closer to a 90 percent match with what I read and what it gave as a summary.”

Assessing the Results

The integrative assessments of the SIMT course allowed me to test the potential of AI to act as a tutor and aid students as they worked on their assignments. These assessments also enabled me to determine if students could identify and analyze managerial and strategic issues related to innovation and technology, two of the learning goals for the course.

In their individual submissions, students had to document the range of their interactions with the AI tutor. For instance, they had to show how they had dealt with problems as if they were consultants to CEOs, commentators on real-life business issues, or even advisors to global companies looking to foray into India.

In their group work, teams of eight or nine students had to present a case linked to a lecture. William R. Slomanson describes this type of as a methodology in which the professor’s lecture is delivered at home and the student’s homework is done in class. I consider doing “homework in class” to be a way of encouraging higher-order thinking because students must engage in discussions to clarify their ideas. They cannot simply passively take notes while they listen to me lecture.

After each two-hour classroom session, students received comparative feedback on the individual assignments they had submitted. I employed a GPT to categorize these submissions into three distinct clusters based on how well each submission matched my benchmark and how it compared to the work of others in the class. This gave students a qualitative measure of their responses. It is important to note that a dedicated teaching assistant—a human—conducted the final grading for individual assignments.

I did not design the pilot to reduce student workload. I designed it to prepare students for a future where AI is an integral part of the workplace.

Post-class surveys revealed that students who embraced the AI tutor’s capabilities engaged with primary sources at a significantly higher rate than students in previous cohorts. They also performed better than students who did not take advantage of the tutor, leading me to believe it was a viable tool for scaffolding student learning.

“Usually, defining the applicability of the concepts is challenging,” says Revanth Vaddi, a student in the Hyderabad cohort who works as a product manager with a global investment banking firm. “It leads to a lot of conversations and debates during group work. In this course, we got on a group call, debated the relevance of the solution, and used the AI tutor a lot to think through the solutions.”

Paying Attention to Prompts

After the pilot programs were completed, I also assessed how student performance for the Hyderabad cohort was affected by the type and number of questions students asked.

A correlation analysis for that cohort showed me the relationship between the number of questions a student asked per assignment and the score that the student received. The positive correlation indicated that those who interacted more deeply with the AI tutor scored better.

A prompt analysis report allowed me to evaluate the quality of the AI’s responses to the students’ questions and compare them to the responses the tool had generated for me. I found that the depth and relevance of the answers increased as students repeatedly asked questions on the same topic, indicating to the bot that they needed more details. Students got the best responses when they requested a summary of the pre-reads before they asked specific questions.

The prompt analysis report also enabled me to assess the number of questions that were asked per topic for each session. This gave me a better view of which readings students engaged with more deeply.

Finally, the report analyzed the differences between the students’ prompts and mine. I learned that most students chose to combine or slightly modify the prompts I gave.

The Way Ahead

Why is it so critical for business schools to provide students with opportunities to engage with AI? The answer can be found in the post-class survey I did of the Hyderabad cohort. Of the 21 percent of students who responded, 87 percent of them indicated that they already use AI weekly at work, and 74 percent had used AI to help with assignments.

I did not design the pilot to reduce student workload. I designed it to prepare students for a future where AI is an integral part of the workplace. When they understand how to pose questions to an AI tool, they will turn it into a partner that helps them continually learn.

Seema Chowdhry, director of the Centre for Learning and Teaching Excellence at ISB, contributed to this article.

What did you think of this content?
Thank you for your input!
Your feedback helps us create better content.
Your feedback helps us create better content.
Authors
Anand Nandkumar
Associate Professor of Strategy, Associate Dean of the Centre for Learning and Teaching Excellence, and Executive Director of the Srini Raju Centre for IT and the Networked Economy, Indian School of Business
The views expressed by contributors to °®Âþµº Insights do not represent an official position of °®Âþµº, unless clearly stated.
Subscribe to LINK, °®Âþµº's weekly newsletter!
°®Âþµº LINK—Leading Insights, News, and Knowledge—is an email newsletter that brings members and subscribers the newest, most relevant information in global business education.
Sign up for °®Âþµº's LINK email newsletter.
Our members and subscribers receive Leading Insights, News, and Knowledge in global business education.
Thank you for subscribing to °®Âþµº LINK! We look forward to keeping you up to date on global business education.
Weekly, no spam ever, unsubscribe when you want.