In February, Ella Stapleton, then a senior at Northeastern University, was reviewing lecture notes from her organizational behavior class when she realized something strange. Was that a ChatGpt question from her professor?
In the middle of a document her business professor created for a leadership model lesson, she was given instructions to chat “Expand all areas. More in depth and concrete.” A list of positive and negative leadership traits was followed, with examples of definitions and bullets each breeding.
Stapleton texted a friend in her class.
“Did you see the notes he placed on the canvas?” she wrote, referring to the university's software platform for hosting course materials. “He made it with ChatGpt.”
“OMG STOP,” replied my classmate. “What's wrong?”
Stapleton decided to dig. She reviewed the professor's slide presentation and discovered AI: other teletail signs in the distorted text, pictures of office workers with external body parts, and terrible mistakes.
She wasn't happy. Given the cost and reputation of the school, she was hoping for a top-notch education. This course was needed by her business miners. The syllabus banned “academic fraudulent activities,” including the misuse of artificial intelligence and chatbots.
“He tells us not to use it, and he uses it himself,” she said.
Stapleton filed a formal complaint with Northeastern business schools, citing private use of AI and other issues he had with his teaching style, and demanded a refund for the tuition fees for that class. As a quarter of the total amount for the semester, that would be over $8,000.
When ChatGpt was released at the end of 2022, it made cheating unbelievably easy, causing panic at all levels of education. Students who are asked to write a historical paper or literary analysis can let the tool do it in just a few seconds. Some schools banned it, while others deployed AI detection services despite concerns about its accuracy.
But oh, how has the table changed? Currently, students are evaluating my professors for their overreliance on the instructor's AI and complaining on sites that scrutinize course materials with words such as “Chatgpt is “important” and “drilling.” In addition to calling for hypocrisy, they also have financial arguments. They are taught by humans, and they pay a lot of people, not algorithms that they can consult for free.
On their part, the professor said he used AI chatbots as a tool to provide better education. An instructor interviewed by The New York Times said the chatbot saved time, helped with overwhelming workloads and served as an automated teaching assistant.
Their numbers are increasing. Last year, in a national survey of over 1,800 higher education instructors, 18% described themselves as frequent users of generator AI tools. This year's repeated surveys almost doubled the proportion, according to Tyton Partners, the consulting group that conducted the study. The AI industry wants to be useful and support profits. Start-Ups Openai and Anthropic recently created an enterprise version of a chatbot designed for universities.
(The Times sued Openai for copyright infringement for using news content without permission.)
Generic AI is clearly here to stay, but universities are struggling to keep up with changing norms. Currently, the professor is on the learning curve, and like Stapleton's teacher, he confuses the pitfalls of technology and the students' negligence.
Make a grade
Last fall, 22-year-old Marie wrote a three-page essay for her online anthropology course at Southern New Hampshire University. She searched for her grades on the school's online platform and was pleased to receive an A. However, in the comments section, her professor happened to be posting repetitions on ChatGpt. This included a grading rubric the professor asked to use for chatbots, and a request for “really great feedback” to give to Marie.
“From my point of view, the professor didn't even read what I wrote.,“They asked to use their middle name and asked not to reveal the professor's identity,” said Marie. She said the temptation to use AI working in schools was the “third job” for many instructors who may have hundreds of students.
Still, Marie mistreated her during the Zoom meeting and confronted her professor. The professor told Marie that he read the student's essays but used ChatGpt as a school-sanctioned guide.
Robert McAuslan, vice president of AI in southern New Hampshire, believes schools are “in the power of AI to transform education,” and said there are guidelines for both faculty and students to “ensure that this technology enhances human creativity and surveillance rather than replace it.” DOS and DOS for faculty using tools such as ChatGpt and Grammarly “instead of authentic, human-centered feedback.”
“These tools should not be used to 'do the work' for them,” Dr. Macauslan said. “In fact, they can be seen as an enhancement to an already established process.”
After the second professor appeared to give feedback using ChatGpt, Marie transferred to another university.
Paul Schoblin, an English professor at Ohio University in Athens, Ohio, said she can understand her frustration. “I'm not a huge fan of that,” Dr. Shoblin said after being told about Marie's experience. Dr. Shovlin is also an AI teacher, whose role involves developing appropriate ways to incorporate AI into teaching and learning.
“The value you add as an instructor is the feedback you can give to your students,” he said.. “It is the human connection we build with our students as people who are reading our words and are influenced by them. ”
Dr. Shovlin is advocate for incorporating AI into education, but not simply to make life easier for instructors. Students need to learn to use technology responsibly and “develop ethical compasses with AI.” Otherwise, there may be consequences. “If you get messed up, you'll be fired,” Dr. Shoblin said.
One example he uses in his class: In 2023, an official from the Vanderbilt University School of Education responded to a massive shooting at another university by sending an email to students seeking community cohesion. The message explaining that “building strong relationships with each other” promotes a “culture of care” was a sentence that finally revealed that ChatGpt was being used to write it. Officials temporarily resigned after students criticized the outsourcing of empathy for the machine.
Not all situations are that clear. Dr. Shovlin said that it is difficult to come up with rules as reasonable AI usage can vary depending on the subject. The Center for Education, Learning and Assessment, where he is a fellow, has instead the “principles” of AI integration.
The Times contacted dozens of professors whose students mentioned the use of AI in their online reviews. The professor said he used ChatGpt to create a quiz for computer science programming assignments and required readings, even if students complained that results didn't always make sense. They used it to organize feedback to students or to make it kinder. As experts in their field, they said that when it hallucinates, or that it is wrong to the facts.
There was no consensus between them as to what would be accepted. Some people use ChatGpt to help students evaluate their work. Others denounced the practice. Some emphasized the importance of transparency with students when deploying generative AI, while others said they were not disclosing their use due to student skepticism about technology.
However, most people found Stapleton's experience in Northeastern, when her professor seemed to use AI to generate class notes and slides, was completely fine. That was Dr. Shoblin's view, as long as the professor edited what Chatgupt spewed to reflect his expertise. Dr. Shovlin compared it to years of practice in academia using content such as lesson plans and case studies from third-party publishers.
Professors are “some kind of monsters” for using AI to generate slides. “It's ridiculous for me,” he said.
Steroid calculator
Singily Christopher Kwaramba, a business professor at Virginia Commonwealth University, described Chatgup as a time-saving partner. He said lesson plans that took several days to develop would take hours. He uses it to generate a dataset for a fictional chain store. This is used in exercises to help students understand a variety of statistical concepts.
“I consider it to be the age of steroid calculators,” Dr. Kwalamba said.
Dr. Kwaramba said the opening hours for students are increasing.
Other professors like Harvard's David Mallan said using AI means fewer students come to business hours for relief assistance. Dr. Malan, a professor of computer science, has integrated custom AI chatbots into popular classes that teach the fundamentals of computer programming. Hundreds of his students can resort to it for help in coding assignments.
Dr. Maran had to tinker with the chatbot to hone his educational approach. That way, I only provided guidance rather than a complete answer. The majority of the 500 students surveyed in 2023 said that it was the first year it was offered and that they found it helpful.
Rather than spending time on “more common questions about referral materials” during business hours, he and his teaching assistant prioritize interactions with students at weekly lunches and hackathons. “More memorable moments and experiences,” Dr. Malan said.
Katy Pearce, a professor of communications at the University of Washington, developed a custom AI chatbot by training with versions of the older assignments she scored. This allows students to give feedback on writing that mimics themselves at any time, day or night. It would be beneficial for students who would otherwise be hesitant to seek help.
“Is there a point where in the foreseeable future, many graduate teaching assistants can be done by AI?” she said. “Yeah, absolutely.”
What will happen to the pipeline of future professors coming from the Teaching Assistant rank?
“That's definitely going to be a problem,” Dr. Pierce said.
The moment you are taught
After filing a complaint with Northeastern, Stapleton held a series of meetings with business school officials. In May, the day after graduation, authorities told her she hadn't regained her tuition fees.
Her professor, Rick Arrowwood, repented about the episode. Dr. Arrowwood, an adjunct professor and has been teaching for nearly 20 years, said he spoke of class files and documents as “giving a fresh look” to ChatGpt, AI search engine confusion, and AI presentation generators called Gamma. At first glance, he said the notes and presentations they produced looked amazing.
“In hindsight, I wish I had looked more closely,” he said.
He placed materials online for students to review, but emphasized that he did not use them in the classroom. He only noticed that the material was flawed when school officials asked him about them.
The embarrassing situation made him realize that professors should approach AI more carefully and disclose to students how and when it will be used. Northeastern has recently issued an official AI policy. If an AI system is used, it requires attribution and a review of the “precision and aptitude” output. A Northeast spokesperson said the school will “accept the use of artificial intelligence and enhance all aspects of education, research and operation.”
“I teach everything,” Dr. Arrowwood said. “If my experience is something people can learn, then it's my happy place.”

