
By Ralph Berrier Jr.
The clock neared 3 a.m. and Jack Graves ’24 had a paper due in just a few hours.
Bleary-eyed, the Roanoke College senior wanted someone to provide meaningful feedback about his work before he turned it in. But at that hour, the Writing Center wasn’t an option, and it was way too early to email his professor.
So Graves used another route for a scholarly critique: artificial intelligence. He popped his draft into a chatbot that within seconds analyzed it and provided several paragraphs of helpful comments that commended what he had done well and pointed out areas that needed improvement, such as clearer sourcing and deeper supporting arguments.
“It was incredibly helpful,” said Graves, a history major from Shawsville, Virginia. “Let’s be honest, it was late, so my brain couldn’t look through the draft and say, ‘that comma shouldn’t be there.’”
Graves wasn’t using artificial intelligence, or AI, as a shortcut. The chatbot he used was designed by his professor, Ivonne Wallace Fuentes, who is experimenting with AI in class. Wallace Fuentes, who teaches history at Roanoke, used a version of ChatGPT, the most common AI tool in use today, to create a virtual teaching assistant that could provide immediate feedback. She wrote a lengthy prompt that included the following:
Engage the student by addressing them directly and inviting them to imagine scenarios or participate in the discussion. Provide clear and detailed instructions when necessary to ensure the student understands your feedback. Then encourage students to revise their work based on your feedback. … [Then] wrap up the conversation in a friendly way, reminding them that writing is thinking and a reiterative craft. Remember, they are very worried about their grades and this will help them do much better.
What a supportive, nurturing chatbot — uh, professor.
AI is having a profound impact in society today, from the workplace — where rapid technological advancement potentially threatens some jobs while possibly creating new ones along the way — to the fields of health care, business and creative pursuits. Generative AI (which can generate original content, words and images based on user prompts) is particularly disruptive — or it’s a great tool that helps creative people perform work quickly and efficiently. Perhaps it’s both.
Higher education is already being affected by AI. At Roanoke, students use AI to help solve problems, improve their writing, construct resumes, analyze text and more. In this Wild West era of advancement, where technology runs wild and rules are few, many professors are scrambling to find ways to embrace AI in the classroom while others steadfastly oppose it. A third group has qualms about AI, but they realize that many of the burgeoning applications and programs are here to stay.
“The toothpaste is not going back in the tube,” said Gwen Nuss, Roanoke’s assistant director of institutional research, effectiveness and planning, who leads Roanoke’s efforts to effectively implement AI into the classroom.
History Professor Ivonne Wallace Fuentes has experimented with artificial intelligence in her classes by creating a virtual teaching assistant that can provide immediate feedback to students.
That’s why Wallace Fuentes, like many other professors at Roanoke, is dipping her toe into AI-generated waters. She wants to understand how AI can be constructively used in the classroom before it controls teaching and learning.
“I do see, much like my colleagues in math who had to reckon with a calculator a couple generations ago, that this is a tool that most students have,” she said. “It's about using it in a way that's productive, and that still highlights and centers human agency and human meaning.”
To that end, Roanoke College is working with four other Virginia colleges and universities on an ambitious state grant-funded project that aims to figure out how to bring AI into education ethically and strategically. George Mason University in Fairfax received $75,000 from the State Council of Higher Education for Virginia’s Fund for Excellence and Innovation grant program. Roanoke joins George Mason, Bridgewater College, James Madison University and the University of Virginia to develop cross-disciplinary resources and workshops geared to help faculty integrate AI into the classroom.
Meghan Jester, the director of career exploration who will co-lead Roanoke’s end of the project, said faculty, students and administration must be ready to accept AI and the changes it will bring to education.
“It's going to be continually evolving,” Jester said. “For our students, something that is inherent in a liberal arts education is that they continue to upskill and understand the value of lifelong learning. This technology is going to continue to change, and how we're using it is going to continue to change, and we must be open to what that change looks like.”
ENVISIONING THE FUTURE
When it comes to planning the future of education in an age of AI, Roanoke administrator Gwen Nuss likes to quote that deep technological futurist and thinker, former hockey great Wayne Gretzky.
“Wayne Gretzky has a quote, ‘You skate to where the puck is going to be,’” said Nuss, who is co-leading the grant project with Jester.
In this case, the puck is AI. Her point is that we must think about what AI will be like in the future, not just at the current, exasperating moment.
“Right now, we're talking about plagiarism and cheating and basic ethics questions, but what's the future going to look like? We have to lay some groundwork before we can even start to think about those conversations.”
To be sure, AI can be used for egregious purposes in class, such as students submitting papers that were completely AI-generated.
In an era of rampant misinformation online, others worry that AI will only exacerbate that problem. Many AI programs rely on prompts or submitted information, then the programs spit out answers based on those prompts. What if, sometime in the future, all the prompts are filled with bad information? What kind of wrong answers will the programs generate, which then could be used as future prompts?
Even if all the information is correct, it’s not difficult to imagine a world where an AI-generated test is given to students who submit AI-generated answers, which are then graded by an AI program … and nobody learns much.
“That’s a nightmare scenario,” said Kathy Wolfe, Roanoke’s dean and vice president of academic affairs. In that case, “we've ceded all the learning and all the thinking and we'll all be like the people in the floaty chairs in ‘Wall-E.’ We will have nothing to do and nothing to learn.”
Picking up on the Gretzky quote, she added: “How do we stop the puck from going there?”
That’s the $75,000 grant-supported question that Nuss, Jester and others are trying to answer.
Nuss is working on developing a college-wide policy on AI, building a set of sturdy guardrails that ensures Roanoke stays up to date with technological advances while upholding integrity of learning and thinking, and protecting privacy.
Currently, Roanoke asks that teachers include a clear AI policy in their syllabi, but it does not mandate what that policy should be.
Even though most AI discussion and controversy focuses on ChatGPT, many other AI programs abound. This rush of AI tools can be exciting for time-pressured students, but they raise concerns for faculty. But even those who have qualms about AI often admit that the technology can be beneficial for students.
Leslie Anne Warden, assistant vice president for curriculum and advising and an associate professor of art history and archaeology, grudgingly realized the benefits of some AI programs when talking to a student whose primary language was not English.
“AI is a great leveler,” Warden said, “particularly for international students or anyone who reads and writes English as a second language. I was complaining about AI to an alumna who is Burmese, and she looked at me like I had grown a third eye. She said, ‘I use this all the time to help me clarify my writing.’ And she was an A student here.”
Those kinds of one-on-one faculty-student interactions, which are a hallmark of a small liberal arts college experience, is one reason Roanoke College is in a good position to effect change and understanding around AI.
Said Wolfe, “Liberal arts colleges, and the humanities and arts specifically, are primed to help students investigate what it means to be human in a world where generative AI exists.”
"Liberal arts colleges, and the humanities and arts specifically, are primed to help students invesigate what it means to be human in a world where generative AI exists."
Dean Kathy Wolfe
Rob Lamour '25 (right) discusses a class project with Anthony Cate, an assistant professor of psychology. The course, "Artificial Intelligence Versus Human Cognition," had students use artificial intelligence models to understand how AI mimics human cognitive skills - and whether it produces those skills the same way as the human brain.
REAL WORLD EXPERIENCE
Tony Saade ’25 had never used ChatGPT until an internship last summer.
While working for a local aircraft company that specializes in building drones, Saade, an engineering science major from Roanoke, was asked to write grant proposals for funding. His bosses knew he had no experience with grants, writing or the extremely technical terms that he needed to know, so they told him to use “something like ChatGPT.”
Saade fed the chatbot some background information, including the fact that he was trying to write a technical document. What he got back “really blew my mind,” he said. “I didn't expect it to actually give me something that would pass as a first draft,” Saade said.
He made some modifications to the next set of prompts and tried again. “What it was putting out kept getting closer and closer to what the boss wanted,” he said. His superiors were thrilled with his work.
Saade returned to classes in the fall, fully committed to the benefits of chatbots and other large language model programs (so-called because of the program’s ability to produce clear, conversation-style text and answers). For a design class, his group had to write a commitment document outlining each team member’s roles and responsibilities. The students brainstormed ideas but struggled to write them in a coherent way, so Saade punched information into ChatGPT and got a template the group easily completed.
Their teacher called the paper “the most professional and most comprehensive of all the groups,” Saade said. “And then we told her that we used ChatGPT, and I think it even surprised her, because I don't think she'd had much exposure to it, either.”
But is the AI-generated document truly the group’s original work? Saade says it is, and for a valid reason: all the information came from the students; the chatbot helped organize it.
“The quality of its output is highly dependent on the quality of what you put into it,” he said.
Saade, who transferred to Roanoke from Virginia Western Community College, not only has real-world AI experience from his internship, but he also started his own light-manufacturing company that uses 3-D printing to make adaptive devices for children with disabilities. AI might someday help with the design and manufacture of those products.
He agrees that AI poses myriad ethical problems: Use it to produce a template for a document? Sure. Use it to manufacture an essay passed off as your own original work? Of course not. But, like others, he knows the technology is here to stay.
“I only started using it because they’re already using it in industry,” he said. “That's sort of my justification.”
That’s justification enough for Jester. She gave a presentation earlier this year on AI and the future of work that informed students that many jobs will be disrupted by technology, but others will be created. She said that agility, continuous learning and adaptability are paramount.
“Education systems are going to have to adjust, employers are going to adjust, and make upskilling opportunities available,” Jester said, adding that in coming years, workers are likely to have 20 jobs across 4-6 careers during their lifetime.
“AI is not going away,” she said. “What can we be doing to make sure our students are using it ethically and appropriately in a way that's going to support them in the workforce? And where are there opportunities for us to leverage our human skills in a different way to support society?”
Professors are finding creative ways to integrate AI into the classroom, and doing that is incredibly time-consuming.
Wallace Fuentes spent hours designing her chatbot prompts for her upper-level history classes. The scope was limited: students used the prompt for feedback on short reviews of books they read for class. Designing feedback prompts for every single assignment will mean more work in the beginning.
“I'm not sure [AI] is saving me a lot of time yet up front,” Wallace Fuentes said, “but maybe down the road.”
Melanie Trexler, an associate professor of religion, used AI to write an exam question, which she then handed to her students with a simple assignment: Tell her everything that was wrong with the question. The students thought deeply about their responses.
“I asked them, ‘What did you fix or did not? But you have to tell me based on what you've learned in this class.’ I got way better essays than if I had given them a white page and had them type. I know that because I've given them a version of this test. They wrote fantastic essays.”
Trexler is also co-director of Roanoke College’s Teaching Collaborative, which took the lead in generative AI discussions in 2023-23 by hosting monthly programs that involved about 15 participants who worked to integrate AI on campus. Additionally, she said that 73 Roanoke faculty, staff, administration and trustees have signed up for AI training through online courses from Auburn University. Some participants are working toward a "badge," which is a type of AI certification in some courses.
Certified Deloitte facilitator Patrick Brugh leads students in a diecussion about career preparation during the Deloitte Future of Work Institute, which encouraged participants to embrace artificial intelligence as they prepare for future careers.
WORKING TO ADAPT
Not all faculty have had positive experiences with AI in the classroom. Many worry about the loss of critical-thinking skills as students outsource some tasks, such as writing or reading texts, to AI programs. Some students have violated class rules by submitting AI-produced work for grades.
Last year, Shannon Anderson, associate professor and director of strategic health initiatives, was alerted by an online grading tool that some students may have submitted final papers that “came straight out of generative AI.”
Anderson invited the students to her office for a conversation about the assignment. Each of them acknowledged using some material from chatbots, and one student said they thought it was no different than using Google to search for answers.
Instead of immediately assigning the students a failing grade, Anderson had them redo their assignments in the building with no take-home notes. She said the experience was a learning opportunity for everyone involved. It inspired her to develop an AI policy that she explains before assignments.
Anderson describes herself as a "hopeful skeptic" of AI’s use in education, and she is working to adapt to an AI future.
"We need to learn to use it," she said. “It has tremendous possibilities and potential if it’s used in the right way ... ways that are constructive and help us learn. I think we are moving in the right direction. We have very good people figuring this out."
Other faculty also have apprehensions about AI. Economics Professor Edward Nik-Khah questions whether ChatGPT is even a tool, as many people call it.
“A tool is supposed to be something that helps you accomplish a preset goal, maybe more efficiently … I'm not sure that that's what this is doing,” he said. He believes that ChatGPT, specifically, is a threat to people’s ability to think critically and to learn tasks.
“Some people might say, ‘This is a tool to help us write,’ but I would encourage us to think more about that,” Nik-Khah said. “Because writing itself is a way of understanding. It is not merely a way of conveying information to someone else. So what does it mean when we delegate the task of writing to an algorithm?”
Nuss said administrators do think about policies and guardrails that could regulate AI on campus.
“We're going to see good things and bad things come out of AI,” she said. “All the more reason why we can't ignore our responsibility to train as educators.”
As part of the multi-college, state-grant-funded project, which will take two and a half years to complete, Nuss will help design Roanoke’s plans and resources regarding AI in the classroom. Jester will create an advisory board of business and industry leaders, along with faculty, to “bridge that gap between higher education institutions and the workforce,” she said.
Wallace Fuentes said Roanoke faculty, students and administrators can’t stick their heads in the sand when it comes to handling AI in the classroom. It’s a civic duty.
“I am absolutely in that camp,” she said. “I would go a little bit further, that it is a requirement for our democracy that students and citizens learn about the capabilities and learn the kinds of digital literacy skills that will be required to survive in an AI world.”