Getting your Trinity Audio player ready...

Wade Maki has been using artificial intelligence for the past year. The ponytailed philosophy lecturer at the University of North Carolina Greensboro has experimented with the technology’s ability to summarize talking points from difficult readings or identify missing ideas from lecture notes. As chair of the UNC System Faculty Assembly, he employed AI-generated images of cats in the boxing ring for a presentation on diversity, equity, and inclusion before the Board of Governors.

Slide from Wade Maki’s February presentation to the Board of Governors. (Courtesy of Maki)

“You don’t even have to go find a meme,” Maki said. “You can create it. Describe it to the AI, and there it is. That’s useful for everybody.” 

In Maki’s classroom, students have permission to use AI to brainstorm ideas, conduct research, and even proofread their papers, so long as they don’t use it to write their essays and exams. His principle is best summed up as: “You can work with AI, but AI shouldn’t be doing the work for you.”

Faculty and students across the country have had to adapt as generative artificial intelligence tools have become increasingly ubiquitous. The technology has changed how students take notes, conduct research, and summarize readings. It’s also changing the college admissions process and recruitment efforts, as well as how teachers generate course content and grade assignments

Some experts worry that heavy reliance on generative AI will strip away students’ critical thinking skills and allow them to take shortcuts instead of learning new material. “I hear anecdotes about students that are completing assignments that are AI-generated,” UNC System President Peter Hans said last month at the first event in The Assembly’s Newsmakers Series. “And faculty are using the tool more often as well. It’s one bot talking to another bot.”

But university leaders also acknowledge that students have to learn how to use the technology. “There’s no way we’re going to get around it,” North Carolina Central University Chancellor Karrie Dixon said at the same event.

UNC System President Peter Hans speaks at The Assembly‘s Newsmakers event. (Kate Sheppard for The Assembly)

Higher education institutions across North Carolina are scrambling to implement classroom guidelines to keep pace with the rapidly evolving technology. Duke University unveiled its own AI platform this summer, DukeGPT. Public universities have crafted example syllabi to help professors set guardrails for chatbots, machine learning tools, and more. The UNC System is working on its own recommendations.

Maki serves as a member of the AI Oversight Committee at UNCG and oversees a UNC System AI task force. He said applying a one-size-fits-all policy is not sustainable. He thinks schools should establish best practices and offer plenty of wiggle room for faculty to customize their AI policies.

“The Wild West is how we always start with these things, and then it gets less wild, and the roads get paved, and there’s more law and order,” Maki said. 

Taming the Wild West

The current conversation about the line between study aid and cheating is in some ways reminiscent of past debates about tools like CliffsNotes or using the Internet for research. Maki, 51, said he had professors who “didn’t like calculators because you should do all that math in your head.” 

None of the five North Carolina universities surveyed by The Assembly bans AI in all classrooms. Universities have generally been reluctant to establish campuswide policies regulating generative AI, leaving it to faculty members’ discretion. But many institutions have established recommendations and best practices that faculty can adopt when drafting their course syllabi. 

A mobile phone displays the ChatGPT application. AI technologies are becoming ubiquitous on college campuses. (Jonathan Raa/NurPhoto via AP)

UNC Charlotte has outlined two suggested syllabus policies that faculty can adapt. The first option allows AI use in all assignments, so long as students disclose how the technology is used. The second limits AI use to designated assignments. Use of AI that is not authorized by the instructor constitutes a violation of academic integrity, the guideline states. 

“We wanted to put guidance around best practices, knowing that the cat’s out of the bag in a lot of ways. This is not going away,” said Jules Keith-Le, academic technology support analyst with the Office of OneIT, UNC Charlotte’s IT services office.  

Similarly, at North Carolina State University, sample statements range from most restrictive—no AI allowed—to least restrictive, which incorporates the technology into the coursework. A “moderately restrictive” option allows students to use chatbots, text generators, and paraphrasers when seeking guidance on assignments, but not for other purposes.

Wake Forest University offers an AI decision tree to help faculty evaluate how knowledge and skills are gained and assessed in a course and whether it is important that students develop them independently of AI assistance. The decision tree guides faculty to syllabus options based on how faculty answer the questions.

Some faculty across the state said the university suggestions for classroom policies don’t help them with one big concern: enforcement.

“There’s no way we’re going to get around it.”

Karrie Dixon, North Carolina Central University chancellor

Scott Simkins, associate professor in the Department of Economics at North Carolina A&T State University, said that despite not wanting to spend time policing AI use in his classroom, he finds it inevitable. This has prompted him to adapt his courses to “focus as much on the process as the product of the learning,” he said, through things like team-based assignments and establishing a “learning community compact,” an agreement outlining shared goals and expectations. 

He said a campuswide guideline for the emerging technology would help foster a culture of responsible AI use among students, faculty, and administrators. 

But Sarah Egan Warren, assistant teaching professor at NC State’s Institute for Advanced Analytics and a digital education faculty fellow, said that’s impractical because what works today may not work in the future. For example, Egan Warren pointed to Google’s AI Overviews, introduced in 2024 to provide summaries found in Web search results. She said that while students and faculty can use Google to search for information, there is no guideline on whether AI-generated summaries can be used in an academic setting. An AI policy implemented in 2022 wouldn’t have accounted for this feature.

She also said that she prefers the flexibility of setting her own classroom AI guidelines. 

“I do want them to experiment with AI,” she said. “I do want them to see the limits of it. I want them to talk about what are the biases that are built into it.”

Wade Maki wears glasses and a tie
Wade Maki leads the UNC System’s faculty assembly. (Courtesy of Maki)

Keith-Le, who teaches visual design of instructional products at the graduate level, said that not all of her students like using AI, but she encourages them to explore the tool. 

After students used Google Gemini to generate an image for a visual design project, Keith-Le polled them on whether they would use AI going forward. She was surprised when the results showed a 50-50 split. Half of the students said they didn’t like using AI for class. 

For Maki, what’s most important is telling students the rules from the get-go and recognizing that student AI use does not always imply cheating. “Cheaters aren’t going to ask,” he said. Therefore, he said, faculty should encourage students to learn how to use the right tools. 

“Every time we drive down the road, we want to know what the speed limit is,” Maki said. “Some of us might still speed, but we at least want to know what it is.”

DukeGPT

Like many of its peer institutions, Duke has no blanket rule on AI in the classroom. Yet the university has stood out for how it is pushing AI into students’ lives. 

“Our goal is to help faculty create learning experiences that prepare students to thrive in a world where AI is part of everyday life—and where human curiosity and critical thinking still make all the difference,” said Aria Chernik, assistant vice provost for Faculty Development and Applied Research in Learning Innovation at Duke.

All Duke undergraduates, staff, faculty, and professional students in Duke’s graduate schools have free, unlimited access to ChatGPT. The university also launched DukeGPT, an interface that lets users compare an array of language models. Duke community members can use suggested prompts to learn about university events and resources. Duke’s pilot with OpenAI is part of the company’s broader strategy to make AI a “core infrastructure of higher education.” 

The university says that giving all students access to ChatGPT levels the playing field for those who cannot afford subscriptions. Exposure to the technology will help students prepare for a job market where AI skills are valued and necessary, Duke says.

More recently, the university introduced MyGPT Builder, a customized chatbot that allows students to generate flashcards, practice quizzes, and study guides from a syllabus and set personalized study schedules. The AI tool will also help staff streamline administrative procedures and support faculty in generating course content and answering common student questions. 

Duke recently launched DukeGPT, an interface that lets users compare an array of language models. (Lucas Lin for The Assembly)

This fall, prospective Duke students have the option to write an essay about AI in their application. 

Duke is not the only institution in North Carolina that has partnered with AI companies. In March, N.C. Central announced a collaboration with OpenAI to launch the Institute for Artificial Intelligence and Emerging Research, the first of its kind program among HBCUs. Students at UNCG have free access to Microsoft Copilot. UNCG said it picked Microsoft over companies like OpenAI because the university thought Copilot offered more security safeguards.

The UNC System is also part of Google’s $1 billion initiative to fund AI literacy programs and research across more than 100 public universities. Under the partnership, all students will gain access to Google’s Gemini 2.5 Pro and Google Career Certificates, an online AI training program. 

The UNC System has formed professional development groups for faculty to discuss generative AI issues in education, and it holds an annual learning and technology symposium that covers similar topics. The system doesn’t have guidelines on AI use in the classroom, but officials are studying the issue and expect to release recommendations this fall, said Heather McCullough, director of learning technology and open education.

“The technologies are changing on a daily basis, and so to be able to have a policy that is nimble enough to be adaptable to the shape of AI six months from now feels a little daunting,” she said.


Lucas Lin is a junior at Duke University pursuing a major in economics and a certificate in documentary studies. He is managing editor at The Duke Chronicle, the university’s independent student newspaper. Beyond journalism, he is passionate about storytelling through documentary filmmaking and photography.