Using Artificial Intelligence to Make Us More Human

A tv monitor in an enhanced learning classroom at the School of Nursing shows a blue head with points lit up in the brain.

Generative AI is rapidly reshaping the future of nursing education and practice — and at the UW–Madison School of Nursing, faculty are leaning into the shift to improve person-centered care.

By Christina Hernandez Sherwood

Last November, UW–Madison School of Nursing Clinical Instructor Britta Lothary, MSN, ANP-C, gave her pharmacology students an unusual assignment. Instead of warning them against using generative artificial intelligence (AI) platforms, like ChatGPT, Gemini, or Claude, to complete school work, she integrated AI into a class activity. The result, she said, was that her students learned more about both nursing practice and how to responsibly use AI.

“This is the way health care is going,” Lothary said about the ubiquity of AI. “This is what these students are going to be exposed to probably upon graduation. I wanted to give them an introduction.”

Here’s how it worked: In the activity, dubbed “Artificial intelligence (AI) Patient Education,” groups of students were each randomly assigned one of 20 medications, such as oxycodone, lithium, or lorazepam, and an age group ranging from early adolescence to elderly. Then, Lothary instructed each group to determine together “the three priority pieces of patient education regarding this patient’s age and medication” based on what they’d learned in class lectures and other course materials.

After they made their own determination, the students were to ask Microsoft Copilot, a generative AI chatbot that is free for all UW–Madison students, faculty, and staff, for its opinion on the three priority education pieces, instructing it to use “only scholarly resources” as sources. Finally, each group reflected on how the AI platform performed compared to them.

As Lothary roamed the classroom while groups completed the assignment, she heard a range of reflections from students. Some, she said, were unaware they even had access to Copilot, while others were adept at using the platform to, for instance, find recipes and make grocery lists. When the AI chatbot suggested different patient education pieces than the students had, the groups dug deeper into its sources. And many students determined that Copilot’s responses were not specific enough to be particularly useful. “I heard a lot of the word ‘generic,’” Lothary said.

All in all, Lothary said the assignment was illuminating — in more ways than one. “AI is a tool that needs to be used in the right way to get the best results for the patient,” she said. “Your nursing gut is still going to be needed.”

At the UW–Madison School of Nursing, faculty and staff are adapting to — and sometimes even embracing — the sudden ubiquity of generative artificial intelligence platforms. The School requires its instructors to address in their syllabi if and how AI is to be used in their courses. Otherwise, it’s up to individual instructors whether to incorporate AI usage, or even discuss its implications, in class.

Some instructors, like Lothary, are using AI to develop class assignments and assessments, or encouraging students to use the tools to help them study. Others are researching the impact of AI on nursing practice. Still others, whether concerned with the potential for academic misconduct or waiting for broader uptake of generative AI tools in clinical settings, have so far eschewed the technology.

In other words, in both nursing education and nursing practice, the use of generative AI is still a moving target. “Nurses are excited about certain aspects [of AI], mostly how it could help them get back to the patient care they love,” said Ann Wieben, PhD RN, a Postdoctoral Trainee and Clinical Assistant Professor whose research focuses on the integration of AI tools into nursing practice. “But there’s still a need for AI literacy and about how to integrate it collectively as a profession.”

Artificial intelligence isn’t new to health care. For years, health systems have been using predictive AI tools, such as fall risk calculators, as a way to preempt adverse health events before they happen. What’s newer, Wieben said, is generative AI, that is artificial intelligence tools that create new content, such as text, images and videos, or data.

There’s hope that generative AI could exploit the mountains of data within electronic health records. According to a recent study from the UW School of Medicine and Public Health, nearly one in five patients has a medical chart the length of Moby Dick. “That’s a lot of data that could be leveraged to uncover insights about patients,” Wieben said. “We have limited resources in health care, so if [AI] can help us make sure we get the right resources to the right patients, I think there’s a lot of potential.” In particular, she said, synthesizing patient data to find the most important insights could be extremely helpful to health care providers, especially in emergency situations.

Some health care institutions are piloting projects that use generative AI tools for help with responding to patient portal messages, which have exploded since the COVID-19 pandemic, Wieben said, and for “ambient listening,” and assisting providers with documentation.

But the “black box” nature of generative AI — the inability for users to get the full picture of how the complex algorithms work — is a barrier to its broader use in health care, Wieben said. In fact, some 200 California nurses protested last spring against the use of “untested and unregulated” AI in their workplaces, such as AI chatbots and monitors that they said might give patients incorrect medical advice or issue false alarms. (Wieben noted that AI isn’t meant to replace decision making by a clinician.)

The outcry highlighted the need for nurses and other clinicians to be part of efforts with data scientists to further develop AI tools for health care, Wieben said. “There needs to be some clinical rationale,” she said. “That’s going to build trust, knowing it was developed with clinician input.”

That sentiment was highlighted by a 2024 survey by National Nurses United, the nation’s largest union of registered nurses, which found that AI “often contradicts and undermines nurses’ own clinical judgment and threatens patient safety.” The union also released a set of guiding principles for AI in nursing, which included the right to high-quality person-to-person care. “The right to health care in-person by a licensed health care professional underlies all other medical care and should not be compromised by uses of A.I. or other technologies that contribute to worker displacement or deskilling,” the union stated.

The document is among the resources Wieben has compiled on her public website, AI Resources for Nurses and Health Professionals, a compendium of publications, media, and training meant to empower nurses to learn about AI and its potential impact on nursing. “I’m consolidating the resources that I found accessible and well done in terms of understanding what AI is, the policies around it — which is a moving target — and the clinical applications that are coming,” she said. “There’s a lot of need for nurses to get educated about AI.”

Though AI is still a hot-button topic in the nursing profession, Wieben said it’s crucial for nursing schools to keep on top of the trend. Last summer, she worked on an educational module that School of Nursing professors could use in their courses to help students remain compliant with university policies on AI tools, and to introduce them to how AI could impact clinical practice. Wieben also started a nursing and AI journal club for the School of Nursing community. “AI is a rapidly changing technology,” she said. “It’s hard to keep up.”

Wieben’s doctorate of nursing practice (DNP) students hear about AI from her in her informatics class, and she’s working on an undergraduate nursing informatics course that would also include AI content. AI content “is not currently baked into the curriculum,” she said. “We’re working hard to get it there.”

Clinical Instructor Caitlin Weitzel, ACNP, APNP, who teaches didactic and clinical courses in the DNP program said she hasn’t incorporated generative AI into her coursework because the tools aren’t used in the in-patient setting at UW Health, where she practices once a week. “Until AI starts being integrated into the clinical setting, I feel like we’d be doing students a disservice by using it to support clinical education,” she said. “The technology isn’t there yet.”

Faculty generally fall into two camps when it comes to AI, said George Jura, PhD, academic technology director for the School of Nursing. They’re either reactive — waiting to address the technology until it’s used for nefarious purposes, like cheating — or proactive. He suggests taking the latter approach. “The vast majority [of students] are honest and want to do a good job,” he said.

It’s up to faculty and staff to set an example for students on how to use AI as the tool it’s meant to be, Jura said. That’s why he approached Lothary with an idea for a pilot project that would both save her time and boost her students’ skill.

Lothary, along with many other faculty members, delivers asynchronous narrated PowerPoint lectures for her students to view before class. Then class time can be used to apply that knowledge. Self testing is one of the most effective ways to determine whether a student understood the lecture material, Jura said, but developing these optional quizzes from scratch is labor intensive for professors. That’s where AI comes in.

Jura and his team uploaded Lothary’s pharmacology lectures to several generative AI platforms, including ChatGPT, Claude, and Gemini. They asked each platform to write multiple choice assessment questions based on the provided lectures. The process wasn’t quite seamless. Most platforms couldn’t handle the 45-minute lectures. Some of the chatbot-written questions needed a total rewrite. And in the first iteration of one multiple-choice quiz, the correct answers were all choice “A.”

But while all the questions required instructor review and some editing, Jura said about 80 percent of the AI-produced content was usable. Jura and his team presented the project at University forums, such as the UW–Madison Teaching & Learning Symposium, and national conferences, including the Online Learning Consortium (OLC) Innovate 2024 Conference. And Lothary has since used AI to help create quizzes for three different lectures, saving her time and inspiring her in surprising ways.  “Sometimes, even if I didn’t like a quiz question, it would spark an idea in my head,” she said. “Then I would write the question within five minutes.”

Even Weitzel, who isn’t ready to teach her students about generative AI because it’s not in the care setting yet, has found ways to boost her teaching with the tool. After a colleague told her about Notebook LM, Weitzel used the AI platform when writing a final exam. She crafted assessment questions, then asked Notebook LM to find references for each from among hundreds of pages of resources. Weitzel said she appreciated how Notebook LM provided clickable citations.

Weitzel has also encouraged her students to try using Notebook LM, suggesting they ask the tool to help them write study guides or study questions. But, she noted, learners should only use AI once they have foundational knowledge so they can carefully appraise what the tool generates. “Everything you’re doing with AI you’ve got to take with a grain of salt,” she said. “You still have to be able to critically appraise the information you’re being given.”

Clinical Instructor Alyssa Haure, MSN, RN, teaches courses on fundamental and advanced concepts and nursing and began experimenting with AI in the past year. “I might be late to the game,” she said, adding: “This is not something that’s going to go away.” Haure said she started simply: asking generative AI to perform a basic task, like rephrasing an objective.

As she grew more comfortable with the technology — learning to craft more specific prompts and to feed AI the resources she wanted it to use, such as a chapter from a digital textbook — Haure began to wonder whether AI might help with class assignments. “This is very much the way the world is moving,” she said. “If you don’t jump on board, I feel like you’re going to be lagging behind… We all need to do ourselves a service and learn with our students.”

Now, Haure said she regularly uses AI as a “jumping off point” to help her write undergraduate nursing education case studies. While she doesn’t use the AI-generated responses word-for-word, she often gets an idea for a scenario to use, or some new way of posing questions to students. “We’re not taking everything from AI at face value,” Haure said, “but it’s a great way to get started.”

Haure said she’s told her students about using AI, which leads to class conversations about the appropriate use of the tool. “For our younger students, this is so much part of their culture now that to work against it almost hinders your relationship,” she said. “We have to figure out how nurses can use it in a way that’s constructive; that’s good for our practice and good for our patients.”

One of Haure’s students is Brandon Kreger, a UW–Madison junior and first-year nursing student and School of Nursing student ambassador, who initially had a negative opinion of AI, thinking it was a tool mostly for academic misconduct. His opinion was softened, he said, by hearing professors like Haure and Lothary speak positively about AI.

Kreger and his classmates have since used AI to help them prepare for a big exam by asking the platform to develop study questions to help them practice. He also used AI to help create a student ambassador presentation: starting with basic AI-generated topics, then going deeper with his own knowledge. “The longer we wait, the harder it becomes to understand the tool,” Kreger said. “We can be the leaders and the innovators in this space.”

Even as a student nurse, Kreger understands the time crunch practicing nurses face. He said he hopes that, by the time he’s in the workforce, AI can help him and his colleagues save valuable time. “AI is efficient,” Kreger said. “It helps give us back time that we need to move along through the day… even if it’s 15 minutes saved so we can have our 30-minute lunch.”

Even more key, AI could serve as a sort of sparring partner for nurses who are tasked with making rapid-fire decisions on sometimes limited information, Kreger said, such as before a patient has a diagnosis. “It could be like that second nurse you talk to when doing a med check, or when you’re maybe not so sure about your own opinion,” he said. “We can use AI as a tool to further propel [our] knowledge and understanding… to enhance patient centered care.”