It’s the spring semester of 2025 and Sam Lapiejko ’26 is walking classmates through a literature review on automated rotator cuff tear detection using 3D convolutional neural networks. The study’s authors used a computer program to look at 3D images of shoulders to locate tears and, using the academic paper as a guide, Lapiejko employed comparable techniques on similar healthcare images to see if artificial intelligence tools could accurately detect tears.
Lapiejko is one of 14 students in Biological and Biomedical Sciences Professor Brian Blais Ph.D.’s “AI in Healthcare” course, which helps students gain skills in understanding and implementing methods of AI in healthcare.
“AI is becoming more prevalent and changing every day — you’ve got to get ahead of the curve,” says Lapiejko.
From diagnostics and clinical decision making to healthcare analytics support, AI is transforming clinical practices. In response, Bryant’s School of Health and Behavioral Sciences has quickly moved to embed AI literacy into its curriculum and experiential learning opportunities.
“Our programs prepare learners to lead, not just to adapt,” says School of Health and Behavioral Sciences Director Kirsten Hokeness, Ph.D., noting that an institutional push led by Provost and Chief Academic Officer Rupendra Paliwal, Ph.D., requires all students to take at least one AI course during their time at Bryant.
High-impact training experiences
In a medical simulation lab tucked in the lower floor of the Physician Assistant Studies wing, PA students sit with Beverly Slate, a high-fidelity mannequin, to discuss yesterday’s lab results and the upcoming treatment course. Known as an immersive clinical simulation, this four-day exercise centers on narrative medicine, where clinicians use listening, storytelling, and reflection to better understand the whole patient, not just the disease.
“We try to dial up the realism as much as we can,” says Director of Medical Simulation and Assistant Clinical Professor Stephen Sherman ’19 MSPAS, PA-C.
To support this authenticity, Sherman has been using AI to generate realistic, reactive situations that closely mirror what students will encounter in an emergency room so they are better trained to react calmly yet decisively. Outside the simulation, PA students are also learning how to use artificial intelligence to generate notes and reduce operational burden that can be shifted to more attentive patient care.
Meanwhile, in the Doctor of Clinical Psychology program, graduate students refine their skills by using Skillsetter, a platform that leverages AI. Pre-recorded clients explain a situation they’re going through, and Psy.D. students record their responses and receive feedback.
“For someone who didn’t come in with a ton of experience, being able to rerecord and think it through more deeply has helped me in my in-the-moment responses,” Mackenzie Dickman ’30PsyD.
While these tools support students’ skill development, faculty are also emphasizing the importance of data privacy, security, accountability, and the limits of AI-supported decision-making.
“The more students understand not just the how of AI but the why of AI, the more prepared they will be to manage these challenging issues with the hopes of improving a system that has been struggling for some time,” Hokeness says, noting that AI has the potential to benefit both patients and providers in a more affordable and equitable way.
Adapting educational needs
AI is reshaping not only healthcare but how faculty lead healthcare coursework. After teaching physics for 25 years, Blais opted for a new approach in the 2025 fall semester.
“In this age of AI, it is too easy for students who experience some level of discomfort with the challenging task of learning a new subject to lean on the technology to solve problems for them,” says Blais. “Because the technology is so accessible, one has to ask what the role of education is, what the role of teachers should be, and how can students effectively face those challenges.”
He shares that he’s implemented a flipped classroom model where, in class, undergrads complete what would normally be done for homework. Out of class, students watch recorded lectures at their own pace — whether that means rewinding to grasp a particular concept or playing at 2x speed because they’re confident in the topic. During this time, Blais encourages undergrads to use tools, including AI, to help explain ideas, make tutorials, and generate practice problems.
RELATED ARTICLE: How is AI reshaping cognitive psychology? Bryant expert explains
In addition to the changes he’s made here, Blais, along with Psychology, Department Chair Heather Lacey, Ph.D., have each been selected as a Davis Fellow of AI through a Davis Educational Foundation grant supporting AI integration across curriculums. Lacey is working on developing an AI tutor for “Introduction to Psychology” while also creating a proposal for a new AI Psychology course that would look at the interaction of human intelligence and artificial intelligence. Within this course, students would examine how machines learn, how humans think, and how the two can collaborate productively and ethically. Drawing on critical readings, debates, and creative projects, students would also analyze AI’s impact on human experience and envision its future role.
Moving full steam ahead
As healthcare organizations increasingly expect graduates to be comfortable collaborating with AI tools and understanding how to use AI to translate data into action, Hokeness emphasizes that education must respond to meet employer expectations.
“Our students have a strong foundation in health and behavioral sciences. What sets them apart is their AI literacy, their awareness of AI applications in the field, and their business acumen,” Hokeness says, noting that SHBS is making strides in the graduate and professional education space by offering stackable credentials in AI in healthcare, from a certificate to a Master of Science in Applied AI with a concentration in Health Informatics.
With a new “AI in Health and Human Performance” course that exposes students to how AI is being used in the field and focuses on ethical and safety considerations, faculty are also advancing AI scholarship. For instance, in the fall, Healthcare Informatics and AI Program Director Nafees Qamar, Ph.D., was awarded a RI-INBRE pilot grant to support his project, “Multi-Site Big Data Chest X-Ray Analysis: Graph Neural Networks for Diagnostic AI,” which examines the use of artificial intelligence in medical imaging, with the potential to improve diagnostic accuracy and patient outcomes.
“Healthcare is a data-rich, data-driven field with incredibly high stakes that faces enormous challenges,” says Hokeness, noting how provider shortages, caregiver burnout, skyrocketing healthcare costs, fragmented data systems, and inequities in care could be addressed, or even solved, by the intentional and ethical implementation of new technologies. “As a result, healthcare is uniquely positioned to lead in the adoption of AI.”