It sounds counterintuitive, but could using AI help housing departments make more human connections?
By Pete Trentacoste
It started as a harmless exercise when I first logged into ChatGPT and asked it to provide a primer on the housing and residence life profession. My expectations were low, as my first thought was "What could an AI program know about the profession I have dedicated most of my adult life to?" There was a little bit of a John Henry human-versus-machine feel to it. Then I watched the scrolling screen report back with excellent prose and APA-style references. It was then that I realized I had underestimated the power of AI as a resource.
I immediately began to think about ways I could incorporate AI into my daily work. I had asked my question as a means to create a curated report prepared for someone new to the field, and what I received, in less than 10 minutes, was more concise and accurate than any book I've come across. I suppose I could have spent that time going through resources by hand and summarizing the vast amount of work that exists to bring someone up to speed on the inner workings of our field. But would that really be the best use of my time?
All of this was a reminder that housing professionals can find themselves so busy managing the administrative machinery of the profession that they often miss the very human connections that drew them to this work in the first place. But what if there was a way to reclaim that time? What if technology could handle the routine tasks so people could focus on what truly matters, such as developing staff and serving students?
Professionals at other colleges and universities have had similar epiphanies and have begun making changes to their housing operations through strategic AI implementation, which may not be achieved in the flashy, headline-grabbing way, but instead in a thoughtful, measured approach. These aren't institutions rushing to replace human judgment with algorithmic decisions. Instead, they're discovering how AI can amplify their existing strengths while freeing up precious time for the work that requires a human touch.
The conversation about AI in student housing often starts with the theme of efficiency, and for good reason. Take Julia Keahey and Sommer Dunlevy at Kent State University. They were part of a project to determine what furniture students would prefer for the residence halls. After a model room was set up, students were invited to try it out and share their feedback via an electronic form. The next thing Keahey and Dunlevy knew, they found themselves facing 2,000 student responses about their preferences. Instead of spending hours on manual analysis to sort all this data, they used ChatGPT to transform overwhelming qualitative feedback into actionable insights within minutes. But here's what's interesting: That time savings wasn't the real victory. The real victory was what happened next.
"Within the University Housing at Kent State, we're using AI not to replace the human touch – but to enhance it," Keahey explained. "We use AI as a creative partner and a productivity tool, whether it's drafting social copy, assisting in generating standard operating procedures, brainstorming campaign ideas, analyzing student feedback, or refining strategy. It helps us work smarter, not just faster, allowing more time to focus on student connection and storytelling."
This philosophy is echoing across other campuses in surprising ways. On the Louisiana State University campus, faculty and staff can now turn to MikeGPT, an AI assistant named after the tiger mascot and with access to 35,000 up-to-date university documents. What started as a tool to help navigate institutional information has evolved into something much more significant. When computer science students worked full time during the summer of 2024 to catalog LSU's more than 350,000 web pages, they weren't just creating a search engine. They were building a foundation for the kind of informed decision making that housing professionals need daily.
The resulting product goes beyond a simple keyword search. MikeGPT can synthesize information from multiple policies to answer complex questions such as, "What are the specific steps to appeal a student's housing decision?" or "What resources are available for students facing mental health crises in residence halls?" It doesn't just provide links; it offers direct, synthesized answers with source citations, enabling faster, more accurate decisions. There are ongoing plans to integrate MikeGPT with other campus software platforms such as Moodle and Workday Student, furthering its evolution into a proactive assistant capable of navigating and explaining dynamic, personalized information for students and staff alike.
Another example comes from Baldwin Wallace University. There, senior director Bob Beyer has found AI to be a useful means for creating training scenarios. “We’ve found AI particularly valuable for creating realistic training scenarios for our graduate and undergraduate staff,” said Beyer. “For instance, we recently developed a session on ethical decision making, a topic we hadn’t always emphasized in previous training. We asked the AI to ‘generate a scenario where a staff member encounters a conflict between privacy and safety obligations,’ and it produced a situation involving a student pursuing a relationship with a student on their floor. AI didn’t replace our work. It enhanced it. It helped us prepare for complex, emotionally charged moments that our staff are likely to face on the job.”
When AI is considered in administrative contexts, the conversation often centers on obvious applications such as automated responses to frequently asked questions, streamlined communication processes, and basic data analysis. These are important, certainly, but the real transformation happens when AI is more creatively leveraged to address the unique challenges of housing operations.
One of these areas is roommate matching, a process that has long frustrated housing professionals. Yes, the matching and assignment process has moved on from manually shuffling notecards to incorporate technology. Still, these traditional matching systems rely on survey responses about considerations such as cleanliness preferences, sleep schedules, and study habits. But what if AI could analyze communication patterns, social media interactions (with appropriate permissions), and behavioral indicators to identify compatibility factors not previously considered? What if it could flag potential conflicts before they escalate into the need for room changes or judicial issues?
The facilities management applications are equally intriguing. At Mississippi State University, for example, Dei Allard, the executive director of housing and residence life, is exploring how AI might enhance administrative efficiencies in areas such as workflow automation and data analysis. "Changes are coming, and I am taking a virtual course to learn more," Allard said, reflecting the thoughtful approach many housing professionals are taking toward AI adoption. It's not about rushing to implement every new tool that emerges. It's about strategically identifying where technology can solve real problems.
AI is changing this equation in ways that feel both revolutionary and surprisingly natural. At the University of Missouri, Tyler Page, the senior director of housing, has integrated Microsoft Copilot into his daily workflow in ways that free up mental energy for more strategic thinking. Along with utilizing Copilot to prioritize and summarize emails, as many do, he uses it to record and summarize meetings while identifying action steps that come out of the conversation. He sees this as a way to reduce the number of times that people leave a staff meeting with good ideas that end up lost in the shuffle of implementation or having action items fall through the cracks because someone was too busy taking notes to fully engage in the discussion. By handling these routine cognitive tasks, AI allows housing professionals to be more present in the moments that matter most.
There are also potential applications in the area of professional development, a vital part of a campus housing department but also the piece most likely to slip to the proverbial back burner. Planned training sessions get postponed due to crises. Learning modules sit unfinished because there aren't enough hours in the day. Individualizing training for dozens of staff members sounds great in theory but feels overwhelming in reality. Beyer's use of AI to create diverse training scenarios is one way to address a persistent challenge in housing: how to prepare staff for situations they haven't encountered yet. Traditional role-playing exercises are limited by an individual’s own experiences and imagination, but AI can generate scenarios that stretch beyond the usual repertoire, helping staff think through complex situations before they arise.
And it is not just about crisis preparation. AI can create personalized learning paths that adapt to individual learning styles and career goals. Imagine a new hall director who's strong in community building but needs development in budget management. AI could design a series of micro-learning modules that build on existing strengths while addressing specific areas for growth. The technology exists to make this kind of individualized development scalable in ways we never thought possible.
ChatGPT and other AI engines have proven themselves adept at analyzing survey data, coding open-ended responses, and identifying trends, and they are adept enough that users must carefully consider what data is made available to them. For campus housing, that means stopping short of putting sensitive data or the names of students or staff into public AI systems. This kind of careful boundary setting is crucial as campuses navigate the ethical implications of AI implementation and will help campuses avoid pitfalls while also unlocking their full potential as they safeguard privacy and trust. There are plenty of concerning stories about data breaches and AI overreaches. These can cause some organizations to experience analysis paralysis, fearing the ethical complexities so much that they hesitate to explore AI's benefits. However, a nuanced approach recognizes that ethical considerations are not an excuse to avoid innovation, but rather a framework to guide it responsibly. It's about understanding the difference between tools. Publicly accessible AI models, like the general version of ChatGPT, are powerful but aren't designed for confidential information. They learn from the data they process, which means sensitive details could inadvertently become part of their training set or be exposed. This differs from internal frameworks, like LSU’s private and secure MikeGPT, which operates within the university's own protected environment, ensuring that data remains confidential and controlled.
These aren't institutions rushing to replace human judgment with algorithmic decisions. Instead, they're discovering how AI can amplify their existing strengths while freeing up precious time for the work that requires a human touch.
Housing departments have always been responsible for protecting students’ physical selves as well as being guardians of their privacy and advocates for equitable treatment. AI implementation raises new questions about how we fulfill these responsibilities. The data privacy implications are particularly complex in housing contexts, where there is access to sensitive information about students' living situations, behavioral patterns, and personal challenges.
The bias considerations are equally important. As Keahey and Dunlevy noted in a presentation about Kent State’s AI usage, "There are downsides to using AI, particularly with biases built into AI." Housing professionals must understand that AI systems can perpetuate or amplify existing inequities if not carefully monitored. A roommate matching algorithm, for example, might inadvertently discriminate against students from certain backgrounds or with specific needs.
There is also the question of transparency about sharing the ways in which AI is used to create the student housing experience. What obligations do housing departments have to disclose AI involvement in decision-making processes? These aren't just technical questions; they're fundamental issues of trust and accountability that go to the heart of the profession. Page's approach to data security offers a practical model. His careful attention to what information he shares with public AI systems demonstrates the kind of professional judgment we need to exercise. Similarly, the gradual rollout of LSU's MikeGPT – starting with faculty and staff before expanding to students – shows how institutions can test and refine AI systems before full deployment.
Institutions that are successfully implementing AI in housing operations share several characteristics that offer a roadmap for others. First, they start small and focused. Rather than trying to revolutionize everything at once, they identify specific pain points and test AI solutions in controlled environments. One suggestion is to identify which tasks are the most frustrating. Is it the time spent analyzing survey data? The challenge of creating diverse training scenarios? The difficulty of managing email communication? Successful early adopters emphasize starting small and learning as they go. Tools like ChatGPT, Grammarly, and Microsoft Copilot are accessible and affordable. The key is approaching them with clear objectives and appropriate safeguards.
Kent State's approach exemplifies this strategy. The staff there began by optimizing social media through Sprout Social's AI assistant, analyzing performance patterns, and adjusting posting times. This provided experience with AI tools in a low-risk environment before expanding to more complex applications like data analysis and content creation.
Second, successful implementers maintain a clear distinction between tasks that benefit from AI enhancement and those that require human judgment. As Keahey and Dunlevy note in their webinar, "AI handles the data and humans craft the story. AI speeds up decision making. Humans build relationships, AI provides insights. Humans add emotion and context." This philosophy prevents the technology from overwhelming the human elements that make housing work meaningful.
Third, successful programs invest in training and support. LSU's MikeGPT project succeeded partly because it involved collaboration between computer science faculty, students, and information technology support staff. The project was large enough that no individual department would have been able to succeed on its own. Building AI capacity requires cross-functional partnerships.
The resource considerations are more manageable than many professionals assume. While some institutions invest in custom AI solutions, others achieve significant results with existing tools. For example, Allard's use of Grammarly for editing her writing, Evernote for managing her tasks, and ChatGPT for communication enhancement demonstrates how professionals can start incorporating AI without major budget requests.
Over the last generation, the campus housing profession – like so many others – has adapted to numerous technological advancements: the shift from paper-based systems to digital platforms, the integration of social media into student engagement strategies, and the adoption of mobile technologies for everything from key cards to maintenance requests. Each transformation felt significant at the time, but AI feels different. It feels bigger.
It is likely that AI will fundamentally transform how housing operations function through the rest of this decade. This prediction is echoed by technology leaders across industries. Former CEO for IBM Ginni Rometty famously stated, "AI will not replace humans, but those who use AI will replace those who don't." The CEO of Google, Sundar Pichai, consistently emphasizes AI's role in "augmenting human capabilities" – not because the technology will replace human judgment, but because it will free humans to exercise that judgment more effectively. The administrative tasks that consume so much time – data analysis, routine communications, basic research – will become background processes that happen automatically, allowing people to focus on the complex human challenges that define our work.
This isn't about becoming less human in our approach to housing. It's about becoming more human. When AI handles the routine cognitive load, housing professionals can spend more time mentoring new professionals, developing innovative programming, and building the kind of communities that help students thrive. People can be more present in crisis situations because they are not distracted by administrative concerns. They can make better decisions because we have better information more quickly.
The institutions that start implementing AI strategically now will have significant advantages in recruiting and retaining both students and staff. They'll be more efficient, more responsive, and better positioned to adapt to changing student needs. But, more importantly, they'll be able to offer the kind of personalized, high-touch experience that has always been the hallmark of excellent housing programs.
As campus housing departments prepare for a quickly changing future, the question isn't whether AI will transform student housing. It is whether or not those departments will be intentional about shaping that transformation. The professionals who embrace this challenge thoughtfully, ethically, and strategically will find themselves better equipped to serve students. Most importantly, they will recognize that AI is a tool, not a solution. The creativity, empathy, and judgment that make housing professionals effective can't be automated. But when AI is strategically implemented to handle routine tasks, it creates space for uniquely human qualities to flourish.
Pete Trentacoste is the executive director of residential life at Louisiana State University in Baton Rouge. Cover illustration by Clint Reno.