When making your case, data alone won't move the needle. Connecting that data to a compelling story will.
By James A. Baumann
Lee Shulman, a professor and past president of the Carnegie Foundation for the Advancement of Teaching, was responsible for one of the best assessments of assessment when he wrote, “Counting and recounting can only be pursued together. Counting without narrative is meaningless. Narrative without counting is suspicious.”
That belief serves as the foundation for chapters in not one but two separate books about student affairs assessment that have released second editions in recent months: Student Affairs Assessment: Theory to Practice and Coordinating Divisional and Departmental Student Affairs Assessment. Everyone knows about the proverbial survey results collecting dust on a shelf (or, probably more accurately for these days, taking up space on a computer drive). However, not enough student affairs and housing professionals are making the time and effort to develop strategies that bring the data to life, illustrate program successes, or identify existing challenges.
In these collections, Darby Roberts, Stacy Jacob, Cassie Walizer, and Jillian Kinzie offer instruction and inspiration about how and why to communicate the results of a survey or research, a vital step of the assessment loop that is too often breezed past. They draw on their professional experience to recount the importance of understanding who will review the assessment results, where their motivations lie, and what strategies can help the most important data and results cut through the clutter and receive the attention they deserve.
Recently, Talking Stick met with Roberts, the director of student life studies in the Division of Student Affairs at Texas A&M University; Jacob, who spent more than 14 years teaching student affairs; and Walizer, a strategy director at Complete College America and former director at the University of Iowa. The result was a lively discussion about assessment, storytelling, the role it can play in nurturing equity, and the importance of getting to the point. The interview has been edited for clarity and length.
To start off, tell me what brought you to these book projects and how communicating assessment manifests in your professional lives.
Cassie Walizer: This project really stemmed from a conversation that Stacy and I were having at a conference a couple of years ago. We started talking about how there are a lot of tools out there for conducting an assessment and gathering data. Most of the training, though, is really just point-and-click instruction rather than a process of thinking systematically or intentionally about how to tell your data story.
I really came to this, though, from the point of working in diversity, equity, and inclusion. I work in an anti-DEI state where I constantly had to communicate the value of my team’s work and my students’ stories. The purpose was really two-fold. How do I make sure that everybody on campus knows that my department is doing great work to support our students and also understands that this work is valuable? Then, while working on my MBA, I started reading about strategic communications philosophies. In our chapter, we realize the nexus of strategic communication and data storytelling.
Stacy Jacob: I am a qualitative researcher and really think a lot about stories. In fact, when I was doing my degree at Indiana University, I took fiction writing classes because I wanted my dissertation to be really interesting, and I tried to sell my committee on letting me write a novella. They did not buy it. But years later, I let students do creative things like that. When Cassie started to talk with me about what she was doing, it lit my whole brain on fire.
I also really loved what Darby did in her chapter where she writes about digging into that political frame of mind. I’ve taught master’s students for years, and when I tell them that assessment is political, they are like, “Huh?” It was just so great the way that it took the organizational behavior theories and systems of power and pulled them out. It just fits so well with that ability to tell a good story and as a way of building a coalition.
Darby Roberts: I’ve been doing assessment for a long time. When I first started, getting people to do assessment was the hard part. I would hear, “I don’t know how to do it. I don’t wanna do it. I don’t have time to do it.” All that kind of stuff. As we started building our program, we got to where people said, “Okay, I need to assess.”
We have grown in thinking about the assessment cycle. We start with, “What’s your goal? What’s your outcome?” But what used to be the end of the cycle is not really the end. It is this thing about using and sharing results. I don’t think we do that as well as we can. That was a key piece of writing this book.
I think many folks in student affairs and residence life don’t get into it knowing how to communicate strategically. In the last couple of years, I’ve been on this soapbox about how we are communicating the results of the many amazing things that we do. Many people say, “Student affairs is the best-kept secret.” And I responded, “Why keep it a secret?” We should be telling people all these things. I go back to knowing who your audience is because that will tell you how you might communicate with them. A student in your program will have a different need than the vice president, a faculty member, or a state legislator.
Stacy, you say you’re a storyteller. I would say, until recently, that I’m not a storyteller because I think of that as fiction. I may not be a storyteller, but I am a communicator.
Jacob: Right.
Roberts: I’m telling a story to spur people to action. I’ve had to rethink my vocabulary. I want our assessment committee to be storytellers. I want our marketing people to be storytellers with data. I recently told our marketing director and our development vice president that I wanted to put a workshop together with the directors of our different departments and, if they have them, their assessment person and marketing person. I want to bring them together and show that these things are actually related. We should be working together in the middle circle of this Venn diagram. They can take data, whether it is quantitative or qualitative, and make a story that tells other people they are doing all these things. Good things will come to your department if you do. That has really been my kick in the last couple of years.
Jacob: I am honestly not sure we have come to a point where everyone believes we must do assessment. Again, it goes back to all this being political. Assessment puts us out there in a way we may be afraid of. We want to serve students, but we don’t want to prove that we are not serving students. And so, assessment can be super scary.
Roberts: I think there are some institutions that do it better and do it more as a culture. Then there are a lot that are still developing in that area.
Walizer: Part of the work that I do now that I’m no longer on a campus and am higher ed adjacent is to help institutions build a deeper data bench. That is how we frame it, but the reality is that it is not just about the data and how to point and click and use the graphs and charts – it is about how to tell that story internally. I always tell institutional leadership that this technical assistance is not for them and this project is not for them because they know how to do it or should know how to do it. It is about them building a culture of assessment across their campus.
We do that because there’s an equity component. We can talk at length about the weaponization of data. But we also know that assessment can be a tool for equity and inclusion when it reveals inequities and when it services voices that have not been included. When you are able to have practitioners who have those skills at the entry level, and are the ones actually working with the students, that really matters.
Jacob: It also opens up equity conversations when we can present data in ways that people can absorb it. The old assessment report that we all talked about gets put on the shelf. When we are reporting it like statistics, there is a whole set of people who go, “Not me. I don’t want to read that. I don’t even understand that.” When we use great visualization and great storytelling, it actually helps people who have no background in research absorb and understand the information. It makes it an even more compelling thing to work with.
Walizer: Yes, absolutely. And if we just boil it down to a bar graph of demographics, who is likely to be reading that? Not the people who are historically marginalized or whose voices need to be uplifted. When we learn how to tell those stories well, we can put aside our biases and privileges and uplift others. There is my equity soapbox for the day.
Jacob: It was great.
Roberts: Working on a campus and thinking of how we take that theory to practice, we should be equitable. One of the things we have been very cognizant of doing is including student organizations in our assessment projects. Any student organization can come to us and say they want to do an assessment, whether it is student government or the chess club. Come on, and we help you just as we help any department in our division. We tell them we are partners in this. They are not relinquishing their power to us; they are the content experts, and we will be the process experts to help them get through this. What is the purpose of the assessment that you want to do? What is the key factor in that? Have you thought about individual questions you want to ask? Engage them in that process.
When we get the data back, we dig into getting students to help us interpret and analyze it because what I might see in the data is not what a 22-year-old will see in the data, or they may interpret words that are unfamiliar to me. They tell me, “That’s 2023 language.”
Or it’s just because they’re closer to the subject.
Roberts: You’re right. “That’s what that means? Oh, okay, I didn’t know that.”
Walizer: Yeah.
You write about how sometimes jargon is off-putting, but sometimes jargon is a necessary part of that storytelling. Another thing that came out in your chapters was the idea of starting with the core message – what you are looking to discuss, answer, and solve – and then how the core message could change based on the audience. Have you seen some changes in recent years about people asking for different core messages? Or have the audiences that want to see the data expanded?
Roberts: I think at an institutional level we’ve been asked, and not necessarily in a negative way, how student affairs has contributed to retention, persistence, time to graduation, and so on. Residence life is a key piece of that. They are also increasingly wanting to share data and are being a little more astute with that.
We’ve had a shift in our division about emphasizing how important it is to talk to donors and find out what they are interested in. What do we tell them about our programs in terms of success or need? And what’s the data and the story that goes along with that? Going back to knowing who your audience is, is your audience the provost? Is your audience the vice president of student affairs? A potential donor? The students who are in the program?
I’ve heard from different housing departments about how the development office is coming to them now and wanting to focus on all the great memories and relationships that were formed within those residence halls.
Jacob: Over the years, I worked with housing a lot because I had a very willing participant in the director of housing at Slippery Rock University. I did a problem-based learning assessment class, and our director of residence life would come in and present to the students, and then they would decide what problems they wanted to work out. One year, we had a set of students that, as a project, connected with how residence life plays out with the mission of the university. It was a really interesting project, and I remember the students got really interested in these questions. Those students actually ended up presenting back to the students that they interviewed. Then it was interesting to hear those students talk about how they didn’t realize how important it was to live in a residence hall. Students as stakeholders can be really interesting.
Walizer: I’ve heard more and more about foundations and private donors wanting to engage with residence life and see those outcomes. I think that’s due to the turnover of the old guard who doesn’t want their kid to live in the residence halls just to bring in a keg and have a great time.
A few years ago, the Indiana Commission for Higher Education conducted a big campaign on the value of college. Considering the cost of college, expressing the value of living on campus is critical. They did this big campaign, and they did some research afterward, and what they found was that prospective students did not care at all about data that showed they would earn a million dollars more over their lifetime. That seems so far off to them. They are focused more on the short term. Psychology would probably tell you that. And so they did this big campaign and realized that it was totally wrong.
They are getting another grant to redo the research post-COVID, which will be interesting. But I think about that a lot: about what we think will be an important message, what we think a demographic should care about, and what we want them to care about or pay attention to in our messaging. Our data isn’t necessarily the reality.
Another question that comes up regards those campuses that don’t necessarily have the resources to do this level of assessment. What are the common barriers you’ve seen to closing the loop on assessment?
Roberts: I think time is probably one of the key ones because no one has extra time. Beyond that is also the skill or confidence of “How am I presenting and to whom? In what way? When?” We had someone from our marketing committee come in and talk about infographics. Not in a “Let’s get into a system and create an infographic” way, but rather “What’s the point of an infographic? What are you trying to communicate?” Some overall best practices.
I’m not expecting our assessment committee to be experts in creating infographics. But the marketing people, most of them, if not all of them, have that skill. If you’re asking a general student affairs professional, maybe a coordinator in residence life who is working with a program, they probably do not have the technical skills to do these things. And so that will be an impediment, and maybe they will just go on their merry way.
Another barrier is created if they haven’t sat back and considered what the most important thing they need to communicate is. There are 57 things they could communicate. How do they choose what’s most important? If they are overwhelmed by everything they get, they don’t know how to narrow it down.
You wrote about a staff person who gives a big presentation filled with data, but it never answers the one question that the person listening to it cared about. There are these cases where someone has collected all this data, and they feel if they don’t tell everyone every last thing, then they must have wasted their time. So you get a fire hose effect, as opposed to lasering in on something.
Walizer: Those examples in our chapters didn’t just come out of thin air. (laughs) I work on a lot of different campuses, but predominantly small, under-resourced campuses, where there might be one or two people in institutional research roles. They are just trying to get through accreditation, IPEDS (integrated post-secondary education data system), and the many requests that are coming through to them because they are the only people who have access to do it all. That is why I like democratizing the data and building data capacity. They have to be done together, and it is so important that we can upscale and let the institutional research office work on the big-picture stuff.
It’s definitely about capacity. If you’re just trying to turn off the fire hose, if we’re going to use that metaphor, there is not much time or energy to truly think strategically. That’s not a criticism at all.
That’s just the way it is. You’re just trying to get things done and check them off the list and not really sit with it. I think that is a huge issue because even the folks who want to be doing these things better and think through these things just don’t have the capacity to do so.
Jacob: When I taught assessment, I told students that, by the end of the class, I wanted them to think monkeys could do this. I don’t really think that monkeys can do assessment, but I wanted students to feel that they can ask a question about somebody’s experience and then use that data to do better. I wanted them to feel open to the possibility that even a bad question teaches you about doing assessment, and you get better at it. Just start anywhere.
Now that you’ve said that, I feel guilty about my next question. How do you feel about the influx of assessment tools? Everybody’s got a Survey Monkey account. Everybody can log in to Canva or use Excel to churn out pie charts left and right. As people committed to thinking about the benefits of telling the assessment story, what do you think it means for the work that you all do?
Jacob: I think it helps people get into it. I want my students to try it; as long as they realize that their first try may not be the best thing ever, they have places to learn, and they can do better. You can ask for a critique and help and those kinds of things. I am for these things.
Roberts: I tell people this is not your dissertation. So, going with what Stacy said, you don’t have to be an expert. Now, the people in my department, I would say, are pretty close to being experts in what we do, but I don’t think it has to be so overwhelming and so complicated and at such a high level. You’re asking people some questions. What is it that you want to know?
I think the other word of wisdom is to realize that assessment is a team sport. I could sit down and create a survey, and I could ask these questions, but if I have no one else looking at it, it is probably not as good as if I ask someone else, “I have this idea. What do you think? Can you read through this? Here’s what I’m trying to accomplish.” You’re not in this alone. You are going to learn from each other. Assessment is a hot skill, so the more you know and the better you can do it, the better your professional career is going to be. Just understand that it doesn’t happen overnight.
Jacob: I love the team sports analogy because, even as someone who is really good at qualitative research, I test the questions that I ask. We all should be testing them.
Walizer: Yeah, I think there are really two categories of learning curves and assessment. The first is the methodology. “How do I know what questions to ask? How do I capture this information?” The second is the technology that comes with it. If you can make one of those things a little bit more comfortable and accessible, I don’t see that as a bad thing.
Also, I do not have the answer to this, but I will put it out there that I’m really interested to see how artificial intelligence will help us design better assessment tools and reports, make sure we are asking good quality questions, and those sorts of things. I think this conversation will look different in a couple of years, and it will not only be the learning curve of the technology and the methodology but also how to use AI with it.
As a communicator, I was drawn to your points about the placement of the thesis and the danger of burying the lead. It made me think about articles where the reader has to read through all the methodology and questions before getting to two paragraphs about the impact of the findings.
Jacob: It is what everyone does in every research report ever. I remember getting to the end of my dissertation a long time ago and then thinking, “What does all this mean?” I really think sometimes we get so caught up in what we are doing that we forget the important part. We are doing this to improve services and experiences for students, and we need to get to the point.
Roberts: I’ll give you a great example. For many years, when we would write reports to our clients, they were structured like a journal article. Here’s this intro. Here’s our method and sample. Here are all these details, then here are the results, and, at the end, our recommendations. We interviewed some of our clients and asked what they thought about our reports, and they pretty much said, “We skip to the end, and we read the recommendations.”
So we restructured our reports. Now they say, “Here’s what we did. Here are some key findings.” Then we go into the results. People are looking for the impact. So rather than making them skip to the end, we just put it in the front.
You’re not writing a mystery novel. The clients are not going to say, “I’m going to keep reading this report until I find out who did it!”
Jacob: Maybe we want to employ some mystery to tell the story. But one of the things I value about the ACUHO-I Journal is that the articles center on that recommendation. It draws people in, and students can take things away from the articles after they read the research.
You mentioned it earlier, but how do you tell the story when the results maybe aren’t so great, or you uncover some potentially bad news?
Roberts: I tell people that it is better that you know the bad news and can focus on it than if someone external tells you. You can say, “Okay, this wasn’t exactly what we wanted, but here are the steps we will take so we are not mired in that state. Here’s our plan, and then we will reassess it.”
Jacob: It’s like what they say about that technique of sandwiching bad news between two pieces of good news when you’re supervising someone. “You’re really great because of this, and here’s how you need to improve, and you’re also really great here.” Except, research over the years has proven that that is a terrible idea because they only remember the good stuff you told them. Getting to the heart of the matter is important. That truth is what’s really important to do the things you need to do in your job.
Walizer: I agree.
Even more subtle were the examples you made about the point of view and the language used: using active voice versus passive voice or phrasing things like, “Here are things that we can do to improve service to students” instead of “These students are getting bad service.” Make it clear that this is something that the university is doing versus something that is happening to these students.
Jacob: Perhaps we can own our problems, so to speak.
Walizer: Sometimes, an assessment that shows bad results or less great results is actually the best assessment because it holds people accountable to doing better. It’s not like you can just say, “Hey, retention rose again for the students who live on campus.” Great, we can just keep on keeping on. You don’t get pushed to do better when you get good assessment results like that. I, ultimately, think when done well, it can be a very good thing.
Roberts: I agree, and for some of our staff, we tell them this is low-hanging fruit. If something wasn’t what you wanted it to be, right there, it tells you that you can take action on it.
Jacob: Yeah.
Roberts: That’s great, versus thinking, “Well, I guess things are okay, and I don’t know what I should be doing, but I’m supposed to report on some change, and we were supposed to implement something.” Fortunately, for a lot of our departments, units, and programs, it is not a failure. It is something you could tweak, and it probably would be better.
Jacob: I even think that it can be like in life where, when we mess up a little, we learn the most. How much do you really remember about all those things you aced versus the ones you have to work hard to fix or make better? Those are the places in my life that I have always learned the most from. It has helped me become a better professional, be more empathetic, be more student-centered, serve students better, or any of those things. Messing up a little bit is perhaps a very good thing.
Walizer: The challenge is ensuring that there is a culture of continuous improvement where failure is okay, there is support for that, and it is not weaponized.
Roberts: I also ask people why they would continue to do something that is not effective. Wouldn’t you want to know? You don’t have time to be inefficient. You want to change to do the best you can for the students you serve. So let’s figure that out and make some changes.
I have one more question, but it’s not an easy one. It’s about balancing all these different issues. You all write in your chapters about the ethical and practical reasons to present data in the aggregate. But we also know about the power of identifying someone and telling an individual story. At the same time, we know the danger of focusing on one individual story versus showing that something will have a greater effect on a larger number of students. So when communicating the assessment results, how do you find that balance? I guess maybe it is determined on a case-by-case basis.
Roberts: Yes. (laughs)
I hate it when I answer my own question.
Roberts: I thought you said it would be a tough question. No, I think what you are talking about is who you are presenting to, what they care about, and in what way they want it. For some, it’s not like a vice president has time to read a 10-page report. They might care about students, but a long student story might not be helpful for them in making a decision about what budget money they are going to allocate to a department. And then, if you’re talking to a donor and you are having students tell their individual stories about how this program helped them, and you can support that with quantitative data, then that meets your goal.
It all goes back to the audience and purpose. And I’m big on the fact that it is not just quantitative and qualitative. It’s all mixed in together. So I have a hard time if people are strictly this or strictly that.
Jacob: When you collect data, there are so many things you could say, and it is hard, but I agree that it’s important to know what message we need to give to make things better. Sometimes there are surprises that you find in it. Especially because I’m a qualitative researcher, I allow things to emerge, and there’s always a surprise to find in data. Sometimes, that surprise ends up being something that is an add-on or a side story that I find important to tell.
Walizer: I agree with all the things that they said. My other closing tidbit would be something I always tell young professionals or anyone when they’re starting a new position. You have to know what kind of data your boss wants to see. That is oversimplifying, yes. But you need to know very early on, and you need to intentionally ask what types of data and what information your boss wants to make a decision so that you can be strategic in how you present that.
Roberts: Definitely.
Jacob: That’s a good old managing-up principle right there.
Walizer: I have taken to literally just asking that question, and I really encourage others to do that as well. Or even if it is just someone asking me to present a report, I ask them what kinds of things are helpful. In this conversation and in our chapters, we talk a lot about recognizing different audiences, but one of the things I don’t think we explicitly state is, “Just ask.”
James A. Baumann is the editor of Talking Stick and the ACUHO-I Publications Director. He also works with the Future of the Profession volunteer team focused on communicating the value of campus housing.