by Craig Seager and Thomas Bruick
C
ollege and university campuses have made a number of dramatic changes in response to the COVID-19 pandemic. In some cases, as students moved into residence halls and classes resumed, many processes were re-invented, almost from scratch. One of those changes has been how resident assistant candidates are vetted, particularly within group processes. And while a previous article illustrated how those traditionally face-to-face experiences can be transitioned to online platforms, it is important to remember that, as the exercise changes, so should the means by which candidates are evaluated.
Group process activities and their ilk are crafted intentionally to support the evaluation of specific skills and characteristics such as communication, critical thinking, ability to work with others, and creativity. While the evaluation of these so-called soft skills can be difficult even when done as designed, it becomes even more difficult when modified for a new platform. Cate Morrison, who performs community engagement and outreach for eRezLife Software, used to help manage the housing department at the University of British Columbia in Vancouver, Canada. There she oversaw 4,500 residents and supervised five professional staff, 16 paraprofessional staff, and approximately 200 student staff. In this role she was part of a group interview process that included more than 600 candidates. In determining what is most important to assess during an application process, she suggests starting at the end and working backwards. “Look at the applications, interviews, and group processes of your highest performing RAs,” she says. “What stands out? Are there any trends or characteristics that are consistently present in this group? Next, complete the same assessment of your low performing staff. At the application and interview stage, what red flags were present in this low performing group? This process will likely indicate the attributes and characteristics that are present in your selection process that predict a successful candidate.”
Beyond simply determining which qualities to assess, those who evaluate the candidates must be provided with clear expectations of what skills and characteristics to consider within each activity. This can be a challenge given limited staff and training time as well as the relative subjectivity that comes with evaluating such skills. These limitations can hinder best practices such as rubric development, training evaluators on using the rubrics, and testing the rubric through techniques such as measuring inter-rater reliability while pilot-testing activities.
It is important that each step in a selection process has carefully constructed evaluation rubrics that focus on specific attributes of an ideal RA candidate as dictated by the qualitative data. The rubrics ensure that all desired attributes are evaluated as consistently as possible. As an example, in the newly developed virtual RA selection process at the University of Central Arkansas (UCA) in Conway, the first two steps in the model do not use rubrics, but instead base evaluation of commitment and responsibility on grade point average and conduct violations. All the other steps in the model, though, utilize rubrics to evaluate specific attributes. All assignments in the RA class, required to be taken by all RA candidates, are tailored to evaluate candidates across six main categories – demeanor, developmental skills, universal competencies, commitment, cognitive skills, and interpersonal skills – as well as their technology skills.
A critical piece to ensure the success of this model is the training of facilitators and evaluators. “Candidate assessment is difficult in any selection process, but it’s especially cumbersome when you consider the breadth and depth of expectations placed on an RA,” says Morrison. “When time and resources are limited, conversations around transparency and trust in a selection process are often the first to go. It’s important to determine if the staff trusts each other’s opinions and the staff can be trusted to make clear, unbiased assessments.” Morrison adds that it is important for the staff to look for the same qualities in the candidates. “The institution may have documented qualities that they’re looking for, but even if just one staff member veers from that expectation, the processes are compromised. These conversations about trust are fundamental to the effectiveness of the process.”
The move to online processes, necessitated by the pandemic, has required a good deal of innovation among housing programs and has also exposed potential weaknesses in processes.
At UCA, facilitators and evaluators are full-time staff, as well as graduate assistants and some returning RAs. Every step of the process and the respective desired outcomes are explained in detail. Everyone involved has a well-defined role and understands exactly what is needed from them in their specific capacity. Individuals are trained on utilizing the rubrics and given various scenarios to ensure that they are evaluating candidates consistently. Most importantly, a mock run-through of the process is conducted to iron out any logistical wrinkles whether it be in the physical or virtual world. Lastly, evaluators and facilitators complete a short assessment about the process to help fine-tune any pieces for further implementation.
Moving processes online has provided an opportunity for additional assessment of candidate abilities, specifically their technological proficiency. Many assume that today’s students arrive on campus having a certain level of comfort with technology, but research has shown that, while students are generally skilled in social and individual technology, that does not necessarily translate to proficiency in other technology areas. Viswanath Venkatesh of the Walton College of Business at the University of Arkansas has been a leading scholar of what he has dubbed the Unified Theory of Acceptance and Use of Technology (UTAUT). Developed through an empirical comparison of eight existing models, UTAUT identified four core constructs that influence user acceptance and their usage of technology: performance expectancy, effort expectancy, social influence, and facilitating conditions. Additionally, gender, age, experience, and voluntariness of use were identified as key moderators. Performance expectancy was defined by Venkatesh as the extent to which an “individual believes that using the system will help [them] attain gains in job performance.” As technology integration was a real aspect of housing and residence life operations before the pandemic, and many technological aspects and needs will remain after it subsides, it is critical that professionals are prepared to create an optimal path forward.
PLAN B
As campuses reimagine how candidates apply for positions, and as professional staff split their time between home and the office, many are shifting to a paperless process where materials are accessible online from any location. Such a system will contain all the applications, scoresheets, notes, references, and hiring decisions as well as demographic and selection data in one place.
The move online has also shone a light on different approaches to evaluating candidates. Cate Morrison suggests the following processes that can provide insight while respecting social distancing boundaries.
Open Application: Ask candidates to provide a written argument to tout their qualifications. Another option is to ask candidates to submit a two-minute video describing why they want to be an RA or how their experience has set them up for success in the position.
Coffee Chat: In cases where an institution typically uses the group interview process to involve current RAs in the selection process, consider scheduling a one-on-one online conversation between the RA candidate and a current RA. The most effective assessment often comes when the format is left open-ended so the candidate is able to showcase their ability to drive a conversation.
Video Interview: Interviews in person and online can be nerve-wracking. To make the process less stressful for candidates, provide the questions in advance and even allow them to record and upload their answers. This allows candidates to be confident that they are showcasing their best self and provides evaluators the opportunity to rewatch as needed.
Candidate Reflection: If an online interview is conducted, ask the candidate to reflect on their experience after the fact. This unique perspective will provide insight about the candidate as well as showcase their commitment to the role.
Due to the virtual nature of this updated application model, as well as the new expectation of the housing department that RAs be able to engage with residents virtually, technology proficiency was included as an evaluable determinant in the virtual process. In simple terms, if a candidate is not able to complete an activity due to their limited proficiency with a technology system or tool, this would impact their candidacy.
But what does technological proficiency look like? Again, the utilization of a rubric and established standards is valuable. Throughout the UCA group processes, zero points were given if the candidate could not locate or utilize the specific program for assigned tasks or could not retrieve appropriate files. One point was given if the candidate demonstrated the ability to open the required program but could not retrieve appropriate files or did not complete tasks appropriately. Two points were given if the candidate demonstrated the ability to locate and open the required program and utilized the program to complete assigned tasks. Finally, three points were given to those candidates who demonstrated the ability to locate and open the required program and utilized unique program functions to complete the assigned tasks.
The move to online processes, necessitated by the pandemic, has required a good deal of innovation among housing programs and has also exposed potential weaknesses in processes. As departments rethink the how, what, and when of their work, it’s also important to not forget the why. By clearly establishing guidelines and metrics for assessment and evaluation, they can help make sure that their work adheres to their overall mission.
Craig Seager, Ph.D., is the associate director for housing and residence life at the University of Central Arkansas. Thomas Bruick, Ph.D., is an assistant professor for the University of Central Arkansas College Student Personnel Administration program and previously served as an assistant director for housing and residence life.