by Camille Perlman
Because residence life and housing professionals are often considered to be the campus experts on relationships and community, one might think they’d also be the most attuned to avoiding bias and divisiveness. But is that the case? “I think we take it (the ability to relate to people without bias) for granted in our field. We say we are good at recruiting, we’re good people, [but], no, we need just as much work to confront our bias as anybody else,” says Shana Alston, director of housing at Temple University in Philadelphia, Pennsylvania. “We are relational, right, and we want people to be relational, but how can you put that into words, and what can you do to recognize [bias]?” asks April Barnes, executive director of housing and residence life at the University of South Carolina-Columbia. “Res life does it [hiring] better, but I don’t know if I’ve seen any improvement as far as bias-related screening and practices,” says Alston.
The truth of the matter is that everyone has some degree of bias. Whether they wear it on their sleeve or it lies buried in their subconscious, it exists. However, this doesn’t mean that it needs to have a negative impact on one’s behavior or choices. But before any positive change can be made, that bias must be uncovered and explored. This is particularly true in the hiring process. Consider the nature of the process and why it may be highly susceptible to the influence of bias. When hiring new staff, a group of people (members of a department or a placement committee, for example) are asked to evaluate individuals based upon the limited amount of information supplied by a résumé and cover letter and perhaps some references. Even as they conduct the interview, the interviewers are not only registering the candidate’s responses to questions but are also observing their accent, dress, mannerisms, and countless other elements – and all that information passes through the filter of their bias.
Daniel Gonzalez, coordinator of residential life at the University of Minnesota-Twin Cities in Minneapolis, has presented on the subject of bias a number of times. He often cites the definition of bias as expressed by the ADVANCE Center for Institutional Change at the University of Washington in Seattle: Unexamined bias “is a form of stereotyping that is often unintentional, automatic, and outside of our awareness. Often contradictory to our conscious beliefs. Also called subtle or implicit bias. Framing it specifically as ‘unexamined’ puts onus for change on the person who harbors or acts on bias, holding them accountable.” Confirmation bias is similar to implicit bias and, as defined by the Oxford English Dictionary, involves “the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories.” In fact, it is the “core cause for lack of diversity,” according to Bobby Schultz, CEO of Fiskal and co-founder of the budgeting application of the same name: “It starts with how candidates are screened and persists through the entire hiring process. The simplest explication for Confirmation Bias is ‘if a candidate is similar to me, they must be good’” (UX Collective, April 23, 2017). Alston, for one, understands how pervasive this kind of bias can be in hiring decisions. “That confirmation bias hit it right on the head. I appreciate having that language now. We have a super structure process, we have rubrics, there’s a certain committee that does application evaluations, and then it’s very specific how our [interview] rubrics work, it’s all competency-based, but still people will write anything in the overall comments.”
The Science Behind Bias
Project Implicit was founded in 1998 by three scientists. Tony Greenwald at the University of Washington, Mahzarin Banaji at Harvard University, and Brian Nosek at the University of Virginia were interested in researching the unconscious thoughts behind attitudes, stereotypes, and bias. They developed tools for measuring these, such as the Implicit Association Test (IAT) which measures the strength of associations between concepts. Today, Bethany Teachman at the University of Virginia and Matt Nock at Harvard University continue the research under a new project name, Project Implicit Health. According to their website their mission is to “educate the public about bias and to provide a ‘virtual laboratory’ for collecting data on the internet.” For those interested in searching deeper into their own bias, more information about the IAT is available online. Project Implicit Health also offers consulting and educational services to help human resource management identify bias, mitigate bias, and educate staff on the science behind bias.
Staff may look around at their coworkers and assume that this is a fair and unbiased team that will act respectfully and fairly in the hiring process. Unfortunately, studies show that our subconscious tendencies invite bias to color our judgement. One randomized double-blind study, “Science Faculty’s Subtle Gender Biases Favor Male Students” (PNAS, October 9, 2012), revealed that science faculty rated male applicants as being considerably more competent for a lab manager position than were female applicants and even selected a higher starting salary for them. A study in the American Journal of Sociology, “Pride and Prejudice: Employment Discrimination against Openly Gay Men in the United States,” revealed significant discrimination against gay applicants in geographical regions where the LGBTQ community is less supported; openly gay men were more discriminated against by employers who emphasized the importance of stereotypical male heterosexual traits. These research examples illustrate the importance of recognizing bias. As Gonzalez notes, “It’s important to raise awareness of bias and put policies and procedures in place to break the link between bias and behavior.” Furthermore, there is both an individual and organizational responsibility to bring awareness to biases and actively challenge them.
It’s important to recognize and understand the pitfalls associated with bias so that, when they occur, changes can be made in the work environment and in specific processes. Again, this is particularly important to address within the hiring process, since the selection of new staff or the advancement of existing ones will likely affect the direction and performance of a department for years to come. Bias takes many forms throughout a hiring process. One of the most common negative outcomes from implicit bias in hiring is referred to as cloning. “Cloning happens when one person gives preference to an applicant because they remind them of someone or they have a similar background to someone in the department,” Gonzalez explains, offering an example. “Jane is a good person, and this candidate reminds me of her, so this candidate is as good as she is.” This resonates strongly with Barnes. “I think the cloning piece happens all the time,” she says.
Bias can lead to important decisions being made through snap judgements, which are “quick decisions about an applicant that are negative or positive. They are made without any sufficient evidence to back [the decision] up,” Gonzalez explains. For example, there may be a snap judgement made about him because of the school flag that is visible in his background during online meetings. These snap judgements often inform the negative or positive stereotypes that can be assigned to a candidate. As Gonzalez notes, “There’s plenty of scholarship out there that talks about how marginalized candidates are many times unfairly scrutinized with a presumption of incompetence in their work, and they also can be seen as having a quote/unquote agenda that others don’t have.”
Staff may look around at their coworkers and assume that this is a fair and unbiased team that will act respectfully and fairly in the hiring process. Unfortunately, studies show that our subconscious tendencies invite bias to color our judgement.
Bias can also lead to raising-the-bar or elitist behavior. Committee members have different expectations for each candidate, expecting those with different identities to answer questions differently. As Gonzalez advises, “You need to pull yourself away from this behavior. The committee members need to help each other and call attention to this and make sure it doesn’t continue to happen.” The last pitfall he highlights is wishful thinking, which occurs “When committee members have an assumption that racism, sexism, ableism or other forms of bias no longer exist.” For example, “When someone says, ‘I don’t see race,’ that’s when I would say back to the committee member I see more training that needs to happen.”
Most would say that all of these inclinations – bias, cloning, snap judgements, stereotypes, and elitism – are detrimental to the hiring process. In many ways, though, they can all lead to a determination that, for many, is still considered an acceptable outcome: the discussion about whether or not a candidate is a good fit for an organization. Many sources agree that there is a lot of frustration around the idea of fit. “This is my biggest pet peeve in the search process,” says Gonzalez. “Sometimes the word fit is often coded language for either how comfortable that committee member would feel about that person or how comfortable that committee member thinks other people would feel about that candidate. And I ask you to think about this: If that candidate would be the only or one of the only persons of color, or the only queer-identified person, or the only marginalized identity, how likely is their fit going to be questioned in that process?”
Learning what implicit bias is and understanding how it can influence decisions has value beyond the hiring process. Just because a round of interviews is over doesn’t mean the awareness of bias and the knowledge of how to mitigate it can go on the back burner. It’s also important to think about what resources supervisors and their staff need to reflect on their own bias. Hopefully, the information provided here gives readers a good place to start the work of self-reflection and planting seeds of change in the overall structure of the hiring process.
Camille Perlman is the managing editor of Talking Stick.