Candidates expect to go through background screening when taking on a new role, but they also expect it to be a smooth, obstacle-free process. “A positive candidate screening and onboarding experience is more important today than ever before, with a demand for hiring employees as quickly as possible, so you don’t lose them to competition,” says Lasky. “It is truly one of the first impressions a candidate will have of the organization, affecting in some way the decision-making of accepting a position. State-of-the-art technology, excellent customer service, and seamless compliance procedures are at the foundation of creating a positive experience.”
And good processes lead to good performers. In fact, a LinkedIn study finds that organizations with a strong candidate experience improve the quality of their new hires by 70%.
In today’s tech-enabled world, what factors make for a positive candidate experience? Lasky and Standerwick share several key factors to consider.
AI tools have been hailed as efficient timesavers for HR leaders, but when it comes to background screening, it is smarter to err on the side of caution. “Artificial intelligence is under great scrutiny right now,” explains Standerwick. “Algorithms can create pitfalls for employers depending upon what decisions are being automated utilizing AI.”
In fact, in January, the U.S. Equal Employment Opportunity Commission (EEOC) released its “Strategic Enforcement Plan (SEP) for 2023-2027” that addressed the increased use of AI by organizations in hiring and background screening. The SEP will examine “screening tools or requirements that disproportionately impact workers based on their protected status, including those facilitated by artificial intelligence or other automated systems, pre-employment tests, and background checks.”
Standerwick says some of the information found and used by AI tools is not current or up-to-date as required by the Fair Credit Reporting Act and state laws. “If the underlying data is not suitable for the purpose of being used in a hiring decision, the employer may be buying a deficient bill of goods. These are attractive because they are often cheap and fast, but may fall woefully short from an accuracy perspective,” she says.
Lasky agrees. “Organizations need to be sensitive to ‘junk data’ that can be returned with automation and AI tools, which can lead to false-positive or potential discriminatory information being reported out. The issue with automation/technology is it still is not full-proof and can lead to false-positive information, if not manually vetted before reporting out, which can increase the risk of class-action litigation and high dollar damages.”
Lasky points to the Bureau of Consumer Financial Protection issuing an advisory opinion at the end of last year that stated that accuracy of consumer reports is critical. It said that consumer reporting agencies need to establish “reasonable internal controls” to prevent the false data being reported. “The government has made it clear that reasonable procedures by screening firms and their clients need to be implemented to assure maximum possible accuracy,” he says.