Innovations in the background check industry are transforming the employment landscape for everyone. We’re seeing a growing use of AI and the popularity of companies performing recurrent background checks on existing employees. These checks are taking a deeper look at their personal lives, even raising sensitive issues about privacy in some cases.
Predictive technology is helping reveal more and more about our lives. Companies like check people and Checkr are applying AI algorithms to limit employment bias by classifying relevant data more effectively.
One controversial tool, Predictim, was marketed to parents who were looking for a reliable nanny for their kids. The company claimed to be able to identify candidates with questionable behavior, poor attitudes, and drug use as well as “disrespectful” care providers. They did this by processing data from criminal records and candidates’ social media information. The logical question to ask was how the company would reduce bias?
The company’s CEO went on record saying they had trained their algorithm and product for one and a half years to make sure it was neither biased nor unethical. They added a review process with human intervention and took gender, sex, race, and protected classes away from the training set. Their model was audited continuously as well.
People finder Pipl doesn’t require more than a name to perform a search. The accuracy of results is directly proportional to the amount of information provided. Pipl attempts to profile targets as if it knows them. Is that good or bad? It depends on who you ask. When the results are accurate, you get a very in-depth look at a search target. If not, you’re going to be deceived about a person.
The tool can confuse people who have the same user names, for example. Generally, it’s possible to detect inaccurate or false information as long as you make an effort to detect discrepancies.
When prospective or current employers get people’s financial data or information about weapons permits, things can get a bit concerning. Alerts for major purchases, notices of bankruptcies…it’s all fair game. Your employer can get thousands of bits of data on you by running a background check. Will he know that you bought a luxury vehicle or registered a new firearm? Yes. Does he need to?
The company Endera establishes employee behavior baseline and then identifies any pressures they may be subject to that could threaten the company or their work. They admit information collected in the workplace can be biased, just like it can be helpful, and state that one cannot rely solely on internal data to determine insider risk. Warning signs might be ignored because they are “personal” in essence, and supervisory intervention is tainted by human error.
Advanced Background Checks: Pros and Cons
Ongoing monitoring and continuous checking are two relatively recent developments in background checking software. Both are aimed at performing background checks on people that have already been hired. The days of manual and static background checks are long gone, and a new epoch of monitoring is here.
Checkr introduced their “Continuous Check” product to offer Uber screenings to terminate the contracts of employees who have committed a violation. Initially, the product was designed to fulfill the ridesharing industry requirements. Then, the background check service started working with insurer Allstate and HR giant Adecco, meaning these corporations get all the personal data as well.
Opponents of this technology claim it will have a disproportional impact on independent contractors and other representatives of the gig economy. The Equal Employment Opportunities Commission (EEOC), which bans employers from using someone’s criminal record to make a hiring decision, protects formal company employees. Such use is in violation of the Civil Rights Act’s Title VII amendment.