When Good Algorithms Go Bad: Implicit Bias by the Numbers

Algorithms are already pervading the hiring process. To continue using these systems, both the companies offering these systems and the companies subscribing to these services need to test and retest to be sure the software isn’t looking for hidden clues that reflect the pre-existing bias built into society: language choice, choice of neighborhoods, zip codes, volunteer activities, names, or the myriad of other, seemingly neutral factors, that serve as surrogates for gender, race, and other potential areas for discrimination.