Didn’t Get The Job? A Computer May Be To Blame

5 14 Loading


Robots continue to invade the workplace. But these bots aren’t going to sift through mountains of paperwork or fetch a pair of scissors for you. On the contrary, it is you who will have to answer to them.

There’s a fast-growing trend in the corporate world to replace human bias with algorithmic precision during the hiring process. In addition to an interview – or even during pre-interview screening – hopeful applicants are completing questionnaires. But unlike similar questionnaires of the past, it’s the computer that looks at the answers and decides whether or not the applicant has got what the company is looking for.

Companies like Xerox. In looking for people to staff their call centers, Xerox used to put an emphasis on hiring people with call center experience. But then they analyzed the data. Turned out that the characteristic that best predicted a quality hire was, not experience, but creativity.

A large part of the testing is personality-based. For instance, the Xerox software asked applicants to choose between statements such as: “I ask more questions than most people do,” or “People tend to trust what I say.” With enough data, statistical analysis reveals which answers are the best predictors that an applicant will be a good employee – or not.

Some of the questions attempt to gauge the applicant’s feelings toward alcohol or how tolerant they would be of a long commute. According to Evolv Inc, the San Francisco start-up that runs the so-called talent management software for Xerox, the ideal applicant is creative but not very inquisitive or empathetic, lives close to the job, has reliable transportation to the job, and is a member of one or more social networks. But the software sets the limit on how many social networks is too many: five. How many HR offices would have been so specific about the applicant’s social network limit? Only through rigorously crunching the numbers of hundreds of thousands of previous good and bad hires can such particularities surface.

But do the algorithms really work or are they simply a way for companies to automate bad hiring practices? According to research they do work. Companies using the software have seen performance go up, attrition rate go down, and disability claims go down.

As an applicant, you can try to game the system but that can be difficult. Companies know that applicants will try and tell them what they want to hear. So they try to craft their questions in ways that make it difficult to determine the “obvious” answer. It’s an intelligent approach, not based on what some HR person thinks is a tough question, but based on data of how applicants in the past have answered.

If company spending is any indication, it seems as though the predictive powers of the software is paying off. Global spending on talent-management software rose from $3.3 billion in 2010 to $3.8 billion in 2011, an increase of 15 percent. Some big tech companies are taking notice of the trend and making sure they get a piece of the automated hiring pie. Tech giant Oracle acquired Taleo in February, and IBM acquired Kenexa in August. Taleo and Kenexa are both businesses that specialize in employee assessment technologies.

Related to these technologies is industrial organizational (I/O) psychology, a branch of psychology that applies psychological axioms to whole organizations. It makes sense: the same things that make a healthy person, like physical and mental well being, make for a healthy workplace. But, since I/O psychology in the end focuses on the bottom line, it tends to be practical-minded. Instead of resolving sibling rivalries, the I/O psychologist will pair an employee with the job that best fits their communication skills, attention to detail, leadership skills, etc. The principle behind talent management software is the same: pairing the best individuals with the job. The difference between the software and I/O psychologists is that the software is fine-tuned with data – pairing, in the Xerox case, “dependable call center employee” with “creativity” without any a priori notions about what defines a dependable call center employee.

With the unemployment rate the way it is, the software’s efficiency is no doubt loathsome to the vast majority who don’t pass the test. Employers, being human (at least for now), are hostage to human biases. Confidence, thinking on your feet, an easy smile and firm handshake are all admirable qualities on the surface. But hiring based on gut feelings about a person someone has just met doesn’t always work. In fact, experts argue that intuition has proven a poor hiring tool. Compared to the software, conventional hiring methods result in a markedly varied group of hires and aren’t very good at predicting which applicants will become quality, longterm employees. To believe this, we need only look at the large number of studies that show an applicant’s perceived qualifications are powerfully affected whether or not their name is John or Jane.

At the same time companies are touting the hard logic of talent management software, disgruntled non-hires are calling a very human foul: bias. In one case, a woman who is speech and hearing impaired, faced this question: “Describe the hardest time you’ve had understanding what someone was talking about.” Scoring 40 percent overall, the software concluded she was less likely than other applicants to “listen carefully, understand and remember.” The software also suggested that her interviewer be attentive to “correct language” and “clear enunciation.” The woman has filed a complaint with the Equal Employment Opportunity Commission alleging she was discriminated against due to her disability.

There’s an inherent risk in using large data sets to select for the ‘ideal applicant.’ At some point it can begin, inadvertently, to select against protected groups, people with certain characteristics that employers cannot, by law, consider when hiring. Some of the most visible of these characteristics are age, gender, pregnancy, race and color. The above example is a case where the software unknowingly considered the applicant’s disability – and counted it against her. And while she took her complaint to the EEOC, many people do not, largely because they never see their assessment scores. In 2011 the EEOC received a total of almost 100,000 complaints. Of these, only 164 were related to talent management software.

From farm to factory to food-making, robots are increasingly working alongside us or replacing us altogether. Standardizing job applicant evaluations seems only too obvious, especially since companies will do whatever they can to trim the fat, to increase productivity and profits. But as with any new technology, there’s going to be some growing pains as talent management software use increases. Employers will have to be watchful for any unintended consequences of applying an algorithm to the future health of their company, and the repercussions of passing human bias onto unwitting software. Maybe they’ll come up with another program for that too.

Discussion — 14 Responses

  • lcalmus October 21, 2012 on 1:10 pm

    The software will certainly find the “best fit” for the current corporation. But what about the future? What about when work environments change? It would seem that the way to use the software is as just one of a number of gate keepers. In a sense all software of this type is not only fighting “the previous war” it is invested in continuing the status quo.

    • Tracy R. Atkins lcalmus October 22, 2012 on 6:11 am

      The nature of employment is shifting towards a “W-9” style economy in many sectors. I can see a day when the majority of the low and middle-income workforce is contract workers or work for an employment firm that manages all aspects of HR. In this environment, there is no real longevity, and employment will shift with the business environment for many people. A bit of the “boom town” lifestyle.

      Systems like this will likely make it easy to quickly assemble teams and workgroups that will work well together in the short term.

  • why06 October 21, 2012 on 1:14 pm

    Even Target & Wal-mart do this.

  • Tracy R. Atkins October 22, 2012 on 6:17 am

    This has some interesting repercussions for the future. Already, certain personality types are pretty much disqualified from certain lines of work in some corporations. I can see a consequence where profiling and preference will lead to entire lines of work that are barred for people based on a psychological profile. Can you imagine an environment where only passive people can work in office jobs, passive-aggressive in middle management and “alpha” types at the top, all by design?
    How about glass-ceilings based upon a profile?

    It’s not a far stretch to take employment profiling to the genetic level, ala Gattaca.

    So, what happens when someone that has continually been denied work wants to “change”. Will there be personality adjustment training to assist those that want to move up?

    • Gauss156 Tracy R. Atkins October 23, 2012 on 4:23 pm

      This is congruent with where other matters of life are heading, i.e. tons of surveillance, overriding of privacy, basing decisions on graph-theoretic models and Bayesian stats on large data sets. Consider a similar tactic, i.e. what intelligence agencies are doing with mobile feeds and all of this “sentiment analysis” and such. They’re so obsessed with “predicting riots”, they haven’t stopped to think about the legal consequences of what it is they’re pushing.

      You’ve got a similar situation here: these corporations have become so obsessed with getting what they think to be the exact, perfect candidate, they’ve overridden so many of the crucial elements that go into the hiring process.

      I guess what I’m trying to say is that more and more these days I see people being reduced to data sets, and we all know that’s an incredibly dangerous place to go. It’s pervasive in government and intelligence, and now it’s hit the corporate hiring process. Can’t say I’m surprised.

  • Bill October 22, 2012 on 7:10 am

    If companies use these methods, they should be required by law to provide the applicant a copy of the results upon request so that they can review it for any instances of discrimination, be it based on gender, age, disability, orientation, ethnicity or race. Human discrimination on these criteria is already rife enough without it being encoded in an HR algorithm.

  • Jason Doran October 25, 2012 on 2:10 pm

    Relying on these backwards-looking technologies is an inherently flawed long-term strategy since it will be self-perpetuating and not allow for the discovery of new best candidate profiles.

    • Neurosys Jason Doran October 28, 2012 on 1:23 pm

      I agree. While these applications may be sufficient to staff the monkey cubicles, Companies will lose a lot of quirky talent that in days past has often been responsible for surges in creativity and innovation. Such talent is not required for the sales floor or the cold call section, they can read and show up on time, good job.

      But consider using these same applications to test Software Designers and Network admins. haha good luck with your company, you have 0 employees.

  • turtles_allthewaydown October 26, 2012 on 8:04 am

    I don’t know if I’d blame “computers” for not getting a job. Once again, this is just a way for people to automate a process, and they’re using data that statisticians have pored over and programmers have coded into the computer. The computer is not AI and it’s not making the decisions. Not yet.

    Basically they’re outsourcing HR, and taking some of the interview process off of the people doing the hiring. Any good company is still going to talk one-on-one with the candidates, and with social networking they’re going to talk to people they know who’ve worked with the person in the past. That’s much easier to do now and changes the game every bit as much as this personality screening test does.

  • Greendogo November 2, 2012 on 8:27 pm

    I hate these tests, it’s getting impossible to trick people into giving you a face-to-face interview!

    How are the sociopaths and socially inept ever supposed to get a job now?