How to Help Students Overcome AI Hiring Algorithms

AI hiring algorithms
NYU professor Hilke Schellmann's new book The Algorithm explores the ways in which AI is influencing the job market and what applicants can do about it. (Image credit: Jennifer Altman)

AI is taking over the job market. 

That’s the thesis of the new book The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired, And Why We Need To Fight Back by Hilke Schellmann, an Emmy award-winning investigative reporter and contributor to The Wall Street Journal and The Guardian, as well as an assistant professor of journalism at New York University. 

In the book, Schellmann recounts how major companies are increasingly turning to AI to scan resumes, assess video interviews, and predict applicant or employee success through games that do not always have a clear connection to the job. 

This is all a problem, Schellmann tells Tech & Learning, because the decisions AI systems make can be arbitrary, unfair, and sometimes, blatantly biased against certain types of applicants. And students preparing to enter the job market, need to know how to navigate the evolving hiring landscape. 

Schellmann offers advice and techniques that you can use to help prepare your students to stand out to AI recruiters and overcome some of the problems associated with the growing AI jobs market. 

Overcoming AI Hiring Algorithms: A Time-Saving Tool for Companies Inundated by Applicants  

The cover of The Algorithm by Hilke Schellmann

(Image credit: Hilke Schellmann)

"Job platforms, like LinkedIn, Monster, ZipRecruiter, and Indeed, they really tried to democratize hiring and they have — a lot of us can apply to a lot of jobs,” Schellmann says. “That has led to a lot of companies getting a lot of applications for a role. We see large companies like Google, they get about 3 million or so applications a year.” 

And it’s not just a tech giant such as Google dealing with this challenge. Anyone who has ever served on a hiring committee knows that any semi-desirable job will receive at least dozens of applications, many from candidates who seemingly meet none of the requirements for the role.  

“A lot of companies are just overwhelmed by the number of applications," Schellmann says. "So they're looking for a technological solution, and the vendors come in and say, ‘These tools are going to make hiring more efficient, they're going to save a lot of money, they will have no bias and they will pick the most qualified candidates.'” 

These AI screening tools do increase efficiency and save money, but the other claims made by vendors can be more dubious, Schellmann says. “Are they without bias? That’s certainly not true. And do they pick most qualified candidates? We don't have a whole lot of evidence that is true.” 

Problems With AI Bias  

Bias can be built into AI algorithms in subtle and unintentional ways that can have devastating consequences. An employment lawyer Schellmann interviewed for the book found a resume screener that gave people who mentioned baseball on their resume more points and gave fewer points to those who mentioned softball. 

“That's probably a case of gender discrimination,” Schellmann says, since you’d expect more men to list baseball on their resume and more women to list softball. What likely happened in that case is that the AI tool was trained on the resumes of past employees at the company. 

“There might have been historical bias at the company, they may have hired more men than women in the past,” she says. So the AI may have seen more mentions of baseball and inferred, wrongly, that those with baseball on their resume were better employees. 

As bad as bias in individuals is, AI bias has the potential to be much worse. “The problem with resume screeners, or with a lot of AI tools, is the scope is just unprecedented,” Schellmann says. “A human hiring manager can discriminate against only a certain number of people, and I'm really sorry for them, but an AI tool can discriminate against millions of people.” 

Overcoming AI Hiring Algorithms

To standout to AI algorithms when you or your students are applying for work, Schellmann suggests: 

  • Keep your resume simple. Forget the old advice to make your resume stand out, the opposite is now true, as you want it to be in a format AI already understands. “Use short, crisp, clear, sentences,” Schellmann says. Images, multiple columns, or other layout flourishes should be avoided. 
  • List skills separately. Many AI tools scan resumes for specific skills, so applicants should create a separate section that lists your skills in a clear manner with bullet points. This helps ensure the algorithm will “ingest their skills correctly,” Schellmann says. 
  • Quantify wherever possible.  “Anything that you can quantify, quantify, versus just describing it — this is really helpful for machines,” Schellmann says. 
  • Use technology to asses your resume before submitting it. Schellmann recommends using a tool such as Jobscan, which allows job hunters to input their resume and the description of the job they are seeking and compare the similarities. “You always try to use the keywords that are on the job description but make sure it's not 100%” Schellmann says. Instead, you want about a 60-80% overlap. “Because if it's 100% of the same words, some of the resume screeners will infer that this is just a copy of that job description,” she says. 
  • Reach out to a person. If you are able to learn the name of the person in charge of hiring for that position, it can be worthwhile to send a short message to them on LinkedIn. This ensures a human will look at your resume. This is important for the reasons laid out in this article and described in more detail in Schellmann's book. “It’s hard to understand how machines will treat your resume," she says. 

The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired, And Why We Need To Fight Back is available on Amazon.

Erik Ofgang

Erik Ofgang is a Tech & Learning contributor. A journalist, author and educator, his work has appeared in The New York Times, the Washington Post, the Smithsonian, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.