Rawpixel - Fotolia

SAP adds functionality targeting gender bias in job ads

SAP SuccessFactors Recruiting Management's job analyzer seeks to identify gender bias in job ads. This means checking words in context. The goal is a higher-quality applicant pool.

SAP is adding new functionality this quarter in its cloud-based HCM software that detects gender bias in job ads. Its goal is to improve the response of women to employment opportunities. SAP doesn't have results yet to show gains in female applicants and is now working with customers to gather data.

Federal law prohibits job ads that show a specific hiring preference based on gender, race and other characteristics. But gender bias in job ads isn't necessarily blatant. A job ad that seeks someone capable of "moving the needle inside of our dominant enterprise business" or uses tactical military language may act as a subtle call for males. Seemingly simple words and expressions can take on a masculine context and deter women, SAP said.

The capability to identify gender bias is being embedded in SAP SuccessFactors Recruiting Management job analyzer, and its roadmap calls for similar work in respect to sexual orientation and age. "Gender is really just where we are starting, but not where we are finishing," said Lyndal Hagar, a product manager of diversity and inclusion at SAP. This tool, which SAP says uses machine learning algorithms to detect gender bias in job ads, is a free upgrade for its customers.

SAP SuccessFactors Recruiting Management job analyzer
SAP SuccessFactors' job analyzer tool flags gender bias in job ads. The left chart shows words that score high as either masculine or feminine. The other chart shows how words can take on a gender bias in certain contexts.

Diversity is key to business success

The case for writing a better job ad is strong. In its "Women Matter" report last month, McKinsey & Co. argued that gender diversity "is a key battlefield for economic growth in aging countries and for companies to win the talent war." This is particularly acute in the U.S., where women only make up 26% of the computing workforce and only 3% are African-American, according to National Center for Women & Information Technology data.

The goal is to make the applicant pool more inclusive and thereby improve the overall quality of the applicants. But even if employers produce gender-neutral job ads, it doesn't mean women will see them.

The gender-neutral ad may be used on ad-serving systems that favor males because women are more expensive to reach, according to new research outlined in the paper "Algorithmic Bias? An Empirical Study into Apparent Gender-Based Discrimination in the Display of STEM Career Ads."

"Though we can remove the bias from the text of ads, our paper shows that the algorithm which decides who sees the ad may still be biased if it is trying to cost-minimize, and male eyeballs are cheaper than female eyeballs," said paper co-author Catherine Tucker, a professor at the MIT Sloan School of Management, in an email.

What is emerging now as the reason for carrying out diversity and inclusion interventions is the fact that diverse teams with a culture of inclusion contribute significantly to business performance.
John KostoulasAnalyst, Gartner

John Kostoulas, a Gartner analyst, said businesses are realizing that diversity and inclusion improve the business. In the past, the goal, especially in the U.S., was to ensure that quotas were being met in workforce and management.

"What is emerging now as the reason for carrying out diversity and inclusion interventions is the fact that diverse teams with a culture of inclusion contribute significantly to business performance," Kostoulas said.

Businesses are now turning to machine learning and AI systems to help inform data-driven decision-making. "They are trying to make their processes less prone to bias," Kostoulas said.

Kostoulas said the AI use cases include helping develop taxonomies to make sure what you search for is correct; ranking candidates based on their fit for the job; augmenting human capability to screen through resumes; and creating tools that predict the success of a candidate in any given job. Some employers may use video to employ facial detection technology that analyzes expressions to glean personality insights.

"Most [of] these technologies are in proof-of-concept or pilot mode," Kostoulas said. "These technologies may be only optimizing at this stage for one language or cultural background."

Advice for buyers of intelligent systems

For buyers, Kostoulas had some recommendations. First, ask the vendor about its definition of AI to understand whether the vendor uses machine learning techniques or simply rebrands aging technology (such as resume parsing, semantic search or keyword-based matching algorithms for ranking candidates) as AI.

It's also important to understand where the data is coming from and how complete and rich the data set is. Spending more time asking about and understanding the models and hypotheses behind the tool and its data sources "will essentially bring more transparency from the vendors," Kostoulas said.

P.K. Agarwal, regional dean and CEO of Northeastern University Silicon Valley, believes the need for this type of tool is indicative of a broader problem in the availability of tech workers.

"The fact that companies feel the need to place gender-neutral ads to remove job bias truly highlights the shortage of quality STEM workers -- and not even just with women," Agarwal said.

Independent of tech needs, the broader employer market is recognizing the need for a diverse workforce, said Patricia Fletcher, a leadership futurist and solution manager at SAP SuccessFactors.

"Diversity is not an option," Fletcher said. "We have to update the system in order to reflect who the workforce is today, who it's going to be tomorrow and how the practices, the processes may change as a result."

Next Steps

Experts disagree on bias risks of video interviewing

How do you get more women in IT jobs?

People analytics in HR continues to gain steam

Data bias problem in AI amplifies stereotypes and prejudice

Dig Deeper on People analytics software