Google’s motto is “Don’t be evil,” and one could argue that for the most part, it has lived up to that goal. But a new study says that the algorithm behind the company’s ad-targeting network may be inadvertently discriminating against women.
A new paper published by researchers from Carnegie Mellon University and the International Computer Science Institute — originally reported by MIT Technology Review — looked at ads served up on third-party websites using Google’s ad network.
It found that when users identified themselves as males, the ad network was more likely to display an advertisement on the site for a career-coaching agency promising executive positions offering $200,000 salaries. Female users, on the other hand, were more likely to see ads for a “generic job posting service and an auto dealer.”
How did they figure this out?
The researchers discovered the disparity using a program called AdFisher, which simulates users’ Web-browsing habits. They then used Google’s Ads Settings tool — a service that lets you control the kinds of ads you see based on your demographic and browsing habits — to set the gender of the simulated user to either male or female.
Prior to performing the test, they made sure the only difference between the male and female simulated users was their gender. All other browsing habits and demographics were the same.
Using AdFisher, they had 500 simulated male users and 500 simulated female users browse the same job-hunting sites. The researchers then had the same simulated users visit the Times of India’s website, where the career-coaching ad appeared.
The Barrett Group of Warwick, R.I., placed the career-coaching ads shown mostly to men in these tests.
When the testing was completed, AdFisher found that the simulated male users saw significantly more ads for the career-coaching service than the simulated females. The males were shown the ad 1,852 times, while the women were shown it just 318 times. And that could have some legal implications — including potential inquiries from the Federal Trade Commission to determine whether discriminatory practices were at play.
What’s an advertising algorithm?
A key question in this conversation, though, is whether the advertiser chose to target ads for the career-coaching service at men, or if Google’s ad-serving software was at fault. To figure that out, it helps to know how Google’s ad network serves up ads.
It starts when a website like the Times of India signs up with Google to place ads on its site. Advertisers participating in Google’s ad network tell Google the kinds of consumers they want to reach with their ads.
According to Google, advertisers can specify a multitude of parameters for consumers they want to market to, including the types of websites they visit, their age, general interests, and, of course, gender.
“Advertisers can choose to target the audience they want to reach, and we have policies that guide the type of interest-based ads that are allowed,” a spokesperson for Google told us.
“We provide transparency to users with ‘Why This Ad’ notices and Ads Settings, as well as the ability to opt out of interest-based ads.”
When you visit a website that participates in Google’s ad network, Google detects the kind of consumer you are based on your profile and matches you with the appropriate ad.
For example, if you’re a 30-year-old man who likes to visit sports and gaming sites, Google’s ad network will recognize that and probably show you an ad for a new video game or sporting event. Google’s algorithms can also calculate who’s more likely to click on a specific ad. The more people who click on ads served up by Google, the more money the company and the website publisher make; that means Google is more likely to show ads that have a high click-through rate.
How could this be discrimination?
For the most part, advertisers are allowed to target specific gender demographics for certain products. For instance, a company selling maternity clothing is completely within its rights to target women in a certain age group who have visited pregnancy-related websites.
The problem starts when you are advertising for something called public accommodations, says UC Berkeley law professor David Oppenheimer.
“You can’t say you’re offering a service and restrict it based on gender, or race, or any other protected characteristics,” Oppenheimer explained. But he added that this would apply only if the services were purposely being targeted at men rather than women.
Because the researchers were unable to find out whether the advertiser was specifically targeting men — or to get Google to reveal how its algorithm works — they were unable to determine whether women were being deliberately excluded from seeing the career-coaching ad.
According to Barrett Group president, Waffles Natusch, the company did not set gender parameters for its ads.
“The ads, there are at least 75 different ones, are targeted to senior executives and attorneys of any gender,” Natusch explained. “My web guys and marketing team agreed it had more to do with Google’s ad algorithms than anything on our side.”
Still, the ad network’s apparent exclusion of female users could be considered an adverse impact form of discrimination, Oppenheimer explained.
Adverse impact discrimination happens when companies attempt to do something completely benign but end up inadvertently discriminating against a group.
Oppenheimer points to an example of a city specifying a height requirement for bus drivers due to the size of the driver’s seat. If the height requirement is too great, it would naturally eliminate a disproportionate number of women from being able to hold the job. And that is illegal.
In the case of Google’s ad network, the algorithm that ensures it places ads on sites based on what users are more likely to click on could have inadvertently decided that men would click the link for the career-coaching service more frequently than women. As a result, it would have shown the ad to more men, thus putting women at a disadvantage.
According to Oppenheimer, if the algorithm turns out to be the cause of the disparity, a government regulatory agency such as the Equal Employment Opportunity Commission or the Department of Justice Civil Rights Division could step in and work to ensure the issue is addressed.
In addition, an individual or a group could bring a class action suit against Google or its advertisers seeking injunctive relief. The company or its advertisers could also face possible punitive damages.
For what it’s worth, the authors of the study aren’t accusing Google or its ad partners of deliberately keeping women out of high-paying jobs. They do, however, call for greater transparency in the way ads are selected for individual users.
“We would remain concerned if the cause of the discrimination was an algorithm run by Google, or if the advertiser automatically determined that males are more likely than females to click on the ads in question,” the researchers wrote.
“The amoral status of an algorithm does not negate its effects on society.”