U.S. Markets closed

Google Changes Search Process To Avoid Displaying Wrong Answers

Eric Chiu
The tech giant unveiled a program Thursday that will see black software engineers from “historically black colleges and universities” studying at the Google campus.

Fake and inaccurate news has been a persistent problem for many tech platforms lately, but a new Google initiative looks to reduce its reach, according to a report from Search Engine Land.

According to Search Engine Land, Google’s quality raters (contractors who help evaluate the quality of search results) can now tag content as “Upsetting-Offensive.” Quality raters don’t directly affect search results that you can see, but their data is used to help improve Google’s search algorithms over time.

Read: Facebook News Feed Will Rank Authentic Content Higher With New Algorithm

Google’s guidebook for quality raters gives several guidelines and examples for them to use. Examples of content that could earn the “Upsetting-Offensive” tag include sites that discuss how to do illegal or harmful acts, graphic violence and racist or discriminatory terminology. Google also guides raters on ways to tell the difference between informative and uninformative content by determining intent and the authority of a site on a given topic.

However, Google also doesn’t want to issue blanket policies for its search results. While the algorithm updates and new tag are intended to avoid incorrectly elevating low-quality search results, Google’s rater guidelines highlight the company’s desire to act as a provider and search tool for users.

“Remember that users of all ages, genders, races, and religions use search engines for a variety of needs,” Google’s guidebook said. “One especially important user need is exploring subjects which may be difficult to discuss in person. For example, some people may hesitate to ask what racial slurs mean. People may also want to understand why certain racially offensive statements are made. Giving users access to resources that help them understand racism, hatred, and other sensitive topics is beneficial to society.”

Read: Google Rethinking Search Queries After 'Did Holocaust Happen' Results Feature Denial Sites

The internal change comes after Google’s search results were part of several high-profile embarrassments for the company, thanks in part to its quick answer box. The section displays an algorithmically determined answer if you type in an answerable question.

But as The Outline noted, it has also showcased false information ranging from presidents who were in the Klu Klux Klan to whether or not former president Barack Obama was planning a coup of the U.S. While Google’s past approach to these stories has been to manually remove the offending terms once they’ve been made public, the search engine likely hopes these tweaks will improve its search quality further and help it avoid similar embarrassments.

Related Articles