SAN FRANCISCO — Google is making changes to its search algorithm to promote more authoritative content and demote “low quality” content such as Holocaust denials.

The Internet giant is responding to growing pressure to make sure the top answers it provides to people’s search queries are accurate and do not contribute to the spread of misinformation, conspiracy theories, hoaxes and offensive content on the Web.

People searching Google will be able to more easily flag search results that are “unexpected, inaccurate or offensive,” says Ben Gomes, vice president of engineering for Google Search. And search quality raters have received new guidance on how to spot and report this content, too.

A fraction, about 0.25%, of searches returns offensive or misleading content, according to Google. Features designed to get people answers more quickly and on more platforms such as smart speaker Google Home are contributing to the problem. For example, autocomplete fills in information you may be searching for as you type. Featured snippets highlights an answer to a query at the top of search results. They are each generated by search algorithms and reflect what people are searching for and what’s available on the Web. Public outcry helped torpedo a featured snippet that claimed President Obama was planning a coup d’etat. Other inaccurate answers supplied in featured snippets included: MSG causes brain damage and some U.S. presidents were members of the Ku Klux Klan.

Facebook has taken the brunt of criticism for “fake news” that spread before and after the U.S. presidential election. But Google has come under scrutiny, too, with critics pointing to inaccurate or misleading articles in search results, prompting a national debate over the responsibility of technology companies that disseminate information to billions of people.

Fake news has created a public relations problem for Google, whose dominance in search rests on the trust of the public, says search expert Danny Sullivan.

“When your core product is being the best search engine in the world with the greatest results and you have searches that are clearly not the best answer you can come up with, it just looks bad,” says Sullivan, founding editor of Search Engine Land.

In November Google said it would bar fake news sites from using its advertising software. Earlier this month Google introduced a feature that places “Fact Check” tags on snippets of articles in its news results.

“From our perspective, there should just be no situation where fake news gets distributed, so we are all for doing better here,” Google Chief Executive Officer Sundar Pichai told BBC News after the election.

Google scours hundreds of billions of web pages. Tens of thousands of pages come online every minute, and 15% of searches Google sees each day are new, making it challenging for Google to always present people with the best answers to their queries, it says. It’s also under siege from so-called content farms trying to game Google’s algorithm to appear higher in its search rankings and get their blue links seen and clicked by more people.

That’s why Google says it’s soliciting feedback from users who spot troublesome content in the autocomplete and featured snippets. The people who assess the quality of Google search results have received updated guidelines so they, too, can flag what Google calls “low quality content” as in misleading information, offensive results, hoaxes and conspiracy theories, Google says.

Quality raters can’t change how search results are ranked but feedback from these contractors is used by engineers and machine learning systems to improve Google search results.

“They are saying it won’t be perfect, you are still going to find stuff that may be shocking or may surprise you,” Sullivan said. “I think it’s good that Google is making an effort and is showing concern. Hopefully it will pay off with some improvements.”

Source: USA Today