Google reportedly moved to increase its control over research reports written by its scientists.
In 2020, the company created a new review, or examination, process for research writings, Reuters news agency reports. The process asks researchers to get advice from legal, policy and public relations teams before writing about some issues.
The new review process aims to identify subjects that could be considered “sensitive” for the company. In at least three cases, Google officials asked writers not to show its technology in a bad light, Reuters reported. The news agency said its report was based on Google documents it was able to examine, as well as information from researchers involved in the work.
One company explanation told researchers that technology progress and growing complexities in the outside “environment” had led to projects that could raise moral, legal or other problems.
Reuters could not confirm the date of the company communication. Three current employees said the policy began in June. Google did not have any comment on the Reuters story.
The new process for “sensitive” subjects adds more careful study to Google’s usual review process for research papers, eight current and past employees said.
One subject considered “sensitive” was how some of Google’s machine learning-powered services might be biased against some groups of people. Other subjects included the oil industry, China, Iran, Israel, COVID-19, home security, location data, religion, self-driving vehicles, telecommunications and systems designed to suggest websites.
For some projects, Google officials intervened later in the research. Earlier this year, a Google official reviewing a study on content suggestion technology told the writers to “take great care to strike a positive tone,” documents provided to Reuters showed.
Strike a positive tone is an expression that means to avoid negative or critical language.
The official added, “This doesn’t mean we should hide from the real challenges” created by the software.
Additional messages from a researcher to reviewers shows the writers made changes “to remove all references to Google products.”
Four researchers, including scientist Margaret Mitchell, said they believe Google is starting to interfere with important studies on the possible harms of technology.
“If we are researching the appropriate thing given our expertise, and we are not permitted to publish that on grounds that are not in line with high-quality peer review, then we’re getting into a serious problem of censorship,” Mitchell said.
Google states on its public website that its scientists have “substantial” freedom. Substantial is a term that means a large amount.
Ethics and AI
Disagreements between Google and some of its employees broke into public view this month after research scientist Timnit Gebru said she had been fired by the company. Gebru, along with Mitchell, led a 12-person team that studied ethics, or moral decisions, in artificial intelligence (AI) software.
Gebru says she was fired after questioning an order not to publish research claiming that AI software that copies speech could hurt some groups. Google said it had accepted Gebru’s resignation. Reuters could not confirm whether Gebru’s paper had gone through a “sensitive” subjects review.
Google Senior Vice President Jeff Dean said in a statement this month that Gebru’s paper discussed possible harms without discussing efforts underway to address them.
Dean added that Google supports AI ethics research and said the company was “actively working on improving our paper review processes.”
I’m John Russell.
Paresh Dave and Jeffrey Dastin reported on this story for Reuters. John Russell adapted it for Learning English. Bryan Lynn was the editor.
Words in This Story
sensitive – adj. likely to cause people to become upset
biased – adj. having or showing a bias : having or showing an unfair tendency to believe that some people, ideas, etc., are better than others
location – n. a place or position
challenge – n. a difficult task or problem : something that is hard to do
reference – n. the act of mentioning something in speech or in writing : the act of referring to something or someone
appropriate – adj. right or suited for some purpose or situation
peer – n. a person who belongs to the same age, education or social group as someone else
censorship – n. the system or practice of censoring books, movies, letters, etc
artificial intelligence – n. an area of computer science that deals with giving machines the ability to seem like they have human intelligence