Montana Attorney General Knudsen Demanding Answers From Google About 'Woke' AI Product
By Denise Rivette
Montana Attorney General Austin Knudsen wrote a letter today demanding answers from Google executives regarding the company’s artificial intelligence (AI) system “Gemini”. The system may violate Montana law by providing inaccurate information biased towards the Google's political preferences without disclosing that fact to consumers.
In the letter sent to Google’s chief executive officer and chief legal officer, Knudsen outlined his concerns with Gemini’s design and goals which appear to deliberately provide inaccurate information that furthers Google's political agenda, while presenting to consumers that the goal of the system is to provide “high-quality and accurate” information to users.
Because it appears to provide inaccurate information in order to align with the company’s political bias and may discriminate based on race or other protected characteristics, the AI system may violate the Montana Unfair Trade Practices and Consumer Protection Act (UTPCPA) and the Montana Human Rights Act (MHRA) which the attorney general is responsible for enforcing. Knudsen asked Google to respond to his 15 questions regarding the program's alleged biases by March 29.
“Google has offered Gemini to consumers in Montana and has represented that its goal is to create an AI system that provides ‘high-quality and accurate’ information. But behind the scenes, Google appears to have deliberately intended to provide inaccurate information, when those inaccuracies fit with Google’s political preferences. This fact was not disclosed to consumers,” Knudsen wrote. “These representations and omissions may implicate the UTPCPA. Furthermore, if Google directed employees to build an AI system that discriminates based on race or other protected characteristics, that could implicate civil rights laws, including constituting a hostile work environment.”
This all began when users of Google Gemini began receiving requested images of America’s Founding Fathers, WWII German soldiers and Vikings with a unique similarity: none of them were Caucasian. A case of overcorrecting for race? That would be a probable cause if it weren’t for the precise written product the program also produced and what it refused to produce. According to The Economist, “Gemini happily provided arguments in favour of affirmative action in higher education, but refused to provide arguments against. It declined to write a job ad for a fossil-fuel lobby group, because fossil fuels are bad and lobby groups prioritise ‘the interests of corporations over public well-being’.”
Gemini seems deliberately programmed to produce these responses. They are not the artifacts referred to as “hallucinations”, where a program makes things up.
Is this just a case of not seeing the forest for the trees while quickly designing a program to join the market; or did Google get caught incorporating social engineering into its offerings?
Google Chief Executive Officer Sundar Pichai says Gemini is being fixed. But does Google culture need fixing as well?
You can read Knudsen’s full Letter to Google executives HERE