Senators spar with Google exec over use of ‘persuasive technology’
Lawmakers expressed disbelief on Tuesday when a Google executive told a Senate panel that the company does not use persuasive techniques targeted at its users.
Maggie Stanphill, Google’s director of user experience, during a Senate Commerce technology subcommittee hearing, told the panel, “No, we do not use persuasive technology at Google.”
At issue before the panel was how algorithms used by companies like Google, Facebook and others might influence their users.{mosads}
But Stanphill’s statement prompted pushback from senators who had been scrutinizing the company over its content decisions on platforms like YouTube.
“You don’t want to clarify that a little further?” Sen. Brian Schatz asked. “Either I misunderstand your company or I misunderstand the definition of persuasive technology.”
Stanphill responded by saying “dark patterns and persuasive technology are not core to our design.”
“We build our products with privacy, security and control for the users,” she continued. “And ultimately this builds a lifelong relationship with the user, which is primary. That’s our trust.”
“I don’t understand what any of that meant,” Schatz said.
A skeptical Sen. Richard Blumenthal (D-Conn.) said that Stanphill’s answer was hard to accept given that persuasion appeared to be baked into Google’s business model.
“On the issue of persuasive technology, I find, Ms. Stanphill, your contention that Google does not build systems with the idea of persuasive technology in mind somewhat difficult to believe, because I think Google tries to keep people glued to its screens, at the very least,” Blumenthal says.
Many on the panel took Google and other internet platforms to task for the lack of transparency in their algorithms and how those technologies influence their users’ behavior.
Sen. John Thune (R-S.D.), the chairman of the subcommittee, suggested that he was considering a bill that would tackle the issue and called for greater transparency from the industry.
“Congress has a role to play in ensuring companies have the freedom to innovate but in a way that keeps consumers interests and well-being at the forefront of their progress,” Thune said. “Consumers should have the option to engage with a platform without being manipulated by algorithms powered by their own personal data — particularly if those algorithms are opaque to the average user.
“Companies are letting algorithms run wild and only using humans to clean up the mess,” added Schatz, the top Democrat on the panel. “Algorithms are amoral. Companies designed them to optimize for engagement as their highest priority, and in doing so eliminated human judgment as part of their business model.”
The subcommittee also heard from Tristan Harris, a former Google programmer who has since become one of the foremost critics of social media companies’ design decisions. Harris told lawmakers that some internet platforms can easily predict from a user’s data what their sexuality is, their personality traits, whether they suffer from low self-esteem and whether they’re about to enter a relationship.
“It’s a race between who can better predict your behavior,” Harris said. “They can predict things about us that we don’t know about ourselves.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.