Amber Davisson, assistant professor of rhetoric and media studies
Search: Your results may vary
Amber Davisson knew Google collected information about its users. But not until she examined search results for a U.S. presidential candidate did she learn how the data was used.
A year later, the Willamette University rhetoric and media studies professor is writing a paper on her findings titled “Agency Google Search Customization and the Future of Political Identity." In it, she explains that instead of being an inherently objective search engine, Google reinforces people’s existing opinions.
“Google isn’t deciding what we need to know,” Davisson says. “It’s telling us what it thinks we want to know.”
For her case study, Davisson focused her research on Michele Bachmann, a Minnesota congresswoman who vied for the Republican Party’s nomination for U.S. president.
With the help of two Willamette students — whose time was funded by the Liberal Arts Research Collaborative (LARC) — Davisson set out to answer two questions. First, she wanted to better understand who Google understood its users to be. Second, she wanted to know how Google created a search engine with its users in mind.
What she discovered surprised her. Not only can Google determine people’s ages by how fast they type and the number of spelling mistakes they make, it can infer users’ reading levels, locations and interests. Furthermore, Google keeps a yearlong record of searches tied to individuals, Davisson says.
“You should be nervous with how much Google knows,” she says. “Instead of giving you an image of the world, it’s giving your image back to you. It’s like looking in the mirror."
Tracking a presidential hopeful
As Bachmann gained traction among the Republican Party, Davisson ran Google searches on her every two weeks for six months. She took efforts to make the searches anonymous, and she enlisted the help of 15 Facebook friends to conduct similar searches.
Based on what Google thought it knew about the users, the types of results were remarkably different, Davisson says.
For example, while some people were shown biographical information about Bachmann, others were prompted to search for her in conjunction with words like “joke,” “naked” and “lies.”
“Suggested searches was a major issue,” Davisson says. “Google directs you to different parts of the web that are more salacious because it thinks that’s what you want to know. Its searching algorithms are not inherently objective.”
The number of results also varied between users, with the greatest difference totaling 20 million, Davisson says.
“If we aren’t all seeing the same web, we aren’t all having the same conversation,” Davisson says about what her data revealed. “When the search results differ this drastically, having a productive conversation becomes virtually impossible.”
Making informed choices
The point of her research was not to decode Google’s impact on the upcoming presidential election, Davisson says. Rather, its purpose was to better understand how the search engine operates so people can use it more effectively.
To obtain objective information, Davisson advises users to keep their searches broad and simple. They’ll also find the most information by reading past the first page of results.
Eoin Sinclair ’13 helped Davisson on her project. Because of her work, he says he now views Google as a librarian who knows what he likes and always suggests things he’d want to read.
“As a result of working with professor Davisson, I’ve become more conscious of the ways I navigate the internet, and I think twice about why I’m being sent in a certain direction,” he says. “Professor Davisson’s work supports the liberal arts mission in this way, pushing us to reexamine the way we view the world and how we navigate it.”
Ultimately, Davisson says it’s important to remember that technology isn’t bad. But it does have a goal.
“Google is making arguments,” she says. “They’re not intended, but they’re being made none the less.”