A study of Google search results by anti-tracking rival DuckDuckGo has suggested that escaping the so-called ‘filter bubble’ of personalized online searches is a perniciously hard problem for the put upon Internet consumer who just wants to carve out a little unbiased space online, free from the suggestive taint of algorithmic fingers.
DDG reckons it’s not possible even for logged out users of Google search, who are also browsing in Incognito mode, to prevent their online activity from being used by Google to program — and thus shape — the results they see.
DDG says it found significant variation in Google search results, with most of the participants in the study seeing results that were unique to them — and some seeing links others simply did not.
Results within news and video infoboxes also varied significantly, it found.
While it says there was very little difference for logged out, incognito browsers.
“It’s simply not possible to use Google search and avoid its filter bubble,” it concludes.

DDG has also made the data from the study downloadable — and the code it used to analyze the data open source — allowing others to look and draw their own conclusions.
It carried out a similar study in 2012, after the earlier US presidential election — and claimed then to have found that Google’s search had inserted tens of millions of more links for Obama than for Romney in the run-up to that.
It says it wanted to revisit the state of Google search results now, in the wake of the 2016 presidential election that installed Trump in the White House — to see if it could find evidence to back up Google’s claims to have ‘de-personalized’ search.
For the latest study DDG asked 87 volunteers in the US to search for the politically charged topics of “gun control”, “immigration”, and “vaccinations” (in that order) at 9pm ET on Sunday, June 24, 2018 — initially searching in private browsing mode and logged out of Google, and then again without using Incognito mode.
You can read its full write-up of the study results here.
The results ended up being based on 76 users as those searching on mobile were excluded to control for significant variation in the number of displayed infoboxes.
Here’s the topline of what DDG found:
Private browsing mode (and logged out):

  • “gun control”: 62 variations with 52/76 participants (68%) seeing unique results.
  • “immigration”: 57 variations with 43/76 participants (57%) seeing unique results.
  • “vaccinations”: 73 variations with 70/76 participants (92%) seeing unique results.

‘Normal’ mode:

  • “gun control”: 58 variations with 45/76 participants (59%) seeing unique results.
  • “immigration”: 59 variations with 48/76 participants (63%) seeing unique results.
  • “vaccinations”: 73 variations with 70/76 participants (92%) seeing unique results.

DDG’s contention is that truly ‘unbiased’ search results should produce largely the same results.
Yet, by contrast, the search results its volunteers got served were — in the majority — unique. (Ranging from 57% at the low end to a full 92% at the upper end.)

“With no filter bubble, one would expect to see very little variation of search result pages — nearly everyone would see the same single set of results,” it writes. “Instead, most people saw results unique to them. We also found about the same variation in private browsing mode and logged out of Google vs. in normal mode.”
“We often hear of confusion that private browsing mode enables anonymity on the web, but this finding demonstrates that Google tailors search results regardless of browsing mode. People should not be lulled into a false sense of security that so-called “incognito” mode makes them anonymous,” DDG adds.

DDG says it controlled for time- and location-based variation in the served search results by having all participants in the study carry out the search from the US and do so at the very same time.
While it says it controlled for the inclusion of local links (i.e to cancel out any localization-based variation) by bundling such results with a localdomain.com placeholder (and ‘Local Source’ for infoboxes).
Yet even taking steps to control for space-time based variations it still found the majority of Google search results to be unique to the individual.
“These editorialized results are informed by the personal information Google has on you (like your search, browsing, and purchase history), and puts you in a bubble based on what Google’s algorithms think you’re most likely to click on,” it argues.

More at: https://techcrunch.com/2018/12/04/go...ds/?yptr=yahoo