Book 8 - Eli Pariser "The Filter Bubble"
Jan. 25th, 2016 03:44 pmEli Pariser "The Filter Bubble" (Penguin)

Another paperback from my covert charity shop visits from last year and which i started reading in December.
The central message in The Filter Bubble is that the search algorithms used by websites like Amazon, Netflix, Facebook, and (most perniciously, Pariser argues) Google are incredibly good at showing us content that similar to content we’ve already looked at. The cumulative effect of all this, Pariser argues, is that if we do nothing we wind up living in a tightly circumscribed online world filled with information, ideas, and outlooks already familiar to us: the “filter bubble” of the title. Pariser also has reservations about the ways in which companies like Google and Facebook gather, store, and use information about us: the raw material their algorithms use to decide what we want to see. The internet shows us what we want to see, not what we need to see, and that deeply frustrates him.
What frustrated me, for virtually the entire length of the book, is that Pariser seems far more concerned with warning readers that they’re on the road that leads to filter-bubble Hell than with asking why that particular route might have seemed – or might still seem – more attractive than the other routes available. He never stops, for example, to consider why filters feel like essential tools when exploring even a narrow, bounded world like Facebook (much less the web as a whole): A hyper-abundance of information, a horrific signal-to-noise ratio, and users with limited time and shaky information- literacy skills. Filtered search results and tailored news feeds have flourished, in part, because people find them useful and efficient.
Pariser, who wants them to return a higher proportion of results that aren’t just what the user would expect (and thus want) is thus in the odd position of arguing that search engines would be improved if they were – in the eyes of most users – made less efficient. Arguing that efficiency isn’t an absolute virtue is far from absurd (it works for hand-dipped milkshakes, artisan bread, and craft-brewed beer) but it’s hard to see it being used to sell lifeboat bilge pumps or body armor. Or search engines. Some things, you just want to be boringly efficient.
The premise underlying Pariser’s case for less-tightly-filtered, (and thus seemingly less-efficient) search engines and news feeds isn’t absurd, either. It’s that “efficiency” in search isn’t giving the user the information they want, it’s giving them the information they need – the information that will make them better informed, better able to think, and thus better able to deal with the world. It’s far from clear, however, that an internet search engine programmed (by others) to give them that is any more desirable than one programmed (by others) to give them just what they want. It’s also far from clear that most people, if presented with that broader range of information, would not – using their own homegrown filters – immediately weed out (as “irrelevant,” “biased,” “uninteresting” or simply “wrong”) precisely the information that Pariser is so determined to provide them with.

Another paperback from my covert charity shop visits from last year and which i started reading in December.
The central message in The Filter Bubble is that the search algorithms used by websites like Amazon, Netflix, Facebook, and (most perniciously, Pariser argues) Google are incredibly good at showing us content that similar to content we’ve already looked at. The cumulative effect of all this, Pariser argues, is that if we do nothing we wind up living in a tightly circumscribed online world filled with information, ideas, and outlooks already familiar to us: the “filter bubble” of the title. Pariser also has reservations about the ways in which companies like Google and Facebook gather, store, and use information about us: the raw material their algorithms use to decide what we want to see. The internet shows us what we want to see, not what we need to see, and that deeply frustrates him.
What frustrated me, for virtually the entire length of the book, is that Pariser seems far more concerned with warning readers that they’re on the road that leads to filter-bubble Hell than with asking why that particular route might have seemed – or might still seem – more attractive than the other routes available. He never stops, for example, to consider why filters feel like essential tools when exploring even a narrow, bounded world like Facebook (much less the web as a whole): A hyper-abundance of information, a horrific signal-to-noise ratio, and users with limited time and shaky information- literacy skills. Filtered search results and tailored news feeds have flourished, in part, because people find them useful and efficient.
Pariser, who wants them to return a higher proportion of results that aren’t just what the user would expect (and thus want) is thus in the odd position of arguing that search engines would be improved if they were – in the eyes of most users – made less efficient. Arguing that efficiency isn’t an absolute virtue is far from absurd (it works for hand-dipped milkshakes, artisan bread, and craft-brewed beer) but it’s hard to see it being used to sell lifeboat bilge pumps or body armor. Or search engines. Some things, you just want to be boringly efficient.
The premise underlying Pariser’s case for less-tightly-filtered, (and thus seemingly less-efficient) search engines and news feeds isn’t absurd, either. It’s that “efficiency” in search isn’t giving the user the information they want, it’s giving them the information they need – the information that will make them better informed, better able to think, and thus better able to deal with the world. It’s far from clear, however, that an internet search engine programmed (by others) to give them that is any more desirable than one programmed (by others) to give them just what they want. It’s also far from clear that most people, if presented with that broader range of information, would not – using their own homegrown filters – immediately weed out (as “irrelevant,” “biased,” “uninteresting” or simply “wrong”) precisely the information that Pariser is so determined to provide them with.