Running with a podcast, the filter bubble and its relevance for policy research

I like to run and listen to podcasts. While listening to people’s talks about, for example, the demonstrations in Egypt or the changes that Barak Obama has brought to America, the steady rhythm of the run, which is almost hypnotic, makes me feel as part of the audience.  Particularly in days like today, when I in a soft and warm rain brought by the low clouds that hide the top of Mount Talinis, just behind Dumaguete.

This afternoon I listened to a very interesting presentation made at the LSE in London. The presenter was Eli Pariser who has written a well-received book titled: The Filter Bubble. You can find the podcast of his presentation + the Q&A session on the LSE website. You can also watch an almost identical presentation at a TED event on his blog (http://www.thefilterbubble.com/).

At the 3km mark, I have learned few things that have really surprised me. Google has been personalizing the search results since 2009. This means that if I Google ‘decentralisation processes in Southeast Asia’ at home on my Mac I get some results. If I do the same in London on a desktop at ODI the results may change a bit. But what is more surprising for me is that if me and my wife do the same search, sitting next to each other and using the same key words on our laptops, it is unlikely that we will get the same search results.

The reason is that our search and browsing history + all the subscriptions we have made + the hardware configuration of our PCs or laptops + the software (and cookies) installed in our computers are like fingerprints which suggest to Google algorithms what our preferences are. Google then provide us with search results in line to those perceived preferences.

Google ‘knowledge’ of our preferences is based on statistical analysis and it is valuable in the sense that allows advertising companies (that buy those information) to better target and personalize their ads.

Facebook does something similar.  Pariser notes that ‘if you have hundreds of Facebook friends, you see relevant updates only from the closest of them. Facebook, in other words, relies on your earlier interactions to predict what, and who, is most likely to interest you.’

All this happens without us knowing. Choices are made without us being aware of being able to influence them. Parisier therefore asks whether this personalization of the internet is a good thing or not.

To some extent he think it is. The Exabyte of information, which are uploaded and made available every single day, require some sort of filtering, he says. But he is also worried. He thinks that personalization filters are a kind of invisible auto propaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.

I looked at my wrist-watch. I passed the 5.5km, 2.5km still to go. While I thought how many 400m loops are left, I found myself thinking about the relevance of Pariser’s worries for policy research, i.e. research that aims at changing and influencing policies.

I havie always thought that the Internet is not value free. Information and knowledge never are. But I thought that the web made it easier to look for alternative views that can help to provide an as-full-as-possible picture around a certain policy issue. Form Pariser arguments emerges that algorithms are not value-free as well, which means that it is probably becoming more difficult for me, given my Internet history, to find opinions not in line with my preferences.

I know, one way to overcome this is to dig a bit deeper into the information available in the web.  But those more in depth searches will leave a digital mark behind and follow me in in the future. They will re-shape my personal filter bubble and influence my future work and research.

The aim of policy research is to influence policies and/or policy makers. To do so they need to influence a wide audience and shape public opinion and/or be targeted to specific policy actors. A communication strategy should always entail a range of tools rather the simple upload of a publication on the web. It should make use of e.g. interviews and reviews by media, direct communication with key policy makers and decision makers, presentation at conferences and workshops, and the publication in journals. The Internet, however, plays a direct and indirect role with all these tools. The concerns highlighted by Perisier show in my opinion that it may become more difficult to reach policy makers and decision makers as they are somehow shielded by their own filter bubbles.

One more km to go. Time to wrap-up. The economist Gunnar Myrdal came to my mind towards the end of the run. In the 1960s, he argues that no research is value-free. He believed that point of views require view-points. Thus, researchers have to state what their values and beliefs are very clearly to provide a stronger legitimacy to their work.

The problem now is that the filter bubble makes it more difficult to find points of views that are different from ours, regardless whether they are stated clearly or not. Moreover this is not our individual choice, but choices made by what an algorithm ‘think’ are our preferences and values.  This threatens to the relevance of the thesis we research, the antithesis we analyze and may result in a weaker synthesis.

Final loop to reach 8km and final open questions: Do we need really to worry about our policy research if this is the way Internet is going? Do you think that communicating and targeting our policy research has become more difficult? How do we reduce the chance that our values and view points will be filtered in their way to other internet users and particularly the policy makers we want to influence?

2 thoughts on “Running with a podcast, the filter bubble and its relevance for policy research”

  1. Funny coincidence: I happened to listen to I guess pretty much the same lecture the day before yesterday, this one recorded at the RSA (http://www.thersa.org/events/audio-and-past-events/2011/the-filter-bubble-how-the-hidden-web-is-shaping-lives). Reminded me of the echo chamber argument, that was tossed around some years back when blogs were becoming more popular. I didn’t buy into that, but this new filtering through automatic personalizing is quite more insidious.

    Filtering of course is a inevitable reaction to information overload. I guess this just requires from us new kinds of skills to recognize the context we are receiving the information, the same kind of intuition we have created with the traditional media. The problem of course is that the laws and principles of the traditional media changed very slowly allowing us to read between the lines.

Leave a Comment

Your email address will not be published. Required fields are marked *

15 − 10 =