Posted by & filed under Technology.

Netflix Uses an Algorithm

Most people who browse the internet are vaguely aware that their previous browsing history helps customise the search results. In fact, the tailoring of results is much deeper than most people appreciate, taking into account some 57 individual data points it knows about people before giving them their search results.
This gives rise to debate about the broader privacy issue, and was explored by Eli Pariser president of MoveOn.org in his book The Filter Bubble (2011). He exposed the ‘invisible algorithmic editing’ that goes on so people see what the search system thinks they want to see.

Pariser explained in an interview in May 2011, that ‘your filter bubble is the personal universe of information that you live online’. It is unique to you, constructed for you alone from the array of personalised filters that now power the web. Facebook contributes data, Google tailors your searches, Yahoo News and Google News edit your news for you…

Pariser called the bubble a ‘comfortable place’ for us as individuals, bristling with the things that compel us to click. His concern was that those are not necessarily the things we need to know. He found that Google wasn’t alone in filtering data for people, although Wikipedia was the notable exception.

He conceded that algorithms once rolling are very cost efficient for search companies, but the lack of human curatorial values are worrying. If you ‘like’ something it doesn’t follow that you really do like it, him/her or them. To ‘follow’ a company on Facebook is not really to follow them, but to be sucked into giving your data to them.
The Netflix example made Pariser’s point about human input. Netflix uses an algorithm called Root Mean Squared Error (RMSE) which calculates distance between movies. This is good at predicting what movies you will like, but it relies on certainties, and is not very adventurous. Human curators take risks with movies you might or might not like.

Pariser believed that the primary purpose of an editor (of information) is to ‘extend the horizon of what people are interested in’. The same old stuff is dull. To him, personalisation is ‘privacy turned inside out’. We ought to have more control over what we get to see. The filter bubble is ‘pernicious invisibility’ as we don’t know it’s happening.

Well, now we do. Is there anything we can do about it? Pariser suggests varying your routes to things constantly. Erasing browser cookies only deletes part of the data about you. If you’re working on a specific computer even when not online, THEY know.

That well be all, apart from not giving any information about yourself beyond the strict minimum to function in the data age. Easier said than done.

The future may force us to be more proactive. The web connects people, helping to offer a better, democratic world creating meaningful, peaceful, organic change in society. But while the net is good at putting like-minds together, it doesn’t, as Pariser said, ‘it’s not so hot at introducing people to different people and ideas. Democracy requires discourse and personalization is making that more and more elusive. And that just won’t happen if we’re all isolated in a web of one’.

Read On:
Interview by Maria Popova with Eli Pariser, May 2011.
The Filter Bubble: What the Internet Is Hiding From You, Penguin Press, May 2011.
MoveOn.org. Democracy in Action.
The Internet Hides More Secrets than a Magician’s Box of Tricks.