“The Filter Bubble”
Background: As a young political activist, author and entrepreneur Eli Pariser became interested in how the internet influenced Americans’ civic engagement. He started to raise awareness about how technology companies limited what users viewed online. In 2010, Pariser coined the term “the filter bubble” to describe how these companies promoted ideological silos online, in which people only viewed content that they likely already agreed with.
Today’s Internet giants — Google, Facebook, Yahoo and Microsoft — see the remarkable rise of available information as an opportunity. If they can provide services that sift through the data and supply us with the most personally relevant and appealing results, they’ll get the most users and the most ad views. As a result, they’re racing to offer personalized filters that show us the Internet that they think we want to see. These filters, in effect, control and limit the information that reaches our screens.
By now, we’re familiar with ads that follow us around online based on our recent clicks on commercial Web sites. But increasingly, and nearly invisibly, our searches for information are being personalized too. Two people who each search on Google for “Egypt” may get significantly different results, based on their past clicks. Both Yahoo News and Google News make adjustments to their home pages for each individual visitor. And just last month, this technology began making inroads on the Web sites of newspapers like The Washington Post and The New York Times.
All of this is fairly harmless when information about consumer products is filtered into and out of your personal universe. But when personalization affects not just what you buy but how you think, different issues arise. Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.…
Companies that make use of these algorithms must take this curative responsibility far more seriously than they have to date. They need to give us control over what we see — making it clear when they are personalizing, and allowing us to shape and adjust our own filters. We citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.
It is in our collective interest to ensure that the Internet lives up to its potential as a revolutionary connective medium. This won’t happen if we’re all sealed off in our own personalized online worlds.
Source: Eli Pariser. “When the Internet Thinks It Knows You.” The New York Times, May 22, 2011.