"On what basis can we trust "knowledge" acquired from a range of sources?"
1.1 Reliability and Integrity, 1.12 Digital Citizenship, 3.5 Internet
The BBC article Are we trapped in our own web bubbles? and Eli Pariser's TED talk 'Beware online filter
bubbles' are two resources that discuss how personalised search results could limit our access to new information.
Search engines play a major role in providing access to knowledge and information. The order of the links appearing in search results therefore has a significant impact
on the types of information that will be accessed by the majority of people (witness how many people only ever use the first page - or even half page - of search results). Additionally, some search engines and social media sites have started to use personalised search results, which can prioritise results that are similar to pages we have previously viewed - thus
forming a so-called 'search bubble' or 'filter bubble' that might limit our exposure to new views.
Despite this, there is still some
debate over just how significant the filter bubble effect is. A 2015 study of Facebook data suggested the effect was minimal or non-existent - but the study itself was quickly criticised.
Filter bubbles returned to the media spotlight after political events including the election of Donald Trump and the UK Brexit vote.
The Guardian attempted to examine the effect in 2018, while the University of Illinois has an interesting page examining the effect and presenting an experiment you can try for yourself.
This can be a useful starting point for exercise 1.8, and also links closely to the IB Theory of Knowledge (TOK) course.