While in many of those apps, what we are exposed to is largely controlled by algorithms. Our feeds are “personalized”, based on our previous behaviour in the app.
A friend of mine recently shared that she “liked” something on Facebook that was posted by one of her daughter’s friends, and suddenly, her own feed was populated by all kinds of content targeted for gay women. The daughter’s friend openly identifies on Facebook as a Lesbian.
My husband often complains to me that he misses pictures of his granddaughter that I see on Facebook, even though we both “friends” with her parents.
Just because you follow someone, doesn’t mean you will see everything they post. What you do see, is controlled by an algorithm that selects your content based on your past activity. A few months ago, Facebook fired its human editors in favour of more algorithm-determined content.
In the CBC Ideas podcast, Sue Gardner, describes what happens when she creates a fake Facebook identity as a Trump supporter – how a feed full of fake news targeting Hilary Clinton suddenly appeared (36:54 here).
The goal of an app is to keep you in there, so that you can be tracked, and fed a diet of personalized advertising. The longer you are there, the more information the app has about your interests, and the algorithm can then feed you the very things that you like – enticing you to stay longer and create more advertising revenue.
Like Brodie Fenton says, “Aleppo doesn’t share well” (unless it is packaged as an image of a child that entices clicks). On the apps, world news looks exactly like cat videos, pictures of our “friends”, and advertising, and the platforms are set up to discourage further exploration, handing you the simple like/share buttons to touch on your phone before you move on.
What are the social consequences of seeing only what you like, and what you enjoy?
Human curators can make mistakes, if they don’t have good judgement. But human curators can also explain their choices and actions.
Algorithms are secret, can’t be held responsible, can change without notice, and filter based on discriminatory practices.
This is censorship on a grand scale.
“And Facebook is just one player in complex ecology of algorithmically-supplemented determinations with little external monitoring to see how decisions are made or what the effects might be.”
According to Hossein Derakhshan, right now, algorithms are determining “who we date, what we eat, where we shop, what we see, what we buy” without oversight.
He argues that the algorithms should be both viewable and customizable by us.
What if, instead of popularity and newness, we valued diversity and quality?
How would that change the experience of so many people for hours every day? How might that impact thinking and behaviour both inside and outside the corporately controlled digital environments we enter into so willingly?
Facebook is a “technology” company, and therefore is not subject to the same restrictions as a “media” company.
Is government involvement needed in digital environments to ensure democratic values are upheld online?
Featured image by ApolitikNow CC BY-NC-SA 2.0
All of the posts in this series can be found here:
This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina!
A caution about this post: I am a learner, not an expert.
I have set out here to use my #10posts10days (#10days10posts) challenge to explore this area that deeply interests me, in an open way that lets others see what I am learning. If you know more than I do, please correct me if necessary, and share! If you have more questions, please post those in the comments too. Let’s learn more together.
All of the posts in this series can be found here: You Live in a Bubble