Tag Archives: algorithm

A World of Feelings, Not Facts 7/10

In our online world, increasingly dominated by social media, feelings are more important than facts.

This is the assertion of Hossein Derakhshan, an idea that I first encountered on Screened Off: The Dangers of the Insular Web.

As an online teacher, I always considered video to be an excellent tool to engage learners.  But do we understand how this engagement impacts learning and thinking?

Streaming video was enabled with the spread of high-speed internet, and more often now, we consume this emotion-driven form of content over text.  This makes the Internet far more like watching television than reading a book or an article.

Increasingly,  our online time is spent inside apps, not on the open web.  According to Hossein Derakhshan, a blogger and writer based in Iran,

“Like TV it now increasingly entertains us, and even more so than television it amplifies our existing beliefs and habits. It makes us feel more than think, and it comforts more than challenges. The result is a deeply fragmented society, driven by emotions, and radicalized by lack of contact and challenge from outside. This is why Oxford Dictionaries designated “post-truth” as the word of 2016: an adjective “relating to circumstances in which objective facts are less influential in shaping public opinion than emotional appeals.””

Subject matter that we consume inside apps, like Facebook, entertains us.  Corporate owners design these spaces to keep us there, because the longer we stay, the more money they make.  Negativity does not encourage engagement, so we are presented with “feel good” content.

Instead of engaging in dialog around important social issues, and reading text, where “facts” can be challenged, more and more, we choose to spend our time in the comfort of our apps, accepting the diet of targeted, advertisement-driven mush, fed to us by the algorithm that controls our content.

Watching video (television) instead of reading text means that we become victims of those who have mastered this one-way, emotion driven form of media, and we become passive consumers of those who grab our attention instead of citizens engaged in discourse around facts.

How does this impact our understanding of our world?

derakhshan-on-tv-online
CBC Ideas Podcast: Screened Off: The Dangers of the Insular Web http://www.cbc.ca/radio/ideas/screened-off-the-dangers-of-an-insular-web-1.3937638

____________________

Featured image shared by Toa Heftiba on Unsplash

All of the posts in this series can be found here:

3/10 – How the “smart phone” and mobile apps have changed the way we interact online

4/10 – Historical perspective – the co-created open web to corporately owned platforms

5/10 – Algorithms: What’s controlling what you see and read?

6/10 – Information Literacy: What will your lesson plan look like now?

8/10 – Popularity over Importance: Celebrity culture in a time of wicked world problems

9/10 – The Attention Economy

10/10 – Escape Your [Filter] Bubble

This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina! 

~~~~~~~~~~~~~~

A caution about this post: I am a learner, not an expert.  

I have set out here to use my #10posts10days  (#10days10posts) challenge to explore this area that deeply interests me, in an open way that lets others see what I am learning.  If you know more than I do, please correct me if necessary, and share! If you have more questions, please post those in the comments too.  Let’s learn more together.

All of the posts in this series can be found here: You Live in a Bubble

 

Text vs. World Trumps

Who’s Controlling Your World? 5/10

According to former Wikimedia Executive Director, Sue Gardner, we spend three hours each day inside our mobile apps, for every hour we spend on the open web (CBC Ideas, Jan 16, 2017).

While in many of those apps, what we are exposed to is largely controlled by algorithms.  Our feeds are “personalized”, based on our previous behaviour in the app.

A friend of mine recently shared that she “liked” something on Facebook that was posted by one of her daughter’s friends, and suddenly, her own feed was populated by all kinds of content targeted for gay women.  The daughter’s friend openly identifies on Facebook as a Lesbian.

My husband often complains to me that he misses pictures of his granddaughter that I see on Facebook, even though we both “friends” with her parents.

Just because you follow someone, doesn’t mean you will see everything they post. What you do see, is controlled by an algorithm that selects your content based on your past activity. A few months ago, Facebook fired its human editors in favour of more algorithm-determined content.

In the CBC Ideas podcast, Sue Gardner, describes what happens when she creates a fake Facebook identity as a Trump supporter – how a feed full of fake news targeting Hilary Clinton suddenly appeared (36:54 here).

facebook-by-bill-ferriter
Shared by Bill Ferriter CC By-NC 2.0 https://www.flickr.com/photos/plugusin/15147354484/

The goal of an app is to keep you in there, so that you can be tracked, and fed a diet of personalized advertising.  The longer you are there, the more information the app has about your interests, and the algorithm can then feed you the very things that you like – enticing you to stay longer and create more advertising revenue.

Like Brodie Fenton says, “Aleppo doesn’t share well” (unless it is packaged as an image of a child that entices clicks).  On the apps, world news looks exactly like cat videos, pictures of our “friends”,  and advertising, and the platforms are set up to discourage further exploration, handing you the simple like/share buttons to touch on your phone before you move on.

What are the social consequences of seeing only what you like, and what you enjoy?

Human curators can make mistakes, if they don’t have good judgement. But human curators can also explain their choices and actions.

Algorithms are secret, can’t be held responsible, can change without notice, and filter based on discriminatory practices.

This is censorship on a grand scale.

Kin Lane, in the Tech Gypsies Podcast episode below, pulled out this quote:

“And Facebook is just one player in complex ecology of algorithmically-supplemented determinations with little external monitoring to see how decisions are made or what the effects might be.”

Kate Crawford and Meredith Whittaker

According to Hossein Derakhshan,  right now, algorithms are determining “who we date, what we eat, where we shop, what we see, what we buy” without oversight.

He argues that the algorithms should be both viewable and customizable by us.

What if, instead of popularity and newness, we valued diversity and quality?

How would that change the experience of so many people for hours every day?  How might that impact thinking and behaviour both inside and outside the corporately controlled digital environments we enter into so willingly?

Facebook is a “technology” company, and therefore is not subject to the same restrictions as a “media” company.

Is government involvement needed in digital environments to ensure democratic values are upheld online?

Featured image by ApolitikNow CC BY-NC-SA 2.0

All of the posts in this series can be found here:

3/10 – How the “smart phone” and mobile apps have changed the way we interact online

4/10 – Historical perspective – the co-created open web to corporately owned platforms

6/10 – Information Literacy: What will your lesson plan look like now?

7/10 – Videos and Images – From Facts to Feelings

8/10 – Popularity over Importance: Celebrity culture in a time of wicked world problems

9/10 – The Attention Economy

10/10 – Escape Your [Filter] Bubble

~~~~~~~~~~~~~~~

This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina! 

~~~~~~~~~~~~~~

A caution about this post: I am a learner, not an expert.  

I have set out here to use my #10posts10days  (#10days10posts) challenge to explore this area that deeply interests me, in an open way that lets others see what I am learning.  If you know more than I do, please correct me if necessary, and share! If you have more questions, please post those in the comments too.  Let’s learn more together.

All of the posts in this series can be found here: You Live in a Bubble

__________________________________________

You Live in a Bubble – A Filter Bubble 2/10

This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina! 

~~~~~~~~~~~~~

In this year’s 10 posts in 10 days challenge, I wanted to find a theme for my posts. It had to be something I needed to explore more deeply, but I also wanted to present the ideas as an organized curation to better help others approach a topic that might be new to them.

In 2016, I was intrigued by how Silvia Rosenthal Tolisano offered her readers a structure to learn from her thinking.  By providing an advance organizer, she linked her ideas together in a way that helped her readers navigate through the topic in a meaningful way.

Recently, the idea that we live in an information filter bubble, and that these “bubbles” are part of massive social change, has entered into conversations on multiple platforms.  I want to learn more about how we came to this place, and the strategies and habits we need to intentionally escape from those bubbles.

I am planning the following posts, and I will link back to them here  (this is my initial thinking, so as I learn more, the topics might shift).

3/10 – How the “smart phone” and mobile apps have changed the way we interact online

4/10 – Historical perspective – the co-created open web to corporately owned platforms

5/10 – Algorithms: What’s controlling what you see and read?

6/10 – Information Literacy: What will your lesson plan look like now?

7/10 – Videos and Images – From Facts to Feelings

8/10 – Popularity over Importance: Celebrity culture in a time of wicked world problems

9/10 – The Attention Economy

10/10 – Escape Your [Filter] Bubble

 

I welcome your feedback, comments and suggestions as I learn more about who has the power in our interconnected, yet insular, digital world.

___________________________________________________________

Some initial resources:

CBC Ideas: The Insular Web

Tech Gypsies Podcast (caution – adult language)

Danah Boyd – Why America is Self-Segregating

 

 

Featured image by Califmom, shared under a CC BY-NC-ND 2.0 license.