Tag Archives: Facebook

A World of Feelings, Not Facts 7/10

In our online world, increasingly dominated by social media, feelings are more important than facts.

This is the assertion of Hossein Derakhshan, an idea that I first encountered on Screened Off: The Dangers of the Insular Web.

As an online teacher, I always considered video to be an excellent tool to engage learners.  But do we understand how this engagement impacts learning and thinking?

Streaming video was enabled with the spread of high-speed internet, and more often now, we consume this emotion-driven form of content over text.  This makes the Internet far more like watching television than reading a book or an article.

Increasingly,  our online time is spent inside apps, not on the open web.  According to Hossein Derakhshan, a blogger and writer based in Iran,

“Like TV it now increasingly entertains us, and even more so than television it amplifies our existing beliefs and habits. It makes us feel more than think, and it comforts more than challenges. The result is a deeply fragmented society, driven by emotions, and radicalized by lack of contact and challenge from outside. This is why Oxford Dictionaries designated “post-truth” as the word of 2016: an adjective “relating to circumstances in which objective facts are less influential in shaping public opinion than emotional appeals.””

Subject matter that we consume inside apps, like Facebook, entertains us.  Corporate owners design these spaces to keep us there, because the longer we stay, the more money they make.  Negativity does not encourage engagement, so we are presented with “feel good” content.

Instead of engaging in dialog around important social issues, and reading text, where “facts” can be challenged, more and more, we choose to spend our time in the comfort of our apps, accepting the diet of targeted, advertisement-driven mush, fed to us by the algorithm that controls our content.

Watching video (television) instead of reading text means that we become victims of those who have mastered this one-way, emotion driven form of media, and we become passive consumers of those who grab our attention instead of citizens engaged in discourse around facts.

How does this impact our understanding of our world?

derakhshan-on-tv-online
CBC Ideas Podcast: Screened Off: The Dangers of the Insular Web http://www.cbc.ca/radio/ideas/screened-off-the-dangers-of-an-insular-web-1.3937638

____________________

Featured image shared by Toa Heftiba on Unsplash

All of the posts in this series can be found here:

3/10 – How the “smart phone” and mobile apps have changed the way we interact online

4/10 – Historical perspective – the co-created open web to corporately owned platforms

5/10 – Algorithms: What’s controlling what you see and read?

6/10 – Information Literacy: What will your lesson plan look like now?

8/10 – Popularity over Importance: Celebrity culture in a time of wicked world problems

9/10 – The Attention Economy

10/10 – Escape Your [Filter] Bubble

This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina! 

~~~~~~~~~~~~~~

A caution about this post: I am a learner, not an expert.  

I have set out here to use my #10posts10days  (#10days10posts) challenge to explore this area that deeply interests me, in an open way that lets others see what I am learning.  If you know more than I do, please correct me if necessary, and share! If you have more questions, please post those in the comments too.  Let’s learn more together.

All of the posts in this series can be found here: You Live in a Bubble

 

Text vs. World Trumps

Who’s Controlling Your World? 5/10

According to former Wikimedia Executive Director, Sue Gardner, we spend three hours each day inside our mobile apps, for every hour we spend on the open web (CBC Ideas, Jan 16, 2017).

While in many of those apps, what we are exposed to is largely controlled by algorithms.  Our feeds are “personalized”, based on our previous behaviour in the app.

A friend of mine recently shared that she “liked” something on Facebook that was posted by one of her daughter’s friends, and suddenly, her own feed was populated by all kinds of content targeted for gay women.  The daughter’s friend openly identifies on Facebook as a Lesbian.

My husband often complains to me that he misses pictures of his granddaughter that I see on Facebook, even though we both “friends” with her parents.

Just because you follow someone, doesn’t mean you will see everything they post. What you do see, is controlled by an algorithm that selects your content based on your past activity. A few months ago, Facebook fired its human editors in favour of more algorithm-determined content.

In the CBC Ideas podcast, Sue Gardner, describes what happens when she creates a fake Facebook identity as a Trump supporter – how a feed full of fake news targeting Hilary Clinton suddenly appeared (36:54 here).

facebook-by-bill-ferriter
Shared by Bill Ferriter CC By-NC 2.0 https://www.flickr.com/photos/plugusin/15147354484/

The goal of an app is to keep you in there, so that you can be tracked, and fed a diet of personalized advertising.  The longer you are there, the more information the app has about your interests, and the algorithm can then feed you the very things that you like – enticing you to stay longer and create more advertising revenue.

Like Brodie Fenton says, “Aleppo doesn’t share well” (unless it is packaged as an image of a child that entices clicks).  On the apps, world news looks exactly like cat videos, pictures of our “friends”,  and advertising, and the platforms are set up to discourage further exploration, handing you the simple like/share buttons to touch on your phone before you move on.

What are the social consequences of seeing only what you like, and what you enjoy?

Human curators can make mistakes, if they don’t have good judgement. But human curators can also explain their choices and actions.

Algorithms are secret, can’t be held responsible, can change without notice, and filter based on discriminatory practices.

This is censorship on a grand scale.

Kin Lane, in the Tech Gypsies Podcast episode below, pulled out this quote:

“And Facebook is just one player in complex ecology of algorithmically-supplemented determinations with little external monitoring to see how decisions are made or what the effects might be.”

Kate Crawford and Meredith Whittaker

According to Hossein Derakhshan,  right now, algorithms are determining “who we date, what we eat, where we shop, what we see, what we buy” without oversight.

He argues that the algorithms should be both viewable and customizable by us.

What if, instead of popularity and newness, we valued diversity and quality?

How would that change the experience of so many people for hours every day?  How might that impact thinking and behaviour both inside and outside the corporately controlled digital environments we enter into so willingly?

Facebook is a “technology” company, and therefore is not subject to the same restrictions as a “media” company.

Is government involvement needed in digital environments to ensure democratic values are upheld online?

Featured image by ApolitikNow CC BY-NC-SA 2.0

All of the posts in this series can be found here:

3/10 – How the “smart phone” and mobile apps have changed the way we interact online

4/10 – Historical perspective – the co-created open web to corporately owned platforms

6/10 – Information Literacy: What will your lesson plan look like now?

7/10 – Videos and Images – From Facts to Feelings

8/10 – Popularity over Importance: Celebrity culture in a time of wicked world problems

9/10 – The Attention Economy

10/10 – Escape Your [Filter] Bubble

~~~~~~~~~~~~~~~

This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina! 

~~~~~~~~~~~~~~

A caution about this post: I am a learner, not an expert.  

I have set out here to use my #10posts10days  (#10days10posts) challenge to explore this area that deeply interests me, in an open way that lets others see what I am learning.  If you know more than I do, please correct me if necessary, and share! If you have more questions, please post those in the comments too.  Let’s learn more together.

All of the posts in this series can be found here: You Live in a Bubble

__________________________________________

From A Level Playing Field to a Few Empires: What Happened to the Web? 4/10

This post is part of a 10 day posting challenge issued by Tina Zita. You can’t be a connected educator if you don’t contribute. Sometimes we need a nudge to remember that if nobody shares, nobody learns. Thanks Tina! 

~~~~~~~~~~~~~~

A caution as you read this: I am a learner, not an expert.  

I have set out here to use my #10posts10days  (#10days10posts) challenge to explore this area that deeply interests me, in an open way that lets others see what I am learning.  If you know more than I do, please correct me if necessary, and share! If you have more questions, please post those in the comments too.  Let’s learn more together.

All of the posts in this series can be found here: You Live in a Bubble

__________________________________________

The first timetable I was assigned as a new secondary school teacher was 3 sections of DIC2A/2G: Introduction to Computers.  It was all about DOS, binary numbers, file storage and hacking – before the excitement around Windows GUI – and I spent more time fixing the network than I did teaching!  But it was a popular course, and we recommended it to students.  Computers were new, but we seemed to understand that learning about them was important.

PET personal computer
Wikimedia

A few years later, in the mid-1990’s, when the “Internet” arrived for most users, we taught most kids to craft websites using HTML code.

Anyone could make a website to share information, and the we saw the Internet as a place that gave people voices.

netscape-navigator-by-kurazaybo-martinez-cabal
Shared by Kurazaybo Martinez Cabellaro CC BY-NC-SA 2.0

Early users accessed the web and created HTML pages through Netscape, but that all changed with the release of Microsoft’s Internet Explorer, the first “monopoly” on the web. By 2002, 95% of online computer users were accessing the Web through this browser, but only 11% of the world’s population was online.  As it lost the “browser wars” with IE, Netscape opened-sourced its code and Mozilla was born.

According to Mark Surman, CEO of Mozilla, developers were essentially able to maintain free access to the web by ensuring free alternatives were available.

Today, though, this has shifted.  Essentially four companies – Apple, Facebook, Amazon and Google – control the internet, making it next to impossible to compete, or to be heard online, if you don’t fit the monetization strategies of the tech giants.

In a world where nearly 50% of the world’s population is online, yet many users think Facebook IS the Internet,  how do we get back to an online world where all users have a voice?

From CBC Ideas: Screened Off – The Dangers of the Insular Web

“If you’re going to engage the modern world, you’re going to use the internet the way tech companies are making it for you. And you’re going to benefit from it in a bunch of ways. But you’re not really exercising a completely free choice.”
Tech thinker Sue Gardner

Tomorrow, we’ll look at how the tech giants use algorithms to control what you see online.

Featured image from Wikimedia

All of the posts in this series can be found here:

3/10 – How the “smart phone” and mobile apps have changed the way we interact online

5/10 – Algorithms: What’s controlling what you see and read?

6/10 – Information Literacy: What will your lesson plan look like now?

7/10 – Videos and Images – From Facts to Feelings

8/10 – Popularity over Importance: Celebrity culture in a time of wicked world problems

9/10 – The Attention Economy

10/10 – Escape Your [Filter] Bubble