Thanks Facebook. The why, the how, and the “wtf” of the attention economy.

Visualize your middle school cafeteria. The goths sat together, the jocks together somewhere else, the bookworms at another table. Human nature entices us to strive to be around people who share our worldview.
We want to feel heard and understood, so we surround ourselves with people who share our worldviews.
It’s natural, and to a certain extent, a healthy part of our culture.
The Modern (digital) Lunch Table
Zoom forward to today. Most people over the age of 13 (and many of their pets) have a Facebook or Instagram account.
We choose who we friend. We self-select.
I’ve seen & heard of many people who selectively unfriend or mute people who don’t share their political views. You might have done something like this, especially around election time.
Again, here we go self-selecting the people around us and the stories we see. It’s the lunchroom tables all over again.
To a point, this can be helpful. I don’t need to see my Uncle rant about political or ideological views that don’t match my worldview, it’s not healthy for our relationship and it’s not necessary. Muting that conversation (or in my case, using a timeline blocker altogether so I don’t see anyone’s updates) allows me to keep my relationships, and sanity, intact.
What happens when it’s not the user selecting the content they will consume?
Unfortunately, it doesn’t look good for us when “someone” else is driving. A key influencer in this interaction goes silently unnoticed, even though its impact is huge… the algorithm.
When an unthinking, unfeeling piece of code is in the driver’s seat we begin to see strange outcomes that the average human moderator wouldn’t choose, or predict.
These algorithms are huge, unwieldy collections of code, and often no single person understands precisely how they work. They are coded with the intention of specific goals, and this narrow-sighted mission can create a cascade of weird side effects for users.
How an algorithm radicalizes and separates humans
A social media algorithm is built with one key purpose in mind, to engage you and keep you on the platform. This stems from the business model of most major players in the social media world.
We’re talking Youtube, Facebook & Instagram.
They make money by selling your attention to advertisers. This means the longer they keep you on the platform, the more ads they can show you, and the more money they make.
It’s an attention-based model, and your attention is the product for sale.
Over time, these algorithms learned what keeps us engaged and online longer, and they use these insights to keep you scrolling.
It’s not deliberately negative, but the cumulative impact of the algorithm’s decisions prioritizes you staying on the platform longer, using whatever content will achieve that end.
It’s hard to consider this a perk. In a public place, look around you at the people with their heads down & their attention buried in their phone. Clearly, these technologies are successful at pulling us into their attention economy, whether we are cognizant of their impact or not.
The most compelling content for users is the most emotionally triggering.
Cute puppies get an “awww” but don’t always compel us to share. When we see something that triggers outrage or other fiery emotions, we can’t help but reply or share or express ourselves.
The algorithm knows this. It’s not doing it to deliberately piss you off, it only “understands” that this type of content increases engagement, so it keeps showing you more of it.
This creates a ratchet effect, the more charged a piece of content is the more it spreads, and the more it spreads the more we see topics that create highly charged emotional states.
Getting a political education from YouTube’s suggested videos
On Youtube, this comes in the form of suggested videos. The algorithm understands that to keep you watching the content needs to ratchet up and hook your attention with novel, and increasingly sensational content.
A while back this meant that a teenage girl looking for diet videos would soon be suggested pro-anorexia videos.
YouTube has since fixed this “bug” after being called out on it, but what other dark rabbit holes have public outcry not been able to highlight yet?
This is how someone gets radicalized and polarized by an algorithm.
Someone looking for political news is suggested increasingly radical and polarizing content, and where they may have found common ground with others before, they are now being increasingly indoctrinated by divisive content.
Not long ago, the Amazon forest burned, with much of the blame falling in the lap of their new alt-right government. A government that was elected thanks to radical videos, it sounds like madness but YouTube successfully radicalized an entire country.
It goes both ways too, someone looking at left-leaning content will be pushed further left, while someone looking at right-leaning content will be pushed further to radical right videos.
What are we left with? People who are increasingly divided, radicalized and separated.
The same thing happens with the other major players in the Social Media world that are advertising based such as Facebook & Twitter. The algorithms are not coded to consider cultural impact, they’re coded to win your attention. And they do. These algorithms are “technically” neutral, but their impact is not.
Where these platforms are pitched as technologies that allow us to connect, in reality, they push us further apart.
When you are regularly fed polarizing content it becomes increasingly hard to find common ground with people who are different from you.
At the same time, it’s even harder to be heard.
Who hasn’t felt stressed by the media? Every day is a new crisis because the media has learned to play the “win eyeballs” game, monetizing their content through clicks & traffic.
What happens when this chatter hits a fever pitch?
If every day is a crisis, and every person is a critic, and everyone can speak out, and does, about the slightest grievances…
What happens when we truly need to be heard? When a crisis really happens?
Marin Niemöller was a pastor & social activist who spoke out against Nazi Germany. One of his better-known missives is as follows:
When the Nazis came for the communists,
I remained silent;
I was not a communist.
[…]
When they came for the Jews,
I remained silent;
I wasn’t a Jew.
When they came for me,
there was no one left to speak out.
It’s scary, but in a world where we are all empowered to speak out we would assume it would be easier to be heard and this fear would no longer be a problem. This doesn’t seem to be the case.
In her newest book, How To Do Nothing, Jenny O’Dell explores how the fever pitch rate of information we receive ends up making it harder for us to notice what really matters.
The coffers of our attention are full to overflowing, so adding more is just more. We’re no longer able to distinguish what truly matters when our attention is overburdened with every new crisis and drama of the day.
O’Dell calls for a need, not for all-out quitting Facebook or never reading the news, but training ourselves to have the ability to not just withdraw attention, but to invest it somewhere else and tune its acuity. This is a big job too, the ability to pull back from the 24-hr or shorter media cycles and pause for consideration for longer time scales.
We must “study the ways that media and advertising play upon our emotions, to understand the algorithmic versions of ourselves that such forces have learned to manipulate, and to know when we are being guilted, threatened, and gaslighted into reactions that come not from will and reflection, but from fear and anxiety…”
When every day is a crisis in the media, and even our friends publish hastily written rants on the regular, it becomes easier to tune it all out. O’Dell argues that tuning it out, while a natural reaction when we’re drowning in information, is an outcome we can’t afford. In a cycle where financially driven platforms close down the space of attention — the very attention needed to resist this onslaught — it may be only in the space of our own minds that some of us can begin to reclaim our agency.
When our human rights are truly threatened and we need to speak up, will anyone hear us through all of the chatter? Will anyone act or will they just post heartfelt memes about it and call it a day?
Hope for the middle ground
While the algorithms push to radicalize us and send us deeper into dark rabbit holes, there is hope.
Jenny O’Dell and her missive to “Do Nothing”, as well as former Google ethicist Tristan Harris and The Center for Human Tech, each take a stand with suggestions to create better relationships between humans and the techs they use (or sometimes the tech that uses them).
You are not a hypocrite if you change your mind after getting new information. An important missive for a world where it’s so easy to share our opinions loudly. And the critical thinking that drives you to absorb new information and change your mind, more important than ever.
A shining light comes in the form of communities built around changing your mind. The subreddit that started it all, Change my View has now spun off into a website where people can make statements and invite others to change their minds with well-reasoned arguments. You earn “deltas” from other users when you successfully change their mind.
Reddit can get a lot of flak for trolling and silencing novel views, but this particular community is a model of thoughtful discourse in a world where chatting about differing viewpoints trends toward a heated diatribe.
This is not your typical political debate, more like getting tea with a stranger and a way of kindly understanding each other’s differing views, take it or leave it.
We’re being divided by tech, and it doesn’t have to be this way
What’s good for Facebook is not necessarily good for the world and humanity. Unfortunately, the ones in charge of these incredibly powerful platforms assume otherwise. There is a huge problem in the tech world where they assume that progress is inevitable, that it’s going to happen and they may as well be in front.
I think we all heard people in tech make this argument before, trying to justify their ethically questionable inventions: ‘You can’t stop progress. Technology will take over either way. It’s the natural evolution of things.’ This article calls out their bullshit.
There is no natural evolution of technology. It’s all on us. Changing the path of the attention economy will take conscious leadership from humans like you.
While it’s easy to blow off Facebook, Twitter, Youtube, SnapChat and other social platforms as recreational platforms and helpful connectors, it’s time to be honest about the impact these platforms have on our culture and our daily lives.
It’s hard to celebrate them for connecting humanity when the algorithms that run their platforms are actively dividing us.
*This article originally appeared on my Medium page
0 Comments