top of page
Search
Writer's pictureLinda Kinning

Your algorithm is not the whole internet: the dangers of thinking you see it all

"No one is talking about X"


On the internet, X can be pop culture trends, hot takes or geopolitics -- but we've all seen these posts that decry with exasperation that "no one is talking about this!" and a plea for you, dear reader, to pay attention and become a someone talking about this. The form is easy enough to spot that writer and Twitter superuser, Patricia Lockwood wrote a novel of the same title.


While these posters seek to highlight what they believe is a gap in the cultural dialogue -- what they actually draw attention to is a gap in their own understanding of how the internet works. This gap becomes especially dangerous when the topics are questions of:

  • Who does our society care about?

  • Who is marginalized? Who has power?

  • Where is our government's attention going?

  • Where are the blindspots in parties that hold power?

I want to name this gap in understanding explicitly: what you see on your social feed is NOT what everyone sees on their social feeds. This is by design, it's called an algorithm, and it's what the business model of all social media is built upon.


The illusion of an accurate cultural pulse

Simply put, a social media algorithm is the automated decision making framework that decides what each user will see on their feed. Think of an IF/THEN statement that assesses user behavior and patterns across a large body of users and then makes decisions about what content, when shown to a specific user, will meet the optimizing objective of the company.


IF user393 shows a preference, with clicks and attention, for: 

(x) cat videos 

THEN show user393 more cat videos 

This is a fine algorithm when the stakes are cat videos or relegated to the realm of entertainment. But social media (which, let's be real, is most of what people mean when they talk about their experiences on 'the internet') is not strictly entertainment.


Social media is the public square, it's the way people get information about their communities and governments, it's how people check the pulse of the world. All media is social. The algorithms are also much more tailored to preferences that extend beyond what makes us laugh. These algorithms optimize for engagement and attention and what do we engage most with?

  • Ideas and people who are most like us and make us feel good (GO, US!)

  • Ideas and people who are most unlike us and make us angry (NO, THEM!)


In practice, these algorithms function more like this:

IF user393 shows a preference, with clicks and attention, for: 

(x) content that reflects their world view and makes them feel strongly 
(x) content that rejects their world view and makes them feel strongly 

THEN show user393 more content that makes them feel strongly 

What any individual sees on their feed, their version of the internet, is more of a reflection of what activates them than a reflection of the world.


I want to repeat that again: what you see on your phone is a reflection of what social media companies believe you will engage with most -- not a reflection of the world as it is and definitely not a reflection of what OTHER people care about.


Social media is filled with tiny pictures and names of people we know, and people we think we know -- the interface is designed to make it feel like this is an accurate and trusted reflection of our community. But we must resist the urge, and the egoism, to think that our assessment of what other people care about can be accurately gauged from our singular experience on the internet.


A better approach to checking the pulse of the culture

These questions, of Who has power? Who is marginalized? Where is our collective attention and how can we focus it on the issues that matter most are important. So important that we need better tools beyond looking at our phones and thinking that is a shared experience of the internet.


I don't have all the answers, but here are a few methods to expand your cultural listening toolkit.


  1. Slow down -- lean into long form | Algorithms work at the pace of computers -- they make quick decisions based on statistics and probability and patterns. Humans need more time and more inputs to make good decisions. Take in information at a human pace and read long form articles that were researched carefully from a variety of trusted sources.

  2. Work through your thoughts offline | The internet is a great place for accessing information -- but it's not always a great place for processing and interpreting information. Algorithms measure engagement with clicks and time spent looking at a screen -- engage with ideas in conversation, and listening, and fumbling through together to make our ideas and opinions more thoroughly considered.

    1. This could sound like, "Hey friend, I'm reading a lot about X online and I'm working through my own thoughts on the issue. Can we play in the sandbox together with these ideas?"

  3. Diversify your algorithm, intentionally | Algorithms predict what you want to see, based on how you identify yourself and our culture around social media often posits that who you follow is a reflection of what your values and interests are. But if you want to use social media as a tool to check the pulse of the culture -- diversify it with people who are different from you.

    1. Note: Do not feed the trolls. Do not follow radical groups who can monetize your eyeballs. Do not engage with bad-faith actors. But your cousin who posts things you'd rather not talk about at Thanksgiving? Maybe this is a chance to listen to that corner of the internet -- it's shaping your world whether you like it or not.


Learning to live with our biases


Algorithms have contributed to society in so many positive ways and we need to learn how to live with them in a way that serves people and society -- not just the corporate entities and shareholders of the internet infrastructure. If we zoom out, we are at the earliest chapters of an internet-enabled society -- future historians will be looking at this moment in time and tracing to it far reaching consequences that we can't even consider yet.


We can face this reality with fear of the unknown, or the generosity and humility in knowing that there is a learning curve to every new technology and adoption is filled with pitfalls and dangers. We are living through these dangers now and seeing the consequences of context collapse, radicalization, and polarization. But we have a choice to treat these technologies not at inevitable forces, but at tools that can be used for creation or destruction.


In order to make the choice to use social media as a creative tool, we must recognize it's limitations. We must recognize that the underlying algorithms are not neutral, and in their current state, and they are not designed to optimize for an accurate reflection of social sentiment or experience.


Your experience of what you see on your phone, is just that. Your experience. Acknowledging that limitation is the first step in getting a better pulse on what people are talking about.





68 views0 comments

Recent Posts

See All

Comments


bottom of page