Social media didn't make you an ignorant jerk. It let you become the ignorant jerk you wanted to be.
Ben Dreyfuss is responding.
(This is the first of a series of posts in response to Jonathan Haidt’s Atlantic essay “Why the past 10 years of American life have been uniquely stupid.”)
When Facebook first launched, users had a profile page which they would fill in with photos and their favorite bands and their relationship status, and other things like that. You would add your friends and then you could go to their profiles and leave messages on their “walls.” You could also write on your own wall. This was called a status update. It very much was about your current status. So much so that it included the word “is” after your name. You were given the prompt “Ben Dreyfuss is…” and would write “watching basketball.”
Like MySpace, you had to put in a bit of effort to see even what your friends were doing. You had to go to their pages. But not long after it launched Facebook premiered a new homepage that showed you updates from all your connection. “Jeff uploaded a new photo,” “Marsha changed her status to ‘Marsha is at the library’.”
The Facebook news feed was at this stage not really a discovery tool. It was an aggregator. It took a bunch of updates from your friends and showed them to you, but it didn’t really show you content from strangers. It didn’t try to get you new connections.
You could still use Facebook as a discovery tool, but it took more effort. For instance, you could tell Facebook what classes you were taking and look up people in your class and connect with them. You could join groups. You could do any number of things but you had to do them. You had to decide to use Facebook to look up new connections.
Around 2008, things started to change. People were gaining more and more friends and liking more and more brands and those friends and brands were doing more and more things and the simple timeline of updates was chaotic. They began giving users the chance to provide feedback on activity. Private feedback. “This activity doesn’t interest me” etc. People didn’t use that very much though and in 2009 they introduced another version: the “Like.” This was public. If you “Liked” someone’s activity, they would know about it, and so would other people. Your “Like” was another bit of content in someone’s feed.
The Like button was incredibly popular and with this data about your preferences, Facebook was able to get a better idea of how to customize the content in your Newsfeed. By 2011, your newsfeed was mostly algorithmic and since the feed wasn’t so cluttered and was now showing you things a computer had good reason to think you might engage with, you started to engage with things you saw more. And it got better and smarter at predicting that. And they rolled out things that made it easier to do that.
For instance, you had been able to “share a link” on Facebook for years. If you saw a link someone shared, you could click on it, or comment on it or Like it. If you Liked it that meant your friends who engage with your activity a lot might see that link in their newsfeed even if they didn’t know the poster, but probably not. Only people who really engaged with your content a lot or only had one or two friends would see an activity like that. It just wasn’t a strong enough signal. But in 2012 FB added a “share” button on mobile so if I saw that Joe shared a link to a TIME article about puppies that can fly I could click share and write “wow puppies can fly” and that’s a much stronger action by me. So a larger number of my friends will see it. If you click on my post about the TIME article about puppies that can fly I got from Joe, Facebook will decide that you like content from me and that you might like content 1) about puppies, or from 2) TIME, and 3) Joe. And maybe you should Like TIME’s Page and maybe you should be friends with Joe and maybe you’d be interested in this article about dogs?
This is all a vicious cycle where the more correct these algorithms are at predicting your behavior, the more you act predictably. Every time the algorithm doesn’t show you a post from some random person you went to high school with changing their job that you couldn’t care less about, it has more real estate to show you something you will care about from someone else, with whom your connection might be purely digital.
Eventually, this fantasy world of gumdrops and agreement where everything is posted by people who share your same preferences and are outraged by the same things is punctured by the outside world, and you are reminded that millions of people believe the exact opposite of you and it gives you a heart attack.
Everything I have written so far is basically the story of how Facebook’s product team in an attempt to anticipate your desires ultimately ended up creating these perverse incentives that wrought a very dangerous case of epistemic closure on this whole country.
And it’s true! But it omits something important that also happened.
Keep reading with a 7-day free trial
Subscribe to Calm Down to keep reading this post and get 7 days of free access to the full post archives.