Scatological Sensibilities

growthfetish

An exchange seen in the droppings of the bird

I follow @hardmaru and @BenedictEvans on Twitter. Both are people I follow because of their keen view of their field and interesting insights. TL;DR: I follow them to learn from them.

So about a month ago, the bird surfaced this chain of comments at me:

  • “Many people simply don’t want an algorithm to decide what they should see. And we should respect that. (note: includes images)” — @hardmaru

  • “Then they’ll see 1-2k posts per day – that’s how much is actually posted. But of course no one will scroll that much. So they’ll see a random sample of whatever was posted in the hour or so before they open the app, because that’s all they’ll get through.” — @BenedictEvans

  • “I think it depends on the number of “friends” one has, but yeah someone might be dumped 1-2k posts/day if they have a thousand “friends”. But hey, if that is what they want, they should have that option...” — @hardmaru

  • “Average posts/day the average user is eligible to see is 1-2k. 150 friends, post/comment/like/share a photo a total of 10 times each, and there you are at 1500. Chronological feeds are just too big.” — @BenedictEvans

  • “Chiming in here to say my opinion here is are other options aside from: users have to see everything with no control users see what an algorithm says with no control (I think the answer is finding ways to put the user in control)” — @jezzamonn


Ranting behind a protected account. Subtweeting on steroids.

That exchange upset me. Enough that I wrote a lot of tweets in a rant.

My account was protected at the time. It was fun having a locked-down account. I could speak my mind even though I knew Benedict wasn't going to change his, much less see my rant.

Being locked-down, I could stream out an emotionally inspired rant, tagging it onto his comment knowing he's was none the wiser. He never seemed to block me, so I suspect no one told him about it. Or no one saw. Either way, I got it off my chest and I felt better.

In the end, those tweets I made were really just venting. I saw people's comments on the topic of Facebook's algorithmic feeds, and that they were causing pain. Then I see Benedict comments showing up in my feed. And I felt anger.

It makes me angry because of my own persistent lens of the topic:

“The personas targeted by shady marketing and #darkpatterns have the faces of my friends and family. I rage when I see them being targeted.” — @ultimape

So I deleted the tweets I made and instead decided to write about it here.

I'm not from the camp of people who frame VCs from SV as uncaring monsters. I choose to capture the anger I felt and try to see where he was coming from. Even in my rant, I wasn't trying to be uncharitable in my framing. I could have used harsh words, like calling him callous and pointing out how his obstinate framing of this topic is a broken record. But I didn't.

I've been chewing on these aggressive thoughts a bit to try and understand them. Digesting my thoughts. Letting the ideas stew in my mouth for 29 days. And now I'm spitting them out.

My goal here isn't to harass Benedict, but to understand him. To try and frame my own understanding of his point of view and hopefully explain where my own is coming from. To help me be less angry next time.


Riding the wave of emotional turmoil.

So after chewing on my anger for a while, I think I figured out what was happening in that original exchange. It feels like an artifact of Benedict's expertise.

The thing you have to understand about Benedict is that he's a great strategist. I'm fairly certain this is a big reason why he's a partner at A16Z. He's smart, sharp, and his narratives are strong. He can see things from the perspective of an org. He is able to take this perspective, map out the larger territory of a software space, and see how features manifest to create the desired results.

When I'm at my best, I would be flattering myself if I said I was “half as astute in thinking about how all of these things fit together”. At least in software land. There's a reason why I follow him to learn.

However in this exchange, this strength was his weakness. I think HardMaru was emphasizing. I, as a passive reader was also emphasizing. And Benedict? Well, he was in systematizing mode as far as I can tell. And like water on oil, it didn't mix well in my head.

He's not looking at it from a user experience lens, or at least if he is, he's not mentioning that framing very much. If he's looking at it from the organization's perspective, well, its not the most productive frame to take. It comes off as more of an ¯\_(ツ)_/¯ response than one trying to emphasize with the user's sentiment.

I think that is what was making me angry. It felt like he was disrespecting and disregarding people's perspective and views. His was replying with his bit about how feeds being limited, and necessarily need to be filtered. And he seems to miss the point about it expressing a feeling of control, the core issue. This a classic Empathizing vs Systematizing divide, and one that I see all too often in myself.

Replacing Outrage with Curiosity

It would be unfair of me to say he's ignorant of the way our info streams are being managed. Some of his other tweets on similar topics suggest that he's not entirely disregarding the user experiences. I've seen examples where he's more than willing to speak his mind about how shitty some of these things become under the wrong incentives.

Heck, he even pointed out the very reason why I stopped using, and eventually deleted my own Instagram:

  • “I know Instagram needs to hit its numbers, but a low-quality and irrelevant ad every 5 posts is excessive. No, I am not renting an apartment in Jersey, nor do I want to learn Russian so I can speak to women.” — @BenedictEvans

  • “If I report an ad for abuse, I really shouldn’t see it again. Or is that the equivalent of the ‘close door’ button on elevators?” — @BenedictEvans

My respect for Benedict is because of his trend-spotting, and the way he explains them. So it makes me really happy that I think I saw this before him. Though maybe I was just part of a different A/B testing group?

“I've stopped using Instagram now that every 4th post is an advertisement. I haven't posted since October. What started as a way for me to share my world and connect with people is now just another vehicle advertisers use to inject their bullshit into my world.” — @ultimape

Like a broken record.

Scrolling thru Benedict's tweets, much of his discussion ends up taking the same shape as this perception filter. Others have tried to argue roughly the same thing to him that I'm about to argue in the rant. But it is like he's got this statistic of 300 statuses and 1500-2000 limit stuck in his head.

You can see it here in his musings about the direction of where algorithms are going next:

  • “All social apps grow until you need a newsfeed All newsfeeds grow until you need an algorithmic feed All algorithmic feeds grow until you get fed up of not seeing stuff/seeing the wrong stuff & leave for new apps with less overload All those new apps grow until...” — @BenedictEvans

  • “Why can't the users get a choice? Algorithmic or Raw!” — @zahidtg

  • “raw is a filter too. you only have so much time in the day” — @BenedictEvans

  • “Raw is not a filter, it just leaves the filtering to the user, which I prefer” — @klangberater

  • “With 1500-2000 items, then the filter is time, and the ordering is random.” — @BenedictEvans

  • “Yes, but I still prefer that compared to an algorithm where my preferences are only a small part of the equation” — @klangberater

  • “Raw is transparent to the consumer; their own sliding scale of commitment vs value/reward. Algorithmic is opaque and, overtime, degrades the user’s perceived value of the feed content. The big devaluation point comes when you see you missed something you badly wish you hadn’t” — @1jrice

And when I take on his frame, I don't disagree. I even remember reading the same report that cited those statistic when it came out.

For reference, I think what he originally read back in 2013 was directly from the engineering team (or maybe he heard it thru the grapevine?):

“This allows us to prioritize an average of 300 stories out of these 1,500 stories to show each day.” News Feed FYI: A Window Into News Feed

But it might have been this Guardian piece from 2014:

“Backstrom explained in August that Facebook's news feed algorithm boils down the 1,500 posts that could be shown a day in the average news feed into around 300 that it “prioritises”.” How does Facebook decide what to show in my news feed?

Or the article in Time magazine that broke down the history:

“Facebook says the average user has access to about 1,500 posts per day but only looks at 300. (A user who scrolls endlessly will eventually see every post from their friends and a smattering of posts from Pages they follow.)” Here's How Facebook's News Feed Actually Works

And I think it will take more than my spiel here to change his thoughts on it. It seems to be part of his head cannon and narrative. A strong point in his larger theorizing. He repeats it often while building up to larger premises.

“The average FB user is eligibile to see 1500-2000 items a day. This sounds absurd, but it’s only 150 friends posting, liking, sharing 10 times a day each. And 3 seconds x 1500 items is 75 minutes. There will always be a filter – the only question is what kind.” — @BenedictEvans

Stuck in the mud

His lens and responses always seem to be about growth and effectiveness of the design in terms of the org's desires. You can look up “feed” and see it in all his responses. It is like a myopia biasing his thought patterns.

I might be projecting my own negative stance here, but it seems he's got his head stuck on the question of “what kind of filter” – it assumes that Facebook's only option is to paternalistically filter your feed for you. Like a broken record, this topic comes up, and I think he misses the point people try to make everytime.

  • “Joining Twitter, you have to spend weeks/months working out who to follow before it gets great. An automated feed is an obvious solution” — @BenedictEvans

  • ”@BenedictEvans people are stupid and can't look at lists? oh my.” — @CaseyParksIt

  • ”@CaseyParksIt that’s not a sensible view. Certainly isn’t what I said” — @BenedictEvans

  • ”@BenedictEvans an automated feed is not an obvious solution unless you think ppl are too stupid to figure this simple design out.” — @CaseyParksIt

  • ”@CaseyParksIt it is self-evident that the current UX is baffling to most people. Hence Twitter has stopped getting new users.” — @BenedictEvans

And like a car stuck in the mud, he keeps kicking up the same sort of muck while trying to dig himself out of his entrenched way of thinking:

But if Google and Facebook don't have any control of what is in your feed, and you don't have any control of what is in your feed... Who does? From where I'm standing this is why the government regulators are stepping in and questioning. They want to limit control (or take it?).

And if you read his article, he describes the very problem I'd like to highlight with the way these systems are built. A different sort of point from what he is getting at with his ideas.

“while this is explicit for Google, it's implicit for Facebook. You tell Google explicitly what you want and you don't think you tell Facebook, but actually you've spent months and years telling it, through everything you've interacted with or ignored. Facebook makes technical, mechanistic judgements about what will be in your newsfeed that are just as bound by things beyond its control – by the internet – as Google's are. It's an index of its users. Every now and then, it decides that it's got off track, no longer aligns with users, and course-corrects. [...] This means that Facebook is surfing user behaviour, and must go where the user takes it. This is why it looks like such an unreliable partner: it will invite you onto the surf board, certainly, but if you're unbalancing the board then it will push you off, and that isn't Facebook's choice. If it didn't push you off then the board would upset and Facebook would be at the bottom of the ocean as well, next to MySpace. The genius of Facebook has been to stay on the board all this time. and especially through the transition from desktop to mobile. “

And that, I think is the core problem. The reason why people don't like algorithmic news feeds is because the filter is being chosen for them. And everybody keeps talking about this, but because of Benedict's expertise and default perspective on the issue, he comes off as blind to the experience.

So if I use his metaphor about Google / Facebook being surfing platforms, I think I can frame the problem that the users are running into. The source of their frustration, and Benedict's apparent blind-spot.

The users (riders?) wants to be able to choose the style of the board based on what they are comfortable with.

  • Maybe they want a wake-board experience – to be pulled along behind the stream currents of a fast ship.

  • Maybe they are a pro-surfer and need something lightweight and highly glossed so they can execute cool tricks on the waves.

  • Maybe they're just there to hang out with their friends and need a paddle board, wanting to stay away from the waves.

But Facebook decides all of this for you, by “making technical, mechanistic judgement about what will be in your news-feed”.

And so, now that I have a better understanding of where he is coming from, I am going to recapitulate the rant I had written.


The rant

THIS IS ABOUT CONTROL.

Listen to what is being whispered, not what is being said. The subtext... the subtext is why articles like this get so many clicks: GOOGLE’S SELFISH LEDGER IS AN UNSETTLING VISION OF SILICON VALLEY SOCIAL ENGINEERING

The subtext is a fear over the loss of control. Fear over silicon valley manipulating the masses. Or someone manipulating them. Sock-puppets, bots, fake accounts, fake news. Lies. Misdirection. Forced manipulation of our attention.

This is the conversation our networked society is whispering. They just aren't saying it with the right words.

Who is controlling the Surfboard?

Facebook presents a black-box system that is constantly changing and not transparent. This leads to all sorts of emotional buttons being pushed. Loss-aversion, FOMO, etc.

People are frustrated by it. So they blame “algorithms” when the real pain they are feeling is an unease about paternalistic design.

Playing poindexter over the definition of algorithms is completely missing the point. People use the language that is presented to them thru modern culture. Algorithms are the demons haunting them; says the media. And so they parrot the language they were told to express their hatred for the demons.

Now, 95% of junk out there is conspiracy theory about Facebook. But the underlying whisper? Fear + loss of control. Over time, these feeling translate into fear of subterfuge and evil motive. But framing user-base as 1) ignorant 2) crazy is not how to generate an empathetic design.

People are so frustrated and disillusioned by the lack of control and transparency, that the governments of the world are starting to take notice. Albert Wenger talked about this in 2015 (emphasis mine):

“This is important, not just for drivers. We are all freelance workers on Facebook and on Twitter and on all these big social networks. Yes we in part we get paid through free services, free image storage, free communication tools. But we’re also creating value. And it’s not just the distribution of value that we’re worried about, we’re also worried about what do these companies do? We’re worried about questions such as censorship. We’re worried about questions such as, are we being manipulated by what’s being shown to us in the feed? And at the moment, what regulators are doing is they’re trying to come up with ad-hoc regulations to regulate each and every one of these aspects. And many of these ad-hoc regulations, are going to have completely unintended consequences. And often these consequences will be bad. Let me just give you one example. The European Union has said, if you want to have information on people who live in the European Union, you have to keep it on European Union servers. That actually makes it harder for new networks to get started, not easier. It actually cements the role of the existing networks instead of saying we need to create opportunities for competition with existing networks.” BIG and BOT Policy Proposals (Transcript)

This is Mechanism Design.

Do you know how people respond to a “random” feed? By following more people to increase the chance they'll see the content they want. My mom had 8 facebook accounts – she wanted to make sure she didn't miss any of her kids posts. Fighting an algorithm she had no control over.

Do you know how people respond to a “random” feed? By posting more content to increase the chance the'll be seen. And if their livelihood depends on it, they'll pay for and join bot-rings just to get seen Real People Are Turning Their Accounts Into Bots On Instagram — And Cashing In

People are so sick of the lack of control that there are even art projects being written that are effectively 'chaffing & winnowing' the “algorithms” to hide themselves. Confuse Facebook's Algorithms with the 'Go Rando' Web Extension

This is what happens when under-thought design butts up against basic human frailties in reasoning about time. Whispers of corrupt algorithms slither thru the user-base. They don't quite understand. It coalesces into asinine requests to get random sampling of part of their feed. “Protect us from Ourselves”

Social foraging a swarm intelligence filter? “It's Not Information Overload. It's Filter Failure.”

Design problems? Design Solutions.

So we miss point of what people are saying here. They want feature X because they have problem Y, the don't know how to express Y, but they know X would solve it. Give them Z.

And yes, there is a problem here to solve, but it is one of mechanism design.

How Game Designers Protect Players From Themselves | Game Maker's Toolkit

... But would that ruin the ability for Facebook to slip ads into the stream and manipulate people's behavior?

Its not about the algorithm, algorithm is a red-herring. People would be fine with algorithms if it was something that they could curate and opt in for themselves. Heck, the problem would fix itself if you turned on the spiggot of ordered content and presented a choice of opt.

This suggests that what people really want: better filtering tools that they can control – even if it just means getting a dashboard that lets them choose how it bias and curates and a view on what factors lead to it.

But they might make a choice that doesn't lead to facebook's targets. They might be night owls, and their friends might not be up, so they might not use the platform as much. An algorithmic feed controlled by Facebook means incentive is for a dopamine drip to boost engagement.

Or worse, Facebook decides that it wants to show how many friends have done or liked something as a form of social proof. It's nudge theory, but weaponized in a way that is outside the user's control. It feels manipulative, but users can't quite say why.

So why the focus on the feed?

What does the want of an unfiltered linear feed mean? What are people really asking for when they ask for that? What pain are they solving for when they make this request?

A linear chronologically ordered feed is predictable. Its not hiding anything, its not wrestling control away from you. It isn't manipulating you in a way you can't ascertain. That should be the baseline.

Arguing for algorithmic feeds is fine, but it should never take away a users sense of control. If something is hidden, it better damn well be because I asked the system to explicitly hide that kind of thing from me. I don't want some hidden algorithm tuned to manipulate me, and I especially don't want it presented to me under a guise of paternalism. That smells like bullshit.

But of course Facebook hasn't done that. They started giving all the 'likes' you had liked pages, and handed control over to those pages to random people. Suddenly your feed was full of content from brands who had snuck in thru girardian style mimetic signaling good. And they're using it to manipulate us, as far as we can tell...

And its creepy.

“So half the Earth's Internet population is using Facebook. They are a site, along with others, that has allowed people to create an online persona with very little technical skill, and people responded by putting huge amounts of personal data online. So the result is that we have behavioral, preference, demographic data for hundreds of millions of people, which is unprecedented in history. And as a computer scientist, what this means is that I've been able to build models that can predict all sorts of hidden attributes for all of you that you don't even know you're sharing information about.” Your social media “likes” espose more than you think

People started complaining they couldn't see all their friends. But the options at the time ran counter to Facebook's intention of being a platform for celebrities, brands and community building thru pages. They are only just now undoing this crappy mechanism design mistake.

Even their ads don't admit the mistake. They talk about friends and friends of friends. and all the crap that started polluting the feed. But the cats out of the bag and the ecosystem is polluted with people who have built up lives around those pages.

Facebook Here Together (UK)

And when we step back and wonder what is going on? We see something fishy and it smells rotten. Facebook moves 1.5bn users out of reach of new European privacy law

TL;DR: Is this Loss?

The incentives aren't there, and the arguments for changing this are misunderstood. Which is why I deleted my Facebook even though its the only way I can contact my dad.

I miss my dad.

And now you know my perspective.


#rant #socialmedia #technology #userexperience #ifightfortheusers #refactoring #techdetox #detoxingthecommons #growthfetish


'Scat Sense' is a personal blog written by Nicholas '@ultimape' Perry. Follow them on the Fediverse here: @ultimape@mastodon.social