With Regards to Your Letter on Meaningful Interaction & Time Well Spent
A few days ago, you announced that your number one goal for 2018 was to make Facebook “Time Well Spent”.
This was especially gratifying for me because five years ago, I coined this term in a conversation with Tristan Harris (who’s worked tirelessly to spread and elaborate the concept, turning it into a movement).
Back in 2013, Tristan and I were worried about the entire tech industry, but your product’s News Feed was then — and is still — our best example of what needed to change. And that was before election manipulation, fake news, teen depression & suicide, worries about children’s videos.
Now that you’re on board with Time Well Spent, let’s get practical about how a company like Facebook (and an industry like consumer tech) can be retooled around “meaningful interactions” instead of engagement.
But first, how did Facebook screw this up? Popular articles place blame in certain places: the advertising business model, centralization, tech bro culture, tech-giant monopolies, or just capitalism-as-usual.
But I think the blame lies somewhere else: in the nature of software itself. I believe even the most well-intentioned teams, operating in the best possible culture would struggle with meaningful interactions and time well spent. Even in a small startup. Even at an open source, peer-to-peer nonprofit.
Why do I think so? I’ll tell you in this letter. Then, in the follow-up essay, I’ll say what to do about it.
It’s possible (but very tricky) to design software so as to address the users’ sense of meaning. But it requires profound changes to how software gets made! These changes make others your company has gone through (such as the adoption of machine learning, the transition from web to mobile) look easy.
But that’s what it will take.
On Social Software & Meaningful Interactions
Sometimes new social software works out well. Few people would come out to a protest of Wikipedia, Couchsurfing, or Meetup, for example. These products — and the social changes that come with them — are welcomed, even embraced.
But people are less enthusiastic about Facebook, Twitter, the “Fake News” ecosystem, Uber, AirBnb, and even smartphones themselves. Why are reactions to these systems different? I think we need the concept of values to understand:
Values: The ideas a person has about how they want to live, especially ideas about what kinds of relationships and what kinds of actions are of lasting importance in their life.¹
Values are like vertebrae: even if you never think about them, you have them, and they structure much of what you do. Values are ideas that direct the manner in which you act, rather than the outcomes you want. Let’s say you’re planning a social event, like the F8 conference you put on once a year. You might have a goal in mind, maybe “getting a lot of people to participate”. But while you craft your invitation, you also have a manner in which you pursue that goal (perhaps honestly or cleverly). Maybe you were inspired by a friend who writes cleverly, or by another invitation from someone who spoke honestly and from the heart.
And here’s the problem: generally speaking, your product (Facebook) makes it more difficult for all of us to live according to our values.
When a person spends hours on News Feed before bed, are they cultivating the type of social relationships they believe in? Are they engaging in acts of lasting importance?
Maybe! Facebook can be used in all sorts of ways. Perhaps this sleepy individual was planning a political revolution, or getting feedback on their first-ever breakdance video.
But many of us wake up the next day feeling like our late-night scrolling session was a waste of time. That’s because living according to our values doesn’t happen automatically. Some social environments make being honest more difficult, while others make it easier. It is similar with courage, creativity, and with every other manner in which a person wants to act or relate to others.
As we’ll see below, social software simplifies and expedites certain social relationships, and certain actions, at the expense of others. And if the simplified actions and relationships weren’t designed with a users’ particular values in mind, then using the software can make living by their values more difficult, which leaves them feeling like their time was not well spent.
For example, it may be harder to live by the value of honesty on Instagram, if honest posts get fewer likes. Similarly, a courageous statement on Twitter could lead to harassing replies. On every platform, a person who wants to be attentive to their friends can find themselves in a state of frazzled distraction.
As users, we end up acting and socializing in ways we don’t believe in, and later regret. We act against our values: by procrastinating from work, by avoiding our feelings, by pandering to other people’s opinions, by participating in a hateful mob reacting to the news, and so on.
This is one of the hidden costs of social software. Let’s call it the cost of values-misaligned systems.
Any social environment can be misaligned with our values, but with social software it is harder to resist. Compared to past social systems — governed by social conventions or laws — software gives less space for personal reinterpretation or disobedience. It tends to code up exactly how we are intended to interact.
Our Choices Are Structured
Consider how social conventions shape our lives: teenagers are ostracized for wearing the wrong clothes, adults for saying the wrong words or spouting unpopular beliefs. But it’s still possible to flout convention, and by operating expressively, outside of conventions, a person can sometimes initiate a new trend or subculture. With software, on the other hand, acting in a way the designers didn’t intend is often impossible: a user can’t sing “Thrift Shop” to a stranger on Tinder or wear their Facebook cover photo on the bottom of the screen. The software has structured the sequence and style with which they interact.²
We see something similar if we compare software with laws. Imagine if Twitter were implemented through government regulation: there’d be a law about how many letters you used when you spoke, and an ordinance deciding who wore a checkmark near their face and who didn’t. Imagine bureaucrats deciding who’s visible to the public, and who gets ignored. Could a law make you carry around and display everything you’d recently said?
In practice, laws can’t structure social life that tightly. Even in the worst dictatorships—when the Nazis had Jews wear stars—they couldn’t ensure complete compliance. As law, the “Twitter Code” would be impossible to enforce. But as software, it’s impossible not to comply.
Social software is therefore different from laws and social conventions.³ It guides us much more strictly through certain actions and ways of relating. As a result, we have less of a chance to pursue our own values. The coded structure of push notifications makes it harder to prioritize a value of personal focus; the coded structure of likes makes it harder to prioritize not relying on others’ opinions; and similar structures interfere with other values, like being honest or kind to people, being thoughtful, etc.
This doesn’t just cause problems for individuals. In the follow-up essay, I show that the problems I mentioned at the outset — election manipulation, fake news, internet addiction, teen isolation/depression/suicide, the mistreatment of children — are fuelled by the fact that actors are guided along in ways that don’t accord with anyone’s values.
So Zuck, let’s return to the issues that you are struggling with: meaningful interactions, time well spent, and the future of politics.
Since they are connected to the nature of software, it won’t be an easy fix. But I believe you’re serious about making a time well spent Facebook, and serious about addressing the harms to democracy and society.
So what can you do? What can any software team do?
There are two approaches that could work: in the long-term, you (and other technologists) can learn to build software that’s less constraining, software that works more like social conventions, which can be defied, expressively reinterpretted, and remodeled by the user. But realistically, that will take decades of research, innovation, business change, and cultural evolution to achieve.⁴
The only other option (besides rejecting the idea of social software entirely) is to learn a lot about values, and to explicitly redesign everything to be as value-aligned as possible, making room for the broad diversity of values of your users.
If you go this route, technology products must be re-conceptualized. They must be considered as spaces: virtual places where people struggle to live out the acts and relationships they find meaningful.
Teams will face questions like these:
- What are the variety of values that users have?
- For each such value, are there features of social spaces which make practicing it easier?
- How do users decide which values to bring into their socializing? How can software support this decision?
- Are there more or less meaningful kinds of conversations? Is there a way to identify less value-aligned talk?
- Can we accomplish all of this without imposing our own corporate or personal values?⁵
Tackling such questions may seem impossible. But this kind of focus on meaning can work out. It’s part of what’s made Couchsurfing (in its heyday), Meetup, and Wikipedia less objectionable than Facebook has been. They were designed with a deeper understanding and prioritization of the values of their users, as spaces for practicing those values. A complex and general product like Facebook will have to go much further in this direction