Facebook's Business Model? Capitalize on Human Nature.
Facebook capitalizes on human nature — quite literally in the form of $40 billion in annual revenue. In the era of viral fake news, we often hear the same rhetoric about this human nature hacking: "confirmation bias," "confirmation bias," "confirmation bias." It’s true: We certainly love being told that our views are right, and we love seeing evidence to the same end. That's part of Facebook's attractiveness, especially in a polarized and divisive political climate.
But the permeation of fake news in media discourse has focused the public conversation too heavily on confirmation bias when thinking about Facebook's exploitation of basic human desires. The behavioral roots of our addiction to Facebook are much deeper than just view-confirming news feeds, and it's time we brought these factors back into the conversation.
First, some context. In 2004, there was no such thing as Facebook. MySpace was the dominant social network at the time. But by 2008, one in five people who accessed the Internet visited Facebook’s website. By January 2009, Facebook had nearly doubled the monthly page views of MySpace and surpassed it as the largest social network in the world — although MySpace still kept its U.S. dominance. Shortly thereafter, Facebook stole that too. By 2012, Facebook reached one billion active users. Today, it boasts over 2.23 billion users each month. In what seemed like a flash, the company had overtaken the online social media space.
The reason, simply put, is that Facebook is like a drug, capitalizing on our innermost human desires; by designing its business model around the conditioning and predicting of human behavior, Facebook has put behavioral science at the core of its profit. Sean Parker, an early Facebook investor, said that the company’s earliest mission was "How do we consume as much of your time and conscious attention as possible?" (In hindsight, of course, he holds that "God only knows what it’s doing to our children’s brains.") Like most other social media companies, Facebook's business model is simple: The more time you spend on Facebook, the more ads you see, and the more money the company makes. Capitalizing on human nature is an easy way to guarantee profits in a world of too much content and too many strains on our attention.
As human beings, we want to feel accepted and appreciated by those around us. Social media creates the mechanism to fill this need, referred to as belongingness.
We share something on Facebook and wait for a reaction. If a friend "likes" it, that boosts our self-esteem and prompts us to share more. More favorable comments lead to even more posting, and this cycle of positive reinforcement repeats itself indefinitely. Since Facebook users typically join the platform by virtually connecting with real-life friends, our desire for belongingness offline is translated almost directly into our desire for belongingness online. As our networks grow from just a circle of real-world friends to a circle of real-world friends-of-friends, and even from there to people we meet entirely online, Facebook bolsters our feelings of social acceptance and belonging — the online version of being popular — so we associate its platform with a sense of community.
On the bright side, Facebook offers a way for us to feel connected and valued in society, especially as social media "influence" holds more and more value offline. The downside is that relationships have now been quantified and gamified: Likes, followers, and friends are new mechanisms of comparison. As we grapple with this reality — and with the ethical and mental health implications — we're still engaging with the platform, driven by its sense of belongingness. Facebook keeps raking in the money.
Social comparison is nothing new. The idiom "keeping up with the Joneses" originated with a comic strip dating back to 1913, referring to comparison with neighbors as a barometer of social standing. Seeing how we measure up against others not only occurs on Facebook, but in almost all aspects of life, whether the comparison is an alma mater, a job title, wealth, physical appearance, or myriad other factors. In fact, this natural human tendency to compare oneself with others comes from the necessity to survive. As humans have grown into complex social creatures, our ability to compare individuals, objects, and behaviors has been critical to success.
Jumping forward to modern times, Facebook has seized upon these natural human tendencies of harsh social comparison. We're going to do it anyway, so the company has built a way for us to do it online — via a platform through which it can mine our information and show us ads (read: make money) in the process.
This desire for belongingness and social comparison isn’t just relegated to the dopamine hits that come with an accepted friend request or a "like" on a post. By giving us a way to escape reality by creating an entirely new one, Facebook has deliberately constructed an addictive environment seething with social comparison and games of self-representation. Indeed, users choose how to present themselves to the world online; in many cases, they consciously present themselves much differently than in the real world. Women manipulate images of their bodies. Users carefully curate their profiles to reflect prevailing social norms and dominant life experiences. As Facebook enters the dating space, it too will introduce further social comparison — and further incentive for misrepresentation — into its platform, which just keeps us hooked to the site and generating data for its profit-generating algorithms.
In a survey of Facebook and Twitter users in England, 82% of people did not feel accurately represented by their social media accounts — whether that's dissatisfaction about appearance (e.g., body image, personality, etc.) or conscious manipulation of appearance (e.g., Photoshop, facetious captions, etc.) in ways that deviate from reality. But we like to stay with the crowd even if it means losing our authenticity; social conformity is a survival tactic long ingrained into our behavior. So, once again, Facebook capitalizes on human nature for its own economic benefit.
To add just a few other examples, research has shown that reinforcement learning is critical to social conformity — and Facebook offers this through a constant stream of notifications. We stay online, curating our image, only to receive more signals about image curation — further reinforcing conformity. Linked to both social conformity and belongingness is a fear of missing out, which it turns out is a driving factor for joining Facebook; one study found that fear of missing out is "strongly linked" with use of the platform. As Nir Eyal, bestselling author and Stanford Graduate Business School lecturer puts it, "What Facebook wants to create an association with is every time you’re bored, every time you have a few minutes." And, of course, confirmation bias and echo chambers too play a role in our attachment to the platform.
As we become socially dependent on Facebook, we still remain online and active. We give up our personal information and only see personally tailored news feeds and targeted ads. Facebook makes money all the while. What this means, though, is that basic policy measures around disclosure of data tracking or transparency of machine learning algorithms are not going to be enough. Facebook has capitalized directly on our human nature, and our human nature — our need for belongingness, drive for social comparison, tendency for social conformity, fear of missing out — is keeping us wired to the platform. It's going to take far more robust policy measures than just disclosures and corporate transparency to get us unhooked.
Ryan Sherman is a student and independent deep learning researcher working on machine learning’s applications in drug development and safe artificial intelligence.