Facebook’s Distorting Lens: The Danger of Social Deference

               Recently I got over my revulsion for Facebook and once again activated an account.  I did it in part because though I dislike the platform for some obvious reasons, I feel it’s important to engage with something that is so monumentally influential.  It’s important to know firsthand just what the atmosphere is like, what sorts of effects it has on its users, and what sorts of changes happen in the environment and the effects they seem to have.  I’m quite familiar with the way in which it creates echo chambers and epistemic bubbles, the draining effect it tends to have on my psyche, but in my recent interactions I feel most upset by what seems to be a lack of autonomy in the social realm.  I feel shuffled from post to post without knowing why and without having any sense that I can control what and who I see.  It’s all the more distressing that in Facebook my social interactions are being governed by unknown algorithms.  I am troubled by what seems to be an integral part of Facebook, something I’ll call social deference.

               It’s impossible to live in the modern world without deferring to others about a good deal of things.  We simply don’t have the ability to know firsthand and from the ground up all the information we need to know.  The most obvious form of deference is deference about facts.  When I accept someone’s word on something, I’m taking on what they say as my belief.  We defer to doctors about the safety of medications and treatments, to engineers about the safety of our planes and bridges, and to news organizations about the events of the day.  This sort of thing is both commonplace and necessary: it would be difficult to get out of bed without trusting other people to sort some of our facts for us.

               There are, on the other hand, facts about which it seems peculiar to defer.  Several years ago, I proposed the following thought experiment.  Suppose that Google offered an app called Google Morals.  You could enter in any question about morality—should I be a vegetarian? Is it permissible to lie to achieve one’s ends?  Is abortion permissible? —and Google Morals would give you the answer.  Set aside for the moment that it would be unclear just how the app would work and how it would have access to the moral truths.  Suppose we had reason to believe it did.  Nevertheless, I maintain, there is something peculiar about deferring to Google Morals, something that isn’t peculiar about deferring to Google Maps in order to learn how to get from Springfield to Capital City.  There is a way in which one is shirking one’s responsibility as a person when one simply takes Google’s word when it comes to moral matters.

               A good part of the problem with moral deference is that we don’t have access to why Google provides the answers it does.  It wouldn’t be a problem if we could “see the work” and understand why Google provides the verdicts it does.  In that case it’s likely we wouldn’t simply be deferring—we wouldn’t be accepting Google’s verdict simply because of Google’s output, we would be altering our beliefs because we understood the reasons why Google said what it said.  Understanding why something is true, being able to articulate the ins and outs, is important when it comes to some of our beliefs—namely the moral beliefs that make us who we are.

               Ok, so suppose this is right; what does this have to do with Facebook?  It strikes me that Facebook encourages a sort of deference as well that is likely as problematic as moral deference.  Call it social deference.

               Suppose that you systematically deferred to others about who was a good friend.  Instead of evaluating someone based on their merits, based on how they treated you, you simply asked a friend expert, a “friendspert,” whether someone was a good friend.  It’s not just that the friendspert recommends you check someone out and that they might be a good friend, but that you adopt the belief that the person is your friend based on their advice and you organized your life accordingly.  This is a sort of social deference—one is allowing one’s social circle to be determined based on the sayso of another.  In some sense one is shirking one’s duties as a friend and is offloading important work onto others that really should be done by each of us—evaluating people based on their perceived merits and demerits and befriending them based on how they treat you.  There would be something wrong if someone asked “why are you my friend” and your answer was “because the friendspert told me to be.”  Acting that way depreciates friendship to the point that it’s not clear that one really has a friend at all.

               The friendspert is an extreme case, and though it’s tempting to say that Facebook, with its friend suggestions, is acting like a friendspert, that’s probably not quite right.  There is perhaps a little truth to this, but it almost certainly overestimates what is really going on when we “friend” someone on Facebook.  It’s not as though when we click that blue button the person actually becomes our friend in any robust sense, and it’s not as though we shut down our independent evaluation of that person and just defer to Facebook’s algorithm.  We form beliefs about the person and make attachments based on what we see on our feed or how we interact with them.

               There is, though, a type of social deference involved in Facebook however that might even be more insidious.  We are deferring in this case to an algorithm that affects how our friends and social circles appear to us.  Who we see and which posts we see are determined by a system that is unknown to us.  To the degree that we let our attachments be shaped by those algorithms we are guilty of social deference.  We are allowing our connections to other people to be shaped based on decisions and frameworks that are not our own.  In doing so we are ceding our social autonomy and we’re allowing one of the most essential parts of ourselves—the social part—to be molded by a third party.

               Most of us know, at least after adolescence, that we should not judge people simply by what others report about them.  Even if those reports are accurate, the intermediary in this case is apt to distort our picture of other people, thereby shaping our judgments about them.  It is important, indeed it’s our responsibility, to judge people as much as we can without intermediaries shaping our perception of them.  The problem isn’t just that falsehoods and misrepresentations enter the mix.  Even supposing they don’t, it is our responsibility to form our interpersonal relationships—especially our friendships—ourselves.  Forming and nourishing friendships requires a subtle navigation between revealing too much about oneself and not enough, foregrounding some features and not others.  This isn’t dishonest, it’s a recognition that not every fact is relevant to every relationship, and sometimes the order and emphasis of what one reveals about oneself says as much about oneself as the information revealed.  (If I start every conversation announcing my religion or political affiliation, that fact will tell you as much about me as whatever you learn about my faith or politics.)

When we use Facebook, we are ultimately introducing an intermediary between us and our social world and are placing trust in it to provide an accurate picture of our social world.  In fact, what we get is a distorting lens that highlights some parts of our friends at the costs of others.  Importantly, the algorithms that generate what posts we see is not interested in generating or preserving true friendship, nor it is interested in showing us the truth about people.  It is interested in what keeps us clicking, and as such it tends to show us the most provocative parts of our social sphere.  People’s most outrageous opinions are foregrounded and those features that are relevant to true friendship are irrelevant.

               We needn’t rest with abstractions to see the point.  How many of us have seen the political posts of our family members and changed forever how we see them?  How many of us have seen the posts of our friends only to resent them for their self-righteousness or for what might appear to be their self obsession?  Our perspective on our social world is being shaped by the hidden algorithms that lead users to spend time on the site, not by anything that matters to friendship.  This is a kind of social deference and by engaging in it we are handing over responsibility for our relationships to a source we all know is untrustworthy.  The result is a weakening and cheapening of our relationships, but we can’t just blame Facebook.  It’s our decision to give a third party the power to distort and mediate our relationships, and to that degree we deserve a large share of the blame for abandoning our responsibilities to our friends and our social sphere.