Pigeonholing and Personalization of the Online Experience

The personalization of our online experience – a result of the algorithmic extraction of our personal data, the subsequent curation of what we see, and the boundaries of our own clicking behavior – threatens to lead to our being pigeonholed into increasingly narrow categories determined by online platforms and data brokers. Such pigeonholing will further constrain what we encounter online, as with each stage of narrowing we will continue to click on increasingly limited subsets of what is made available to us. While the amount of information we encounter will, I expect, remain as robust as ever, the content of this information will be constrained by the bubbles to which we’re assigned. One troubling implication is that what we encounter will continue to narrow until the original promise of the internet ‘opening’ the world may eventually have the opposite result, leading us to become more easily predictable consumers and more easily persuaded actors in an increasingly curated spiral of contracting content.

To see how we are already being categorized, consider one of the many pigeonholing practices of Acxiom, one of the world’s most powerful data brokers:

“Acxiom assigns you a 13-digit code and puts you into one of 70 ‘clusters’ depending on your behavior and demographics… People in cluster 38…are most likely to be African American or Hispanic, working parents of teenage kids, and lower middle class and shop at discount stores. Someone in cluster 48 is likely to be Caucasian, high school educated, rural, family oriented, and interested in hunting, fishing, and watching NASCAR.”[1]

As companies like these persist in selling our data and the content of our online experiences narrow further, we will continue to be treated as categorizable cogs in an increasingly competitive attention economy.

In fact, Acxiom’s own ‘Consumer Data Products Catalog’ gives us a look inside just how they view us:

“Information includes consumers’ interests — derived, the catalog says, “from actual purchases and self-reported surveys” — like “Christian families,” “Dieting/Weight Loss,” “Gaming-Casino,” “Money Seekers” and “Smoking/Tobacco.” Acxiom also sells data about an individual’s race, ethnicity and country of origin. “Our Race model,” the catalog says, “provides information on the major racial category: Caucasians, Hispanics, African-Americans, or Asians.” Competing companies sell similar data.”[2]

It must be admitted that being placed in categories provides us with ads for products we’re more likely to desire, but it’s nonetheless natural to wonder if these benefits can compete with the costs. As each company learns more about you, it more finely shapes what you see. Beyond limiting the range of information we’re exposed to, this may, as noted by Frischmann and Desai[3], lead to the standardization of the individual. In other words, we have become subjects in a massive social engineering project. If companies can determine what we want with ever-increasing precision, they may ultimately be able to (at least partially) determine our online behavior by way of precisely tailoring the options they provide. In short, corporate knowledge of individuals may allow them to psychologically pigeonhole us in ways that are conducive to the ends of the corporation itself rather than our own. Consider the following from Frischmann and Desai:

“Suppose we’d like to induce a group of people to behave identically. We might personalize the inducements. For example, if we’re hoping to induce people to contribute $100 to a disaster relief fund, we might personalize the messages we send them. The same applies if we’re hoping to nudge people to visit the doctor for an annual check-up, or if I’m hoping to get them to click on an advertisement. Effective personalized ads produce a rather robotic response—clicks. Simply put, personalized stimuli can be an effective way to produce homogenous responses.”[4]  

A closely related worry involves the emergence of echo chambers and filter bubbles. The personalization and filtering of our online experience can lead to homophily, or the forming of strong connections to, and preferences for, people who share our beliefs and attitudes. While this can be psychologically comforting it can also reinforce confirmation bias and lead to the dismissal of opposing ideas.[5] Clearly, this phenomenon is problematic on many fronts, one of which involves the erosion of democracy. A vibrant, well-functioning democratic society requires the free, active and willing exchange of diverse ideas. The outright dismissal of opposing ideas yields pernicious polarization that undercuts both the likelihood of these crucial exchanges as well as the open-mindedness and willingness to truly consider competing opinions.

One finds oneself in a filter bubble when one is presented with limited perspectives on any relevant issue(s).[6] The filtering of our online experience may lead us to mistakenly believe the information we’re receiving is comprehensive while leaving us informationally and epistemically sheltered. Alternative, competing ideas are likely to seem not only foreign but reasonable targets of scorn and dismissal.

The more any entity knows about you the more likely it will be able to persuade you to act in particular ways. This, in fact, is the goal of social engineering. And clearly this would be an attractive scenario for any organization seeking results of any kind. We know that companies – and possibly countries – exploited partnerships with Facebook by microtargeting individuals during the 2016 Presidential campaign. In addition to Cambridge Analytica, the Russian Internet Research Agency targeted minorities, amongst others, by creating fake accounts and nudging them either toward voting for the third-party candidate or not voting at all. The more companies know about us, the more they can target us (or small, pigeonholed groups of us) directly in order to affect our beliefs and, therefore, our actions.

Nonetheless, the complete determination of our desires, the orchestrated directedness of our attention and the erosion of our democracy are not inevitable. It’s important to recognize that we are not without responsibility or control in this brave new world, regardless if it takes some significant reflection and understanding of the workings of the online universe.

It would be entirely unreasonable for our IRL (i.e., in real life) behavior to be constantly monitored in order to commodify our attention and modify our behavior. While I expect little resistance to this claim, this is the reality when it comes to our online lives. We need to at least consider the possibility that what we see is being limited by organizations seeking to monopolize our attention and direct our behavior. But we also need to realize that our online behavior is part of what is leading to our limited purview. In my very limited wisdom, I would suggest that we seek out opposing viewpoints, alternative news sources, new experiences and attempt to engage with information that transcends what we find comforting and agreeable. Harder still, we need to truly remain open to changing our opinions in the face of new data.


[1] Lori Andrews, I Know Who You Are and I Saw What You Did. p. 35.

[2] https://www.nytimes.com/2012/06/17/technology/acxiom-the-quiet-giant-of-consumer-database-marketing.html

[3] Frischmann, B. and Desai, D. “The Promise and Peril of Personalization”. https://cyberlaw.stanford.edu/blog/2018/11/promise-and-peril-personalization

[4] Ibid.

[5] For further discussion see C Thi Nguyen, “Escape the Echo Chamber”, https://aeon.co/essays/why-its-as-hard-to-escape-an-echo-chamber-as-it-is-to-flee-a-cult

[6] See Eli Pariser’s The Filter Bubble.