The Danger of Manipulated Images

Faces created by NVIDIAs AI Algorithm

A couple of days ago, probably at the suggestion of an AI, I read Sonia Klug’s Medium article AI is Changing How You See the World. The main argument of the piece is that AI enhanced images lead us to misrepresent reality, and that this, among other things, leads us to believe in unrealistic ideals, or perhaps even to miscontrue the way the world actually is. The problem has diverse manifestations. On the one hand, you have the presence of deepfakes that might well persuade us that a public figure said or did something she didn’t, and on the other you have filtered and enhanced pictures that lead us to believe our own blotchy and bumped faces are decidedly subpar. I’m inclined to agree with the main points in the article, but I’m particularly interested in another idea, clearly presented by Katy Cook, author of the forthcoming Ethical Threats and Emotional Unintelligence in the Tech Industry, who argues that “When this basic function [our ability to rely on what we see] is compromised by deepfakes or manipulated images, our ability to agree on basic facts diminishes and our social cohesion suffers as a result.” I think that’s right, and as Cook hints, this sort of thing might just be another step in the fracturing of our body politic, furthering the damage done by the siloed news channels and the bubbles caused by facebook algorithms.

An interesting thought here is that as we lose our ability to rely on common evidence, the more likely we are to retreat to our ideological corners so that we don’t have to adduce evidence at all. (How often do we object to those who agree with us because their reasons for their belief aren’t airtight? Philosophers aside, that is.) We either don’t talk to those who might debate us, or we avoid talking about the topics of disagreement.

In general, it seems likely that as trust in evidence weakens, so too does the drive to seek evidence. “You can’t trust anything” becomes an excuse to stick with whatever beliefs one already has, or to adopt the beliefs that are most convenient. What makes it particularly insidious in the case of images is that we tend to give credence to what we see, but if we lose that, we are apt to lose the last bit of hope we have that anyone can be convinced by the truth. At that point, the peddler of convenient truths wins, likely at the cost of the real world.

2 thoughts on “The Danger of Manipulated Images”

  1. I wonder if there will be some type of security measure, not necessarily to prevent the production of manipulated material, but to identify (and categorize) real vs fake.

    1. There is an added worry about “retreating to our ideological corners,” viz. the problem of group polarization. When people communicate only with others who share their views, those views tend to become more radical. The lack of exposure to dissenting opinion leads to extremist views.

Comments are closed.