For those who doubt the very existence of what gets referred to variously as “structural racism,” “institutional racism,” or “systemic racism,” or who downplay its deep-rooted perniciousness and virulence in American culture, the past year has provided many harsh corrective lessons. (Can you still say “post-racial” or “color-blind” without wincing, boys and girls?)
These tutorials have taken a number of forms, many of them visual: a gang of policemen murdering U.S. citizen Eric Garner a few blocks from my house; Univ. of Cincinnati police officer Ray Tensing murdering U.S. citizen Samuel DuBose; North Charleston, S.C. police officer Michael T. Slager murdering U.S. citizen Walter L. Scott … the list goes on.
No single one of these cases came to light as a result of police investigation or unforced confession. We know of them due to cellphone-cam video citizen-journalist witnessing or the press and public demand for the evidence provided by the increasingly mandatory documentation provided by police dashboard and body cams. Police departments nationwide have forcefully resisted the latter for years, and now we know why.
But structural racism takes other visual forms as well — specifically in decisions made by those who control and work in editorial positions in the mass media. Most commonly these involve still photographs, not videos, manifesting themselves in such ostensibly neutral, impartial, strictly procedural actions as the selection of pictures to illustrate a news story and/or the editorial refining thereof.
Certainly the most notorious instance at the end of the 20th century was Matt Mahurin’s doctoring of the LAPD’s O. J. Simpson mugshot for the cover of the June 27, 1994 issue of TIME. Whether or not you believe Simpson guilty of murdering his wife and her friend (I do), by darkening his features even further Mahurin (who is white) undeniably made him look more ominous, an inherently racist act in that it plays on white fears of dark-skinned people, the darker the scarier. Which — especially for a cover image — made this an editorial pre-judging of his culpability.
Most likely this does not happen out of malevolent calculation, but springs from unconscious bias. The picture editors responsible, almost certainly white liberals, would assuredly never describe themselves as racist; from their consistent responses, they feel both shocked and hurt by that accusation. But their protestations of innocence hardly mitigate the phenomenon as a symptom. To the contrary.
Were it consequent to personal prejudice, maliciously inflicted, one could fault the individuals responsible, case by case, treating them as isolated incidents. Instead, their obliviousness to the issues involved — as reflected in the statement about the Mahurin-Simpson incident published by TIME‘s managing editor, James R. Gaines, claiming that “no racial implication was intended, by Time or by the artist” — speaks volumes.
The fact that TIME did not “intend” the “racial implication” of that cover (whose credit line, printed in tiny type at the bottom of that issue page 3, read, “Photo-Illustration for Time by Matt Mahurin”) exemplifies the structural racism institutionalized in that media empire. Because, as Sen. Sam Ervin reminded us during the Watergate hearings, “A person is presumed to intend the natural consequences of his actions.” That Mahurin and Gaines did not realize consciously what they did in making and publishing that cover does not exculpate them; indeed, it condemns them. After all, it’s their professional job to understand the power and signification of images. This performance leaves their competence in question.
Now we have a fresh example (though we’ve had plenty in between): The decisions made by NBC News, CNN, BBC, People magazine and hundreds of other news outlets to pair a formal, official portrait of accused murderer Ray Tensing, smiling and posing proudly in his police uniform with the American flag right behind him, with the mugshot of his victim — a glum Samuel DuBose in a white t-shirt. Underscoring the implicit racism of this juxtaposition, some outlets (CNN among them) even went so far as to use it in stories announcing Tensing’s indictment for murder.
This raised an understandable storm of protest on social media and in various publications. “Dear NBC, BBC, CNN, and others: Mugshots are for criminals and murderers, not their victims,” by Nidhi Prakash, published by Fusion on July 30, can stand for those.
This is what the terms “structural racism,” “institutional racism,” and “systemic racism,” mean, as evidenced in visual form. Those who chose and approved these two photos to illustrate this story were unquestionably white people. And, unquestionably, they had absolutely no awareness of their own biases when they made those decisions. To date, none of them have come forward to acknowledge their prejudices, nor have the outlets that gave them platforms for their biases apologized for doing so. They need to own that, in public, by identifying themselves and pledging to seek education and take other steps to check their white-skin privilege.
To do the same with yours, or that of anyone you know who needs to get real, I recommend the series of short videos produced by Race Forward: the Center for Racial Justice Innovation. Start with “What is Systemic Racism?” by Rinku Sen, president of Race Forward, and work your way through the set of nine one-minute presentations. Then read just about anything by Ta-Nehisi Coates. And then bring that to Matt Apuzzo’s August 1, 2015 New York Times report, “Training Officers to Shoot First, and He Will Answer Questions Later.”
But first, here’s Rigel Robinson schooling NBC News on the appropriate visual comparison to make when presenting images of an accused murderer and his victim:
That’s right. Mug shot for the alleged perp, attractive portrait for the victim. That’s what systemic color-blindness would look like. (Can you say “Not quite there yet,” boys and girls?)
If you want to argue for the possibility of artificial intelligence, then you have to acknowledge its inevitable corollary, artificial stupidity. Google developing and debuting Photos, an app whose algorithm mistook black people for gorillas, surely qualifies. Hard to believe that no instances of this happened during Google’s extensive testing of the app.
To what extent could this be coincidence? You decide. I made this screenshot of what appeared in the right-hand sidebar of the BBC’s website as I browsed the news there on June 2, 2015. I do frequently read reports and analyses online concerning such issues as race, literacy, and animal intelligence, and indeed I did go on to read both these stories.
Currently I use Firefox as my browser, and Google as my search engine, but since this happened at the BBC site (which I visit regularly), as usual offering me other BBC stories I might find of interest, it seems reasonable to conclude that BBC’s algorithms paired these two. Not necessarily connected by anything other than my idiosyncratic range of interests.
I Can Be Replaced
Now comes kulturBot 3.0, “a robotic art show reviewer and poet that attends exhibitions and tweets text-captioned photos of the artworks and venue.”
Prof. Frauke Zeller of Ryerson University’s School of Professional Communication in Toronto created kulturBot 3.0., constructing him/her/it from a vacuum, a colander, a lemon juicer, a shopping cart flag, and a Wiffle Ball, plus an internet connection for the tweets, a webcam, and a thermal printer for on-the spot hard copy of its comments. Its apparent height — about 18 inches, I’ll guess, with its webcam positioned just 12 inches from the ground — gives it a rugrat’s perspective, which may help it avoid becoming jaded like some of us senior types.
My esteemed robotic colleague offers responses surely more cryptic and poetic than mine, yet mercifully free of the dreaded “International Art English,” sometimes called Artspeak. Restricted to the 140 characters of the standard tweet, kulturBot 3.0 utters comments like “leap of gymnasts flung across their predatory fingers come! Here is denied the belly of rogue locomotives, alone wi.” and “S the drunk with being alive! Standing on the sea. Then we are on the libraries, fight morality, feminism and therefo!” Exactly.
kulturBot 3.0 also collaborates with living poets and generates daily poems of its own, derived algorithmically from a 19th-century travel narrative by David Thompson. The textual source for its randomly generated critical tweets goes unspecified, but I detect the strong stylistic influence of Frank O’Hara and Peter Schjeldahl.
Great to have some young blood (so to speak) enter the field. If you find my posts overlong, or simply hanker for a quick critical fix, you can follow kulturBot 3.0 on Twitter: @kulturBOT. vlan! indeed. I couldn’t have said it better myself.