With the rise of AI-generated video, there’s a new place on the Internet to check whether the cute puppy video your grandma posted on Facebook is real: the subreddit r/isthisAI. Participants there have not only been discussing the origin of videos, but they’ve also been dipping into philosophy of language. I recently spotted a debate about the applicability of gendered pronouns for the subjects of AI-generated videos.
A user posted a video that seems to picture a young black woman with vitiligo. Half of her face is light, and the hair on that side of her head is blonde. The other half of her face is light brown, and she has dark brown or black hair on that side of her head. The post is wondering whether the video is AI, since, as the author writes, “She looks fake.” Everyone quickly agrees the video is AI-generated.
What caught my eye was this exchange (replies are indented):
alligatorkingo AI. Its hair blurs when it opens its mouth. Also the sticky notes text is gibberish. Bad bot.
TemporaryElk5205 I know the picture is AI generated, but how about you still refer to the subject as “her.” We refer to fictional subjects with human pronouns all the time. Referring to the girl in the video as “it” when she’s also a black person with vitiligo is a particularly bad look.
alligatorkingo It is not a real person, it is a bot, a package of data. Generate [sic] by AI
12-ft-tall-skeleton We don’t need to humanize the clankers
Linguists and philosophers of language do not agree about what determines when a grammatically gendered pronoun (e.g., “he” or “she”) is appropriately used, or is felicitous. [Some argue it’s sex(https://philpapers.org/rec/BYRPPV), in the sense of biological features like gametes. Some argue gender, whatever that it is—and there’s significant debate on this point. Some might distinguish between gender and gender identity. Others argue that gender and sex both license correct application, depending on the nature of the referent.
Misgendering is now a common philosophical topic that arises in dicussions about the semantics of gendered pronouns. Usually, it arises in the context of transgender or genderqueer people. There are debates about whether English grammar constitutes a sufficient reason to use “she” for a transgender man (someone assigned female at birth), and there are debates about the normative dimension of using a gender pronoun someone rejects. Many philosophers argue that, whatever the semantics of gendered pronouns, there are overriding reasons not to misgender someone (call a trans man “she” when he prefers “he”). After all, it’s plausible that the biggest problems with misgendering are not, at root, grammatical (failing to use the proper pronoun given facts about language), but interpersonal or moral. Depending on the situation, someone who misgenders is being impolite, insensitive, or actively harmful.
The Reddit thread, in contrast, is concerned with misgendering an AI image. It’s the first I’ve seen of such a concern, and it made me reflect a bit on what’s going on here and its relationship to existing theories of pronouns.
One of the themes in this exchange is the choices available for referring to the subject of the video. TemporaryElk5205 objects to the use of “it” because alligaterkingo seems to have intentionally selected “it” despite the readily available “her” being used for fictional characters. Cameron Domenico Kirk-Giannini and Michael Glanzberg and Elin McCready have argued that choosing to misgender someone is similar to what Renée Bolinger calls “contrastive choice” in her account of slurring. Even if a slur and a neutral counterpart have the same semantic content, it could be that the slur’s offensiveness is in the choice of that word over the neutral counterparts. The choice signals that the person endorses what’s associated with the slur. Likewise, misgendering someone signals anything from ignorance of prevailing norms to holding beliefs that trans people are mentally ill or should be excluded from certain spaces, etc.
Here, although “it” is typically used for nonhuman referents, and there is no human person in the video, there is a common practice of using “she” for fictional characters. We frequently refer to computer-generated video game characters as “she.” And that practice even holds for fictional, nonhuman (alien, elvish, etc.) characters. Thus, alligatorkingo’s choice “it” signals a dehumanizing stance, as 12-ft-tall-skeleton points out.
That user also invokes a derogatory term, “clankers,” which is often also characterized as a slur. (If slurs are dehumanizing by nature, there are interesting questions about how a word can dehumanize something not human.) The implicit defense of applying “it” to the generated image in the video is that using “she” and “her” would humanize artificial intelligence, when, as a “clanker” or a “bot,” AI is not human. Returning to the original contrastive choice, by opting out of the usual practice of referring to even fictional characters with grammatically gendered pronouns, alligatorkingo signals strong opposition to AI.
TemporaryElk5205, in contrast, is concerned with how the pronoun is applied to the character, which they distinguish from the AI that generates it. From their perspective, there’s something worrying about applying dehumanizing pronouns to a depiction of a person who is doubly a member of a marginalized group: black and subject to an autoimmune disorder that visibly marks them as different. It’s a “bad look” to dehumanize the subject, even if the subject is fictional. We might think that’s because the pronoun refers to the fictional represented subject and not the “package of data” that causes the representation (or the pixels that appear on the screen).
That difference in pronominal use is also interesting in this case. While alligatorkingo seems to conflate the AI that generates the image and the image itsself, charitably, they take “it” to refer to the sequence of pixels displayed on the screen (or the underlying structure that makes them emerge). The two speakers seem have different ideas about what the pronoun’s referent is and thus, which pronoun is felicitious. That is consistent with Kirk-Giannini and Glanzberg’s observations about the complex situation for gendered pronouns, as they argue that it’s likely that there is a variety of properties that people’s pronoun use is sensitive to, depending on their idiolect.
Relevant to the messiness of the situation is another comment downthread, responding to something that’s been deleted by a moderator:
19468 Frodo isn’t pretending to be a real person… Also, how do you know the bot in the video would identify as a woman, if it were real?
This comment tracks with Dembroff and Wodak’s distinction between gender and gender identity in their treatment of pronouns. While often, people talk about gender being the property to which gendered pronouns are sensitive, gender identity can be understood as a person’s identification with a gender group. To Dembroff and Wodak, misgendering is a way of opposing or undermining someone’s gender identity. And, importantly for their larger argument that the singular “they” should replace binary-gender pronouns, gender identity is something we do not have direct epistemic access to. This is precisely what 19468 implies in their retort to the use of “she” for the AI-generated subject. For this person, it seems, pronouns track gender identity.
That the situation is messy for “he,” “she,” and “they,” is by now well-established in philosophy of language. It seems that philosophers of language might want to add “it” to the list of pronouns to consider in relationship to misgendering. The situation is likely to continue to change, not just with the rise of AI, but with the rise of reflection on AI in the form of Reddit threads like this one and science fiction. For instance, Martha Wells’s All Systems Red is now an Apple+ TV show featuring an agender robot who uses it/its pronouns. Perhaps the pronoun will shift some of its associations. The titular character, Murderbot, prefers “it/its”, analogous to singular “they/them.” Maybe we will see a reclamation of the impersonal pronoun “it” in the years to come (and with it, the derogatory “clanker”)?


