I Was a Male Chat-bot: The Turing Test, Artificial Intelligence, and Gender Online


Three summers ago, I made ten dollars an hour plus commission portraying “Jessica”, an online shopping assistant program designed by InQ serving the WhiteFence.com website. On WhiteFence.com, a customer can purchase phone, cable, internet service, and other products specific to their address. If any questions about the products or ordering procedure were to arise, the customer could initiate an online chat with Jessica simply by clicking on her picture in the upper right hand of the page. Jessica looked the part of an intelligent and congenial assistant with blond hair pulled back, a collared white shirt, and a pair of stylish librarian glasses. However, this image of Jessica rarely resembled the individual who answers questions as Jessica. In fact, in the first two months of the WhiteFence.com account, all of the agents working as Jessica were males of ages 20-40. The InQ office was filled with Jessicas working on different websites such as bellsouth.com, sprint.com and vonage.com, all corresponding to roughly similar pictures of the blond, attractive woman ready to answer all your questions. As agents, we were encouraged to maintain our “Jessica” identity at all times. Jessica provided a human face for the website, a form of branding which personalizes an online experience usually marked by anonymity. The overwhelming majority of customers fully bought into the Jessica masquerade, often typing personal testimonies of their trials and tribulations in trying to get their phone connected and appealing to Jessica’s implied sense of personal concern and warmth. Jessica was always sympathetic, but she was also a saleswoman, trained to guide customers to the latest long distance plans and rebates so she could make a fifty cent commission on each sale.

The way that the customers personally related to and trusted Jessica’s authenticity consistently astounded me as I sometimes lost sight of the avatar I impersonated only to be reminded by the femininity they projected upon me. Some customers so thoroughly believed this domestication of the internet, that they contributed their only gestures of intimacy, sometimes  referring to me as “Jessie” in their conversations or even as “Miss Jessica”. Other individuals continued to chat with me long after their purchase was completed, including one notable individual who asked me on a date after a long story about needing to purchase internet service for his new apartment now that his girlfriend kicked him out. Other lonely hearts have been more forward as Jessica has seen her share of lewd comments, come-ons, and outright sexual harassment. Although not a woman in real life, I nonetheless felt than just a sense of disgust in principle, but felt an affect of violation at these comments as if she has become an extension of myself. While my mind fell in and out of the mode of gender impersonation, any customers who seized upon gendered power assymmetries in the conversation immediately interpolated me as a subject into Jessica’s body and I felt the sense of degradation that a real Jessica should have felt, and no doubt, what millions of real women experience regularly in their jobs.

Conscious of the fact that not only am I not truly “Jessica”, but I am also not even a female, this experience raised questions about how age-old conventions of gender performance have been infused into online communication . While most customers fully trusted the fact that that blond-haired woman named Jessica was on the other end of the conversation, a small percentage doubted not only my identity, but my reality as well. Jessica is given a set “script” of answers to frequently asked questions and detailed product descriptions to send that one could obviously not be able to improvise on the spot. In this sense, Jessica is a collaborative artificial intelligence where the intelligence of the individual and the programmed information of the computer merge. This shift from the discourse markers of corporate language and my own personal construction of phrases raises suspicion in the eyes of the customer, who frequently demanded to know if I am real before they allowed me to help them. While most customers were content to just have the answers to their questions, a good-sized portion insisted upon verifying my identity (and sometimes proving I was an American and not in an Indian call center) before they would allow me to help them. Performing as Jessica, I was burdened with having to prove my reality as a human or an American while still masquerading as a female. For these customers, interactions with Jessica took on the properties of a 21st century Turing test in which they felt compelled determine both my gender and humanity through only my responses online so as to determine whether or not the entity on the other end possessed a legitimate and trustworthy store of knowledge on cable TV and internet service.

In 1950, British Mathematician and cryptologist Alan Turing published the paper “Computing Machinery and Intelligence” which has since become the cornerstone in the modern scholarly discourse on Artificial Intelligence. Turing begins his paper around the hotly debated question “can machines think?”. He is notably reluctant to define what it is to “think”, wishing to avoid providing a definition that could either be used as an inflexible referent that gains credibility on the strength of his reputation alone or initiating a semantics battle over the definition which would obscure the point of the paper. Turing instead proposes a hypothetical model for determining the intelligence of the machine without debate over the essentials of thought and consciousness, which has subsequently come to be termed the “Turing Test”. In this model, which has been proposed in many variations, usually consists of an individual who is placed in a room where he is to give questions and commands to an entity placed in another room. The entity in the other room, which could be a human or machine, sends back answers to the questions that are first interpreted by a human test conductor and given back to the one who wrote the questions. After several rounds of questions and conversation, the interrogator is then asked to determine whether or not he has been communicating to a machine or a person. In some versions, both a machine and an individual can answer the questions at the same time and thus a comparison can be made. If the individual who asks the questions cannot determine if he is talking to a machine or not, then Turing concludes that the machine could be termed intelligent. For Turing, intelligence in this context is not evaluated in of itself, but instead as far as it relates to the subjective judgment of the individual. A machine possesses intelligence as far as the individual cannot tell it apart from the intelligence of a human being.

I will not make the error that many scholars in the humanities make by attempting to enter into a debate in the sciences for which they are ill-equipped. Instead, my primary interest in Turing’s model is in how questions of gender identity informed his test of Artificial Intelligence and how his theory understood how intelligence and knowledge becomes an embodied phenomenon Turing’s inspiration for his theory was based on a parlor game called “The Imitation Game” in which individuals guess the gender of a hidden individual based on responses to questions. Turing defines it by the following:

“It is played with three people, a man (A), a woman (B), and an interrogator(C) who may be of either sex. The interrogator stays in a room apart from the other two. The object of the game for the interrogator is to determine which of the other two is the man and which is the woman. He knows them by labels X and Y, and at the end of the game he says either ‘X is A and Y is B’ or ‘X is B and Y is A. The interrogator is allowed to put questions to A and B thus:

C: Will X please tell me the length of his or her hair?

Now suppose X is actually A, then A must answer. It is A’s object in the game to try and cause C to make the wrong identification. His answer might therefore be:

“My hair is shingled, and the longest strands are about nine inches long.”

In order that tones of voice may not help the interrogator the answers should be written, or better still, typewritten. The ideal arrangement is to have a teleprinter communicating between the two rooms. Alternatively the question and answers can be repeated by an intermediary. The object of the game for the third player (B) is to help the interrogator. The best strategy for her is probably to give truthful answers. She can add such things as “I am the woman, don’t listen to him!” to her answers, but it will avail nothing as the man can make  similar remarks”.

Turing’s model relies upon the subject of speculation being closeted. This closet, which can be seen by the interrogator, presupposes the content of either a female body, a male body, or in the case of the AI test, a machine or computer of some sort. The ability to detect the contents of the closet depends on the player’s ability to visualize a presence, knowing that something must be there to send the notes. Even when the machine is nothing but a box that can produce a tickertape, we project our sense of agency upon it. Because the mind works with sound images and visual signifiers, we cannot possibly imagine pure information without a visualization of authorship or some origin of the words. Therefore, we must attribute some sense of our selves via personification onto the product that produces the information in order to understand it

With the goal of the game as to fool as many people as possible, gender performativity becomes the ultimate modus operandi for victory. Without the context of voice or handwriting due to a neutral individual or teleprinter reading the responses, the only way to prove gender results from the content and phrasing of the information given. Per Turing’s example, if a woman were to have short hair, it would be in her best interest to lie and talk of long hair if she believes that the audience would expect a woman to have long hair. Therefore, the actual woman may not be bodily woman enough to correspond the signifier of woman formulated in the mind of the interrogator and must perform to what the interrogator pictures as a woman so as to prove her own authenticity

Just as Artificial Intelligence uses repeated programmed responses contoured around the expectations of the user to appear natural, so too does a woman’s gender appear natural as it countlessly repeats the same gestures and affects that we have come to associate with authentic femininity. The more a gesture is repeated, the more natural it feels until that gesture becomes ingrained in the unconscious as instinctual when it is in fact learned behavior. Thus, gender performance is both an unconscious involuntary process and a tactical employment of signifying acts of masquerade as advanced by the early psychoanalyst Joan Riviere in her essay “Womanliness as Masquerade”:

“Womanliness therefore could be assumed and worn as a mask, both to hide the possession of masculinity and to avert the reprisals expected if she was found to possess it — much as a thief will turn out his pockets and ask to be searched to prove that he has not the stolen goods” (36).

Under this definition, the woman performs the gender of femininity in order not to call attention to how her actions may disrupt or threaten the masculine agency of the man she addresses. In this dynamic, the woman is defined as “the Other”, a collection of undefined qualities that are merely the opposite of what the male associates with his sense of agency. There is thus no definition for woman as a genuine, independently define identity, only the false masquerade of typical feminine acts that covers this otherness with familiar gestures. In a similar dynamic, technology and computers occupy a roughly concept of otherness. Just as woman in a simple binary is situated as the opposite of “man” as we think of mankind as masculine, so too do we pose the machine as the other of man. Although we like to believe we have referential qualities for what “man” signifies in opposition to “machine”, the definition of machine lacks its own signifiers that would signify it in its own right other than the lack of humanity. Therefore, there is no “machine” outside of “man” just as there is no “woman” outside of her relationship with “man”. The technology of intelligence employed by the woman and the machine both becomes technologies of masquerade under which there is no authentic face but the expected face projected by the masculine subject onto the veneer of the mask itself. In this relationship, woman and machine are united in the category of “Other” and must perform themselves to the expectations of the assumed male spectator. Thus, the imitation game provides a model for which we can see that the process of signifying intelligence implies a masquerade and that the test for machine intelligence and gender intelligence is the same exact test. There is no difference in the Turing Test between the two imitation games; they are the same test of successfully covering one’s otherness to appeal to the spectator’s expectations of what entails intelligence.

The performance of Jessica online represents not only a contemporary reconceptualizing of Turing’s Test, but that it also constitutes the reality of signifying artificial intelligence and gender intelligence online as the same process in the same body. In other words, Jessica can be viewed as half a gender performance and half a performance of humanity. Because these are not separate performances, but that instead the performance of gender is always already a performance of humanity, I was constantly reminded of the works of Donna Harraway and her Cyborg Manifesto. For Harraway, the cyborg represents a deconstruction of the binaries present in gender relations, specifically the binary of control and lack of control over one’s body. Harraway argues that “Late twentieth-century machines have made thoroughly ambiguous the difference between natural and artificial, mind and body, self-developing and externally designed, and many other distinctions that used to apply to organisms and machines” (34). The cyborg represents the way in which modern technology has not only influenced the evolution of the individual and society, but also how it has now become inseparable from the human, a melding of DNA and binary code in which we cannot conceptualize our selves without the intervention of technology. The cyborg is formulated when an individual has incorporated technology into their body and the day to day functioning of the body so depends on this technology that it feels as though it is a natural appendage of the body complete with its own circulation and nerves. The body starts to feel through technology This can take the form of physical prostheses like contact lenses or surgical implants, technologies we use supplement the capacity of our bodies, or even technologies of bodily depiction like the use of Photoshop to retouch pictures where our standards for the bodily strength and beauty become distorted by our ability to create biologically impossible but natural-appearing images of the body.

Under this reading, Jessica can be partially interpreted as a cyborg. While my co-workers and I all joked about being Jessica and pretending to be a woman, it was clear that nobody actually thought of themselves as Jessica nor did they change their behavior to perform consciously what they would consider gendered speech. Instead, Jessica is best seen as a collaboration between the computer program of Jessica and the individual agent logs into her identity when they receive chats. As part Artificial Intelligence, Jessica the computer program is designed to pop up on the screen of a customer who has stayed on a single page beyond five minutes as well as remain a seductive icon of help that one can click on, on the upper right of the web page. The agent portraying Jessica had no ability to initiate chat at all; it is only Jessica as AI that could initiate chat with a prerecorded offer of help under the guise of having been initiated by a live person. Once the customer responds, the agent is then notified of the request and can begin to click Jessica’s pre-typed script and FAQs to send to the customer as well as free type responses on their own. Through the combination of pre-programmed script and improvised answers, the agent and the program mutually collaborate to create the intelligence of Jessica. On a rudimentary level, Jessica may be considered a cyborg so far as she as presented to the consumer as neither wholly a person nor wholly a computer program but instead a technology of artificial and human intelligence stamped with an attractive blond face as to pose as purely a product of human intelligence. Neither Jessica the computer program nor the operator logged on as Jessica can do their job without one another; they inhabit one virtual body composed of human flesh and binary code. The cyborg body of Jessica is composed a technology of gender which projects the picture of Jessica on the screen into the mind of the customer to constitute an image of whoever chats with them on the other end of the conversation. The pixels of the computer image of Jessica takes over the DNA of the real chat operator who becomes infused with the elements of Jessica in the mind of the consumer.

However, Jessica proves somewhat unsatisfactory as a cyborg in that her identity as a cyborg is only comprehended by the agents who portray her. For the customer, Jessica is designed specifically to erase the potentially threatening notion of artificial intelligence of technology and to allay consumers’ fears of shopping online in a world obsessed about identity fraud and scams.  Posing specifically as a gendered human, Jessica represents how individuals online ignore the how technology has pervaded our existence and rendered us all cyborgs. The consumer clings to and insists upon knowledge of “real” existence of Jessica so as to verify the veracity of the information they are given, somehow suggesting that if it had come from the program and not Jessica herself, that somehow the computer had developed the ability to formulate it without someone first having given the computer that information. Thus, there is a fear of technology having too much agency that the image of Jessica assures them has been tamed and domesticated.

The privileging of “real” biological identity offline transports the Turing Test to the 21st century in which individuals determine intelligence based on the information broadcast over the internet by an individual performing identity. This Turing Test of determination of true identity through the use of language is further complicated in Jessica’s interactions with customers. In the dynamic established in chat exchanges between Jessica and the customer, a slightly modified version of the Turing Test is realized. The main adaptation comes with the ever-increasing role played cyberspace in the model (thus the machine) as both the means of conversation and thus the moderator. In the original Turing test, the interpreter of the information submitted from the closet is a human being, yet in this model, the machine is the moderator. The machine is both the moderator and possibly the object submitting information from the closet at the exact same time. While the customer must still necessarily visualize the source of the information that he is fed, he is already conscious of the presence of machines as he is communicating through a computer. Therefore, the customer himself must rely upon a certain level of artificial intelligence in the form of his computer so as to be able to access the information in the first place. Despite the lack of consciousness in the part of the customer, the interaction between Jessica and himself cannot be defined a man talking to man, or machine talking to man, but instead cyborg communicating to cyborg. As I have spoken of earlier, internet communication inherently changes the way in which we communicate because we cannot express certain ideas in words or merely lack written proficiency. Just as gender is affected through online communication as performance, so too is the humanity of the individual typing.  Customers often find typing to be cumbersome or they feel unable to express their questions and thus they frequently ask to speak to Jessica on the phone. There is some element of their own humanity that they feel becomes depersonalized or incommunicable through the computer-mediated communication.

The presence of Jessica’s picture helps to alleviate this anxiety as she is employed to make it appear as though the customer is talking directly to a human being instead of sending their words out into the anonymity and incomprehensibility of cyberspace. The Turing Test’s set-up is realized in the interaction as the customer is presented with the possibility of a jovial woman being in the closet sending him the information instead of some anonymous machine. However, despite the fact that all of the WhiteFence.com Jessicas were men, there was no premium placed by the supervisors on the agents to perform a feminine language. Instead, the standardization of the language came from the scripts on products and ordering procedure formulated by the supervisors. Instead of being consciously gendered, the scripts were imbued with salesman-type phrasing, attempting to make the products appeal to as broad of an audience as possible. The use of language centered on an ultimate tone of neutrality and diction at the level of the average consumer that one could easily read and understand through text alone.  Yet, at the same time, very few customers ever doubted Jessica’s femininity despite the relative neutrality of scripted answers and the males supplying the free-scripted answers. The only context of femininity provided for the chat is the little aforementioned picture of Jessica on the screen and the prefacing of every piece of information with “Jessica says”. This consistency of iteration results first from the use of scripts. Because the tone of the scripts are carefully processed and edited, there is a consistent tone of salesmanship that overrides any feminine or masculine tone. Secondly, the consistency is also derived from the mere repetition of the name Jessica before every piece of information submitted. Every time Jessica speaks, the program gendered the statement by reminding the customer that it came from Jessica. Thus, the language use becomes gendered not by any inherent quality, but merely by the customer expecting to read inflections of gender in it and in the process producing them himself.

This acceptance of the gender of Jessica raises the question of why the female gender is preferable to the male for representing the company and why individual customers trust her information. As I have alluded to beforehand, Jessica functions as a sort of brand name for the website on which she is featured. Jessica provides and image that humanizes a product such as a website which is composed of complex computer science of which the average consumer lacks knowledge. The employment of a female face to personify a section of cyberspace refers back to the act of masquerade. The female who has been labeled as an “other” shares a relationship with cyberspace as “other”. Otherness in this context can be especially threatening for the customer as the computer functions in ways it cannot comprehend. While Jessica functions to answer questions that the consumer could very well look up in the FAQ page, Jessica’s feminine gender represents the masquerade of difference that masks the feared otherness of technology and woman. The feminine gender is already imbued with the process of masking otherness through performances of sensitivity and empathy rooted subconsciously in the mind of the customer. Here, the otherness of machine is hidden under the same mask of gender in order to familiarize what cannot be fully understood. As a brand, Jessica as woman inherits a lineage of female icons adorning products and familiarizing them to the consumer. Just like an Aunt Jemima, Betty Crocker, or Mrs. Butterworth, Jessica infuses a touch of femininity into the website that she adorns. These corporate icons suggest personification of a product that consists of little more than a bag of flour and sugar. The image of the completed pancake on the box is not enough to simplify the abstraction of the product before it is cooked. The end product must have an author, a genial, motherly cook whose know-how produces delicious pancakes that you too can make. In this same way, Jessica’s simulacra of bodily presence of the website seizes upon the need for the customer to see embodiment online.

The privileging of the situatedness of intelligence through gender in the eyes of the consumer is evidenced by the one account in the company that does not use the Jessica moniker. InQ’s account with Gamefly.com, a website like Netflix for video games uses the name “Mike” for its employees. While most of the Gamefly.com agents were in fact men, the customers reacted negatively to and doubted Jessica’s advice because they assumed that a female would not know enough about video games. Jessica was subsequently fired and replaced by Mike who proved much more effective with the customers despite the fact that the scripts remained the same and the account used the same exact chat agents. The only change in the dissemination of information was changing Jessica’s name to Mike and removing her picture. The Turing Test set up of customer and agent is reflected here as the customer situates intelligence through “masculine know-how”. The customer determines the intelligence of the information he is given based on his own criteria for intelligence. He necessarily presupposes a man in the closet on the other end of cyberspace as the signifier of intelligence. Once he is presented with the suspicion of femininity, the information no longer qualifies as “intelligence” and is thus branded artificial, or in this case, female. For the customer, intelligence is an embodied phenomenon where the presence of the body indicates an understanding beyond mere instruction. As a male is presupposed to “know how” to play video games, (the complicated sequencing of pushing buttons) his bodily presence signifies this know-how that corresponds to the signifier of video-game intelligence in the consumer’s mind. Despite the fact that this process of video game playing cannot be communicated online, its know-how is symbolized by the name Mike and thus anything he recommends on the website carries an authenticity that a woman who delivers the same advice cannot signify.

Jessica extends the common yet unfortunate practice of using attractive women in customer service positions so as to seduce the wandering eye of the male consumer and lure him into a power dynamic where he thinks he is in control due to his perception of superiority over women. This practice is merged with cyberspace that is imbued with connotations of the frontier, a wild yet virginal area prime for the conquering male to insert himself. The presence of Jessica as an attractive female further raises the sexualization of the cyberspace frontier, putting a human face on unconscious sexual drive. The presence of Jessica allows the customer to trust the information he is given in that her intelligence is validated through this masquerade of self and machine as woman. Such a process is evident with the sheer amount of sexual harassment that the average Jessica must fend off. Yet, this sexual power dynamic is a mere mask on the actual power dynamic at place which feminizes the consumer through the monopolistic economic control of the cable and phone companies over the individual which are sold on the site. Jessica thus softens and sexualizes the monstrous nature of technology’s power over the consumer.

About Chase Dimock

Chase Dimock teaches Literature and Composition at Broward College. He earned his PhD in Comparative and World Literature with a graduate minor in Gender and Women's Studies at the University of Illinois in 2014. He specializes in 20th century global modernisms and American, French, and German literature with an emphasis in Queer Theory, Feminism and psychoanalysis. He has taught courses on world literature spanning the ancients to the existentialists as well as courses on gender studies, queer literature, European cultural politics, and representations of the Holocaust. He is originally from Los Angeles, California and holds a BA in creative writing and political science from UC Santa Cruz and an MA in Comparative Literature from The University of Illinois. He contributes articles regularly to As It Ought To Be and The Lambda Literary Review
This entry was posted in Chase Dimock and tagged , , , . Bookmark the permalink.

3 Responses to I Was a Male Chat-bot: The Turing Test, Artificial Intelligence, and Gender Online

  1. David says:

    Overall I think you bring up some very interesting points but there’s one problem I have with your article, specifically what you write on the Joan Riviere quotation.

    I think the quotation and your exposition on it is useful, but it falls flat to me in accepting manliness as a man’s sense of agency. I think any serious examination of gender roles reveals that both men and women are alienated from their gendered expectations, and therefore both are dehumanized in these power structures.

    Obviously womanliness has gotten the short end of the stick since the end of matriarchal society, but you yourself end the essay with a perfect example of how manliness undermines masculine agency. The market sends the consumer to deal with a fictitious power structure and assert his dominance over Jessica instead of witnessing the actual social relations expressed in his consumption.

    Your conclusion I think is very strong but to develop it further we would have to go the next step and see how current conceptions of manliness and womanhood are themselves the products of our alienated labor. One of Marx’s great insights was recognizing that under capitalism the social relations between people appear to them as the social relations between products. Note the gendered nature of videogames, but the ungendered nature of a man typing words.

  2. Sophie says:

    I want to talk to an intelligent robot like Jabberwacky, but Jabberwacky is female and I want a male robot

  3. Matt Miller says:

    This is funny, because I too worked at InQ back in 2007, but our name was changed to “Mike” because it turns out the male video game players were desperate for some of jessicas goods. I got fired for being on my cell phone too much, which sucked because I was consistently getting the highest sales in the GameFly unit.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s