Overview
- Isaac Asimov
- I, Robot
- Chapter-by-Chapter Notes
Isaac Asimov (1920-1992)
Isaac Asimov was a rather prolific writer, who authored nearly 500 books, which is incredible. He is most known for his science fiction, but he wrote many general audience guides to science, linguistics, history, and (some) humor. He even wrote “Tales of The Black Widowers,” a mystery series. I have always found a detective nature to his science fiction, especially in I, Robot. Think of this novel as a series of vignettes, brief semi-related episodes that each offer a view of a particular troublesome robot function. Overall, the stories lead to an inevitable (or is it evitable) conflict.
I use this novel in my Introduction to Technical Communication courses because Asimov’s stories offer ample opportunity to think about the role of technology in society. As you read, try to relate Asimov’s science fiction technology with contemporary technologies–cell phones and computers are good choices.
Retro SciFi
I like to watch old science fiction shows and films that project into the future. There was a prevailing view that we’d all be traveling in flying cars by the year 2000. We don’t have flying cars, but we are on the verge of viable self-driving vehicles. Asimov writes about the science of his day, but he also projects that science into the future. As a scientist himself (he received a PhD in Chemistry), he would have had access to, at least, cutting edge discussions of science. Asimov tells readers in his autobiography that he was well aware of the significance of harnessing the power of the atom for nuclear energy–for both civil and military purposes. In fact, a professor of his at Columbia University, Harold Urey, was working on the Manhattan Project during WWII. Asimov recounts a story where Prof. Urey claimed
how pained he was that he could do nothing to help the war effort. Asimov pointed out innocently that perhaps the enriched uranium kept at Columbia may have had something to do with the war effort. Urey reddened and changed the subject (Strickland, 126)
The Allies knew the Germans were pursuing atomic weapons, so those working on the project were nervous about letting others know they were working on it (Even people working and living around Oak Ridge, TN in the early 1940s had no idea the military was enriching uranium for atomic weapons). Asimov didn’t know exactly what research was taking place (at his own university), but he was definitely aware of the potential of atomic energy. I mention this because, as you no doubt noticed, small atomic devices appear to power technologies in Asimov’s future of I, Robot. This is retro scifi: the past’s view of the future. Remember, though, good science fiction isn’t about predicting the future. Instead, it’s about considering one’s moment in time and extrapolating on that moment by putting it into a future, speculative setting. Asimov and other science fiction writers project an assumption of what technology will be like in the future based on prevalent, promising contemporary technologies.
Chapter-by-Chapter Discussion
Below I’ve added the Appendix from the article I wrote on using I, Robot in the technical writing classroom. The following breakdown is for Chapters 1-8. Although the article is for other technical writing teachers, it provides a way for readers to interpret the novel and draw comparisons to contemporary technologies.
Appendix 1. Robot/human vignettes
The following examples match the first eight chapters in Asimov’s I, Robot; of course, they are not exhaustive. Also, the examples may appear to stress a tools approach, but as moments for discussion, these examples can easily be incorporated into critical technological awareness or critical literacy. Although the examples stress ethics in technical writing, I hope readers experiment with the multiple topics that may arise from the following situations.
- Robot marketing. I, Robot helps students to consider how they would sell twenty-first-century technology. Obviously, the text allows for a reading of the audience, and in this case, Asimov’s audience is that of the hyper-industrial future. Space exploration is not merely for expanding humanity’s knowledge of the galaxy; it is for colonizing far-off places and extracting valuable resources. A New-Historical approach would capitalize on the mercantile, neo-colonial nature of such a world system and offer critiques of globalization. The discussion may even highlight differences between a manufacturing economy and an information economy.
- Labor issues surrounding robots. Because robots are ultra-efficient in the novel, humans almost banned their use on Earth (Asimov, p. 35). Because our contemporary technologies continue to put jobs in jeopardy, students may benefit from examining how robots are used mainly for jobs humans cannot do. By looking at contemporary
labor disputes regarding illegal immigration, the novel may be compared to political positions in favor of allowing undocumented workers to obtain visas—these workers do jobs that Americans will not—and positions contrary to allowing undocumented workers to acquire jobs in the United States. Furthermore, the practice of outsourcing
manufacturing to lower-paying countries (e. g., sweatshops) may provide an interesting parallel to the novel’s “outsourcing” to robots. Such a conversation may spur students to think about the precarious nature of technical jobs that “disappear” because of either automation or a cheaper labor source. - QT, the existential robot. A rather peculiar robot named QT or “Cutie” is able to rationalize his existence as superior to that of humans. Because he feels that his kind is bigger, faster, and stronger than humans, he reasons that the “Creator” must have made him in His own image and not the weaker humans (Asimov, p. 63). Because QT does his job well enough, the humans decide not to disassemble him; instead, they allow the peculiar robot to go about believing that he is superior even though that condition is a minor nuisance for the robot engineers. Alan Cooper (2004) calls users who accept minor nuisances of technology apologists: An apologist learns to work
around software bugs because she or he accepts that one must bend to the rules of technology (p. 30). This may lead to more discussion on user friendly versus user-centered designs in documentation. After all, what service are technical communicators doing if they assume their audience will conform to their expectations? Audience analysis is fundamental to all communication courses, and a more humanist approach would adapt documentation to users instead of conditioning users to just “deal with” the idiosyncratic problems that arise with software or other technologies. - Redhead stereotypes. Interestingly, the future of I, Robot still portrays redheads as ill-tempered individuals. From Judas Iscariot to the redheaded robot engineer Mike Donovan, popular culture portrays redheads as ill-tempered or simply suspect (Roach, 2005). Because science and technology are often considered truth and not socially
constructed “beliefs,” Donovan’s attitude helps foster discussions about how experts’ attitudes may affect how they carry out their work. Scientists and engineers are often considered cold, calculated, objective professionals, so Donovan’s disposition, even though a stereotype, helps humanize the experts. Too often science and technology
are thought to be purely objective, practical endeavors devoid of human subjectivity. The politics of science and technology are well known (Latour, 1987; Winner, 1986). A discussion on the role of scientists’ and engineers’ attitudes goes well with ethical discussions that arise when scientists and engineers have to bow to the pressures
of management. For example, the often-cited instance of “when Jerald Mason asked Robert Lund to ‘take off his engineer’s hat and put on his manager’s hat”’ and approve the Challenger’s ill-fated launch to conform to managerial wishes (Walzer & Gross, 1994, p. 425) supports the idea that attitudes (or, simply, human factors) affect technological policy decisions. - Views of women in engineering and science. When the engineers are confronted with a robot that tells the humans what they want to hear, Dr. Calvin is tricked into believing that an object of her affection, Milton Ashe, is attracted to her. An analysis of women in science is quite appropriate at this point because Asimov portrays Dr. Calvin as an
asexual being for most of the novel; however, after she is lied to about Ashe’s affection (p. 118), Dr. Calvin begins “using lipstick . . . [r]ouge, powder, and eyeshadow, too” (p. 121). Her new cosmetic look, combined with the fact that Dr. Calvin never marries, seems to suggest that a woman either has her job or her man—never both. Although such sexist ideology might be refuted as outdated, we cannot ignore that women are overwhelmingly underrepresented in “physical science, engineering, [and] technology field[s]” (Burger, Creamer, & Meszaros, 2007, p. 6). Discussions on gender are important in technical writing to begin addressing barriers to women entering these careers. Also, drawing on a humanist rationale, the class discussion may help introduce students to feminist critiques of gender, topics that traditionally get left out of science and engineering courses but are integral to humanities courses. Although Dr. Calvin has a prominent role in the novel—she is the narrator—she inhabits a subtle stereotypical role: In the universe of hard-science engineers, Dr. Calvin is a “soft” robopsychologist. That persona reinforces the stereotypes surrounding women’s understanding of emotions but not cold logic or facts—the supposed realm of male science and engineering. - Government support of technology and war. Even though the world is at peace in Asimov’s novel, the military still invests in research and development. The first law of robotics, a governing principle impressioned into the “brains” of all robots, states “a robot may not injure a human being, or, through inaction, allow a human being to come to harm” (Asimov, p. 44-45). In order to do a job that the military wants, a few robots are not impressioned with the first law (Asimov, p. 140). Class discussion can begin with observations on how technologies such as the Internet, or Humvees (later sold commercially as Hummers), began as military applications before becoming consumer
products. Besides the possible military-industrial complex critique, this particular vignette supports discussion on ethics when one non-first-law-impressioned robot gets loose and mixes in with the fully-impressioned robots. Dr. Calvin says to destroy all of the robots because it is too dangerous to have a non-first-law-impressioned robot on
the loose. However, at thirty thousand dollars per robot (in 1940s dollars), a cost-benefit analysis has to be done, begging the question, “When exactly do you take the metaphorical Ford Pinto off the market?” Classes should use this example to explore the ethics of both technology and documentation of technologies. Who’s responsible for harmful technologies? - Scientific and technological races. A rival company wants to sabotage U.S. Robot’s “brain robot” in order to keep the firm from coming up with a secret to creating the Hyperatomic Drive, an [interstellar] engine that, if created, “will be the biggest thing in the world” according to a company manager (Asimov, p. 179). The brain robot is a supercomputer that does calculations faster than any other robot, and the brain robot can think. The rival company’s brain computer has failed, which puts them back “six years at least to build another [super robot]” (Asimov, p. 176). Because competition is so fierce, the rival company wants to slow U.S. Robots down. This scenario allows for a fruitful discussion examining competition in technology and science. After all, why do
we call Watson and Crick’s work on DNA “the race for the double helix” or the United States and the Soviet Union’s attempts to travel into outer space “the space race” after Sputnik? With growing litigation and concern for intellectual property violations—whether they be downloading MP3 audio files or cannibalizing another firm’s proprietary software code—discussions on workplace ethics are germane to any technical writing course. The way that the rival company tries to sabotage U.S. Robot’s brain is an interesting example of ethics and withholding information. - The fallacy of machine predictability. At the novel’s conclusion, the reader learns that the robots “run the show.” Although utopia is not quite reached, Asimov’s world does look good. More accurately, though, the good is really from a modernist perspective that believes “science and technology” lead humanity on “the path of human progress and efficiency” (Wilson, 2001, p. 73). If robots—beings that think in mathematic formulae—can run the world, what does that say about our society? Is life so predictable that one can punch in numbers and get perfect results? Hardly. Science and technology are not perfect, but the ideology of the industrialized world insists that science and technology will solve our problems. This is not to say science and technology have not improved life; instead, the discussion should explore the social values and practices that stem from holding the view of technological panacea. Are all technologies equally valuable? Technologies need to be analyzed just like we analyze communication—from the surface level to the systemic level. A pacemaker “update” is a bit more significant than a software update when we think of the pacemaker’s direct benefit for human life. Then again, software updates do make our lives easier if they fix bugs or improve functionality. Introducing this topic may also support critical discussions on planned obsolescence—the dominant model for most, if not all, software.
The above excerpt should be plenty of information for interpreting on I, Robot. Make sure you do your Canvas post about I, Robot before Friday, 5/29 at 11:00 pm.
Works Cited
Asimov, Isaac. I, Robot. New York: Spectra-Bantam. (Original work published 1950 by Gnome Press)
Burger, Carol J., Creamer, Elizabeth G.,& Meszaros, Peggy S. (Eds.). Reconfiguring the Firewall: Recruiting Women to Information Technology Across Cultures and Continents. Wellesley, MA: A K Peters, 2007.
Cooper, Alan. The Inmates are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity. Indianapolis: Sams, 2004.
Latour, Bruno. Science in Action: How to Follow Scientists and Engineers through Society. Cambridge, MA: Harvard University Press, 1987.
Roach, Marion. The Roots of Desire: The Myth, Meaning, and Sexual Power of Red Hair. New York: Bloomsbury, 2005.
Strickland, Jeffrey. The Men of Manhattan: Creators of the Nuclear Era. Lulu.com, 2011.
Toscano, Aaron. “Using I, Robot in the Technical Writing Classroom: Developing a Critical Technological Awareness.” Computers and Composition, vol. 28, no. 1, 2011, pp. 14-27.
Walzer, Arthur E., & Gross, Alan. “Positivists, Postmodernists, Aristotelians, and the Challenger Disaster.” College English, vol. 56, no. 4, 1994, pp. 420–433.
Wilson, Greg. “Technical Communication in Late Capitalism: Considering a Postmodern Technical Communication Pedagogy. Journal of Business and Technical Communication, vol. 15, no. 2, 2001, pp. 72–99.
Winner, Langdon. The Whale and the Reactor: A Search for Limits in the Age of High Technology. Chicago: The University of Chicago Press, 1986.