First Monday


Information Ecologies by Bonnie A. Nardi and Vicki L. O'Day

Chapter Two: Framing Conversations about Technology

The seventy-year-old film Metropolis is a reminder that our current questions and concerns about technology have a long history. Many of the particular technologies we experience now are fairly new - voicemail, cellular phones, the Internet, and many more. But the challenge of responding well to technological change goes back at least to the invention of the earliest machines.

There is no question about the imaginative appeal of technology, not just in the cityscape of Metropolis but in our own world today. With the help of technology, we can understand genetic structure, take pictures of stars being born, and perform in utero surgery to save the life of an unborn baby. These are accomplishments that give us a sense of wonder and appreciation for human inventiveness. They celebrate our abilities and extend our connections with the natural world.

When we adopt new technologies, we face uncertainty about how our quality of life may change. The development of new technology affects the nature of work, school, family life, commerce, politics, and war. We might expect that anything with such profound influence on the way we live would be subject to considerable scrutiny and debate. But most of us don't see ourselves as influential participants who can offer informed opinions about the uses of technology. On the contrary, new technologies tend to be mystifying. They resist our attempts to get a grip on what they do and how they work.

As long as we think we do not have enough expertise to engage in substantive discussions about technology, we are effectively prevented from having an impact on the directions it may take. But there are opportunities for discussion and intervention in the process of technological growth and change, and it is important to take advantage of them. We believe that the lack of broad participation in conversations about technology seriously impoverishes the ways technologies are brought into our everyday lives. Our aim is to show how more people can be more fully engaged in important discussions and decisions about technology.

This book is a personal response to the prospect of increasing technological change. Our perspective comes from our experience as researchers in Silicon Valley and as users and consumers of technology. We, Bonnie Nardi and Vicki O'Day, have been trained in (respectively) anthropology and computer science. We have each crossed boundaries into the other's discipline during our years of working in industrial research labs, including those at Hewlett-Packard, Apple Computer, and Xerox. Both of us have designed and implemented computer software, and both of us have conducted empirical studies of how people use technology.

Our empirical studies are ethnographic studies, which means that we go out into the "field" to study situations in which people are going about their business in their own ways, doing whatever they normally do. For us, the field has included offices, libraries, schools, and hospitals. We observe everyday practices and interview people in their own settings over a period of time, to learn more about the complicated and often surprising workings of a particular environment. We bring the insights we develop from ethnographic studies to help in the design of technological tools that will be a good fit for the people who use them.

We consider ourselves critical friends of technology. We want to see more examples of good, useful applications of technology out in the world, like those we have seen in some of our studies. But as we do our fieldwork, read the newspapers, and watch the developments around us, we are sometimes troubled by what we see. Technical developments in everything from telephone menus to cloning and genetic engineering have potentially disturbing effects.

We have noticed that people seem to distance themselves from a critical evaluation of the technologies in their lives, as if these technologies were inevitable forces of nature rather than things we design and choose. Perhaps some of this lack of critical attention is due to the sheer excitement at the novelty and promise of new technology, which makes it easy to move ahead quickly and without reflection. For example, NetDays focused on wiring public schools for Internet access were carried out with good intentions, but we have seen that some of our local schools have had a difficult time coping with the new technology once they have it.

We are troubled when people ignore the human intentionality and accountability behind the use of technological tools. For example, when one of us recently forgot to pay a credit card bill and saw her credit card temporarily disabled as a result, she called her bank to ask it to accelerate the process of turning the card back on. She assumed that a twenty-year history as a good customer would make a difference. The response from each of the three customer service people she talked to was the same: "You know, you're dealing with a computer here." Well, not exactly. We are also dealing with people who solve problems and make decisions, or we should be. Human expertise, judgment, and creativity can be supported, but not replaced, by computerbased tools.

Many people have misgivings about technology, but most of the time we do not express them. Our own specific concerns are unimportant in this discussion. What is important is that each of us develop and use our own critical sensibilities about the technologies that affect us.

This book is addressed to people who work with and around technology. This includes school teachers and school administrators, engineers, salespeople, professors, secretaries, journalists and others in publishing, medical professionals, librarians, people who work in finance and banking, and many more. We believe that our colleagues in technology design will also find this book useful.

For all of our readers, what we hope to accomplish is a shift in perception. To explain what we mean, we can use visual perception as an analogy. Psychologists have studied the way people see, and recent research suggests that there is no conscious perception of the visual world without paying attention to it. That is, you can't see what is in your field of view unless you are prepared to notice and process what your eye takes in. According to Arien Mack and Irvin Rock, who have carried out research in this area, we are subject to "inattentional blindness" when we are not ready to pay attention to something in our field of view [ 1].

This language about visual perception and inattentional blindness resonates with our experience as researchers who study technology in use. Sometimes we have seen different things in settings we have studied than other technologists or even some of the participants themselves have seen. We believe that to some extent, this is because we were prepared to see and pay attention to different things. We were looking for a multiplicity of viewpoints in the settings we studied, the hidden side effects of technology, people's values and agendas as they deployed technology, the resources they brought to bear in getting their work done, the actual work practices that accomplished the goals of the work, and the social interactions that affected work and technology use.

In other words, some of what goes on in any setting is invisible unless you are open to seeing it. We have noticed two blind spots people seem to have in considering work settings: informal practices that support work activities and unobtrusive work styles that hide valuable, skilled contributions [ 2].

In any work setting, there are commonly accepted accounts of the regular and sensible ways things get done. They come in the form of written procedures, job descriptions, organizational charts, project planning documents, training materials, and more. While these descriptions are useful in helping people coordinate and carry out their work, they do not always reflect the whole picture of the way things get done. They capture the work activities, roles, and relationships that are most visible, but not the informal practices that may be just as important.

For example, informal collaboration among coworkers is common but little discussed. An individual might own the responsibility for a particular task, but behind this formal responsibility there are many informal consultations and small communications with coworkers that help get the task done. In engineering work groups we studied, engineers asked one another for help in using complicated computeraided design tools, although coaching was not in anyone's job description.

Sometimes work is invisible because workers are intentionally unobtrusive in their activities. In our research in corporate libraries, for example, we have seen that reference librarians usually carry out their highly technical and skilled search activities behind the scenes, so much so that their own clients are largely unaware of what they do. Since clients may not understand what librarians are doing, they may think that automated services can replace librarians. If they looked at the actual work, they would understand that automated services cannot perform the same tasks.

When new technologies are adopted into a work setting, they usually affect the informal activities and unobtrusive activities as well as the formal ones. As we plan the introduction of new technologies or the modification of existing technologies, it is useful to shift our perception and become aware of aspects of work that are usually invisible.

Though we are all subject to inattentional blindness, we can try to be more aware of informal and unobtrusive activities. By preparing to see between and behind the formal, welladvertised roles and processes, we can enlarge our vision. And if we learn to see our own settings differently, we will also be able to see different possibilities for discussion and local action.

The Rhetoric of Inevitability

To achieve a shift in perception and prepare for conversations for action, we must look beyond some of the common rhetoric about technology. As we read and listen to what designers and technology commentators have to say, we are struck by how often technological development is characterized as inevitable. We are concerned about the ascendance of a rhetoric of inevitability that limits our thinking about how we should shape the use of technology in our society.

Some commentators welcome the "inevitable" progress of technology - that is the view of the technophiles, who see only good things in future technological developments. Some decry the inexorable advance of technology - that is the view of dystopians, who wish we could turn our backs on the latest technologies because of their intrusive effects on our social experience.

There are more possibilities for action than these extremes suggest. But to see past this pervasive rhetoric, we first need to bring it clearly into view, so we can recognize it, sensitize ourselves to it, and then move forward to a more fruitful position.

To consider just one of many examples of the rhetoric of inevitability, in an article in Beyond Calculation: The Next Fifty Years of Computing, Gordon Bell and Jim Gray of Microsoft assert that "by 2047 ... all information about physical objects, including humans, buildings, processes and organizations, will be online. This is both desirable and inevitable" [ 3]. It is instructive to read those two sentences aloud.

Humans are objects. We are in the same category as buildings. In this formulation, any special considerations we might have simply because we are humans (such as rights to privacy) are obliterated. The authors declare that creating a world in which people are objects in a panoramic electronic database is "both desirable and inevitable."

The authors use their authority as virtuoso engineers to tell us what they believe to be inevitable and to suggest how we should feel about it. Bell and Gray's article is not an anomaly. It is one example of many books and articles in which experts describe how technology will be used in the future with a sense of certainty, both about the technology itself and our own acceptance of the benefits it will bring to us [ 4].

Bell and Gray state, "Only the human brain can process and store information more easily, cheaply and rapidly [than a computer]." The human brain is formulated here as cheap information storage. By reducing people's intellects to simple computation and storage capabilities, our goals and intentions and opinions are rendered invisible and uninteresting. We are concerned about the way the corporate mind is reaching into the future to define us as physical objects about whom any data can be stored online. Through the rhetoric of inevitability we are being declared nonplayers in the technical future. We are bargain basement commodities.

Another example of the rhetoric of inevitability can be found in the discussions of cloning people, which have featured inevitability as a constant refrain. Immediately after the story about the successful cloning of sheep in Scotland appeared in the newspapers in February 1997, a U.S. government spokesperson said, "Should we stop scientific development in these areas because the capacity [to clone humans] might become available? I don't think that's reasonable, or even possible. I just think that's one of the costs that come along with scientific discovery, and we have to manage it as well as we can. But it's awfully hard to stop it" [ 5].

The author of these remarks was Harold Shapiro, the chair of the National Bioethics Advisory Commission appointed by President Clinton. Surely someone appointed as a representative of the people's interests to advise the government on the ethics of biotechnology should take a little more time before declaring cloning technology inevitable. Is it not appropriate to have a public conversation about this farreaching, controversial technology? Here the rhetoric of inevitability protects a scientific establishment that wants to be free of considerations of how its activities might affect the rest of society.

Shapiro was joined by Eric Juengst, a bioethicist at Case Western Reserve University, in declaring that banning future research is like "trying to put a genie back in its bottle" [ 6]. The rhetoric of inevitability reaches a nadir in Juengst's comment: "Do we want to outlaw it [cloning] entirely? That means of course only outlaws will do cloning."

There must be a better argument to be made about the implications of cloning than that only outlaws will clone if we make it illegal. Let's throw away all our laws, in that case! This is a sad logic, especially from someone described as in the newspaper as "one of the nation's leading biomedical ethicists."

Fortunately, the cloning discussion has been more polyphonic than many other technology discussions. In a story about cloning in the San Jose Mercury News, our local newspaper, it was reported that in 1973 the scientific community declared a moratorium on research in which DNA from one species was moved to another species, because there was popular concern about mutant strains of bacteria escaping from laboratories and infecting the entire world. In 1974, scientists urged the federal government to regulate all such DNA technology. Strict guidelines followed. They have been relaxed as the scientific community has taken time to sort through the issues and as public understanding has grown, but the regulations are widely regarded as responsible and socially beneficial steps to have taken at that time [ 7].

Margaret McLean, director of biotechnology and health care ethics at the Markkula Center for Applied Ethics at Santa Clara University, wrote of the cloning debate, "We ought to listen to our fears." She noted that Dolly the sheep seems to be growing old before her time, possibly due to the aged genetic material from which she was cloned. McLean discussed concerns with attempts to overcontrol a child's future by controlling its genes, by setting expectations that a cloned child might find emotionally unbearable. She argued that we should consider our misgivings and give voice to them. McLean takes on the issue of inevitability squarely, declaring, "I, for one, believe that the possible is not the inevitable" [ 8].

The developer of the cloning technique himself, Ian Wilmut, voiced opposition on ethical grounds to applying the technology to people. There are already laws in some European countries that ban the cloning of human beings.

We hope that our readers will develop active antennae for sensing the rhetoric of inevitability in all the media that surround us. The cloning discovery and the variable responses to it show that there is not a single story to be told about any technology. Those who declare a technical development "inevitable" are doing so by fiat.

Conversational Extremes: Technophilia and Dystopia

Conversations about technology are often positioned at one of two extremes: uncritical acceptance or condemnation. Writers of both technophile and dystopic works often assume that technological change is inevitable - they just feel very differently about it [ 9].

These two positions stake out the ends of the continuum, but they leave us with poor choices for action. We want to claim a middle ground from which we can carefully consider the impact of technologies without rejecting them wholesale.

Nicholas Negroponte's book Being Digital is a shining example of the work of a technophile. Negroponte, director of the MIT Media Lab in Cambridge, Massachusetts, populates a new and forthcoming Utopia with electronic butlers, robot secretaries, and fleets of digital assistants [ 10]. In Negroponte's world, computers see and hear, respond to your every murmur, show tact and discretion, and gauge their interactions according to your individual predilections, habits, and social behaviors. Negroponte's lively future scenarios in which digital servants uncomplainingly do our bidding are always positive, unproblematic, and without social costs. There are some important pieces missing from this vision, though it is certainly engagingly presented.

Technological tools and other artifacts carry social meaning. Social understanding, values, and practices become integral aspects of the tool itself. Perhaps it's easiest to see this clearly by looking at examples of older and more familiar developments, such as the telephone. The telephone is a technological device. It is a machine that sits on a desk or is carried around the house, and it has electronic insides that can be broken. But most of us probably don't think of a telephone as a machine; instead, we think of it as a way of communicating. There is an etiquette to placing a call, answering the phone, taking turns in conversation, and saying good-bye, which is so clear to us that we can teach it to our children. There are implicit rules about the privacy of telephone conversations; we learn not to eavesdrop on others and to ignore what we may accidentally overhear. These conventions and practices are not "designed in" and they do not spring up overnight. They were established by people who used telephones over time, as they discovered what telephones were good for, learned how it felt to use them, and committed social gaffes with them.

Negroponte's scenarios are missing a sense of each technology's evolving social meaning and deep integration into social life. Though these social meanings can't be engineered (as the histories of earlier technologies have shown), we must understand that social impacts are crucially important aspects of technological change. We should be paying attention to this bigger picture, as it emerges from its fuzzy-grained beginnings in high-tech labs to saturate our houses, schools, offices, libraries, and hospitals. It is not enough to speculate about the gadgets only in terms of the exciting functions they will perform.

When we turn to writings in the dystopic vein, we find that concerns with the social effects of technology are voiced. But the concerns are met with a big bucket of cold water - a call to walk away from new technologies rather than use them selectively and thoughtfully.

A recent bestseller in this arena was Clifford Stoll's Silicon Snake Oil [ 11]. Stoll is an astronomer and skilledcomputer programmer who is well known for his remarkable success in tracking down a group of West German hackers who broke into the Lawrence Berkeley Laboratory computers in 1986. In Silicon Snake Oil, Stoll shares his concerns about the hype surrounding the Internet for everyday use. He suggests that consumers are being sold a bottle of snake oil by those promoting the Internet and other advanced technologies. In the rush to populate newsgroups, chat rooms, and online bookstores in a search for community, we may find ourselves trading away the most basic building blocks for community that wealready have - our active participation in local neighborhoods, schools, and businesses.

This is not an unreasonable fear. Another technology, the automobile, transformed the landscape of cities, neighborhoods, and even houses in ways that profoundly affect the rhythms and social networks of daily life. In the suburban Silicon Valley neighborhood where both of us live, each ranch-style house is laid out with the garage in front, making it the most prominent feature of the house to neighbors or passersby. The downtown shops are a long walk away on busy roads that are not meant for pedestrian traffic. Most people routinely drive many miles to get to work. We can be reminded of what our driving culture costs us by walking for awhile in a town or neighborhood built before cars - though this is not an easy exercise for Californians and other Westerners. In these earlier neighborhoods, there are mixtures of houses, apartments, and small shops, all on a scale accessible to people walking by, not shielded from the casual visitor by vast parking areas.

While we share Stoll's belief that the introduction of new technologies into our lives deserves scrutiny, we do not believe that it is reasonable or desirable to turn our backs on technology. It is one thing to choose not to use automated tools for the pure pleasure of doing something by hand - to create beautiful calligraphy for a poem instead of choosing from twenty or thirty ready-made fonts, or to play Monopoly (an activity advocated by Stoll) instead of Myst (a computer game with beautiful graphics). But sometimes the computer is exactly the right tool for the job, and sometimes it is the only tool for the job.

The issue is not whether we will use technologies, but which we will choose and whether we will use them well. The challenge now is to introduce some critical sensibilities into our evaluation and use of technology, and beyond that, to make a real impact on the kinds of technology that will be available to us in the future.

Stoll and Negroponte seem to be diametric opposites. Stoll says faxing is fine; Negroponte offers a withering critique. Stoll asserts that people don't have time to read e-mail; according to Negroponte, Nobel prize winners happily answer the e-mail of schoolchildren. Stoll tells schools to buy books; Negroponte says computers make you read more and better. But both Negroponte and Stoll are in agreement on one crucial point: the way technology is designed and used is beyond the control of people who are not technology experts. Negroponte asserts that being digital is inevitable, "like a force of nature." What Mother Nature fails to provide will be taken care of by the engineers in the Media Lab. And Stoll describes the digital promises as snake oil - not home brew. Neither Stoll nor Negroponte offers scenarios in which citizens have a say in how we want to shape and use technology.

A Different Approach

Our position in this public conversation about technology lies between the positions exemplified by Stoll and Negroponte in some ways, and completely outside their construction of the argument in others. We share Negroponte's enthusiasm for and fascination with cutting-edge technology development. We share Stoll's concerns about the social impact of technology. But to shun digital technology as Stoll advocates is to miss out on its benefits. Neither does it seem wise to sit back passively waiting for the endless stream of amazing gadgets that Negroponte hypothesizes. It is not necessary to jump on the digital bandwagon. It is dangerous, disempowering, and self-limiting to stick our heads in the sand and pretend it will all go away if we don't look. We believe that much more discussion and analysis of technology and all its attendant issues are needed.

Some of this discussion is fostered by political action books, such as Richard Sclove's Democracy and Technology [ 12]. Sclove argues for grassroots political action to try to influence official governmental policies on technology. He writes, "[I]t is possible to evolve societies in which people live in greater freedom, exert greater influence on their circumstances, and experience greater dignity, self-esteem, purpose, andwell-being."

We are in passionate agreement with this statement. At the same time, we recognize that politics per se - national, regional, or local policy advocacy - is not for everyone. There are other ways to engage with technology, especially at the local level of home, school, workplace, hospital, public library, church, and community center. We all have personal relationships with some of these institutions. We can influence them without having to change broad governmental policy, though that might happen in some cases.

In our research studies, we have seen examples of responsible, informed, engaged interactions among people and advanced information technologies. We think of the settings where we have seen these interactions as flourishing information ecologies. Each of these ecologies is different from the others in important ways. Each has something unique to teach us, just as we learn different things about biology from a coral atoll, a high desert, a coniferous forest. We suggest that these examples be read as stories that model a holistic, ecological approach to technological change. Using the metaphor of an ecology, we will discuss how all of us can find points of leverage to influence the directions of technological change.

About the Authors

Bonnie Nardi is a researcher at AT&T Labs-Research and is the author of A Small Matter of Programming (Cambridge: MIT Press, 1993) and editor of Context and Consciousness (Cambridge: MIT Press, 1996).
e-mail: nardi@research.att.com

Vicki O'Day, formerly a researcher at the Xerox Palo Alto Research Center, is a graduate student of anthropology at the University of California at Santa Cruz.
e-mail: oday@calterra.com

Notes

1. Arien Mack and Irvin Rock, 1998. Inattentional Blindness Cambridge: MIT Press.

2. The issue of invisible work is explored at length in Computer-supported Cooperative Work - A Journal of Collaborative Computing, volume 8, numbers 1-2 (May 1998), with guest editors Bonnie Nardi and Yrjö Engeström.

3. Gordon Bell and James N. Gray, 1997. "The Revolution Yet to Happen," In: Peter J. Denning and Robert M. Metcalfe (editors) Beyond Calculation: The Next Fifty Years of Computing. New York: Springer-Verlag.

4. See also Michael Dertouzos, 1997. What Will Be: How the New World of Information Will Change Our Lives. San Francisco: Harper San Francisco.

5. "Cloning procedure could bring unthinkable within reach," San Jose Mercury News, 24 February 1997.

6. "Cloning procedure could bring unthinkable within reach," San Jose Mercury News, 24 February 1997.

7. "Cloning procedure could bring unthinkable within reach," San Jose Mercury News, 24 February 1997.

8. Margaret R. McLean, 1998. "Just because we can, should we?" San Jose Mercury News, (18 January).

9. Dystopic visions include Jerry Mander, In the Absence of the Sacred (San Francisco: Sierra Club Books, 1991); Sven Birkerts, The Gutenberg Elegies: The Fate of Reading in an Electronic Age (New York: Fawcett Books, 1995); and Neil Postman, Technopoly (New York: Vintage Books, 1993). Technophilia is well represented across the mass media in old-line publications such as Time and newer outlets such as Wired.

10. Nicholas Negroponte, Being Digital (New York: Knopf, 1995).

11. Clifford Stoll, Silicon Snake Oil: Second Thoughts on the Information Highway (New York: Doubleday,1995).

12. Richard Sclove, Democracy and Technology (New York: Guilford Press, 1995).

Context

This text originally appeared in Information Ecologies: Using Technology with Heart published in 1999 by MIT Press. The text is copyrighted by Bonnie Nardi and Vicki O'Day and the book is copyrighted by MIT Press. The book is available from MIT Press directly, fine bookstores everywhere, and Amazon.com.The authors manage a Web site for the book at http://www.calterra.com/infoecologies/.


Contents Index

Copyright © 1999, First Monday