"Hackers" are identified as a specific subgroup of computer workers. The history of the hacker community is told. The explicit and implicit ideologies expressed through hacking is analyzed and presented. Computer artifacts of origin both inside and outside the hacker community are compared and the embedded properties of the resulting artifacts are inferred. Hacking is discussed in the context of being a method for system development. Finally, it is argued that this system development method under certain circumstances may yield superior software artifacts.
Contents
Introduction
Tinker, Taylor, Scholar, Hacker
The Hacker Community
The Hacker Ethic The Virtual Class
Deconstructing Software
Hacking in the Real World
Conclusion
Introduction
"Whenever computer centers have become established, that is to say, in countless places in the United States, as well as in virtually all other industrial regions of the world, bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler's on the rolling dice. When not so transfixed, they often sit at tables strewn with computer printouts over which they pore like possessed students of a cabalistic text. They work until they nearly drop, twenty, thirty hours at a time. Their food, if they arrange it, is brought to them: coffee, Cokes, sandwiches. If possible, they sleep on cots near the computer. But only for a few hours - then back to the console or the printouts. Their rumpled clothes, their unwashed and unshaven faces, and their uncombed hair all testify that they are oblivious to their bodies and to the world in which they move. They exist, at least when so engaged, only through and for the computers. These are computer bums, compulsive programmers. They are an international phenomenon" (Weizenbaum, 1976).
"The hacker is a person outside the system who is never excluded by its rules" (Turkle, 1984).
""Hackers" are computer aficionados who break in to corporate and government computer systems using their home computer and a telephone modem" (Meyer, 1989).
The popular image of the computer hacker seems to be part compulsive programmer preferring the company of computers to people, and part criminal mastermind using his or her technical prowess to perpetrate anti-social acts. But this is at best only half the story. A number of people I know who are proud to be called "hackers" are sensitive, sharing, social and honest.
Part of the confusion surrounding the word "hacker" stems from the fact that it as been applied to at least three distinct communities.
The "original" hackers were computer professionals who, in the mid-sixties, adopted the word "hack" as a synonym for computer work, and particularly for computer work executed with a certain level of craftsmanship. They subsequently started to apply the noun "hacker" to particularly skilled computer workers who took pride in their work and found joy in doing so.
Then in the seventies, assorted techno-hippies emerged as the computerized faction of the counterculture of the day. These were grassroots activists who believed that technology was power, and as computers was the supreme manifestation of the power of technology, they believed that they should be put into the hands of the people. While these activists did not speak of themselves as hackers or identify closely with the master programmers that populated the first wave, the term was thrust upon them in 1984 when they first were celebrated by the publication of Steven Levy's landmark Hackers: Heroes of the Computer Revolution (Levy, 1984), and then again by the first Hacker's Conference hosted by the Point Foundation and the editors of the Whole Earth Review. What characterized the second wave hackers was that they desperately wanted computers and computer systems designed to be useful and accessible to citizens, and in the process they pioneered public access terminals, computer conferencing, and personal computers.
Finally, in the second half of the eighties the so-called computer underground emerged, appropriated the terms "hacker" and "hacking" and partly changed their meaning. To the computer underground, "to hack" meant to break into or sabotage a computer system, and a "hacker" was the perpetrator of such activities.
Popular media's fascination with things subversive and spectacular has long ago ensured that it is the latter rather than the former definition that reign supreme. However, the strong association between the word "hacker" and the "computer criminal" has the unfortunate side effect that it hides the "other side" of hacking, the side that involve skilled craftsmen who believe that a computer is more than a means of production - it is, among many other things, an instrument for creation, communication, mastery, artistic expression and political empowerment.
In the outset, however, it should be noted that the three hacking communities are not completely disjunct. The hacker of the sixties was not beyond appreciating lock-picking skills, both those addressing physical locks barring access to computer rooms, and software protection schemes such as password files and encryption schemes, and he also believed that information was born to be free - including the source code he had written and the knowledge he had about the inner workings of various systems. In 1990, when the Electronic Frontier Foundation was set up as a response to Operation Sun Devil (a U. S. Secret Service raid on the computer underground), funding was provided by John Gilmour (of Sun Microsystems), Mich Kapor (co-creator of Lotus 1-2-3), Steve Wozniak (co-founder of Apple Computer) and other well-to-do second wave hackers. As far as politics go: Today's generation-x hackers share with their artisan and activist hacker predecessors a distrust in authority, and a tendency to position themselves outside bourgeoisie society's norms and values.
Some commentators (Anderson, 1993; Rosteck, 1994) considers hackers (of the anarchist variety) to be radical partisans, very much in the same manner the Russian nihilists in the 19th century was considered to be part the radical political movement of that time. Others (Kelly, 1994) have attempted to co-opt hackers as the avant-garde of neo-laissez-faire economic liberalism.
In this essay, I shall try to put some perspective on these two claims. My main purpose, however, is to instigate a discussion on hacking as a valid method for developing software for information systems.
Tinker, Taylor, Scholar, Hacker
In the 1950s, people working with computers had much in common with artists, artisans and craftsmen. There was room for creativity and independence. Management methods of control were not yet developed. There was no clear division of labor. Skilled programmers, like all good craftsmen, had intimate knowledge and understanding of the systems they worked with. A humorous account of the state of affairs in those early days is rendered in Ed Nather's The Story of Mel. It first surfaced as a Usenet message in 1983. When it finally was cast in ink (in Raymond, 1991) it was heralded as "one of hackerdom's great heroic epics".
This did not last. By the mid-sixties, management wanted to bring computer work in line with other industrial activities, which essentially meant that they wanted programming to be part of a managed and controlled process.
To accomplish this, they turned to a more than fifty year old fad, called "Scientific Management" (Taylor, 1911). Scientific Management was invented by the engineer Frederick Winslow Taylor, and aimed at taking away from workers the control of the actual mode of execution of every work activity, from the simplest to the most complicated. Taylor's argument was that only by doing this could management have the desired control over productivity and quality.
The methods advocated by Taylor were to increase standardization and specialization of work. In the computer field, this spelled, among other things, the introduction of programming standards, code reviews, structured walkthroughs and miscellaneous programming productivity metrics.
The most profound effect of application of Taylorist principles to computer work was the introduction of a detailed division of labor in the field. Computer workers found themselves stratified into a strict hierarchy where a "system analyst" was to head software development team consisting, in decreasing order of status and seniority, "programmers", "coders", "testers" and "maintainers". Then, below these on the ladder was a number of new adjunct positions created to serve the software development team: "computer console operators", "computer room technicians", "key punch operators", "tape jockeys" and "stock room attendants". Putting the different grade of workers in different locations further enforced the division of labor. Most corporations in the sixties and seventies hid their mainframes in locked computer rooms, to which programmers had no access. This isolated programmers from technicians, diminishing their social interaction and cutting off the opportunity for the exchange of ideas. It also prevented programmers from learning very much about the workings of the machine they programmed.
As noted in (Braverman, 1974) and (Greenbaum, 1976), at the core of this process was dequalification of computer work, the destruction of programming as a craft, and the disintegration of working communities of programmers - all in order to give management more control over computer workers.
The emergence of hackers as an identifiable group coincides closely in time with the introduction of various Taylorist methods in software development. Many of the most skilled programmers resented what was happening to their trade. One of the things that characterized the early hackers, was their almost wholesale rejection of Taylorist principles and practices, and their continued insistence that computer work was an art and a craft and that quality and excellence in computer work had to be rooted in artistic expression and craftsmanship and not in regulations. So, long before the proponents of sociotechique and "Scandinavian School" system developers questioned the Taylorist roots of modern software development methods (Budde et al., 1992), hackers voted against it with their feet - by migrating to communities where a non-Taylorist stance vis-à-vis computer work was tolerated.
Hacker lore abound with horror stories about earnest hackers who, due to some misfortune or just some stupid misunderstanding, suddenly find themselves caught in the Taylorist web of some major corporation. The high point of these stories is often to expose some Taylorist principle to scorn and ridicule as corporate stupidity defies itself and Taylorist productivity measures (such as line counting) prove to be easily subverted. Thus, in the folklore, the hacker emerges triumphant, as the moral as well as actual victor of the skirmish. Many of these stories have since found their way into Scott Adams comic strip Dilbert, partly based upon Adams own experiences as "corporate victim assigned to cubicle 4S700R at the headquarters of Pacific Bell" and "true life" e-mail submissions from computer workers out in the field (Adams, 1996).
In real life, things are not always so amusing, and sometimes real anger surfaces when hackers voice their feelings about the destruction of their craft - as in this message, posted to an Internet mailing list known as unix-wizards in February 1989:
Programming standards and code review committees attract all the jerks trying to angle their way from the ranks of us hackers into the Vice-Presidency of the Division. While these characters are deceiving themselves into believing they have a career path, they cause everyone else a good deal of trouble. [...] Structured Programming is often the buzzword for an attempt to routinize and deskill programming work to reinforce the control of hierarchy over the programming process - separate from and sometimes different from, improving quality.
Dahlbom and Mathiasen (1993) - building on Lévi-Strauss - introduces the terms "tinkering" and "engineering" and discusses briefly how these two approaches relate to software development. They argue that engineers are "modern" and tinkerers "illiterate", that engineers work "top down" while tinkerers "bottom up" and so on. My first reading of Dahlbom and Mathisen's text left me with the impression that hackers were the "tinkerers" in their "terminology", while "engineers" were those using all of the "scientific" and professional methods. Then the following paragraph hit me like a sledgehammer:
Modern societies have engineers, illiterate societies have bricoleurs or tinkerers. As engineers. we organize our thinking in projects, choosing means and tools once the aim of the project has been decided. As tinkerers, we use what we have, letting our means and tools determine what we do. As engineers, we set our goals first, often having to invent tools to be able to reach them. (ibid., p. 173)
Suddenly I understood what the anger surfacing in unix-wizards was all about! The hacker had been forced by his programming-illiterate boss into using some tools and methods that he considered unsuitable or inadequate for the task at hand. The hacker wanted to work as a professional, as an "engineer", and management had forced him to become a "tinkerer".
The Hacker Community
As Taylorist-inspired software development methods descended upon the corporate world, the hackers entrenched themselves at the campuses of the large technical universities of the U. S. (in particular MIT, Stanford and Carnegie-Mellon), where their non-conformism was tolerated and their special skills appreciated.
As other outsider communities [ 1], the hackers developed a strong skepticism for the products of mainstream society, and they often preferred to develop their own programming tools and computer systems rather than rely on commercial solutions. For example MIT hackers developed their own operating system (ITS - Incompatible Timesharing System [ 2]) for its DEC PDP-6 and PDP-10 computers which pioneered advanced concepts such as networking, file-sharing between machines and terminal-independent I/O. The hacking community also tried to develop its own hardware. The MIT Lispmachine, the Stanford University Network (SUN) workstation and the Carnegie-Mellon SPICE computer are other examples of efforts in this direction. In the Bay Area, community efforts such as the Homebrew Computer Club designed, built and wrote the software for what became better known as personal computers.Even today, this tradition continues. The hacker's operating system of choice is Linux, a free [ 3] Unix-compatible operating system developed as a community effort headed by Linus Torvalds at the University of Helsinki. And most hackers prefer the tools and utilities developed (again as a communal effort) by the GNU [ 4] project at the Free Software Foundation (in Cambridge, Massachusetts) to their commercial counterparts.
The reason the hackers preferred writing their own software and constructing their own machines was not just rooted in a frontiersman belief in self-reliance. By rejecting the fragmented and controlled world of corporate employment, the hacker community developed its own distinct perspective on computing and computers. This perspective favored open systems, integrated solutions and distributed resources. Whether this perspective emerged from the hacker's work style, or vice versa is impossible to tell, but the early hackers rapidly adopted a community-oriented style of working. They preferred to work as intimately as possible with technology, and as interactively as possible with each other. This meant that they wanted direct access to the computer, and they also wanted to share files, look at each others screen images, review and re-use each other's source code, co-author code libraries, and so on. The commercial machines and operating system of that era with their batch job submission systems, operator priesthoods and rings of protection were not suitable for this, and the hackers proceeded to construct tools that they felt they needed to support their style of working.
The next step in establishing a hacking community was the ARPANet. The Advanced Research Project Agency of the U. S. Department of Defense (DoD) funded the ARPANet. The main objective behind the ARPANet program was to link computers at scientific laboratories so that those researchers could share computer resources (Hafner and Lyon, 1996).
When the ARPANet emerged, it sported an unorthodox architecture that made it radically different from existing communication infrastructures such as the long-distance telephone network. Instead of establishing a (virtual) circuit between two end points and using this circuit to carry communication, messages were routed through the network by splitting them into small, independent and totally self-contained "packages" which were then left to find their own way to their destination, where they were re-assembled into complete messages again before being delivered [ 5]. This solution has a number of interesting implications. It means that the entire network is distributed and self-similar - there is no "center" and therefore no means by which an authority can assume "central control". With a certain degree of redundancy this also makes the network very robust against failure. If a portion of the Net breaks or is blocked for other reasons - the Net will automatically route packages around it (hence the hacker proverb - originally coined by John Gilmour: "The Net interprets censorship as damage, and routes around it.")
Another characteristic of ARPANet was the ease with which new services could be added. The network provides the infrastructure to transport messages between any two points connected to the network. To create a new service, one designs a new set of messages and defines the semantics of those messages be writing one or more computer programs that understands and are able to act upon the messages. Now, anyone connected to the network with the new programs installed on their computer can take advantage of the new service. As the Net itself provides a most eminent infrastructure for disseminating computer programs (which are just one special type of message), it was easy for all interested parties to bootstrap into new services as they became available.
Hackers were attracted to the ARPANet project. Both the engineering challenges involved and the goal (to enable the sharing of resources, tools and technologies) must have held a strong appeal, as well as the distributed architecture and the flexibility and power to create new computer-based services.
Consequently, first wave hackers became some of the most vigorous members in the communities commissioned to develop the ARPANet. This resulted in hacker sensibilities being ingrained with the Net, and - as the ARPANet became the Internet, it became an intrinsic part of the hacker culture - and the preferred place of residence of the first wave hacker community.
While the ARPANet from the very beginning nurtured an unorthodox community, access to the ARPANet was controlled. It was only available to affiliates of research institutions and universities with DoD funding. This made ARPANet off-limits to members the grass roots movements that were the breeding ground of second wave hackers.
The second wave hackers therefore set out to create their own communication infrastructure. First they set up stand-alone Bulletin Board Systems (BBSs) which could be reached by individuals using a modem to dial into BBSs. Then the boards were connected into homebrew global computer networks such as FIDOnet and PeaceNet. In 1985, Stewart Brand, the editor/publisher of The Whole Earth Catalog and Review, established the Whole Earth 'Lectronic Link (The Well), which rapidly emerged as one of the first and most significant of the online communities of the techno-counterculture of the mid-eighties (Rheingold, 1993). The Well started out as a BBS-system, and did not hook up to the Internet (the open successor of the ARPANet) until 1992. But because Brand had elected to build the Well on Unix technology, the effort had sufficient credibility among first wave hackers (who shunned MS-DOS-based BBS-systems) to make them participate in the community and help Brand build and run the system. The Well was therefore influential in bringing together hackers of the first and second waves. Brand (with Kevin Kelly, who later emerged as the executive editor of Wired) also organized one of the first "hacker" conferences in 1984, to which he invited prominent first and second wave hackers as well as assorted counterculture celebrities.
As the Net grew, it also helped the hacker community and culture spread beyond it core areas (the large U. S. technical universities and Bay Area computer counterculture), to become a worldwide phenomenon. Finally, the recent commercial success of the Internet has made hackers skilled in creating distributed applications an appreciated resource in Internet-savvy companies, where some of them proudly display their roots on their business cards.
The Hacker Ethic
"[Tim Berners-Lee] didn't patent the [World Wide Web]. He didn't copyright. He made that openly available. And that's what has fuelled a great deal of the network development, and all the innovative ideas. [...] There is a continuing ethic in the community to give back to the network what it has given you" (Cerf, 1997).
A major impetus of Steven Levy's work on hackers (Levy, 1984) was his exploration of the hacker value and belief system. Levy calls it the "hacker ethic". While Levy may be accused of romanticizing his subject, his book is nevertheless the best-published study of this community so far.
Below, I've paraphrased Levy's "hacker ethic" into a set of imperatives that reflects on the hacker mode of operation, followed by a set of position statements that reflects on the hacker attitude.
Imperatives:
- reject hierarchies
- mistrust authority
- promote decentralization
- share information
- serve your community (i.e. the hacker community)
Position statements
- when creating computer artifacts, not only the observable results, but the craftsmanship in execution matters
- practice is superior to theory
- people should only be judged on merit (not by appearance, age, race or position)
- you can create art and beauty by the means of a computer
Another source to a reading of the hacker ethic is Richard M. Stallman's The GNU Manifesto that outlines the rationale behind the GNU project and Stallman's own resignation from the MIT AI Lab:
"I consider that the golden rule requires that if I like a program I must share it with other people who like it. Software sellers want to divide the users and conquer them, making each user agree not to share with others. I refuse to break solidarity with other users in this way. I cannot in good conscience sign a nondisclosure agreement or a software license agreement. For years I worked within the Artificial Intelligence Lab to resist such tendencies and other inhospitalities, but eventually they had gone too far: I could not remain in an institution where such things are done for me against my will.
So that I can continue to use computers without dishonor, I have decided to put together a sufficient body of free software so that I will be able to get along without any software that is not free. I have resigned from the AI Lab to deny MIT any legal excuse to prevent me from giving GNU away" (Stallman 1985).
Ten years earlier, the techno-hippies of the somewhat deceptively named People's Computer Company (it was a magazine, not a corporation), the Berkeley Community Memory project, the Portola Institute (later renamed Point Foundation) and Midpenninsula Free University tried to voice their feelings about technology in society. None were more vocal than Ted Nelson, who put out a self-published pamphlet: Computer Lib / Dream Machines (later re-published by Microsoft Press), which more or less was a call to arms:
"I have an axe to grind. I want to see computers useful to individuals, and the sooner the better [...] Anyone who agrees with these principles is on my side. And anyone who does not, is not. THIS BOOK IS FOR PERSONAL FREEDOM. AND AGAINST RESTRICTION AND COERCION. A chant you can take to the streets: COMPUTER POWER TO THE PEOPLE! DOWN WITH CYBERCRUD!" [ 6] (Nelson, 1974)
Later, Nelson sets down his ideals for designing computer artifacts, which includes such objectives as:
- populist (equally available to all at low cost)
- open and universal (transcending political, technological, geographical or other boundaries)
- pluralist (support many points of view, including controversial subjects and unpopular and eccentric positions)
The embodiment of Nelson's ideas is Xanadu - a distributed interlinked hypertext system that Nelson has been working on since 1960. Still, after nearly 40 years of gestation, Xanadu is still not widely deployed for public use. In fact, it is Tim Berners-Lee's World Wide Web that finally created the docuverse that Nelson envisioned. Xanadu advocates (Pam, 1995) have argued that the World Wide Web is technological inferior to the design proposed by the Xanadu project. That may very well be true, but Nelson's failure in getting people to Xanadu may nevertheless serve to illustrate the hacker idiom that rhetoric is inferior to practice.
There is, however, another observation to be made from the failure of Xanadu when pitted against the World Wide Web.
Ted Nelson is almost the typical techno-hippie-cum-entrepreneur. His rhetoric champions such progressive ideals as democracy, justice, tolerance, self-fulfillment and social liberation, but his beliefs is clearly that of free market capitalism and neo-classical economic liberalism. In a recent presentation of Project Xanadu, Nelson describe it thusly:
"[Xanadu] is a complete business system for electronic publishing based on this ideal with a win-win set of arrangements, contracts and software for the sale of copyrighted material in large and small amounts. It is a planned worldwide publishing network based on this business system. It is optimized for a point-and-click universe, where users jump from document to document, following links and buying small pieces as they go" (Pam, 1997).
In fact, if you look beyond the rhetoric, there is very little in Project Xanadu that distinguishes from other enterprises. The name "Xanadu", the Xanadu software and Xanadu group's servicemark, the Flaming-X symbol, are all copyrighted, trademarked and jealously defended by Nelson and his cohorts. And at the very core of the Xanadu system is an incredible complex scheme for keeping track of ownership to, and extracting royalties for, intellectual property.
Not only did Tim Berners-Lee not bother to copyright or patent the World Wide Web, he made the source code available for others to experiment and improve. While this gave us such pieces of misengineering as the blink and frames tags, the open source policy of Tim Berners-Lee and subsequent developers of Web technology appears to me as the distinguishing quality that made the World Wide Web succeed and the plethora of competing hypertext schemes (of which Xanadu probably was the first, best designed and most functional) fail.
The Virtual Class
Recently, with all the glitz and glamour surrounding high technology and the Internet, hackers have found themselves co-opted by technophiles, neo-classical economic liberalists and purveyors of technological hyperbole. Suddenly, hackers were hailed as avant-garde of the computer revolution.
At the forefront of this campaign has been Wired. Its pages are crammed with psychedelic graphics and unreadable typefaces that regularly describe hackers (of absolutely all kinds - from digital vandals to ace programmers) as larger than life heroes. But Wired is not a revolutionary publication. The magazine's first managing editor John Battelle (as quoted in Kroker and Weinstein, 1994) made it no secret that Wired does not nurture much faith in the hacker ethic:
"People are going to have to realize that the Net is another medium, and it has to be sponsored commercially and it has to play by the rules of the marketplace. You're still going to have sponsorship, advertising, the rules of the game, because it's just necessary to make commerce work. I think that a lot of what some of the original Net god-utopians were thinking is that there was just going to be this sort of huge anarchist, utopian, bliss medium, where there are no rules and everything is just sort of open. That's a great thought, but it's not going to work. And when the Time Warners get on the Net in a hard fashion it's going to be the people who first create the commerce and the environment, like Wired, that will be the market leaders.""
As pointed out by Winner (1995) Wired's political platform is a mixture of social Darwinism, laissez-faire capitalism and technological determinism, combined with an admiration for self-indulgence, profit-seeking and boundless egos. The magazine's political allegiances are also evident through its ties with Newt Gingrich's conservative thinktank Progress and Freedom Foundation. For the post-modern techno-yuppies that use their ability to innovate and create original products to free themselves from the mores of regular employment and the corporate world and gain considerable autonomy over their pace of work and place of employment, Wired has rapidly become the most authoritative guide to vogue and vocabulary, attitude and artifacts.
Dubbed "symbolic analysts" by Robert Reich (1991) and "the virtual class" by Kroker and Weinstein (1994), these individuals may at first glance be indistinguishable from equally non-conformist technology-loving artisans of the hacker community. The ideological differences between techno-yuppies and hackers are, however pronounced.
The techno-yuppies seem to share a fundamental belief in both the glory and the deterministic nature of computer technology. Technology is going to produce social changes "so profound their only parallel is probably the discovery of fire" proclaims Wired publisher Louis Rossetto (1993) in the first issue of the magazine. "Like a force of nature, the digital age cannot be denied or stopped" chimes senior columnist Nicholas Negroponte (1995b), it will "flatten organizations, globalize society, decentralize control, and help harmonize people" (Negroponte, 1995a).
In contrast, the hacker's fascination with technology is not because they believe that technology will bring about great and revolutionary changes (or make any societal difference whatsoever). Hackers love technology for its own sake. But hackers believe that technology is too a good a thing to be proprietary. Therefore, hackers pay considerable attention to the problem of how to make technology available to the public at zero or very little cost. In The GNU Manifesto, Stallman (1985) a number of alternative solutions to this problem is proposed. One of them is to make the government impose a "software tax" on the sale of all computers. Revenues generated by this means would be used to fund future software development, with the results made freely available to all citizens at zero or little cost.
While the techno-yuppies usually portrait the Internet as a laissez-faire utopia created out of thin air, the hackers who actually provided the skill and labor for its gestation are well aware that the construction and initial running costs of the Internet was funded with public money, first through the Advanced Research Project Agency (ARPA) and later through the NSF (National Science Foundation).
Sometimes the difference in outlook between the techno-yuppies and the hacker community becomes manifest, as when Stewart Brand - organizer of the first hacker conference -- in 1995 was invited to the annual meeting of the conservative Progress and Freedom Foundation:
"Brand patiently waited out countless denigrations of government, relentless rhetorical slander, until at last he broke out: "What about the G.I. Bill, which paid its way in four years and has been a pure profit ever since? [What about] ARPA and computers, ARPA and the Internet, and ARPA and God knows what else?" He wasn't invited back this year" (Hudson 1996).
Neither do most hackers share the techno-yuppies view of the Internet as a lawless environment where supervision and control is both undesirable and impossible. Instead, there is growing concern in the hacker community that the increasingly commercial nature of the World Wide Web and the robber-baron capitalism of Internet junk mailers are overgrazing their beloved digital commons. After an interim where technological fixes (e.g. cancel 'bots [ 7]) were implemented and found to be inadequate, stronger regulation of abuses from both responsible service providers and governments has been called for (Seminerio and Broersma, 1998). Other hackers have already abandoned the present-day Internet as a great experiment which has now turned sour, and has embarked upon the task of creating "Internet 2", which among other things, is being designed with much better mechanisms for stopping abuses and enforcing policy.
In general, hackers strive to free themselves from the capitalist mode of productions. Hackers therefore tend to be associated with universities or projects that have no commercial interests. If necessary, hackers may at times do consulting "for money" just to make ends meet, but their priorities are clearly to earn a living in order to create beautiful technology, not the other way around. When interviewed about this, several hackers of my personal acquaintance cite an essay by Alfie Kohn entitled Studies Find Reward Often No Motivator: Creativity and intrinsic interest diminish if task is done for gain (Kohn, 1987) as explanation for their motivation and lifestyle.
The independence of the typical techno-yuppie is a more elusive thing. As noted by Greenbaum (1995) and Barbrook and Cameron (1996) techno-yuppies are usually well-paid and also have considerable autonomy over their pace of work and place of employment. But they are also tied by the terms of the consultancy contracts they enter into and the fact that they have no guaranty of continued employment beyond the expiration date of their assignment.
Deconstructing Software
Joseph Weizenbaum, we already have noted, did not like hackers:
"To hack is, according to the dictionary, "to cut irregularly, without skill or definite purpose; to mangle by or as if by repeated strokes of a cutting instrument". I have already said that the compulsive programmer, or hacker as he calls himself, is usually a superb technician. It seems therefore that he is not "without skill" as the definition will have it. But the definition fits in the deeper sense that the hacker is "without definite purpose": he cannot set before him a clearly defined long-term goal and a plan for achieving it, for he has only technique, not knowledge. He has nothing he can analyze or synthesize; in short, he has nothing to form theories about. His skill is therefore aimless, even disembodied. It is simply not connected with anything other than the instrument on which it may be exercised. His skill is that of a monastic copyist who, though illiterate, is a first rate calligrapher" (Weizenbaum, 1976).
As the real target in Weizenbaum's pamphlet "Computer Power and Human Reason" is instrumental rationality (the belief that because a task is technically feasible, it should be performed), it is only fitting that he includes an attack on a community who apparently showed little restraint in embracing technology for technology's sake.
But the hackers that Weizenbaum observed and wrote about (the residents of the 9th floor at MIT Tech Square in the early-to-mid-seventies) were prolific, productive and creative implementers. They designed and constructed computer networks, timesharing operating systems, workstations and computer languages. With that in mind, I find Weizenbaum's statement about the hacker's superior technical skill linked to a lack of purpose and aim intriguing. To what extent is Weizenbaum's observation about hackers being "aimless" correct? Is there no sense or purpose to the software artifacts created by hackers?
Unfortunately, when reviewing a computer artifact, it is very seldom one has access to authoritative material that asserts the purpose or aim behind its creation. Even for commercial products, where advertising copy sometimes purport to gives such information, one finds that it usually is just blurb created by the marketing department.
Nevertheless one has the computer artifacts themselves. I will attempt to "read" these artifact in order to deconstruct some original design decisions as well as the aim and the purpose of the project. Doing this is not without its pitfalls, as my personal preferences and experiences as a user of such artifacts doubtless will mediate the result, but with the exception of being present as an observer during the entire gestation process, I can think of no better method to uncover the aims and purposes behind computer artifacts.
First, let's tabulate a list of software artifacts rooted within the hacker community with their corresponding mainstream counterpart to see if some pattern emerges. While the list to some extent may be viewed as arbitrary, my criterion for inclusion on either side is that the artifact should be fairly generic, widely deployed, or popular by its user community.
Software Roots Within the Hacker Community Outside the Hacker Community Network Infrastructure Internet ISO/OSI, SNA Programming Languages Lisp, C, C++, Perl Cobol, Ada Multimedia Formats HTML XML, Acrobat/PDF Operating Systems ITS, Unix, Linux VMS, MVS, MS Windows/NT Window Systems X.11 MS Windows Text Editors emacs, vi MS NotePad, MS Word Typesetting TeX LaTeX, MS Word
Software constructed by hackers seem to favor such properties as flexibility, tailorability, modularity and openendedness to facilitate on-going experimentation. Software originating in the mainstream is characterized by the promise of control, completeness and immutability.
Most of the artifacts originating outside the hacker community are the product of corporate entities. It is, however, interesting to note that the two most dysfunctional and misengineered (at least, in this author's opinion) - the Cobol programming language and the ISO/OSI network infrastructure - originated from semi-democratic processes with the best intentions and ample room for user participation. The team that created Cobol set out to create a programming language that was intended to be more "user-friendly" than any other computer language - including the quality of being readable by non-programmers. And the ISO/OSI network infrastructure was designed by networking experts and users in concert. Through an elaborate system of meetings, discussions and voting, all concerns was addressed and all opinions was listened to by the committee, before the design decisions were meticulously documented in a series of official documents.
Looking at some of the designs in more detail, it seems clear that the creators of the emacs text editor understood well its potential applications. The range of problems and environments in which an editor might be used was well beyond what could be imagined by its inventors. To deal with these unpredictable situations, the editor has its own embedded production-strength Lisp programming language, which users can modify as necessary. This particular feature has been well exploited by users to create an impressive range of applications needing a powerful text manipulation tool - from software development systems and mail-readers, to desktop publishing. To save needless duplication of effort, literally thousands of ready-made Lisp plug-in modules for emacs created by communal effort have been assembled and can be downloaded from the Net. These can be used as-is, or act as starting points for new modifications.
For typesetting complex text and graphics, the hacker community has provided TeX. TeX works seamlessly with emacs (and most other text editors should the user have other preferences), and again provides almost infinite tailorability and flexibility. The built-in rule set (which may be overridden when required) knows a number of important typographical rules and provides output that is pleasing to the eye and adheres to generally accepted typographic conventions.
Contrast TeX to Microsoft Word, a text editor and typesetter rolled into one package. While the latest offering (Word '97) seems to be crammed with all sorts of "features" [ 8], for me it seems to be impossible to adapt it to do what I really want it to do. Even very simple adaptations, such as changing key bindings to get a consistent set of interfaces when switching between tools and environments, seems to be impossible. The only possible reading of this design decision is that Microsoft does not want me to switch between tools and environments, but prefers that I remain locked in their proprietary world.
Having purchased and started to use a product I do not like and do not want to own (Microsoft Word) in order to fulfill a client's (and even some academic conferences') expectations, I have now also resigned myself to choosing one of two equally unpleasant options: Either to rely on the mostly immutable built-in rule-set and deliver sub-standard looking documents - or spend a disproportionate amount of my time to layout the document manually. To configure the typesetting portion of Word to automatically and consistently provide professional looking typography is in my opinion beyond the capabilities of the tool.
MS Word is by no means unextensible. It comes with a powerful macro and scripting capability and permits the use of Microsoft's Visual Basic as an extension language. It is reasonable well integrated with the other products in Microsoft's Office family (Excel, PowerPoint, Access and Outlook), and Microsoft's OLE (Object Linking and Embedding) provides a means to integrate Word (and the other Office products) with third party applications. This extensibility does, however, not necessarily imply tailorability. The property of extensibility may be interpreted as the expression in software of an imperialist strategy of assimilation and conquest [ 9]. The property of tailorability is one where the artifact is open to yield to user and environmental requirements.
Another interesting comparison may be made between HTML (Hyper Text Markup Language) and the Adobe Acrobat PDF format. Both are essentially vehicles for presentation of hyperlinked multimedia documents across distributed systems. HTML imposes no restrictions on how the markups are interpreted and presented. There are some conventions and guidelines, but the final decisions are essentially left to the implementers, which means that it is possible for hackers to implement any personal preferences they may have, including those who substitute plain text for graphical elements to cater for visually disabled users. By comparison, the Adobe Acrobat PDF format is designed to re-create an excruciating exact facsimile of the original document, and allow very few deviations from this. Further, the HTML system makes the source code of the document available [ 10] to the user, to be studied, admired, copied and modified. Nothing like this is possible with the Acrobat/PDF, which touts the immutability of the "original" as a major feature.
Similar readings may be made for all items in the list above, and for many similar pair of items originating respectively inside and outside the hacking community [ 11].
Another observation that emerges from studying the table above is the attitude towards the end user (i.e. a user with no inclination towards adapting, extending or developing computer artifacts). None of the artifacts originating within the hacker community has the property commonly referred to as "user-friendly". It almost seems as if the hacker attitude towards end users can be summed up in: "Since it was hard to write, it should be hard to use."
A case in point is emacs: Emacs was originally developed on the ITS operating system running on the 36-bit DEC-10 computer in 1975. More than twenty years later, it is still almost universally loved as the "one true editor" by hackers all over the world (there is a contingent of ascetics who considers emacs to rich too their taste, and sticks to vi). But the apparent attraction of emacs lies is in its tailorability and flexibility - not its user interface. It has been adapted to work on every conceivable platform, but even in windowing environments, it more or less present itself as a command driven text editor of the glass-TTY area.
Several attempts on my part to turn end users on to emacs, and the subsequent discussions about the merits and failings of the program made it painfully apparent that the majority of end users despise the emacs editor and environment with the same fervor as I love it. In their opinion, and in particular when compared to MS Word, it is considered utterly and totally user-hostile. They do not like the "old-fashioned" command driven interface, the "dull" look & feel, and the rather steep learning curve required to actually attain the tailorability and flexibility I had promised them.
Returning to MS Word, it seems obvious that from the outset, "user-friendliness" must have been one of the most prominent aims of its design team: It was one of the first substantial office applications to provide a graphical user interface - and ousted WordPerfect from its spot as the world's best-selling office application on that strength alone. Now, MS Word'97 even comes with a company of nine animated cartoon characters who dispenses friendly advice and encouragement to novice users, such as "Power Pup".
Incidentally, while emacs do not include cute cartoon characters, it can be tailored to be just as mouse-intensive as MS Word - if this really is what the user wants to do. One of the reasons it does not behave like that by default is because the text editor is one of the programs that receives the most intensive use in many computing environments. Having a mouse-driven (rather than a command-driven) interface may increase the risk of the user being afflicted by stress related problems suc has carpal tunnel syndrome. What may be perceived as "user-friendly" (a mouse-intensive, slick, direct-manipulation user interface) at one level, turns out to be user-hostile at another.
To understand some of the implications, let's again compare hacking to industrial software production:
Commercial software is generally produced by teams of computer workers churning out the various components (look & feel, functions, database administration, low level interface, etc.) to a pre-set list of specifications. In short, the mode of production is not dissimilar to piecework performed by the metal workers described in Taylor (1911). As with the metal workers, it is unlikely that the computer workers themselves or anyone they actually know will make extensive use of what he creates. Even if he or she is a user of the actual product the component is a part of, it may be just as difficult to grasp the role of the component in the finished product as it is for a car owner who also is a metal worker to appreciate how the cogwheel he has machined fits into his car. As a result, the functional properties and qualities of the finished artifact are of little concern to the worker. His or her only purpose is to satisfy the specifications for the component he or she is commissioned to make. Sometimes the myopia this causes lead to serious errors of judgement [ 12].
A hacker, on the other hand, does not perform well producing piecework software components based upon pre-set specifications. As we have seen, his preferred working environment is a communal setting very much like that of an artisan or craftsman engaged in pre-industrial production.
Like the workers in programming teams, hackers make use of the components of others. But there is a difference. While industrial ideal is to treat such components as "black boxes", of which nothing need to be known except how to interface to it - hackers generally require access to the artifact's interior and want to understand all parts of the systems they work on, including the components contributed by others. This is how Eric S. Raymond describes his motivation for doing a major rewrite of code contributed to his project by another hacker:
"I had another purpose for rewriting besides improving the code and the data structure design, however. That was to evolve it into something I understood completely" (Raymond,1997).
This does not mean that hackers are oblivious to such staple engineering practices as object oriented programming and information hiding. These and other sound engineering practices are routinely employed when hackers create complex software systems. But the difference between the hacker's approach and those of the industrial programmer is one of outlook: between an agoric, integrated and holistic attitude towards the creation of artifacts and a proprietary, fragmented and reductionist one.
Hacking in the Real World
"Linux is subversive. Who would have thought even five years ago that a world-class operating system could coalesce as if by magic out of part-time hacking by several thousand developers scattered all over the planet connected only by the tenuous strands of the Internet" (Raymond, 1997).
While there exists a number of studies of hackers as a political, sociological and cultural phenomenon, I know of only one study of the hacker as a programmer, in a paper entitled "The Cathedral and the Bazaar" by Eric Raymond (1997) [ 13]. It tracks the development of a software system (fetchmail) implemented by Raymond himself and a large number of collaborators. Raymond's paper is part diary, part descriptive and part prescriptive. Reading it, however, one is struck by the similarity between the system development model described by Raymond, and the system development models offered by a number of European information system developers from the mid-eighties (e.g. Floyd, 1989; Mumford, 1995) as an alternative to the waterfall model. The basic ideas (rapid prototyping, iterative development, and strong user participation) are similar.
"I released early and often (almost never less than every ten days, during periods of intense development, once a day)" (ibid.).
"One interesting measure of fetchmail's success is the sheer size of the project beta list [...] At time of writing it has 249 members and is adding two or three a week" (ibid.).
"Users are wonderful things to have, and not just because they demonstrate that you are serving a need, that you've done something right. Properly cultivated, they can become co-developers. [...] Given a bit of encouragement, your users will diagnose problems, suggest fixes, and help improve the code far more quickly than you could unaided" (ibid.).
"Given enough eyeballs, all bugs are shallow. [...] Although debugging requires debuggers to communicate with some coordinating developer, it doesn't require significant coordination between debuggers. Thus [debugging] doesn't fall prey to the same quadratic complexity and management costs that make adding developers problematic" (ibid.).
But, as evident by the four quotes from Raymond's paper given above, there are also some important differences.
Firstly, what distinguishes the method for software development prescribed and described by Raymond from methods such as STEPS and ETHICS is the absence of formalism (certainly in Raymond's scholarship, but also, although to a lesser degree - one suspects - in the method's execution).
Secondly, levering on modern tools for automatic system update and the Internet as an infrastructure for user/developer contact, Raymond speeds up his development cycles to a frenzy, co-opts his users as debuggers/developers, and indiscriminately adds everyone who wants to participate to the project beta list. This is different from the carefully metered out development cycles, the clear division of roles between users and developers, and the representative system for user participation, that figures prominently in both the STEPS and ETHICS methods.
Thirdly, both the developers' and the users' desire for participating in the endeavor is more or less taken for granted in STEPS and ETHICS. Raymond acknowledges that securing participation from all parties may pose a problem, and argues that the project coordinator need to have some of his/her focus on user and developer motivation, take certain steps to ensure it, and need to possess personal qualities in this area.
Fourthly, STEPS, ETHICS and similar models are presented as universal approaches that can be used regardless of circumstances (I doubt whether this actually is true, but that discussion is beyond the scope of this paper). Careful reading of Raymond's paper makes it fairly clear that hacking as an approach to constructing software artifacts should not be considered equally and universally applicable. The success of the project as described by Raymond seems to depend on at least three pre-conditions being present:
- The projected system must fill an unfilled personal need for the developer;
- The project need to secure user participation and maintain continued user support; and,
- The project coordinator/leader must have good interpersonal and communication skills.
The second pre-condition implies that hacking is not an applicable method when developing an information system "from scratch". Since hacking does not involve formal requirements or system specifications, there will - at that point - be little to attract or interest the users. Hence, if the task at hand is to create a new system "from scratch" one should not consider hacking as a viable method for software creation, but rely on more conventional methods for system creation.
However, if some previous system exists that may be used as the starting point for developing a new system that eventually will replace it, or if the system development project has evolved to the point where prototypes of the projected system are sufficiently functional to interest users, hacking may be viewed as an alternative method for system development. Given that the right pre-conditions exist, hacking as a method for system development may result in a better system.
The pre-conditions listed above do not stipulate that hacking only works in a computer underground setting, nor does it limit the applicability of this method to the production of "free" software. Also, hacking is not an all-or-nothing proposition. A project may well start out being developed along any number of traditional methods, and then switch to the hacker approach when prototypes or early versions have evolved to the point where hacking becomes viable.
Looking around, I find that hacker-like approaches to software development are adopted in environments where one would least expect it.
For Microsoft, many customers are becoming debuggers as "beta" versions of new products are distributed in massive quantities (literally tens of thousands of copies) on the Internet. Microsoft has also developed closer communication channels between users and developers by having some of their developers participate in on-going discussions about their products on the Internet [ 14].
Netscape has gone even further down this route. By making the source code of its Navigator Internet browser open and freely available, Netscape is essentially gambling on hacking as a method to making it a superior product.
Conclusion
So far, hacking as a method for construction of information systems and software artifacts has been precluded from serious study and consideration. The term itself is also poorly understood, surrounded by much prejudice, folklore and mythology. Part of the confusion stems from attempts to hi-jack the term by a large number of special interest groups ranging from digital vandals to neo-classical economic liberals. It is, however, possible to see through the confusion and excess baggage, and what remains is a community sharing an attitude to and method for construction of computer artifacts that has been consistent since the 1960s.
Hacking as a method for system development originated as a grass-roots reaction to attempts to impose an industrial mode of production on the development of software. The qualities emphasized by the implicit and explicit ideologies of this community result in the production of artifacts whose quality and usability characteristics are different from those gestated through an industrial mode of production.
This community has successfully created a number of usable and unique software artifacts - ranging from text editors to the Internet. Lately, large corporate entities such as Microsoft and Netscape have started experimenting with hacker-like approaches in areas such as quality assurance and user/developer communication. Still, hacking as a method for creating information systems and software artifacts has received virtually no attention from the scholarly community interested in methods for system development.
My belief is that "hacking" deserves to be put on the map as a viable method for the creation of construction of information systems and software artifacts. It should be studied alongside other system development methods, and practitioners in the field of system development should be aware of its applicability and able to take advantage of its "bag of tricks" when appropriate.
Acknowledgments
First, thanks to Eline Vedel, for encouraging me to write this piece in the first place, and for being available to discuss it at various points on the way. Also thanks to the participants at the 1997 Oksnøen Symposium on Pleasure and Technology, where an earlier and shorter version of the paper was presented, and to Rick Bryan, Ole Hanseth, Haavard Hegna, Arne Maus and Eric Monteiro, who took the time to read the draft and who provided stimulating critisism, discussion and suggestions.
Any errors in fact or logic that remain, as well as the opinions expressed are, of course, only my own.
About the Author
Gisle Hannemyr received his B.Sc. at the University of Manchester, England in 1977, majoring in computer architecture. Since then, he has been active in both academic research and commercial use in a number of computer related fields, including hardware design, AI, system development and computer networking. He is currently pursuing research interests as a fellow of the Department of Informatics, University of Oslo, Norway. E-mail: gisle@hannemyr.no
Notes
1. Witness, for instance, the politics implicit in The Whole Earth Catalog and the activism of its editor/publisher, Stewart Brand.
2. The name is a pun on the IBM CTSS - Compatible Time-Sharing System.
3. The term "free software" in this context denotes four levels of freedom: 1) that it is free of any restrictions that limits its use and application; 2) that it is freely distributable, 3) that its freely portable between different operating platforms; and, 4) that the source code is available, so users are free to modify and tailor the software.
4. GNU is a recursive acronym standing for GNU's Not Unix. It designates an ongoing effort from 1985 within the hacker community to create a body of "free software" (see above) at least as complete and rich as the Unix operating system and its associated utilities.
5. It has been claimed that this particular design decision was due to the ideas of RAND Corporation scientist Paul Baran (Baran, 1964), who had a strong interest in the survivability of communication systems under nuclear attack. Baran argued strongly for distributed networks and the use of packet messaging as means to achieve this. Bob Taylor, who as director of ARPA's Information Processing Techniques Office oversaw the formation of the ARPANet, has denied that the design of the ARPANet was motivated by anything related to supporting or surviving war. According to Taylor, the driving force behind the ARPANet was simply to enable scientific laboratories all over the U. S. to share computing resources (Hafner and Lyon, 1996).
6. "cybercrud /si:'ber-kruhd/ (coined by Ted Nelson) Obfuscatory tech-talk. Verbiage with a high MEGO [My Eyes Glaze Over] factor. The computer equivalent of bureaucratese" (Raymond, 1991).The irony of using a jargon word of his own invention ("cybercrud") to protest obfuscatory tech-talk probably never occurred to Nelson.
7. A cancel 'bot is an automatic process which monitors messages sent over Usenet and deletes those who meet certain pre-set algorithmic criteria.
8. Microsoft Word will automatically change a lowercase "i" to a capital one. This is indeed a nice feature when one is writing English text. It is not so convenient that Word continues to make this correction even after you have changed the language of your document to something else.
9. The imperialist underpinnings of Microsoft Windows is apparently so strong that it is the norm for Windows programs to behave like conquerors. When you install a new program on a Windows machine, it will invariably claim whatever it can gain access to for itself.For instance, when installing Microsoft Word on a computer that already running Adobe Framemaker, I found that ownership of all pre-existing Framemaker documents was automatically and expediently transferred Word and effectively rendered useless (Word isn't capable of opening them). The message was loud and clear: Thou shalt have no other word processors before Microsoft Word.
10. Note that when the Java language, rather than HTML, is used for Web presentations, the intrinsic access to the source code goes away. This is one of the reasons the hacking community has been split over whether Java should be considered an acceptable tool or not. Also, the chief architect of Java - James Gosling - has a long and stormy relationship with the hacking community. He was one of the original hackers at Carnegie-Mellon University, and also the main programmer behind the move of the hacker's favorite editor - emacs - from TECO/ITS to C/Unix. As the hacking community migrated from DEC mainframes to Unix, gosmacs, as Gosling's version of emacs became known became their favorite on that platform. Gosling then enraged the hacking community by retroactively imposing his own copyright on his (originally assumed by fellow hackers to be copyright-free) implementation and selling the exclusive rights of it to a commercial software company.
11. There are also some items whose status may be disputed. What about the Apple Macintosh and the IBM PC for instance? Hackers universally despise both machines. According to Bruce Horn, who was part of the small team that created the Apple Macintosh, most members of the team (with the notable exception of Steve Jobs) were hackers in the original sense of the word. However it was also stipulated (by Jobs) that the machine should not be extensible and that the inner workings it was to be hidden from any user/programmer that had not entered into contractual relationship with Apple to become an "official" developer. This effectively prevented it from becoming a hacker's favorite. And, of course, Apple's decision to claim copyright on the "look & feel" of the Macintosh (which hackers believed was a mishmash of ideas that were common knowledge and had been part of the industry for years) - did not help to endear Apple and the Macintosh computer to the hacker community. As for the IBM PC: The hardware originated outside the hacker community, in an IBM laboratory located in Boca Raton, Florida. Nevertheless, it sported nevertheless a number of hacker-favored properties, such as openendedness, accessibility and flexibility. What the original design lacked, however, was any type of networking support. It was really a personal computer, and therefore unsuitable for the hacker communal style of working. The hackers stuck to their DEC mainframes, or to networked workstations of their own design.
12. In a recent development of a large back-office computer system by a large and prestigious consultancy known for its strict belief in "specification" and "method" as means to superior software quality - the system as constructed turned out to be useless for its intended purpose because the coders had elected to use a data type known as "double precision floating point" to represent the decimal fractions the specifications called for. Now, while "double precision floating point" is the appropriate data type for decimal fractions in contexts such as physics and engineering, it is a disastrous choice for data that at some point are to be processed by transaction systems. Despite its very promising name, "double precision floating point" is a data type that, on binary computers, is unable to hold an exact representation of many two-digit decimals fractions (e.g. 0.30). This leads to minuscule rounding errors that play havoc with the embedded controls of any properly designed transaction system.
13. One extraordinary thing about Raymond's paper, is that is on record (Barksdale, 1998) that it was instrumental in making Netscape Corporation disclose the source code of its product and make it openly available for hackers to inspect, change and develop.
14. To what extent it is possible to reap the benefits from this approach while still refusing to hand out source code is unclear. In the University of Oslo multimedia lab, where we among other things work on developing protocols and services for the next generation Internet (Internet 2), we found ourselves being stymied in our attempts in using Microsoft operating systems and network browsers for this purpose. Without source code available, it was just impossible to do the necessary modifications to the communication stack and device drivers to make the equipment work in an Internet 2 environment.
References
Scott Adams, 1996. The Dilbert Principle New York: Harper Business, and The Dilbert Zone.
Anthony Jon Lev Anderson, 1993. "Technology and Freedom," at http://www.eff.org/pub/Activism/technology_freedom.paper
Paul Baran, 1964. "On Distributed Communications Networks," IEEE Transactions on Communications Systems, volume 12; and, Paul Baran, 1964. "On Distributed Communications: I. Introduction to Distributed Communications Network." Rand Corp. Memorandum RM-3420-PR, at http://www.rand.org/publications/RM/RM3420/
Richard Barbrook and Andy Cameron, 1996. "The Californian Ideology," http://www.dds.nl/~n5m/texts/barbrook.htm
Jim Barksdale, 1998. "Software Development for the Greater Good," http://www25.netscape.com/columns/mainthing/source.html
Christopher B. Browne, 1998. "Linux and Decentralized Development," First Monday, volume 3, number 3 (March), at http://firstmonday.org/issues/issue3_3/browne/index.html/p>
Harry Braverman, 1974. Labor and Monopoly Capital. New York: Monthly Review Press.
R. Budde, K. Kautz, F. Kuhlenkamp and H. Züllighoven, 1992. Prototyping: An Approach to Evolutionary System Development. Berlin: Springer-Verlag.
Vint Cerf, 1997. "Vint Cerf: Father of the Internet," interview conducted by technical editor Leo Laporte and broadcast by MSNBC: The Site (June 3), at http://www.thesite.com/0697w1/iview/iview421_060397.html
Bo Dahlbom and Lars Mathiassen, 1993. Computers in Context: The Philosophy and Practice of Systems Design. Cambridge, Mass.: NCC Blackwell.
Christiane Floyd, 1989. "STEPS to Software Development with Users," Proceedings of ESEC 1989, University of Warwick, Coventry, England, 11-15 September.
Joan Greenbaum, 1976. "Division of Labor in the Computer Field," Monthly Review, volume 28, number 3.
Joan Greenbaum, 1995. Windows on the Workplace. New York: Monthly Review Press.
Katie Hafner and Matthew Lyon, 1996. Where Wizards Stay Up Late: The Origins of the Internet. New York: Simon & Schuster.
David Hudson, 1996. "Digital Dark Ages," San Francisco Bay Guardian, November 6, at http://www.metroactive.com/papers/sonoma/01.02.97/cyberlibertarian-9701.html
Kevin Kelly, 1994. Out of Control. Reading, Mass.: Addison-Wesley.
Alfie Kohn, 1987. "Studies Find Reward Often No Motivator: Creativity and intrinsic interest diminish if task is done for gain," Boston Globe, January 19, and at http://fsf.varesearch.com/philosophy/motivation.html
Arthur Kroker and Michael A. Weinstein, 1994. Data Trash: The Theory of the Virtual Class. New York: St. Martin's Press.
Stephen Levy, 1984. Hackers: Heroes of the Computer Revolution. New York: Anchor Press/Doubleday.
Gordon R. Meyer, 1989. "The Social Organization of the Computer Underground," DeKalb, Ill.: Northern Illinois University, Master Thesis, http://ftp.eff.org/pub/Privacy/Newin/New_nist/hacker.txt
Enid Mumford 1995. Effective Systems Design and Requirements Analysis: The ETHICS Approach. Basingstoke: Macmillan.
Nicholas Negroponte, 1995a. "Being Digital - A Book (p)review," Wired, volume 3, number 2 (February).
Nicholas Negroponte, 1995b. Being Digital. London: Hodder & Stoughton.
Ted Nelson, 1974. Computer Lib / Dream Machines. Chicago: Mindful Press.
Andrew Pam, 1995. "Where World Wide Web Went Wrong," Proceedings of the Asia-Pacific World Wide Web '95 Conference, http://www.glasswings.com.au/xanadu/6w-paper.html
Andrew Pam (editor), 1997. "Xanadu FAQ," http:/ / www.xanadu.com.au/ xanadu/ faq.html
Eric S. Raymond, 1991. The New Hacker's Dictionary. Cambridge, Mass.: MIT Press, and at http://www.tuxedo.org/~esr/jargon/jargon.html
Eric S. Raymond, 1998. "The Cathedral and the Bazaar," First Monday, volume 3, number 3 (March), at http://firstmonday.org/issues/issue3_3/raymond/index.html
Robert Reich, 1991. The Work of Nations: Preparing Ourselves for 21st-Century Capitalism. New York: Random House.
Howard Rheingold, 1993. The Virtual Community: Homesteading on Electronic Frontier. Reading, Mass.: Addison-Wesley.
Louis Rossetto, 1993. "Why Wired," Wired, volume 1, number 1 (January).
Tanja S. Rosteck, 1994. "Computer Hackers: Rebels With a Cause," Montreal: Concordia University, Dept. of Sociology and Anthropology, Honours Seminar - Soci 409/3, http://www.proac.com/crack/hack/files/hack7.txt
Frederick Winslow Taylor, 1911. "The Principles of Scientific Management," American Magazine, (March-May), http://melbecon.unimelb.edu.au/het/taylor/sciman.htm
Maria Seminerio and Matthew Broersma, 1998. "Usenet junk e-mail could swamp the system Friday," ZDNet (April 3), http://www.zdnet.com/zdnn/content/zdnn/0402/303950.html
Richard M. Stallman, 1985. "The Gnu Manifesto," http://www.fsf.org/gnu/manifesto.hmtl
Sherry Turkle, 1984. The Second Self: Computers and the Human Spirit. New York: Simon and Schuster.
Joseph Weizenbaum, 1976. Computer Power and Human Reason. San Francisco: W. H. Freeman.
Langdon Winner, 1995. "Peter Pan in Cyberspace: Wired Magazine's Political Vision," Educom Review, volume 30, number 3 (May/June), http://www.educause.edu/pub/er/review/reviewarticles/30318.html
Copyright © 1999, ¡ ® s - m ¤ ñ d @ ¥