First Monday

FM Reviews

Thierry Bardini.
Bootstrapping: Douglas Engelbart, Coeveolution, and the Origins of Personal Computing.
Stanford, Calif.: Stanford University Press, 2001.
paper, 284 p., ISBN 0-804-73871-8, US$19.95.
Stanford University Press: http://www.sup.org

Thierry Bardini. Bootstrapping: Douglas Engelbart, Coeveolution, and the Origins of
Personal Computing.

Crusaders make unsettling neighbors, by and large. They try to recruit you for the rowdy and undisciplined troops they gather to their causes, and frequently lay waste to the neighborhood before they set off to do the same elsewhere. They generally have no room in their lives for any concern more mundane than their own exalted projects. They can be fearful bores.

Douglas Engelbart appears to be an exception, perhaps because his purpose in life was relatively modest, as crusades go. He wanted to shape the computer and its user into a co-evolutionary unit, where the user would command the computer as a sort of slave, a kind of extension of her own body and mind. His aim was fairly grandiose, like most crusades: he wanted to augment human intelligence by integrating this new tool into its users' lives. This was in direct contrast to the model proposed by other influential workers in the development of personal computing, perhaps the most significant of whom, according to Bardini, was J.C.R. Licklider. As first director (1962-64) of the Information Processing Techniques Office (IPTO) at the U.S. Defense Department's Advanced Research Projects Agency (ARPA), he was in a position to provide the major funding for all projects devoted to the development of computer technology. Licklider's vision of the future of computing was quite different: he envisioned the machine becoming a quasi-independent "colleague" of the "intelligence worker", eventually capable of more or less autonomous cerebration: artificial intelligence, in other words.

Engelbart's plan was to involve computer users with the machines in the process of co-evolution, so that the use of the tool would involve the development of a new kind of human, capable of interacting with the machine in a sort of synergistic symbiosis. The key to this process, in his view, was the mode of interface by which the two symbionts would communicate one with the other - and ultimately with the universe of other computers/users.

Engelbart's idea was to meld the user and the tool in a much more integrated fashion than the familiar vision-dependent graphical user interface with its metaphor of the desktop. He wanted to include the body in the relationship, by making kinesthetic skills akin to those of a trained musician part of the mix. The closest he came to success in this area was in inventing the mouse, but his plans originally included a device called the chord keyset. This was a five-key device based loosely on the telegraph key in which the transmission of information involved specific combinations of keys rather than the distinct meaning assigned to each key of the QWERTY typewriter keyboard. This failed in part because the typewriter model had already established a long hegemony; its use was too familiar to be replaced, and the chord keyset required a long learning curve. I can personally attest to that, having tried to learn a similar tool, the court reporter's stenotype machine, with which it is possible to record speech verbatim much faster than any typist can do. That experience also supports Bardini's main contention: that Engelbart's model of the interface would have made the computer a more flexible and effective tool than the current interface provides. The stenotype was hard to learn but much easier to use.

Another factor that contributed to the keyset's failure was the development of the Apple computer and its graphical user interface. This invention, with its menus and icons, removed a major difficulty with earlier systems, in which modality could and frequently did become a source of frustration. This refers to the fact that all computers receive both instructions and data from their users. In order to switch from one function to the other it is necessary to switch modes, and this, in systems like DOS and CP/M, requires remembering long code strings to bring about the change. The menu and the icon render this mode-switching functionally invisible, removing the need to learn the cumbersome coding of the keyset.

Networked computing was central to Engelbart's ideal. This is quite distinct from the AI model, which imagined the human as essentially external to the process. The computer, in a new vision of the golem, would operate on its own. What purposes it might serve in that mode are not detailed here, but that is a side issue to Bardini's main purpose, which is to sketch the history of Engelbart's contribution to the history of personal computing.

The book makes it clear that his efforts to evolve a new being out of the computer/human interactor failed in large part precisely because of the way Engelbart perceived his mission. In his career at Stanford Research Institute he single-mindedly pursued his goal, to the exclusion of much interest in other developments in the field, and came to dominate his subordinates there to an extent that created something of the aura of a prophet leading his flock into a techno-promised land. According to Bardini's account, fleshed out with some help from a thinly-disguised roman à clef by the UFO "researcher" Jacques Valee, Engelbart made the always-fatal prophetic mistake of permitting competition. In the climate of the late 60's personal potential movement, Engelbart had instituted by fiat a sort of weekly encounter group meeting for all his personnel from which he entirely excluded himself. These pointless sessions, decreed to fill no less than two hours a week of otherwise potentially productive time, predictably created a mutinous climate among his staff. This was only enhanced when Engelbart permitted and then encouraged active input from another prophet: Werner Erhard and his acolytes in Erhard Seminars Training, known as est.

Engelbart's project was already in trouble, largely because of the great difficulty in learning to use his system. Like the stenotype it was very easy to use once you had learned it, but Engelbart had no problem at all with the fact that it could take a year to learn to use his software. Since at that time the only clients for his system were industries, this made his system a very hard sell, which also didn't trouble him greatly. Another problem with Engelbart's project was that it was trying to do two quite different things at once: R&D on computer interfaces and product development for the market. The product development arm was always something of a poor relation, since Engelbart's interest was always and only in research and development.

The story of Engelbart's progress is a long and complex one, full of shifting personnel and corporate and academic alliances. This complexity, no detail of which is spared in Bardini's account, serves in an ironic way to demonstrate the nature of Engelbart's interface. Since the book is about computers and their designers it is packed with acronyms. A passage that demonstrates the difficulty this book poses is the following discussion of "requests for comments", a sort of interoffice mail at Stanford Research Institute, which is not in most respects untypical:

"SRI-ARC authors contributed eight RFCs [in 1973], and this trend continued in 1974 and 1975, until the NIC moved out of ARC. From 1974 on, NWG contributions from ARC were limited to the work of two staff members who were not part of the early NLS development."

This encoded language is hard to learn and easy to use, exactly like Engelbart's interface. As a non-participant in the development of the personal computer, I found I spent almost more time in the index of Bardini's book than in the text, though I have also to acknowledge that spelling out all those acronyms might have made the book even harder to read.

A final point that needs mention is Engelbart's difficulty with imagining the user, though to be fair it must be pointed out that all designers have the same problem, at least in the early stages of their work: the user is themselves, since no one else is involved with the product, whatever it might be. In later tests of the product with naïve users, problems inherent in the design process can, to some extent be ironed out, but Engelbart appears to have ignored the "real" user of his interface, which was a fatal mistake and another arena in which Apple stole his thunder. Jobs and Wozniak designed a computer whose user was everybody, while Engelbart remained committed to the "intelligence worker" as his ultimate customer. This figure was, in large part, himself: a dedicated researcher devoted to development of himself and his tools. Ultimately, it was crusading that killed his idea.

None of this is mere metaphor, for recurring passages in the book make it clear that Engelbart and at least some of his colleagues pursued their goal with a fervor that was nothing short of religious. Bardini makes the point explicit in his discussion of the culminating est episode, where Engelbart's staff turned on him with the rage of disappointed true believers. Like all crusaders, Engelbart eventually had to settle for a lot less than he aimed for. --Ted Daniels End of Review

++++++++++

Antony Bolante.
Premiere 6 for Windows and Macintosh.
Visual Quickstart Guide.
Berkeley, Calif.: Peachpit Press, 2001.
paper, 592 p., ISBN 0-201-72207-0, US$19.99.
Peachpit Press: http://www.peachpit.com/

Antony Bolante. Premiere 6 for Windows and Macintosh.

If you work with electronic digital media, or you ever intend to, chances are that you will have already come across Adobe's flagship digital media authoring program called Premiere. With a keen eye on the increasing use of the humble desktop PC to create digital media content, Adobe have recently unveiled the latest major update to version 6. There is now added support for Web video, native DV capture, and real-time audio mixing and many additional features that will keep professional editors very happy. The transition from Premiere 5.1 to 6 was easy and the interface really hasn't changed that much. Many existing features have been either fixed or improved with the addition of new features, the feel of the program overall is much more robust.

Peachpit Press and author Antony Bolante have also not been been idle in the last year or so since VQG Premiere 5.1 was published, as this latest edition attests to. The Visual Quickstart Guide Premiere 6 for Windows and Macintosh has now matured into a 562 page guide, a "bible" by any other publisher; and if I did find one fault with the VQS Premiere 6 guide it is that it's not in the Visual Quickpro Guide series and it certainly should be. Every chapter is laid out in a clear two-column format, one for text and one for screen shots, and each section is written in an easy-to-follow step-by-step format. For a new user learning the basics of the program, there are no better books than Peachpit Press' Visual Quickstart Guides; by using the guide, the reader can learn how to edit short and long format movies for video, multimedia, and Internet applications.

The VQG Premiere 6 only has one additional chapter than its predecessor, 18 chapters as opposed to 17. The "Working with Clips" chapter has been split into two - "Importing and Managing Clips" and "Editing Source Clips" - these go a long way in clarifying the important differences between the two features. The book is organised to present information as you may encounter it in a real world project, though it is also the type of book that allows the reader to just jump to the topic of interest without getting bogged down. The "Introduction" examines the application's new features such as deciding on an editing strategy, optimising the workflow, utilising the media suite as well as availability of Macintosh and Windows versions. Chapter One covers the basics of the Premiere 6 workspace including overviews of the new "History" palette feature, keyboard shortcuts, context menus, and correcting mistakes. Chapter Two deals with "Starting a Project" from selecting an initial workspace through to deciding on which settings to adopt for the current project. The chapters on "Importing and Managing Clips", "Editing Source Clips", "Creating a Program", "Editing in the Timeline", and "Refining the Program" shows how to edit an existing project. Incorporating Transitions, Previewing, Adding Effects, and Motion Settings are covered in great detail in Chapters 8, 9, 11, 13, and 14. Audio mixing is covered in chapter 10 with panning, fading, balancing, and channels treated. Using the title tool is covered in chapter 12 though I am wondering why the chapter (16) on "Capturing Video" was towards the back of the book. In my humble opinion, I feel that this chapter should have been included within the first few chapters. The topic of Publishing is discussed in "Creating Output" (chapter 15) covering the exporting, recording, printing, and utilising the "Special Processing Options". The closing chapter, "Video and Audio Settings", discusses some non-Premiere related topics like video interlacing, NTSC and PAL frame rates, and compression settings and options such as Timecode, Safe Colours, Keyframes, and Codecs.

The author clearly shows the reader that Premiere 6's integration with other Adobe applications such as After Effects and Photoshop is more noticeable than in previous versions, for example, by providing After Effect filters that can be applied during video edits, 'dragging-and-dropping' of Photoshop files onto the project windows and creating 'instant composites' that can be viewed in the camera. This continues Adobe's vision to provide a more cohesive interface across all its high-end production tools.

The Visual QuickStart Guide Premiere 6 for Macintosh and Windows makes learning this powerful tool an achievable goal. The U.K. film industry may not be ready for your feature film, but at least you'll be ready for them. This is an ideal book for someone who already owns Premiere but finds the tutorials or the manual too intimidating. It is a step-by-step guide to professional video-editing for the beginner and professional editor wanting to be brought up to speed with the new features.

Peachpit Press should be congratulated for continuing to be the only publisher to consistently publish affordable "manuals" for the novice as well as the professional. The Visual Quickstart guides are becoming a recommended standard for computer related books; keep up the good work I say! - Glenn Dalgarno End of Review

++++++++++

John Carlis and Joseph Maguire.
Mastering Data Modeling: A User-Driven Approach.
Boston: Addison-Wesley, 2001.
paper, 416 p., ISBN 0-201-70045-X, US$44.95.
Addison-Wesley: http://www.aw.com/cseng

John Carlis and Joseph Maguire. Mastering Data Modeling: A User-Driven Approach.

Imagine a room full of surgeons, physiotherapists, occupational therapists, makers of artificial limbs (prosthetists), medical researchers, and a data modeler, all trying to agree on what a national amputee database should contain. The surgeons arrive with lists of amputation types and levels. The physios talk about rehabilitation techniques. The nurses discuss the relative merits of plaster of Paris dressing versus soft dressings. The data modeler sits pulling on his beard and twirling his glasses while getting more and more confused and frustrated at the conflicting needs of he users.

This scenario describes my first serious professional experience of data modeling. It seemed a long way from the neat examples and exercises of the text books. Getting people to agree on what should be in the model and what should not was a nightmare, especially as there were strong political motives and conflicting professional agendas within the group. Eventually a working model emerged from the meetings but I wish I had discovered Mastering Data Modeling first!

Mastering Data Modeling aims to help data modelers to build up constructive modeling relationships with their clients. The book lists six 'good data modeling habits':

These may seem obvious but sometimes the obvious must be made explicit in order to understand why it is obvious. The authors describe how users' language and vocabulary are distinctive to the users' working culture and how this can cause problems for the modeler. First, people from different cultures within the modeling group will disagree about what is worth recording in the database. Second, the cultures will differ as to the relevant distinctions that will need to be made: some will argue for broader categories than others. Third, the cultures will disagree about vocabulary. Fourth, it is difficult to understand another culture's vocabulary. Finally, members of a culture may not have a precise understanding of the vocabulary they use.

A data modeler has to enter a modeling meeting equipped to deal with these problems when they arise. The authors draw on their broad experience of modeling sessions with clients and describe events at some of these sessions, with a 'moral' to be learned from each one. Some examples of these 'Story Interludes' are:

Jim, a novice modeler, snapped at his clients, saying, "Tell me about our data, not about what you do!" ... little progress was made. Moral: Allow users to express themselves however they wish.

At Company X, a four-day modeling meeting blitz was not very productive ... At company Y, users who had come back after several weeks either had forgotten a lot or had done a lot of hard thinking that had put them out of synch with the others ... At company Z, a big meeting with two dozen users caused problems ... Moral: Have short, frequent meetings with a few users.

As we reverse engineered [a vending machine mechanic's form], we found ... a surprise ... handwritten text splashed diagonally across the form: Refund $1.75. The form was inadequate for his business so he sensibly augmented it. Moral: Interview real users who do real work.

Much of the book contains the details of the 'Logical Data Structure' (LDS) notation, which provides a means for developing data models with users during modeling meetings, and is essentially an entity-relationship notation. Each possible configuration ('shape') which may exists in a valid data model is described and explained and readers are encouraged to learn these shapes so well that they will instinctively recognize them during modeling meetings. There is a chapter called "Introduction to Mastering Shapes" which includes advice on knowing how shapes are likely to evolve as the model develops and on how to ask the right questions to help users to choose between shapes. 'Shape Mastery' also involves knowing the relative frequencies of the shapes: if a model contains too many rare shapes, you know something must be wrong.

A concept developed in the book which is often neglected in more theoretical data modeling text books is that of 'Controlled Evolution' of models, by which the process of model creation is controlled by the modeler without taking it away from the users. The method is explicitly Socratic - Socrates did not directly reveal the truth to people but guided them as they refined their own understanding.

Mastering Data Modeling is an innovative book that treats with humor a subject that is so often stodgy and dull. Computer science students, IT consultants and business managers will find valuable material in this highly recommendable book. - Robert Scovell End of Review

++++++++++

John M. Carroll.
Making Use: Scenario-Based Design of Human-Computer Interactions.
Cambridge, Mass.: MIT Press, 2000.
cloth, 376 p., ISBN 0-262-03279-1, US$39.95.
MIT Press: http://mitpress.mit.edu

John M. Carroll. Making Use: Scenario-Based Design of Human-Computer Interactions.

A recent rummage with AltaVista, using "scenarios" and "Carroll" as search terms, came up with a remarkable 1,036,655 pages. John Carroll is a pioneer and leading exponent of the use of scenarios as a design technique, particularly in the context of software development. Currently Professor of Computer Science, Education & Psychology and Director of the Center for Human-Computer Interaction at Virginia Tech, his prodigious output includes some 13 books, and more than 250 technical papers. But Carroll's interests and expertise are not confined to scenarios; from his own Web site:

"I have broad interests in HCI methods. For example, I developed techniques like "usability specifications" and "claims analysis" for making rapid prototyping less of a trial and error method. My perspective on usability engineering is lifecycle oriented. I do not see usability engineering as a fancy term for trial and error with lots of evaluation! My interests in HCI methods particularly in scenario-based design has led to the development of interests in software engineering, and particularly in requirements engineering."

Carroll's latest book is a very welcome addition to the field of information systems design and human-computer interaction. It is described as being a technical monograph yet manages to be both engaging and broad in its approach. Obviously, it is about scenarios, yes, but it goes beyond that and addresses, up front, the nature of design as a problem solving activity. For this reason it may appeal to designers in many fields. In a nutshell, scenarios are "concrete stories about use", they are about understanding users' real needs. Rather than designing software by mechanically listing requirements and functions, the focus is first on understanding the real life activities that need supporting and then informing the rest of the process with descriptions of those activities. Scenarios are composed of various elements; for example the setting, populated with agents/actors who have specific goals and objectives that are achieved via sequences of actions and events. By representing the use of a system with a set of scenarios, that use becomes explicit and widens the focus beyond the purely technical.

The book's Preface includes a clear description of the contents and points out that the chapters may be selectively read, although as the author notes, the further into the book you start the more backward references will have been missed. An early, key chapter is devoted to exploring what is meant by "design". Carroll identifies what he sees as characteristic and difficult properties of design (some of which were originally discussed in Reitman's 1965 monograph Cognition and Thought), observing that standard design methods seldom address these issues:

  1. incomplete description of the problem to be addressed;
  2. lack of guidance on possible design moves;
  3. the design goal or solution state cannot be known in advance;
  4. trade-offs among many interdependent elements;
  5. reliance on a diversity of knowledge and skills; and,
  6. wide ranging and ongoing impacts on human activity.
  7. There are clearly parallels here with the idea of resolving "wicked" (as opposed to "tame") problems, an idea first proposed by Rittel and Webber (1973) in the context of social planning; the main problem being that very often, problems themselves are ill-defined. Carroll then, sees information system design as sharing similar challenges with other design activities; a notion that I suspect may not have struck many systems "designers".

    Other chapters unfold to present arguments supporting the overall approach and suggest scenario-based development models and techniques. Numerous, highly illustrative case studies are included and the book concludes with an interesting survey of current scenario-based work in human-computer interaction and software engineering.

    From this brief review, some readers may conclude that scenario-based design is a fairly obvious idea - indeed, as Carroll himself engagingly reveals:

    "I am attracted to wide-ranging and somewhat obvious ideas. Scenario-based design is a good example. If you want to design something for people to use, especially an interactive system, it is probably a good idea to explicitly take the nature of that use into account in the design work."

    Carroll is exploring and developing an explicit and robust methodology for scenario-based design practice that avoids the mechanistic (and often fruitless) processes that typify so many formal methods. Now, that's not obvious. - Jenny Le Peuple, London Guildhall University End of Review

    References

    W. R. Reitman, 1965. Cognition and Thought: An information Processing Approach. New York: Wiley.

    H. Rittel and M. Webber, 1973. "Dilemmas in a General Theory of Planning," Policy Sciences, volume 4, number 2, pp. 155-169.

    ++++++++++

    Andrew Deitsch and David Czarnecki.
    Java Internationalization.
    Sebastopol, Calif.: O'Reilly and Associates, 2001.
    paper, 451 p., ISBN 0-596-00019-7, US$39.95.
    O'Reilly and Associates: http://www.oreilly.com

    Andrew Deitsch and David Czarnecki. Java Internationalization.

    From early cave markings to today's World Wide Web, people have struggled to find common language denominators that make communication more effective between many very distinct languages. Today's computers have added to this "Babel" effect with the addition of ASCII, Unicode, ISO based character sets. This diversity of today's spoken, written, and digital languages has continued to plague the communications industry with problems due to the difference in "native" languages whether they are displayed left to right, right to left, top to bottom and vice versa. From mankind's early attempts at communication, it is estimated that the languages of the world now contain approximately 500,000 characters. The Unicode Standard has the ability to describe over 1,112,064 characters compared to ASCII which has a limited 256 unique "slots" which is not nearly enough for communication between today's modern technology.

    If you are writing software which must be "Internationalised", then there is no question that you need this book. Aimed at software engineers who design or manage Java software, this concise volume cuts straight to the chase with a concise guide on how to get Java applications to work globally for local distribution. With the promise that Java will enable "Write Once, Run Anywhere!", Java has successfully delivered cross-platform capabilities. However developers need to be aware of the difficulties and issues raised when attempting to write a single application binary, one that can be complied at runtime by the application or from user defined input.

    Unfortunately most software is still written in English with English-only software already becoming obsolete. Java Internationalization empowers the software developer to begin the process of writing software that is truly multilingual, using Java's very sophisticated internationalisation facilities. Internationalisation makes communication more effective between today's international e-commerce markets and high speed global economy; this book provides an in-depth coverage of the support for globalised software on the Java platform. Covering techniques and issues surrounding the creation of software in different languages, Java Internationalization brings Java developers up to speed on the new generation of software development, by enabling authors to write software that is no longer limited by language boundaries, because one of the real strengths of Java is that support for Unicode Standard (a standard now at version 3.0 but readers should note that Java 2 supports only version 2.1.) Unicode is a system for the interchange, processing, and display of written text in 24 different languages it is thoroughly integrated at just about every level.

    Java Internalization begins with a truly fascinating history of the world's writing systems and character sets that are necessary for the outputting of characters in software, an excellent overview of the complexity of writing for different languages. "Locales " covers the specifics of obtaining locale specifications such as, country, language as well as local specific variables such as the Euro Currency character in Europe. "Isolating Locale-Specific Data with Resource Bundles " discusses how to isolate the general objects of a resource bundle and the way authors can include these variables at runtime. "Formatting Messages " deals with the handling of complex message formatting making the reader aware that there are many unsupported classes and methods to avoid due to the lack of true "internationalisation" support. "Character Sets and Unicode " provides an overview of the character sets and the Unicode standard explaining issues of unification, normalisation, special characters, and conversion. In "Searching, Sorting, and Text Boundary Detection " there is a thorough description of techniques and issues surrounding the creation of multi-lingual software in different languages.

    With Java's built-in support for locales, geographical and language communities comes later in the book. With "Fonts " and "Graphical User Interfaces " the authors show how to format text (and dates) for different markets using built-in Java APIs and features (like resource bundles). "Input Methods " provides details on how software developers can develop programs that can accept input from languages that require more keys than are available on a 105 Standard keyboard. "Internationalized Web Applications " focuses on internationalising Web sites by offering some valuable insights into the presentation aspects of internationalised Internet applications powered by Java whether with servlets or JSPs. The book concludes with a quick look at the future evolution of Java 3.0 internationalisation. The appendices are very extensive, containing listings for "Language Codes ", "Character Encodings Supported By Java ", "Unicode Character Blocks ", "Programmers Quick Reference ", and "Internationalisation Enhancements Across Versions of the JDK ".

    Java Internationalization does justice to an intriguing area of Java development, one that is increasingly important as more and more software is extended to new global markets that are no longer limited by language boundaries. This book is a good introduction to a complex topic - a truly absorbing read!. - Glenn Dalgarno End of Review

    ++++++++++

    Juanita Ellis and Timothy Speed.
    The Internet Security Guidebook: From Planning to Deployment.
    San Diego: Academic Press, 2001.
    paper, 320 p., ISBN 0-122-37471-1, US$44.95.
    Academic Press: http://www.academicpress.com/

    Juanita Ellis and Timothy Speed. The Internet Security Guidebook: From Planning to Deployment.

    This book is a real gem and interesting read for everyone, but particularly for network administrators and managers. The book is actually designed and intended to be an introductory text for network administrators and managers, and that is why it does have a certain textbook feel. That however does not diminish the book's quality - quite to the contrary.

    What does it mean when I say the book is primarily oriented toward network managers and administrators? This basically means that the entire scope of the book is broader than it would be for a network engineer manual. The authors do not go into technical details of network components and protocols - at least not beyond the level that is most useful for network managers, or IT managers in general. As soon as they come to technical details and nuances that go beyond planning, establishing, and maintaining a secure network as a whole, as soon as technical details and specifications descend into brand names of technical components, the authors restrain themselves. That is why this book, or a textbook if you prefer, presents an interesting and above all practical reading on a such an important topic as the Internet security.

    The journey of reading this book feels like climbing a hill. Readers may know something about the hill they are about to climb - the hill called the Internet security, or they may not. Whatever their knowledge level is they learn as they climb, reaching some sort of a pinnacle, and then they start descent. When they descend from the hill and wrap up the reading of the book they are left with a strong foundation on which they can continue building their Internet security knowledge and expertise.

    The book starts in a real textbook fashion - with basics. After defining what constitutes a business network the authors proceed with the list of possible threats - perpetrators, tools and methods they utilize, and all the things they can do to your network. The authors do not go into minuscule differences among various types of Internet security violators, dividing all the perpetrators into hackers and crackers, since this broad division into these two types satisfies the needs of business. This taxonomy in this case is quite an interesting and revealing topic per se; interested readers should go to the NSA glossary of Internet security terminology at the SANS Institute site at http://www.sans.org/newlook/resources/glossary.htm or Information Security Magazine's online description of perpetrator subtypes at http://www.infosecuritymag.com/articles/july00/features2a.shtml

    All possible types of internet security violations are then defined and described, from unauthorized use of computer systems through denial-of-service, sabotage, virus attacks, and so forth. Secure environment will then consist of the following security components: policy and procedures, tools (firewalls, encryption software, antiviral software, A3 (authorization, authentication and administration software), and commitment. The authors point out that secure system is not a finite task, something you do once and you will be safe and sound forever after. Having and maintaining a secure system is a process, rather than a single task. It requires constant vigilance and commitment from all the employees. More than that, being a process itself security should be a part of every other process within a company.

    But what if you ran a business for a while with no or only some insignificant security procedures in place? Where do you start now in order to get a complete system? Again, the authors are here to help businesses resolve this quandary. Start with what your business is, what its requirements are, and what service levels have to be met. This leads us to the risk assessment and risk analysis for your business, after which we are ready for developing strategies, procedures for incident handling, establishing customer and employee training, and finally - we have to keep the process current and updated. The last item is really important. It reiterates again and again this most important of all the security notions - security is a process, or rather - security is a state of mind. Perhaps the expression state of mind is a bit too strong here, if not paranoid, and in such a case security as a process describes the real situation most accurately.

    Chapter Two provides a security review of a business-oriented trusted network. A trusted network should have Web access through a sort of DMZ with routers and a firewall that would control access to the network. The authors provide a layout of security policy and operational procedures and in many aspects this chapter presents the core of the book. Chapter Four comes as a natural continuation after Chapter Two, where technical elements of a secure network are presented in a more detail, and Chapter 5 deals with firewalls as a major protection tool of your intranet from both the Internet and extranet. This section is basically an introduction to the theory and practice of firewalls. The authors first introduce what types of firewalls one will encounter out there - packet filtering, application gateway and circuit-level gateway. They describe in detail advantages and disadvantages of each of these, which of course depends of what particular business one is running. Finally, the authors discuss the actual implementation and maintenance of a firewall. The chapter is pretty detailed, discussing real-life managerial questions, like the buy-or-build dilemma. To help managers make an informative decision the authors offer a sort of assessment form. The form covers all the questions needed to be answered in order to determine specifications of a firewall they need, and this form can than be given to commercial firewall vendors, eliminating a lot of guesswork. The chapter ends with a list of firewall vendors.

    From this point in the book probably the most logical way of continuing the reading would be to go back to Chapter Three and then continue with Chapter Six, where elements of cryptography and authentication and authorization procedures are discussed. Chapter Three introduces Public Key Infrastructure (PKI), which is later in Chapter Seven discussed in more detail and then applied to e-commerce.

    Chapter Eight deals with this ubiquitous symbol of society based on information technology - e-mail and messaging system. E-mail has become a necessity in today's world, in fact business could hardly be imagined without it. Standard e-mail and instant messaging are both discussed, and the ways their security vulnerabilities can be remedied. Mass-mail, as an important business and marketing tool, receives a detailed discussion. Authors deal with a variety of hazards associated with e-mail, like spam and e-mail viruses, and discuss available remedies. The chapter ends as always with e-mail policy advice for managers and even provides a template that can help you design a statute of acceptable use of messaging for your company.

    After all the elements of a secure network, procedures and policy are discussed separately, and after the point that by now readers are well familiar with basic concepts and technology of Internet security, they are put together in Chapter 9. This chapter discusses extended risk analysis. In view of a given business risk is defined as a sum of potential treats, how likely they are, and their overall impact on our business. Going through each of these in detail every security officer can help his or her CEO determine dollar cost that security breaches could have on their assets and their business. As always, the chapter ends with a down-to-earth accounting sheet that can help executives assess a quality index of their firm, based on their sales base, cost of security and a moving average of number of incidents in a given time period. Armed with all this information the reader is then taken to the corporate level and the authors discuss in Chapter 10 how proper security handling would have to be based on an appropriate corporate security policy. Detailed discussion and charts that will help managers outline corporate security policy are provided and the entire job broken down and described in detail. The entire process has been detailed so that even the security elements that are not part of IT security per se, like physical security, have also been incorporated so that the entire plan emanates the sense of completeness.

    Well, you have done it now, you have gone through the book and followed and implemented all those useful but lengthy advice and procedures. Everything works fine until one dark and stormy night you've been hacked. Well, yes there is no 100% secure system, no one can promise that and therefore we all should be ready and prepared for such and eventuality. This is not the end of the world - just a regular, though unpleasant, situation in the world of Internet security. And Chapter 11 of this useful manual then comes to the rescue. Incident handling and procedures are discussed here, and not only from the Internet and corporate security point of view, but also as a part of a broader security. Incident prevention and handling charts and graphs incorporate local and national law enforcement authorities, including the FBI.

    The final chapter is a general recap of the entire project of the Internet security design, implementation, and operations. Again, practicality rules and you can find here advice on how to make use of pilot security programs in your overall security planning. As an example, a detailed and full plan and scheduler for planning and implementation of a PKI project is provided.

    In the first appendix a variety of security tools and their vendors are listed, and in the second we have a full report on CERT (Computer Emergency Response Team), with Web links and other useful information. With a detailed glossary and references section broken down by security topic this little manual, or textbook if you want, is a real gem. And, as I have already mentioned, these words should be taken literally - this is a manager's manual, not a technical reference book. It provides all the info one might need to organize and supervise an Internet security project - but for all technical details on particular tools and hardware one need to go further. The field of Internet security is also a quite broad area, and again for many specifics, and particularly for being on top of the latest things in the field, one needs to go beyond this book. As a nice and appropriate continuation of the book security breaches in the Internet security have been quite deftly covered by InfoWorld at http://www.infoworld.com/articles/tc/xml/01/04/09/010409tcsecurity.xml and http://www.infoworld.com/testcenter/security2001.html, for example.

    Certainly, if you read and applied all the advice from The Internet Security Guidebook you will discover that your business is quite well protected.

    There is one thing that I found quite interesting and amusing with this book. The authors have a distinct style in discussing this serious and perhaps a bit dull subject, as Internet security might seem. They drop subtle humorous and fitting lines here and there making the entire reading a sheer pleasure. For instance, the authors state that giving out the password to unknown individuals calling you over the phone should not happen - ever. If there are some however who believe that being so loose with passwords is not a serious breach of security the authors advise such people to e-mail them their credit card and PIN numbers and the authors will have fun shopping on their credit cards. In another instance, when discussing the concept that every security system is intrinsically vulnerable to hacking, and that absolutely secure system is a fundamental impossibility, they say that whoever claims that their system is absolutely secure they should name it Titanic. Or at another point, when discussing spam they 'pay tribute to a wonderful meat product' and follow advice of the Hormel Food Corporation how this particular word should be used, which is in lower case letters, since the word SPAM is patented and owned by Hormel. "Go to your local grocer and purchase some. It's very good and can be prepared in many different ways." Quite lighthearted and inoffensive way of dealing with a sensitive subject, one must say.

    All in all, this book is a nice, informative and concise manual on Internet security, designed for managers - and also a fun to read. - Sinisa Dragic End of Review

    ++++++++++

    Nancy J. Johnson (editor).
    Telecommuting and Virtual Offices: Issues and Opportunities.
    Hershey, Pa.: Idea Group Publishing, 2001.
    paper, 264 p., ISBN 1-878-28979-9, US$69.95.
    Idea Group: http://www.idea-group.com

    Nancy J. Johnson (editor). Telecommuting and Virtual Offices: Issues and Opportunities.

    Quite suddenly, it seems, four people in my immediate family have become telecommuters. Because we are all doing it more or less as free-lancers, the transition has seemed practically effortless. However, once an organisation takes responsibility, then telecommuting presents considerable challenges.

    This collection of essays, case studies and research findings is rich with the 'issues and opportunities', many of them needing further research. The 13 chapters are gathered into three sections concentrating on issues facing the community, the employer, and the employee.

    The North American and European perspectives explored suggest that telecommuting is largely employee-driven, beneficial to employer and employee and community, increasingly affordable, and expected to grow. On the other hand, effective managers will want to face up to many issues. These range from the 'hard' and quantifiable, to 'soft', less tangible ones. They could include contracts, data security, infrastructure, costs, health and safety, supervision, support systems, and career management.

    Simply picturing a 'telecommuter' is no simple matter. A telecommuter may be at home, mobile, working from a satellite office, or a telecentre; may telecommute one day a week or more; may be permanently in this role or only for the duration of a project; may use very different infrastructures to connect to the office; may have low autonomy or be a professional; may work more or less independently; may have chosen or be required as a condition of appointment to work in this way.

    Nearly every page reveals similar depth in this surprisingly complex area. For example, many Internet team-workers will already be aware of strategies for achieving 'user-friendly' English. A section of Kirk St. Amant's essay "Success in the International Virtual Office" discusses a whole list: you should avoid idiomatic expressions, abbreviations, false cognates ('map' in Dutch means folder), complex verbs, and long noun strings. On the other hand, be sure to know the dialect of English spoken by the target culture ('subway' as highway underpass or underground train system?), and ask a native speaker from the target audience to review a document. Next, have you considered how different cultures represent numbers? (Could this warning have saved the Mars Climate Orbiter, famous victim of a metric/imperial muddle?) All this is a mere prelude to consideration of other, more subtle, cultural differences.

    It will be for others to attempt to simplify the lessons to be learned. This book gives a clear message that telecommuting has good outcomes; but organisations have a lot to worry about first. - Catherine Sack End of Review

    ++++++++++

    Jeffrey Veen.
    The Art and Science of Web Design.
    Indianapolis: New Riders Press, 2000.
    paper, 259 p., ISBN 0-789-72370-0, US$45.00.
    New Riders Press: http://www.newriders.com

     Jeffrey Veen. The Art and Science of Web Design

    Buy this book. It is inspiring (in the best sense of the word), and is worthy of savoring. Allow yourself to dog eared its pages and scribble notes in the margins. Each of the eight chapters - neon dividers identify them - provides useful information that can be immediately carried out to good success.

    Veen is currently a jet-setting speaker but was previously an user experience specialist for Lycos, HotBot, and other online hotspots. He has been involved with the W3C's cascading style sheets group and has a background in traditional newspaper media. So, in this book, Veen knows what he is talking about. The reader has the voice of a mentor in the pages of this book; Veen does not use confusing technological jargon to convey his knowledge to the reader.

    Stories help to illustrate Veen's examples. Readers will find anecdotes and personal insight. For example, chapter one provides a primer which should be required reading for every new HTML coder. The discourse in this chapter ranges from history of Standard Generalized Markup Language to the history of World Wide Web. Framing all this is a discussion of the marks that editors use. There is an interesting explanation of how header and font codes were translated into tags for HTML coding.

    The first chapter also provides a conceptual model that is subsequently used throughout the book. This model of words-pictures-code is used throughout the book in various contexts and provides readers a frame of reference where words are linked to structure, pictures to presentation, and code to behavior. That is, words/structure provide the basic building blocks (text) of the Web site. Using this model, Veen deftly transitions to a discussion of XML. Further, in using the conceptual model, Veen can show how collaboration between members of the Web team facilitates communication and prevents wasted time in each area of expertise.

    Veen helps the reader find answers. For example, in another chapter two, there is a discussion of knowing the rules and knowing how to break them. One of the chapter "breaks" is about how HotBot ignored conventions and how much consideration went into which rules would be broken.

    This book has many big ideas and golden nuggets that readers can take away with them to use now and to savor in moments when Web design requires some thought. - Beth Archibald Tang End of Review


    Contents Index

    Copyright ©2001, First Monday