First Monday

FM Reviews

For this column, I have decided to group four titles on communicating effectively through the (mainly) written word. Some of First Monday's readers are also authors themselves, thus the following books can offer specific help in producing polished and convincing written material.

I will start with a new series of books from Oxford University Press, One Step Ahead, which aims at providing its readers with the necessary tools to express ideas, present facts, and divulge information. The official blurb states that "The One Step Ahead series is for all those who want and need to communicate more effectively in a range of real-life situations. Each title provides up-to-date practical guidance, tips, and the language tools to enhance your writing and speaking." The books in the series include guides on editing and revising text, punctuation, spelling, writing reports, organising meetings and writing press releases, as well as producing long dissertations and tackling academic exams.

I have chosen three titles.

Jo Billingham.
Editing and Revising Text.
Oxford: Oxford University Press, 2002.
paper, 136 p., ISBN 0-198-60413-0, £6.99, US$11.95.
Oxford University Press:

Jo Billingham. Editing and Revising Text.

Whether you are just starting to write for other people or are a fairly seasoned communicator, you ought to take a look at this short guide, packed with useful techniques and conventions on the subject of editing text. Editors, in this context, can be either people intent on improving their own piece of work, or as part of a writing team. The two situations entail quite a different approach when assessing the readability or appropriateness of language. In about 120 pages, Jo Billingham covers style guides, structural and linguistic flow, accuracy, brevity, clarity, the issue of editing circumstances, the intended readership, as well as providing a comprehensive collection of checklists. Using the latter can be very beneficial for ensuring that all aspects of editing have been looked at and that the final version is both accurate and consistent. For instance, the checklist for style guide include entries such as "Abbreviations: USA or U.S.A? And or &: where there's a choice, which do you use? Bullet point lists: are you using capitals at the start, or semicolons at the end? Contractions: can you use it's, or must you use it is? Grammatical points: can you use and or but, etc. at the start of sentence? Can you use split infinitives?" and many pages of other, likewise helpful questions.

In general, beginners will find a lot of good advice and will learn about the many facets of editing text in a professional fashion. For the more experienced writer the book can still offers a handy review of those techniques that make all the difference: what to do with too much information, how to make the tone less (or more) formal, and how to take full advantage of modern computer technology. The concluding resources section lists a great deal of good publications dedicated to grammar, punctuation, troublesome words, style, technical writing, proof-reading, copyright issues, and more. Notably absent, however, were books on typography and layout design, which are, after all, also an important element of written textual communication.

Although Billingham's guidance is sound for the most part, I stumbled upon the occasional instance of incomplete, contentious and slightly inappropriate pieces of advice or examples: "It is acceptable to start sentences with 'and' or 'but', to split infinitives, or to end a sentence with a preposition if that is the best way to make the sense clear." Of course, purists would frown at this statement, so I would have added a cautionary note here. In the section dedicated to common Latin phrases that can be used in certain circumstances, I noticed the glaring omission of prima facie and inter alia which, after all, are fairly widespread both in writing and speech.

But my most annoying encounter took place when political correctness was given priority over grammatical integrity: "The writer needs to spend some time sorting out their thoughts" and "If the writer is unhappy about your suggestion, respect their wishes ... ." While acknowledging that this is becoming a hotly debated matter, I would not expect to find plain wrong grammar in a book dedicated to good style and effective communication. A thorny issue requires a more in-depth discussion, and I felt the author seemed to take for granted this ugly formula, as if it should become accepted as a legitimate way of avoiding the masculine personal pronoun. Time will tell, but I, and many others still feel strongly about it.

In any case, apart from this item, I still think that Editing and Revising Text is a very worthwhile guide. With the advent of the World Wide Web and the omnipresence of e-mail, the English language is going through a very taxing period of its development. Much good is coming out of this process, but a great deal of sloppiness is also starting to creep in; being able to produce readable, accurate, and effective writing is thus all the more important. End of Review


Chris Mounsey.
Essays and Dissertations.
Oxford: Oxford University Press, 2002.
paper, 128 p., ISBN 0-198-60505-6, £6.99, US$11.95.
Oxford University Press:

Chris Mounsey. Essays and Dissertations.

Devoted to a more advanced type of writing, this book is dedicated to the planning and successful completion of long pieces of text, such as dissertations or academic essays. The focus here is on making sure that long essays will conform to certain norms and that they will achieve their goal. The advice given is excellent and Mounsey's engaging approach turns a, let's face it, rather dull and uninspiring topic into something digestible. The 120-odd pages cover a lot of ground, from understanding the questions of an assignment, to finding the right books on a certain topic; from extracting quotes to complement one's own work, to developing a coherent line of argument; from essay structure, to presentation issues; from advanced bibliographical research, to the various footnotes/endnotes formats. All of this and more is surveyed succinctly, yet with enough detail to be of real benefit to any academic scholar.

A concluding reference section is dedicated to additional areas of essay writing. For example, in The purpose of the essay, the author offers a visual outline of the three vital elements of any essay: Opinion ("This is your contributions. You do not have to say anything new, but must argue for a particular viewpoint"), Evidence ("This is the result of your research, The evidence you present should lead to the reasons why your opinion is to be believed"), Brevity ("An essay is not trying to say everything about a subject. You need to go into depth about just a little bit of the topic under discussion"). Subsequent sections assist in the process of research, spelling and punctuation, presentation issues, drafting and redrafting and even contain a handy essay template, with its Introduction/Body/Conclusion structure.

Targeted perhaps at less experienced academic writers, Essays and Dissertations offers nevertheless a great deal of useful information and is a great companion for anyone who needs to share his or her knowledge, especially through a peer-reviewed process. End of Review


Jane Dorner.
Writing for the Internet.
Oxford: Oxford University Press, 2002.
paper, 128 p., ISBN 0-198-66285-8, £6.99, US$11.95.
Oxford University Press:

Jane Dorner. Writing for the Internet.

The last guide from the One Step Ahead series is devoted exclusively to the activity of writing for the Internet. Having said that, the book is really mainly about writing for the Web, with some sections dedicated to email and chat etiquette. Just as the other two guides reviewed above, this one is also fairly short, yet is packed with abundant information, tips and good advice.

The Web can be said to have brought publishing to the masses, for anyone with a minimum of disk space on an ISP's server can nowadays disseminate whatever he or she wishes, without having to find a traditional publisher to distribute the material. The downside to this, however, has been a certain eroding of tested methods of content and presentation verification. For instance, publishing an article in a scientific journal requires going through several stages of peer review, which ensure that the final version will be up to the expectations of a particular readership. Moreover, the way the material is presented is also subordinated to certain conventions. On the Web, however, the freedom of expression can lead often to interesting content presented incoherently and in a confusing way. So, it is by learning about the 'rules' of online communication that authors can confer more authority and weight to their writing.

Jane Dorner manages to condense a great deal of these rules in her guide and provides a guiding hand to both the novice and expert alike. Beginning from how the language used on the Internet differs from other types of more conventional communication, she then goes on to discuss Web page design issues and how to tackle the evaluation of one's audience and the purpose of the material. The book features also sections on planning and on the nitty-gritty of writing and producing a Web site. Here the author examines the numerous aspects linked to the appearance of a page, collaborative writing, drafting, editing, house styles and language choice. Towards the end of the book, an interesting short chapter looks at the various types of Web site: personal home pages, interactive brochures, and communities. Although nothing here is revolutionary, it is nevertheless good to be reminded of the 'rules' that can help making communication more effective and appropriate. In this respect, Writing for the Internet definitely achieves its goal. First Monday habitués will have noticed by now that I often leave the negative points of a book for the end of my reviews; it is the case here, too, although the elements that caught my eyes were minor in most cases. For example, at some stage in the discussion of punctuation, Dorner writes: "Plain text in emailed newsletters is developing a brand of different keypad combinations to draw the attention - *word* or _word_ both indicate italics. This may develop." The fact that she used the words is developing struck me as odd, for this convention has been in existence for a long time indeed, thus it is not a recent phenomenon (although it likely is so for most recent adopters of electronic communication, from the late nineties onwards).

Another incorrect statement appears under the heading E-zines: "E-zines arrive as plain ordinary text, so there are no design crutches to easy reading." Even though this was undoubtedly true until the advent of HTML-rich e-mail, nowadays it seems more likely that, if anything, electronic newsletter will have been formatted as a Web page, and in order to receive the 'good old' plain text-only format one has to ask for it specifically.

Finally, in the chapter titled Technical fundamentals, the author exemplifies the structure of a URL. The box is headed by the line "Elements of an Internet address - known as a URL (Uniform Resource Locator)", however, strictly speaking, a URL is a World-Wide Web address, not an Internet address. The term Internet address could be taken to symbolize an e-mail address or even (and probably more correctly) an IP number, as this would represent the true Internet identifier of an individual computer. Sadly, even in this book I detected a slight carelessness in using the correct terminology, which is unfortunately becoming more widespread as the original nomenclature of computer-mediated-communication recedes further and further in the past.

But I admit that such annoyances are minor in the context of a work as general and compact as this, even if they come from Oxford University Press (with all the expectations associated with this fact). Writing for the Internet won't teach you how to become a successful Web designer, but it will show you the way towards pertinent and effectual communication in the vast human construct we call Cyberspace. End of Review


James G. Paradis and Mauriel L. Zimmerman.
The MIT Guide to Science and Engineering Communication.
Second edition.
Cambridge, Mass.: MIT Press, 2002.
paper, 292 p., ISBN 0-262-66127-6, US$29.95.
MIT Press:

James G. Paradis and Mauriel L. Zimmerman. The MIT Guide to Science and Engineering Communication.

Still remaining in the area of effective communication, MIT Press has released the second edition of its Guide to Science and Engineering Communication. It is a standard work which had demonstrated to be an essential reference for many scientists, engineers, and students of technical subjects. This new, spiral-bound revision takes into the account the many changes that have been brought about by widespread computer technology in both academic and professional life.

The scope of the guide is substantial, with many chapters devoted to producing scientific documentation, either as part of a team, or as an individual effort. Nothing is left out: from the organisation and drafting process, to the assessment of the audience; from revisions and design issues, to library research, in order to summarise how the various references apply to a particular project. Each chapter begins with a 'sample' introduction which typifies a real-life situation. For example, the section on developing graphics open as follows: "You've spent weeks drafting and revising a report. You've carefully considered your audience and defined your work aim. You and your colleagues have read and reread each other's work. Your attention to organization and style has made the final version concise and readable. Looking over the finished report, however, you feel something is missing [...] You're still not finished. You need to illustrate your prose better." And it goes on to discuss the benefits of adding graphics, charts, and graphs to a scientific report, book, or article.

The beauty about this MIT guide is that its language is clear and does not come across as prescriptive. Moreover, the numerous figures included complement the points very effectively, as in the case of the chapter dedicated to outlines, or in the comprehensive review of the many types of graphical charts (bar/column, scatter, semilogarithmic, histograms, etc.) Paradis and Zimmerman always focus on the topic at hand and ensure that their commentary does not become too long-winded and dull, as it could easily be with a reference of this kind.

The second part of the book introduces journal articles, proposals, progress reports, oral presentations, computer documentation, CVs and job correspondence, as well as memos, letters, and e-mails; in short: most of what a scientist or engineer will encounter in his or her professional life. The authors reveal great competence and, true to the view that science requires accuracy, they point out many of the subtleties of technical writing which make the area so arresting. Just one of the many examples: "Numbered citations. In this citation style, each number refers to an item in the final reference list. Numbers are almost always enclosed in square brackets rather than parentheses, to distinguish citations from equations." Fascinating. I have written papers myself, however, as they did not contain equations, I never thought about the danger of my parentheses clashing with possible equations!

It is more than likely that, when writing this guide, the authors did apply the criteria they themselves described in the book, so it is hard to find many faults. However, I did encounter a few passages where the treatment of the subject could have been improved by revising some of the details. The end of chapter 7, Design of Page and Screen illustrates the salient differences between paper and electronic documentation. And it keeps referring to the latter as online, although the meaning it conveys is on-screen. While years ago pointing out such a distinction would have been truly pedantic, the emergence of the Internet in general and the Web in particular have rendered the differentiation necessary. Whereas onscreen is understood as information presented on a computer monitor, online is widely accepted as meaning "information delivered over a remote connection", therefore not stored on a local device or cache.

In the same section, the authors look to the future of document standards and, rightly, recognise that information delivered in HTML format on CD-ROMs is becoming more common on a daily basis. However, I was surprised to find no mention of possibly the most popular electronic document delivery specification: PDF. After all, quite a considerable amount of files are nowadays exchanged using Adobe's portable format.

But these are issues that can easily be passed over, the value of the guide is certainly undisputed. As an amusing aside, the only real typo I found was in a sample application letter on page 283. Here, a figure depicts an application sent to the IBM Director of Personnel. Although its content had been carefully worded and all necessary elements included (research activities, relevant academic training, work experience, and so on), one sentence says: "From your company Web page, I see that IBM is actively involve with robotics and automation control systems [...]" Ouch!

The final, short Handbook of Style and Usage is a little gem in itself, as it tackles some particularly important linguistic and stylistic pitfalls; "write effective paragraphs", "break long sentences into shorter ones", "make choppy writing flow", "use parallel subject headings to reveal logical flow", "emphasize the active voice", "write with economy", "avoid excessive nominalizing", to name but a few. All in all, about 30 pages of sound advice on how to improve the structure of language.

So, it might not be a gripping piece of literature, or a thrilling volume of detective stories, however, the MIT Guide to Science and Engineering Communication fulfills its goal admirably; technical writing is as much about disseminating scientific information, as it is about being effective communicators and convincing persuaders, and this book will help you on both fronts. — Paolo G. Cordone. End of Review


Michael Thad Allen and Gabrielle Hecht (editors).
Technologies of Power: Essays in Honor of Thomas Parke Hughes and Agatha Chipley Hughes.
Cambridge, Mass.: MIT Press, 2001.
paper, 520 p., ISBN 0-262-51124-X, US$24.95.
MIT Press:

Michael Thad Allen and Gabrielle Hecht (editors). Technologies of Power: Essays in Honor of Thomas Parke Hughes and Agatha Chipley Hughes.

This book is subtitled "Essays in Honor of Thomas Parke Hughes and Agatha Chipley Hughes", and celebrates the contributions made by these two scholars over a period of half a century to the study of the history of technology. It makes sense as a tribute to two fascinating people, and it also stands in its own right as a primer for its field.

Essays by 12 different contributors explore the issues behind the technology of history through studies of particular times, places and technologies. If the book has a weakness, it is that these are based entirely in the western tradition, with all of its examples coming from the U.S. and Europe. In terms of time the essays are confined to the nineteenth and mostly twentieth centuries. The authors reflect occasionally on their own situation within the field, and there is more than passing reference to the way in which attitudes to technology have shaped and been shaped by political and social circumstances. The contributors have all benefitted directly from Hughes' teaching, and so the history of the history of technology is heavily slanted towards an American perspective.

The contributors all recognise the seminal work undertaken by Thomas Hughes towards an understanding that technology develops within particular social and historical circumstances. Logically it follows that the contributors must be reflexive and consider how their own attitudes to their material have been influenced by their own personal social and historical circumstances. The one circumstance that hangs heavily over all the contributors is the Cold War, within which, so far as can be gleaned from their biographies, they all grew up and were educated. American policy was driven for decades by the demands of technological determinism, an assumption that in order to survive America must produce bigger and better, and the one direction in which to travel was obvious, and could no more be influenced by human desire than the course of the moon. As the politics of the Cold War began to be questioned, so some of the underpinnings of people's attitudes to the technology they were producing came under question, and the anti-deterministic strand of technological history began to assume ascendancy.

The contributors do not examine this in too much detail, which is a pity because there is a rich seam of history to be mined here. Instead each concentrates on a period and place of their own specialisation. The editorial aim is to ground the exposition of the theory in detailed study of particular issues, and the result is a fascinating mix, ranging from the introduction of the telephone in the U.S., through refrigeration and nineteenth century opposition to mechanised street transport in American cities, to French nuclear policy, British wartime operation research, and inevitably the gas chambers of the Nazi era. Michael Thad Allen's exposition on "Modernity, the Holocaust and Machines without History" in effect functions as the centre piece to this series of essays, exploring how attitudes towards the mechanics of massacre were socially grounded and politically worked out. Somewhere in the roots of this history is the foundation stone of the rejection of the idea of the inevitable goodness of Western progress and modernity. Allen examines in detail the rhetoric and the reality surrounding the exterminations and exposes how technological and social issues relate to each other and shape each other in many and complex ways.

In equally powerful ways the same themes about society and technology are examined through the material provided by history. Each is a challenging study in its own right and the book as a whole functions as an absorbing tour of the field. One to buy. — Rob Parsons. End of Review


Erin Jansen.
NetLingo: The Internet Dictionary.
Ojai, Calif.: NetLingo, 2002.
paper, 528 p., ISBN 0-970-63967-8, US$19.95, £14.00, E22.95.

Erin Jansen. NetLingo: The Internet Dictionary.

Finding a decent online dictionary is a fairly easy task these days. In fact, so many of them have sprung up in recent years that it would be even difficult to maintain an up-to-date list. But despite this abundance, I was intrigued when I found out about NetLingo. Here there was, I thought, something of an interesting "odd one out": whereas a number of such resources have, at some stage in their history, migrated from paper onto the Web, with the rest having existed only in electronic format from day one, NetLingo's authors (compilers would probably be a more apt word), Erin Jansen and Vincent James, have decided to go the other way: they have produced a printed version of what started back in 1995 under the name "NetLingo", a very useful online reference dedicated to Internet terminology. Thus, it could be said that the book is the companion to a Web site, rather than the other way around. Apparently the request for a real book came from the many NetLingo's visitors who realised that consulting a paper-based dictionary can still be handy, especially in those situations where a computer is not near, or no direct connection to the Internet is available.

In order to give some raison d'être to her work, Erin Jansen introduces NetLingo by saying: "A variety of computing and technical terms are included in NetLingo, but only as they pertain to the Net. Indeed, one of the main reasons NetLingo exists is because most of the dictionaries I have seen in this genre are too difficult to understand. They're usually full of arcane computer-related definitions that mean very little to a person who is new to computers or who just wants to learn about Internet-related stuff." NetLingo's main aim then is to provide a handy reference for people who might be quite new to this medium and who want to also have some fun while learning. By reading through it, I suddenly became aware of a distinct conversational style, which sets the book apart from many other references I have seen (for example, the Oxford University Dictionary of the Internet, reviewed here back in the November 2001 column). Thanks to its friendliness and colloquial character, one could actually end up reading NetLingo as if it were a normal book: from cover to cover! And on a more sensorial level, I found the unusual paper format somehow inviting: it vaguely reminded me of a Michelin guide or a restaurant directory ... by just looking at it I simply wanted to pick it up and read it.

But what about the content? Reviewing a dictionary about the Internet is always a thorny affair, especially if it is printed: with new terms being coined nearly on a weekly basis, looking for completeness means knowing a priori that it will be a fruitless task. Nevertheless, NetLingo definitely features a lot of entries and at 500+ pages, this little tome can come in handy in most situations where specific terms or concepts related to the Internet need to be looked up.

Allow me to digress a little at this point: the unfortunate aspect of computer-mediated communication (CMC) is that, although being now very widespread, many of its conventions are not appreciated by a huge number of recent adopters. For example, just a few days ago I followed a discussion on a private university conferencing system in which a lecturer claimed that "putting the same information in multiple places is called spamming." As many know, however, the exact definition of spam is concerned with mass-mailing or cross-posting rather than with sending one e-mail about something to one forum and another similar message to another forum, in which the topic can also be considered relevant. So, my immediate reaction was the realisation that there is a definite perception of "terminology fluidity": anybody can assign to CMC terms the meaning he or she wishes, according to how it best fits the situation. And it is here that good education in CMC should begin; by becoming proficient with the terminology already established. The same, of course, applies to the countless examples of acronyms which everybody takes for granted but that remain obscure to the majority of users: ISDN, ISP, HTML, FTP, and so on.

Going back to NetLingo, how good is it at educating us? The dictionary is divided into separate sections, encompassing the dictionary itself (over 400 pages), lists of acronyms and expressions, emoticons, straight-on-smilies, country codes, and file extensions, plus categorised indices of cyberslang, organisations & initiatives, companies, people, and more.

While commending the effort put into most of these lists, I must confess in advance that I always found directories of so-called "useful expressions" fairly superfluous, yes even detrimental to our language. The phrases they present are used mostly in chat rooms and there is no rational reason why anybody would want to learn that SWMBO signifies "she who must be obeyed" or that TLK2UL8R is "talk to you later" (the latter, incidentally, probably takes less to type than its cryptic "shortened" version).

Fortunately abbreviations of this type only take up about 25 pages, which does not add much bulk to the book. The dictionary proper is as exhaustive as it could be. Apart from the numerous and typical entries such as easter egg or cookies, I have found some real esoteric and probably quite localised terms: barbie bird, for example.

The definitions are typically very accurate and show that a great deal of research took place during preparation of the book (and online version). I was particularly pleased to discover that NetLingo identifies explicitly the difference between Internet and internet (with a lower-case 'i'). A not insignificant fact which is being unjustly overlooked more and more, even by those who should know better (CMC lecturer!) Moreover, Jensen and James must be the only people around who still capitalize the word 'web' as in Web page; after all, we are not talking about a spider web here and it is nice to give the World-Wide-Web (note the capital 'W'!) some special status. Overall, then, the dictionary is a true linguistic treasure: it will provide avid readers with the right expression, every time, and at any level.

Having said that, I did find the odd annoyances even with this dictionary. Although no reference can be perfect or totally comprehensive, care should be given when deciding what to put in and what to leave out. For example, while many key players of the industry have been identified, Cerf, Metcalfe, Torvalds, Wozniak, Moore, to name but a few, I could immediately think of at least one major individual who was not listed: Jon Postel. The lesson is: there is always an inherent danger in compiling authoritative lists, as something will always be forgotten.

More of a personal disappointment was the fact that even NetLingo (like other dictionaries I have consulted) fails to include an entry for mailto: Think about it, this code element appears probably on nearly every single Web site in cyberspace, yet, you will not find its meaning explained anywhere, which is a curious thing.

Although a rare occurrence, there was the odd definition which was either not fully explained, or was inaccurate. For example, when discussing alpha and beta releases, no mention was made of the fact that in beta the feature set is frozen, while in alpha it can still be expanded; in the entry on anonymous FTP the fact that no account is necessary to access data should have been complemented by the important notion that usually the username is 'anonymous' while the password is the user's e-mail address. On AOL it reports that "eventually, the company who owns it, America Online, grew big enough to provide Internet access, acquire a number of Internet sites (including Netscape), and merge with Time Warner." Well, I worked for Netscape Communications and it was not only a Web site, but an actual company!

Finally, although the dictionary is strictly in alphabetical order, I did come across at least an instance where the word was not where I expected it: digerati appeared between digitizer and direct connection. The mystery was solved when I read the first line of the entry: "a.k.a. digiterati".

But I am being deliberately pernickety now and if you were to judge NetLingo only from my 'negative' comments it would not do justice to this excellent work. Picking on some very isolated cases was, in fact, necessary to demonstrate that here quality and accuracy have been given a very high priority. In truth, I would recommend NetLingo to anybody who wishes to become more fluent in IT-speak, while at the same time enjoying a good read. There is so much to learn once you get started; for once, turn off the computer and become an Internet guru the easy way! — Paolo G. Cordone. End of Review


Tim Kennedy and Mary Slowinski.
SMIL: Adding Multimedia for the Web.
Indianapolis: Sams Publishing, 2001.
paper, 408 p., ISBN 0-672-32167-X, US$39.99/£28.99.
Sams Publishing: (U.S.), (U.K.).

Tim Kennedy and Mary Slowinski. SMIL: Adding Multimedia for the Web.

As an instructor and developer of SMIL Level 1 eClass courses, I have a long term interest in authoring presentations using Synchronised Integration Multimedia Language, (or SMIL for short). I was therefore delighted when I received the recent publication from Sams Publishing called SMIL Adding Multimedia to the Web.

Since the endorsement of the SMIL 2.0 Standard by the W3 Consortium in July 2001, this has, at last, become the rich multimedia authoring language for the delivery of synchronised and integrated multimedia to end users' desktop. The new standard is gaining more and more industry converts everyday.

As an indication of the increased complexity of this format, the original SMIL 1.0 specification was 40 pages long, (not including the Document Type Definition), whereas version 2.0 has been expanded to a whopping 440+ pages. If you are like me then, you will probably be scratching your head asking yourself "where do I begin?". The answer is simple: by reading this book.

The three main parts offer a complete introduction to the art of using SMIL: "Understanding SMIL - The Basics", "Using SMIL - The Specification Modules", and "Using SMIL - The Projects". Part 1 is a gentle walk divided into two distinct paths, "What is SMIL" and "SMIL authoring", is prefaced by an introduction. Here the book touches on the benefits of a thorough understanding of the language. Generally, you will be able to offer a higher level of usability and accessibility as well as an improved user experience which can further enhance a company's unique brand identity.

The guts of the book are contained in part II, which is a rollercoaster ride packed with thrills and spills that will make you gasp for breath. The subsequent second part is divided into 12 sections, covering the various SMIL modules: structure, layout, timing & synchronisation, media object, transition effects, animation, linking, content control, time manipulations and Real Networks' RealPix and RealText.

Whereas SMIL 1.0 was simplicity in itself, SMIL 2.0 is more complex and by the same token, a lot more powerful. The SMIL 2.0 Modules offer more elements and attributes than was previously available, but the specification does not end here, SMIL 2.0 adds 'Profiles', which further enhance the optimised playback of a multimedia presentation.

Beginners may feel a little overwhelmed by the advanced information contained in this section, but the authors manage to remain supportive, while keeping the reader focused throughout. In the chapter "Getting Technical: The SMIL Specification", there is an excellent walk-through of some 'browser' compatibility issues that might be encountered when authoring SMIL presentations.

In Part III there are two comprehensive projects which give a better overview of authoring SMIL presentations. The two projects discussed are based on Real Networks' 'RealPlayer' and Microsoft's 'Internet Explorer', functioning as player applications for delivery of the SMIL presentation. It is here that the Kennedy and Slowinski offer detailed instructions and examples, bridging the gap between creative inspiration and technical know-how, enabling you to progress quickly through an otherwise very steep learning curve. Each of the projects is aimed at the intermediate developer and shows the underlying code of each effect or action in detail; it also relates to the presentation being built. The code examples are available on the book's companion Web site at

The book is rounded off with an appendix in Part IV, covering online resources and breakdown of the SMIL 2.0 modules, listing each corresponding modules' elements along with their associated attributes: a very handy reference that will prove very useful when the process of authoring with SMIL begins in earnest. So overall, more than being just a reference tool, this is a superbly written book which will help you, with each passing chapter, build up a solid skillset. Highly recommended. — Glenn Dalgarno. End of Review


Warren G. Kruse II and Jay G. Heiser.
Computer Forensics: Incident Response Essentials.
Boston: Addison-Wesley, 2001.
paper, 416 p., 0-201-70719-5, US$39.99.

Warren G. Kruse II and Jay G. Heiser. Computer Forensics: Incident Response Essentials.

If the number of security companies advertising their services on the Internet is an even moderately accurate measure, forensic computing might be the new growth industry. Needless to say, this is a highly specialised field and there is little published material on the subject for the interested novice. This volume by Kruse and Heiser is aimed at just such interested novices as well as seasoned professionals in IT and law enforcement ... although not necessarily in that order.

From the outset, this book establishes the authors' key to computer incident investigations - be thoroughly methodical in your approach and document everything. This will probably be wholly familiar to any judiciary in the readership but will likely be anathema to a large number of techies! In fact there seems to be this skill swap throughout the book - the technical reader gets an idea of the methodology and sheer depth of paperwork required and the law enforcement reader gets a crash course in cracking and playing at scripts.

The book is split almost equally between hosts running Windows NT/2000 and Unix/Linux. From a technical standpoint, however, it really is a bit enigmatic. The authors explain how to spoof Sendmail, hide files using NT streams, crack Windows passwords and other such puerile pursuits. They then trot out explanations of some straightforward topics with sub-titles like "Email Signatures Provide Clues to Online Identities" ... you don't say!

On the software front, we are introduced to tools as diverse as Windows file viewers such as Quick View Plus to hardcore forensics tools such as Encase. Appendices also point the reader to online resources for creating a bootable Linux CD-ROM for forensic purposes. No CD-ROM is bundled with the book but there are numerous URLs footnoted throughout.

The authors also give credit to Fred Cohen's ForensiX, a package which has subsequently been withdrawn due to fears of prosecution under the DMCA, an issue which seems to have added an additional twist to the whole subject of computer forensics. Curiously, while providing fairly explicit instruction on how to access information on a "suspect" computer, the reader is given no caution regarding individual privacy or data protection legislation. It is always worthwhile reminding ourselves that power without responsibility is a dangerous thing.

Computer Forensics: Incident Response Essentials runs to almost 400 pages but still seems curiously thin. I would have expected a book on this subject to have had decidedly more heft! Anyone skipping expectantly to the appendix entitled "How to become a Unix Guru" will be sadly disappointed; finding only the authors' advice to run Linux and - oh, yes - read a lot of other books! On the other hand, while the legal aspects of the book, such as evidentiary rules etc., are interesting, they might possibly have limited worth to readers outside law enforcement and/or the United States. To be honest, I'm not qualified to comment so I won't.

On the whole, Computer Forensics is useful for the methodology it presents. For me, however, it's one of those books that should be either 50 percent thinner or 200 percent thicker. — Rory Beaton. End of Review


Graham Meikle.
Future Active: Media Activism and the Internet.
Annandale, New South Wales: Pluto Australia, 2002.
paper, 280 p., ISBN 1-864-03148-4, $34.95.
Pluto Australia:

Graham Meikle. Future Active: Media Activism and the Internet.

Future Active is a timely and novel contribution to the debates about Internet activism. In this fast changing field there have been few books that have documented activists' online political interventions so extensively. It is written in a smooth and engaging style using accessible language without simplifying the arguments.

The key premise of the book is that there are two competing visions of Internet practice - version 1.0 which is 'open' and version 2.0 which is 'closed'. Version 1.0 is based on a willingness to share files, data, ideas, music, etc. for free, and a desire to engage in "conversational interactivity" (p. 31) as a means to participate, collaborate and create anew. This "intercreativity" is presented as the foundation of the Internet that is now threatened by version 2.0 - the corporate model of control, ownership, and profit generation.

Meikle divides the book into three core themes; examining the nature of interactivity online, the strategies of alternative media, and practices of tactical media. Throughout these he documents and critiques the variety of tactics and strategies developed and applied by media activists on the Internet. In doing so Meikle identifies several challenges for activists for their continued use of the Internet especially in relation to productively recasting forms of activism on- and off-line.

Employing a wide range of international examples, drawn from the U.S., Europe and Australasia, these cases concentrate on radical political activism (such as McSpotlight, the Electronic Disturbance Theater, rtmark, Indymedia and B92) though mention is made of formal political parties in Chapter 2. To a certain extent the examples he charts have been covered by other authors, and at times there is a lack of reference to this earlier relevant work. However, the storylines of the book are threaded via discussion of several activists who are introduced in respective chapters. Using the activists to form the plot, and thus allowing their voices to be clearly heard, not only adds weight to assertions made but also personalises what at times can seem like abstract technological events. Furthermore, this approach has resulted in a comprehensive analysis of the core themes in this field of debate.

Future Active is not a heavily academic book. While this aids its readability some might be disappointed by the lack of a theoretical context. Meikle proposes that the Internet should be an open non-commercialised space and amply illustrates the inventiveness of a variety of political activists to these ends. However, it is unlikely to convince someone committed to capitalist ideals, and this does not appear to be the book's aim. Thus much is assumed of the audiences' sympathy with the activists' projects.

Meikle focuses on the pioneering aspects of online media activism rather than on how particular groups use the Internet. While this focus on the cutting edge of innovation makes inspiring reading, it leaves open the debate as to whether only an elite few are able to employ the technology in such a way, or whether the Internet is being employed by a variety of activists during their everyday political activities.

In this context it would have been interesting to have had slightly more examination of whether online acts actually had impacts beyond generating mainstream media hype. Although 'effects' are always hard to measure, more in-depth examination of each case study (beyond simply the electronic element) might have strengthened the debate. This drawback is illustrated most clearly in Chapter 6, where the debate about hacktivism seems to focus more on whether it should be called hacktivism (relating to ideas of the power of language) than whether the online attacks so far practised were a good idea, or on the range of alternatives beyond virtual sit-ins. Furthermore, a lack of focus on the context of technology use meant that there is little examination of the difficulties activists faced in their use of the Internet. Although surveillance issues are given some space in Chapter 3 and retaliation from targets is explored in Chapter 6, access issues only merit a cursory mention (on p. 82).

Future Active is not only of interest to those specifically concerned with political activism on the Internet. It raises questions about the functioning of the Internet more broadly in terms of its future as a space for interaction free from commercial interference, and of the value and problematics of political activists' use of media and technology. In all, the book is a contemporary snapshot of radical Internet activism. It is inspiring and provocative, and as Meikle himself concedes, it raises many questions that have yet to be answered. — Jenny Pickerill, Curtin University of Technology, Western Australia. End of Review


Gian-Paolo D. Musumeci and Mike Loukides.
System Performance Tuning.
Second Edition.
Sebastopol, Calif.: O'Reilly, 2002.
paper, 334 p., ISBN 0-596-00284-X, US$39.95, UK£28.50, E45.60.

Gian-Paolo D. Musumeci and Mike Loukides. System Performance Tuning.

I'd like to start by saying what this book is not, nor claims to be. It is not the holy grail of performance tuning. Tuning implies careful listening and listening is a subjective experience. Compare the listening habits of a typical teenage boy, with that of a homemaker for example, male or female. I don't know many early to middle aged fans of satanic rockers Slipknot. Nor is it a reference book. At least not until you've read it all. Performance tuning requires an appreciation of everything that is occurring on your system and with that in mind ... .

System Performance Tuning covers mainly Sun Solaris (up to version 8) and some variants of Linux. However the principles described are applicable to most other flavours of Unix. Every aspect of tuning is covered, from the low level aspect of the hardware, through to application algorithms and tuning methodologies. Whilst the coverage isn't exhaustive it does provide, in an entertaining read, a basis from which to start. The only major omission I feel is the lack of support for larger systems. Storage Area Networks are only mentioned but not treated and I couldn't find any mention of Network Attached Storage anywhere.

Performance tuning is often considered to be an art because there never seems to be an obvious rule or law that will provide a provable best solution. However, the authors have set out to provide a guide to the principles and tools involved in performance tuning that turns tuning back into a science. Because there are always rules involved, they are just not necessarily obvious or simple. And this is where the book succeeds. It provides the reader with enough information to build a tuned system.

The authors start by looking at how to approach performance tuning and the tools available to do this. Most of the tools are the standard ones included with the Operating System and each one is explained clearly, including its limitations. Coming from an organisation that has very strict change controls in place on servers this comes as a relief to the reviewer. Each one is explained clearly, including its limitations and the authors duly trot out the Heisenberg principle (the act of observing a value alters the value), something that should be held in mind before top is run.

Once the approach to tuning is covered, the authors turn their attention to benchmarking and testing. This is an area that is all too often not treated with the care and attention it deserves. Here Musumeci and Loukides give the subject a thorough going over explaining what commercial benchmarks are available and also how to build your own. They also introduce the concept of the Cargo Cult administrator: a magician with no rhyme or reason for what they do, and something which is important to remember in the last chapter.

Most of the book is then taken up with a more in-depth look at the individual components that can bottleneck a system: processors, memory, disks and disk arrays, and finally networks. Each subject is covered in an easy to understand fashion with real world configurations discussed. Particularly good is the section on networks. Networking is increasingly important and with that there is now a greater choice of solutions and their respective problems and pitfalls which are covered very well here.

Code tuning receives its own chapter. Initially it almost seems out of place, but in real life situations I have seen much greater gains from application tuning/rewriting that I have ever seen for platform tuning. It rounds the book off nicely although this is definitely only a primer on code tuning as it is a very large subject. However I would like to have seen a section on how to convince your developers that that their code isn't tuned well! This is a personal gripe on the systems that I am responsible for.

The final chapter has a few instant recipes for tuning under specific circumstances which, whilst often useful, come with an important warning: do not become the Cargo Cult administrator. Set shmsys:shminfo_shmmax=4294967295 is not the solution to all your Solaris 2.6 problems. It's all about applied knowledge.

I would recommend this book to administrators starting out on their UNIX Odyssey without hesitation. For those who are new to the joys and traumas of tuning then this will give a thorough grounding in the subject. It will also give someone who's new to the administration game a good introduction to most of the technology used today. However it is less well suited to someone with a currently existing system that isn't performing as well as expected: there will never be a substitute for building the system correctly in the first place. It will help, but not as much as reading it before building the system in the first place.

To get the most out of this book, you must question everything. Tuning is different for every situation and if you apply the knowledge gained you will at least be on the right road. Overall the book is well written and researched. Just remember to read it before you build you system. It's also useful for hitting the developers around the head with when they are tuning out bad code! — Mark Dorman. End of Review


Marty Poniatowski.
HP-UX Virtual Partitions.
Upper Saddle River, N.J.: Prentice Hall, 2002.
paper, 1040 p., ISBN 0-130-35212-8, US$49.99.
Prentice Hall:

Marty Poniatowski. HP-UX Virtual Partitions.

If you speak to most UNIX vendors these days, the latest great thing in their sales pitch is consolidation. Having spent the last few years selling us blade servers for serving Web pages, midrange machines for back-end databases, and smaller machines in between to run applications layers in, now we are meant to throw that all away and invest in what are effectively mainframes with UNIX on board.

These come in may flavours; IBM sell the p690 "Regatta", Sun has the F12K and F15K Sunfire servers, Fujitsu Siemens are in on the act with their Primepower range, and HP has the Superdome. The one thing all these have in common is that when you have bought one, you will want to partition it to look like blade servers, application servers and database servers. This is done logically by some vendors like IBM, and physically by others such as Sun. The HP offering is vPar or virtual partition.

This book is ostensibly a manual for a system administrator who has to set up a server using vPars, but it does go further and has some useful information hidden away where you least expect to find it. In the past I have, along with other colleagues, been very critical of books from this author. His books are always very thick, but when you read them there is some content interspersed with UNIX manual pages. This has often led to the criticism that he is simply using man output to pad out a book otherwise thin on content. He has set out his reasons for including the man pages in this book, although I do find his reason of sysadmins reading the book at home and wanting to look up a command a little odd. Most sysadmins that I know don't read technical books at home, and those that do often have a UNIX box of some description at home that will have the man pages loaded anyway.

On a more positive view, the man pages are not too intrusive in this book and the author give a lot of useful examples of how to get an vPar up and running, an how to manage it once it is up.

As I said earlier however, there are some gems of information in this book that will be of great use even if you are not implementing vPars; the sections on IgniteUX and on the use of Veritas Volume manager instead of HP's own LVM are most useful, although each would merit a book in their own right. His chapter on the use of SAM is very good, but the part I found most useful was the section on building a new kernel where he goes into some detail on the tunable parameters for the HPUX kernel. This is information that it is usually quite difficult to find in the manuals and most of the things you could ever think of changing are in here. There are other sections that touch on tuning and performance, which give a broad overview of how to approach problem solving and tuning for HP systems.

Overall, the book does cover HPUX Virtual Partitions, but there is much more than just that in the book. It will become a useful resource for me for other things as well, and I would say this is probably the only book from this author that I would suggest HPUX administrators should go out and buy. — Peter Scott. End of Review


Keng Siau and Terry Halpin.
Unified Modelling Language: Systems Analysis, Design and Development Issues.
Hershey, Pa.: Idea Group Publishing, 2001.
paper, 288 p., ISBN 1-930-70805-X, US$74.95.
Idea Group:

Keng Siau and Terry Halpin. Unified Modelling Language: Systems Analysis, Design and Development Issues.

This publication consists of 15 papers from different authors that discuss aspects of the Unified Modelling Language (UML). These are presented under the titles of application, evaluation, extension, and formalisation of UML. The book assumes an intermediate to expert knowledge of the UML standard.

The preface to the book outlines the history and basic structure to UML, and then follows with a breakdown of the contents for the next 15 chapters. As always with books of this nature, differences are observable in the quality and depth of writing in different chapters. The papers presented are well sorted and in interesting topics, and generally arguments are detailed and well presented. A word of warning however for the casual reader: you may be put off by the nature of the book. A strong working interest in the development and use of UML is necessary. To the UML converted: this book holds a good set of examples of advanced use of, and identified problems with, the standard.

The four sections of the book have the following contents:

The first section of the book holds three chapters that discuss the application of UML in the areas of World Wide Web application design, the conversion of static UML models to object oriented code, and the movement towards object-oriented (OO) database design. Frameworks are defined for the generation of program code from UML, and the movement from entity-relationship databases to OO ones.

The second section holds three chapters that deal with the evaluation of the language - specifically in the areas of the development of a process model for UML, its support for early reuse decisions and evaluation for the development of high quality software models.

The third section covers four papers with proposed extensions to UML. Extension models are proposed for mobile agents and distributed systems development, and examples for UML improvement are taken from the object role modelling and object constraint language techniques.

The fourth and final section of the book covers the formalisation of UML. The first two papers propose development of formal methods techniques for UML, using Timed Communication Object Z (TCOZ) in the former case, and meta-models (the Maude algebraic language) in the latter. The final paper discusses the abilities of UML in interaction modelling.

It seems an irony that while modularity and efficient design are regarded as a cornerstone of modern software design, Unified Modelling Language and most of the tools for it have proven so cumbersome and hard to extend or refine. This observation runs across the book, with comments in several chapters mentioning both Rational Rose (the current leading UML tool) and UML itself in terms of problems found and solutions proposed. This openness is refreshing in a technology book - the limitations of the language are exposed and explored rather than glossed over, and where no solution can be suggested, problems are outlined and left open at the end of a particular paper.

The UML standard describes a large set of processes and has a diverse set of possible uses. The book mirrors this, with papers relating its use in areas as disparate as distributed systems, formal methods, Web design, and database design, to name but four. This book will prove a useful snapshot of ongoing work in different areas for those experienced developers who work with UML on a frequent basis. However, those interested in learning this language, or investigating its evolution as a standard will want to look elsewhere. — Andrew Butterly. End of Review

Contents Index

Copyright ©2002, First Monday