First Monday

FM Reviews

James W. Cortada and John A. Woods, eds.
The Knowledge Management Yearbook, 1999-2000.
Boston: Butterworth-Heinemann, 1999.
cloth, 536 p., ISBN 0-750-67122-X, US$74.95.
Butterworth-Heinemann: http://www.bh.com/

James W. Cortada and John A. Woods, eds.
The Knowledge Management Yearbook, 1999-2000.

Remember all those articles, journals, chapters, and books you meant to read about knowledge management? Here is that whole bookshelf in one volume. In hand are five sections intended by the editors to provide an annualized resource of information. It is the first volume of a new series designed to be not only a reference tool, but also to present ideas, techniques, and case studies in knowledge management (KM). Volume Two should be in stores soon, at a slightly higher price of US$89.95.

Volume One contains a thin index, but meaty references are to be found throughout. A "clearinghouse in a book," this tome is a compendium that emphasizes the practical implementation of KM, to complement academic and theoretical material available elsewhere. KM practitioners can use this library of best practices, which is a must for benchmarking. It can also serve as a valuable ready-reference tool in the corporate library.

The five sections range from stage-setting theory to show-stopping practices and terminate with laudable references.

Part One, the nature of knowledge and its management, provides the perspective and background to understand what is occurring in an organization. This section also provides a framework to build expertise of the enterprise. Part One opens with an essay on the status of KM in companies today. The classics section provides a historical background with a seminal article on the practice of pulling bits of knowledge held in a "pool" in order to get the bigger picture and to understand what is going on. Next are two articles about know-how and documenting know-how in an organization. Stressed is an understanding of the nature of tacit and explicit knowledge. Finally, KM culture is explained with a series of four articles that provide an international perspective on KM, an excellent overview of KM by David Snowden (a must read), and a means to increase an organization's intelligence, or OIQ, to improve the strategic use of knowledge.

Knowledge-based strategies comprise the second part. There are four articles in this section, including a well-written extended summary by the authors of a report on creating a KM-based business. It provides the characteristics and success factors of KM leaders (and pinpoints the KM failures). Readers will find in another article the eight key characteristics that make KM projects succeed. There is thought given to answering the question of how effective KM strategy affects business results, as well. Another author discusses the importance of intellectual capital-the competencies of knowledge workers.

Part Three, KM and organizational learning, is the "how-to" section, with over 20 articles for perusing. These articles explain how to overcome problems with organizational learning by using a reflective exercise. Next, organization of information for easy access is discussed, which, in turn, facilitates the effective transfer of knowledge. Finally, lessons learned and case studies show how KM was implemented.

The fourth part is all about knowledge tools and techniques. It is further divided into two parts: information technology and measuring KM effectiveness. In essence, the articles in this section stress that KM must begin to take hold as an inherent practice rather than a dictate from the powers that be before it can organize the corporate memory (a repository of company knowledge). Intranets are used, in part, to point knowledge workers to directories that store information about skills and experience, and learning resources (as part of a performance support system in some cases). These knowledge (data)bases form the core of the KM system and can be used to extract historical data that contributes to business intelligence.

Part Five, a selection of references, contains about a hundred amusing quotes on the nature of knowledge, knowledge management, learning, and wisdom. The quotes are "meant to stimulate [one's] thinking" in these areas, or possibly to provide a chestnut to set the mood for one's own sig file or oeuvre. Also included in this section are a glossary of terms, suggested periodicals and online resources, as well as a compilation in annotated bibliographic format of twenty of the (other) most important articles from 1996 to 1998 that should be in one's to-read pile and then circulated amongst colleagues. The latter resources are prime sources for the foundation of a knowledge management library or resource center.

This book is well worth its price if one stops to consider the time and energy involved in acquiring the articles or listing the resources and then scanning them for practical information. The bottom line is that purchasers will immediately receive their money's worth from Part Five, with the investment further paying off with the fistful of knowledge from Part Three. There is a minor quibble held by the reviewer, though. The preface claims that all material dates from 1997 or 1998, but in its "classic articles" section, there is an intentional reprint from 1945.

Some quotes from the yearbook:

"Simple [KM] models, based on coherent, common-sense values in an incremental, evolutionary environment, are the way to success." (p. 53)

"Knowledge production cannot be departmentalized from other functions, as is customary in industrial manufacturing. The person who develops a knowledge service is often the best producer of the service and sometimes even the best at selling it to the customer." (p. 172)

"Create communities of competence. This is one of the key knowledge management skills and the one open to the most interpretation." (p. 230)

"Unregulated information flow is disconcerting for people who think that organizational hierarchy will help them retain power and control." (p. 396)

"All human knowledge takes the form of interpretation." -Walter Benjamin

- Beth Archibald TangEnd of Review

++++++++++

Kevin Dick
XML: A Manager's Guide
Reading, Mass.: Addison-Wesley, 1999.
paper, 204 p., ISBN 0-201-43335-4, US$29.95
Addison Wesley: http://www.awl.com/

Kevin Dick. XML:  A Manager's Guide

When is a book like sex?

Since XML is so exacting and formal, a handholding tome like Dick's XML guide is quite welcome. It's a handsome book, easy on the eyes with plenty of interesting things to look at, and some simple XML to play with. Charts, tables, and bulleted lists just when you need them. In this case, however, the old saying about technical computer manuals' being like sex is true: when it's good, it's very good, and when it's not so good, it's better than nothing. More about that later.

Extensible Markup Language-XML-like HTML grew out of the SGML language. Akin to SGML, designers craft their own tags to define their data. The XML tags describe the content, unlike HTML, which presents and formats information. The goal of XML is to be easily available (i.e., platform-independent) to the applications that need it; it can also be understood by both humans and machines and serve hybrid purposes to boot.

Managers are a step removed from technical work, like programming. As such, a resource that presents information in a way they can utilize in order to speak intelligently with their programmers is quite valuable. Author Dick has structured the guide in such a way that different audiences (vendors and their buyers) readily see how XML will benefit them. In addition, executives, project managers/planners, and designers (along with their respective job titles) from both audiences can self-identify the areas upon which they need to focus. Given that busy managers will be "reading" this guide, provided are one-line summaries in the margins and bulleted item, along with the executive summaries.

I have some issues with some of the one-line summaries mentioned above, however. To wit:

- XML tags are similar to HTML tags

is fine, but this:

- We'll start with a simple example

missed the point completely. The paragraph in question drew a fine parallel to the "key technical features" of XML and the simple exchange of business cards. "Simple example" yes, but not the main concept.

There are seven chapters in this book. They do not need to be read sequentially, although it is helpful to read the first chapter first, which provides context for the need for XML, that is, situations where existing Internet technologies do not fulfill current requirements.

I would also recommend chapter two, XML basics, as part of your reading. The author explains the mechanisms behind XML, which helps to understand its potential and return on investment. The five-part conceptual model is also a bonus, but you'll miss it if you only read the executive summary!

Chapter five, all about the processes and people, is a must. And read the whole thing. You will walk away with a keen understanding of how XML affects processes and staffing. By being able to categorize the applications needs (content, business, or protocol), you'll know that you require certain staff and what their roles will be. This kind of understanding is imperative when coordination and planning are key. Disruptions can be mitigated with good planning.

Finally, depending on your business needs, chapter six presents XML applications for businesses and chapter seven does the same for vendors. The more technically oriented will want to investigate the remaining two chapters for related standards and XML tools.

In addition to the seven chapters, there is an index and glossary of XML terms. New terms are in bold blue (not to be confused with the bold blue headings) in the text. Some terms are well defined, some not so well. Here's a sample definition for element content: "The content associated with an element. It appears between the start tag and end tag of the element." Hmm. Still, it's better than nothing.

You will likely want to flirt with another book on your shelf. It is understood that there will be a need for the more technical manual or developer's guide next to this volume. Simple examples are provided, but for more detailed understanding of the intricacies and exactness of the language, also refer to such manuals as the XML Bible or Web sites such as XML.com.

The cover claims the book is elegant and deep. I'll give you "deep." It is extremely hard to discuss coding concepts without reverting to programming jargon. Some complexities are rendered understandable for those who are willing to risk delving into the text and referring to charts and tables.

I am particularly impressed by the explanation of the "roots of the connection problem"; the author uses a familiar situation and expands on it so that the reader, techie or not, can readily grasp the gist of the idea. Dick also uses similar boxy images to convey relationships; expansion on their use, especially at the beginning of chapters, would have helped the guide qualify as "elegant" and given some continuity for the sequential reader or a visual cue for the skimmer. For those who are scanning the book, the improved visuals would have really put it over the edge. It would also have been helpful to retain some consistency with the executive summaries; some seem to plug other chapters while some summaries just missed the mark.

Buy this book, but keep your little black book within arm's reach, too. - Beth Archibald TangEnd of Review

++++++++++

Patricia Diamond Fletcher and John Carlo Bertot
World Libraries on the Information Superhighway:
Preparing for the Challenges of the New Millennium

Hershey, Penn.: Idea Group Publishing, 2000.
paper, 313 p., ISBN 1-878-28966-7, US$74.95
Idea Group: http://www.idea-group.com

Patricia Diamond Fletcher and John Carlo Bertot.
World Libraries on the Information Superhighway:
Preparing for the Challenges of the New Millennium

The idea, revived in recent news reports, that books could soon be a thing of the past, usurped by electronic publishing, is an indication of one of the challenges facing libraries in the new millennium. Their role as custodian of books is expanding rapidly in order to embrace the disparate products of the digital age. Consequently, it is necessary to face up to the technical difficulties in capturing and preserving documents published in electronic format in an environment where software and hardware standards are rapidly evolving, as well as the practical difficulties in identifying material which is worthy of preservation.

Another threat to the traditional role of libraries, "that they should take responsibility for collecting, recording, providing access to and preserving their own national imprint" (Cameron & Philips: 6) is the difficulty in dealing with the sheer volume of published material. There was an outcry recently when it was reported that the British Library had been discarding books - some of which were known to be the last extant copy of particular works - due to pressure of space. Although storing information in electronic format avoids the major problem of providing adequate shelf space for physical artefacts, the explosion of electronic publishing has led some national libraries to take a selective approach, arguing that much of the material published on the Internet is not worthy of preservation. If such an approach is taken to material published electronically, it seems logical that books published in traditional format should also be subject to criteria of quality.

These are two of the highly topical issues covered in this collection of papers. The collection begins with an article by co-editor Patricia Diamond Fletcher on "Libraries and the Internet: Policy and Practice in the 21st Century", and ends with an overview by co-editor John Carlo Bertot which draws together the "Issues and Lessons Learned" from the various contributions. The variety of perspectives reflects what Fletcher (1), citing Wedgeworth (1998), calls "Understanding of the cultural similarities and differences which directly affect the service mission of libraries".

Each of the seventeen chapters is a self-contained article, with authors including librarians from Australia, the USA, Canada and Portugal, and academics from several countries including Korea, Malaysia and Estonia. Many of the articles report on specific projects, while others look more generally at trends, policies and strategies for the new millennium.

While the most urgent trend seems to be the expansion of the role of the library, "The main business of libraries - organizing and storing information and helping clients to use it - is being challenged" (Haigh: 27). It has been claimed that libraries are no longer necessary as information can be accessed so readily on the Internet. This collection demonstrates not only that libraries still have a role to play in the wired world, but that their role is a rapidly-developing one. They will help to bridge the "digital divide" by providing access to the Net.

They will also play a key role in describing and cataloguing online resources to facilitate access to relevant, accurate and reliable information for both academic researchers and the general public. As Fountain (88) puts it, "With the exponential growth of the Web, we need to change our focus from simply locating information to locating the most relevant information in an efficient and cost-effective manner". One approach to this is described in Wang et al., in their study of the system in use at Queen's Borough Public Library. "Just as an experienced librarian will recommend one book over another to a customer seeking information in the more traditional way, the library's Web site can point the user to a set of information" (150). Liebowitz & Adya (170) describe various methods of automating the process of cataloguing, including the use of expert systems and intelligent agents.

Their role will also encompass the preservation of digitally published information as part of each country's national archive, which will entail new approaches to the acquisition and storage as well as the cataloguing (Lee: 67). Moreover, the interface through which patrons access information must be easy enough to use so that patrons will not be deterred (Lee: 68, Fountain 90). In the context of the academic library, Schmidt et al., in their report on the University of Queensland Cybrary, state that "Through the Cybrary, students are able to navigate their way to new sources of information. Use of the journal collection has doubled..." (113). In order to ensure the ease of access to information, the library provides information skills training for the University's students.

As well as providing online materials, libraries are faced with the challenge of providing Internet access for those who have no other access. As more information becomes available online, possibly with a concomitant reduction in availability through traditional means, libraries will play an increasingly important role in ensuring equality of access. Indeed, the electronic library is likely to be the local access point to a range of government information resources and services, and so providing access for those socioeconomic groups who have no other means of access will be an essential part of the library's role in the community. Baigent & Moore (130) quote the (UK) Library and Information Commission as saying that "The networking of public libraries will place then at the forefront of the drive to create an educated, informed and ICT-literate society".

One new role for the networked library is to provide online services ranging from access to database to the provision of rare texts in digitized form.This provides new opportunities for libraries to share resources, and can also obviate the duplication of efforts; patrons will have access to materials from several libraries, so it will no longer be necessary for each one to maintain a similar core collection.

New methods of collaboration and cost-sharing are being developed to enhance the scope for cooperation between institutions. This also entails change in the organizational structures, particularly the development of fluid team-based structures to work on projects. Librarians will need to deploy a broad range of skills, particularly in information technology, which will demand retraining for the current generation, and radical changes to pre-service education together with on-going in-service training to keep pace with the rapid changes.

The envisaged expansion of library services will require adequate funding for technology and infrastructure. While the potential for networked collaboration can reduce the drain on resources, there is an urgent need to develop methods of accessing, navigating and indexing information in multimedia formats. Additionally, in their role as national archive, libraries must find ways to preserve the rapidly-changing digital material as a historical record. That involves finding not only hardware and software that will continue to be available or at least compatible with future systems, but also mark-up languages and document formats that will continue to be supported. The technical magnitude of such a task is compounded by the need to ensure the accuracy and authority of Internet-based resources.

There are also several policy issues to be resolved, such as copyright and ownership (Lee: 70). As Haigh (29) points out, "Libraries are not necessarily prepared to undertake costly digital preservation activities such as conversion and migration if they only, in effect, rent the resource".

In the final chapter, co-editor John Carlo Bertot provides an overview entitled "Libraries on the Information Highway: Issues and Lessons Learned". As he expresses it, "A key theme espoused throughout the chapters in this book is the need for libraries to reassess their roles, services and functions in light of the digital information age" (292). This wide-ranging collection provides a great deal of insight on both the technical aspects and the theoretical implications of the situation, as well as several case studies and research projects that provide indications for possible ways forward. - Peter Beech

References

Library and Information Commission, 1997. New Library: the people's network. London: Library and Information Commission, accessed 2 April 1999, at http://www.lic.gov.uk/publications/policyreports/newlibrary/index.html

Robert Wedgeworth, 1998. "A global perspective on the library and information agenda," American Libraries, volume 29, number 6, pp. 60-68. End of Review

++++++++++

Rita Lewis with Bill Fishman
Mac OS In A Nutshell
Sebastopol, Calif.: O'Reilly, 2000.
paper, 376 p., ISBN 1-565-92533-5, US$24.95

Rita Lewis with Bill Fishman.
Mac OS In A Nutshell

I will be the first to admit it: reviewing a book like this is not an easy task. Any book aimed at Power Users is always a risky endeavour, as they always tend to verify each and every statement or fact presented in any type of documentation. However, Rita Lewis and co-author Bill Fishman did decide to produce a Quick Reference on the Macintosh Operating System (or Mac OS) which could be used both as a handy guide and as a lookup resource, with the more experienced users in mind.

As a seasoned Macintosh user myself, the first thing this book made me realise was the amount of features that the Mac OS includes when taken out of the box. Perhaps it is because I have been upgrading incrementally over the many years that I never appreciated how powerful and user-friendly this operating system is. For this reason alone, "Mac OS In A Nutshell" is an eye-opener.

But there is more. The book is divided into 19 chapters and 2 appendices, covering the basic interface elements of the Macintosh, files and disks, utilities, accessories, as well as more specific features, such as AppleScript, Sherlock and ColorSync. Networking and Multimedia are also covered in details, with many pages filled with valuable information, tips and specifics. The book is rounded up by a chapter on the iMac and one on the new features of Mac OS 9, plus appendices on error codes and keyboard shortcuts. Intelligent use has been made of boxed notes, tables and figures, especially when some of the more technical details are listed (the printer drivers in the Extensions folder, or the description of all known Mac viruses, for example).

There are extensive references to third-party utilities and programs which can solve the various problems outlined in the chapters, and the fact that uniform resource locators are provided throughout makes the book really invaluable. Of course, this makes it more prone to become outdated sooner, as often Web addresses change, but this is inevitable.

In general I felt that the authors put a great deal of effort in writing the book, which is also reasonably priced at about $24 for 380 pages. However, not everything is above criticism and I did find a few "glitches" in the texts (OK, I know, I must now look like one of those Power Users who adore to find mistakes in other people's creations!)

First of all, it would have been nice to add a paragraph or two about the authors. Nothing about Rita Lewis and Bill Fishman was written anywhere, which is unusual and can make the book a little bit anonymous. Another issue is related to little mistakes, here and there, throughout the book. On page 59, for example, table 3-1 is described as presenting a short overview of the different Mac operating system versions up through Mac OS 8.5 when in fact it goes right up to Mac OS 9. Figure 12-1 depicts the appearance of various font files but its caption is somewhat incorrect as it describes a printer font from Bitstream as being a screen font, while it states that a PostScript Type 3 printer font is from Bitstream although that font's icon is clearly from the Monotype foundry (they have a characteristic 'M' icon).

If these are minor mistakes, there are also some omissions, such as not including Sophos in the list of professional anti virus utilities on page 138 (I excused the fact that the authors could have not possibly know of VirusBarrier, released just a few days ago, as well as still reporting Eudora as coming in two flavours: Pro and Light, when these have been consolidated in three modes), and skipping the PowerPC 740 chip in the table of page 318.

Perhaps the only real piece of misinformation appears on page 330, where there seems to be a bit of confusion with respect to Human Interface Guidelines and Application Programming Interfaces. The text says: "Apple has always asked developers to follow its Macintosh Human Interface Guidelines very closely when writing Macintosh applications. Mac OS 9 is an example of why Apple has demanded such strict compliance -- any program that strays from this narrow path will crash because Mac OS 9 has rewritten many of the basic operating system Toolbox managers that should not be accessed directly by applications." In fact, what Apple wants is that developers do not use undocumented toolbox calls or low-level routines which might be subject to change by the engineers in Cupertino and which might cause non-standard application to crash or corrupt the System. The Human Interface Guidelines indicate how to create a program's interface in such a way as to conforms to certain norms (for example, the placement of buttons in dialog boxes, the order of menu items, keyboard shortcuts, etc.)

But what I found really irritating about the book was the fact that the Mac OS interface depicted throughout the chapters had all sorts of Appearances and Kaleidoscope themes. So that, for example, you might find, on pages 286-7, two screen shots of the Internet Setup Assistant side by side, one with a weird-looking theme an the other in the more traditional Platinum appearance. While the author specifically states that she had customised her Macintosh and that "what you see in this book may not completely match what you see on your Mac", I think that for the sake of consistency and clarity, the use of 14 different appearances in a book like this should be avoided at any cost, even if only to make it look more authoritative.

Despite all this, however, Mac OS in a Nutshell does provide a great deal of useful information, is extremely well researched and structured, with a logical sequence of topics. I would recommend the book to anybody who is at least a bit familiar with the Macintosh operating system, and of course, to those Power Users who, like me, tend to think that they know it all. - Paolo G. Cordone End of Review

++++++++++

Kieran McCorry
Connecting Microsoft Exchange Server
Boston: Digital Press, 1999.
paper, 416 p., ISBN 1-555-58204-4, $34.95.

Kieran McCorry.
Connecting Microsoft Exchange Server

This book is a fascinating look at how to connect multiple servers and sites, possibly using disparate messaging systems and different protocols. The author uses a combination of real life situations (he gained his experience during his time at Digital and Compaq) and the theory underlying these situations. It is not designed to be a troubleshooting guide for an administrator, although much of the knowledge imparted would be both useful in understanding the problems illustrated, and interesting in terms ofappreciating why such problems might occur.

McCorry makes much of the term "integration" and, indeed, as the book progresses, it becomes clear that this is in fact what using Microsoft Exchange in today's high tech messaging world is all about. As is pointed out in the Preface "Exchange is one of many messaging technologies, and as long as other messaging systems exist so, too, will the need for messaging integration".

The author's enthusiasm for his subject is infectious, and the reader has the impression throughout that "this guy really knows his stuff". Whenever the subject is beginning to get a little tricky, a practical example and usually either a screenshot or a diagram is provided to facilitate understanding. An appendix containing such useful things as sample scripts is helpfully provided at the end. The real beauty of this work is that it shows exactly what happens from the time the message is sent, to the time it is available for reading at the remote end of its journey.

The book takes the reader from the basics of the Exchange organisations available, through explanations of the connectors and protocols, and concludes with the important topic of synchronisation. There are in depth sections on the MTA (including how to troubleshoot problems), Routing, and the IMS. All references are almost exclusively to version 5.5, though mention is made when an earlier release did not include this functionality. Scenarios are used in which combinations of sites, connectors and protocols are explored, and the topics of optimisation and possible pitfalls are explored in some depth. It is pitched at a level where a reasonable idea of how messaging works is assumed, and in particular, how Exchange works. It would not be suitable for a total novice.

The chapters on X400 (on which Exchange is primarily based) are quite difficult to follow, and much of the information is aimed at those individuals who really want to know the ins and outs of X400, which, as the author admits, is now very firmly in second place to SMTP in terms of popularity. Strangely then, the chapters on SMTP seem to be less meticulous than those on X400.

But these are minor criticisms of a book which I found difficult to put down. McCorry's mix of humour and frank opinions, together with his own experience of the subjects make this a book which I strongly recommend to anyone who is involved in, or will be involved in any projects concerned with the integration of Exchange into heterogeneous messaging and directorybackbones. - Brian Parker.End of Review

++++++++++

Shawn P. Wallace
Programming Web Graphics with Perl and GNU Software
Sebastopol, Calif.: O'Reilly, 1999.
paper, 470 p., ISBN 1-565-92478-9, US$32.95.

Shawn P. Wallace. Programming Web Graphics with Perl and GNU Software

Introduction

I'm a skinflint and I don't want to shell out on Dreamweaver. On the other hand, I want to create groovy dynamic Web sites with lots of graphics without taking out masses of bandwidth. I also enjoy the challenge of hacking code in the raw. This book could have been written for me - a feeling I get with many of O'Reilly's superb publications. (I'm not being sycophantic there - these books are a refreshing antidote to the whitespace doorsteps that some other publishers rip us off with.)

Web Graphics Fundamentals

File Formats

Throughout the book, Shawn Wallace deals with three image file formats: GIF, JPEG and PNG. He kicks off by explaining the differences between these formats and the uses they can be put to. There is a useful file format comparison table which explains the differences concisely but in some detail. Subsections advise on the best format to use for circular images, photographs, 'graphical' text, greyscale images, line drawings and animated images. The sections that describe the innards of the file formats are interesting in an anorakky kind of way - there is perhaps more detail than necessary but some useful insights are gleanable from these sections. (Appendix A contains the code for a simple PNG decoder in Perl.)

Serving Graphics

Chapter 2 addresses the issues involved in serving graphics on the Web and looks closely at the various attributes of the IMG tag. There are some useful Perl scripts here, including one that tries to add WIDTH and HEIGHT attributes to a given HTML page for the various image files used by the page - this would be a useful sub for CGI scripts that create graphics files on the fly. At the end of the chapter there are sections covering odds and ends: support for Lynx users (i.e. good and bad ALT choices); the use of OBJECT to give older browser-users a 'traditional' image rather than some groovy multimedia thang that they can't see; a 'rogue's gallery' of dodgy non-standard tag extensions; and a short Perl script for converting true-colour image values to a 216-colour 'Web-safe' palette.

Graphics Libraries

There are several free graphics libraries available for a number of platforms (although developers using Windows might question the truth of the word 'most' in the 'Platform' section of the library descriptions). In Chapter 3, the first library described (on the principle that 'AAA Taxis' will always get to the top of the Taxis section of the Yellow Pages) is AAlib, a library for creating ASCII art that Lynx users can view. (Interestingly, the Gimp has an AA plug-in with an option that will generate HTML for inclusion of ASCII art in the ALT field of the IMG tag - this is described in detail in a later chapter. It is truly a Good Thing that Lynx users are not forgotten in a Web graphics tome.)

An exciting inclusion in Chapter 3 is photopc. This is a collection of drivers and tools to allow Linux users to interface with some digital cameras through the serial port. The author discovered that the command-line tools provided were rather more efficient than the bloated Windows and MAC OS software shipped with his cameras.

Graphics on the fly

GD

Having got through these introductory chapters, we come to the solid meat of the book: creating 'on-the-fly' graphics. Wallace looks at the GD module, a Perl port of the gd C graphics library, which allows CGI scripts to mess about with graphics files before serving them to a browser. This chapter is slightly out of date in that GD no longer works with GIF files due to the infamous Unisys patent. Instead, GD handles PNG and JPEG files. However, most of the details in the chapter are not compromised by this. (But see what I say below about GIFgraph...)

PerlMagick

An alternative approach to dynamic graphics manipulation is to use PerlMagick, an object-oriented Perl interface to the ImageMagick libraries. PerlMagick can also be used offline for the batch conversion and manipulation of images (its 'real strength' according to Wallace). Anything that can be done manually with ImageMagick, can be done by writing Perl scripts using PerlMagick. Wallace includes plenty of cool examples, mostly using a picture of a cat that gets squashed, squeezed, boxed-in, shaded, stretched, swirled, and turned into a sine-wave.

GIFgraph

Developers who wish to publish charts and graphs on their sites will be interested in Chapter 6 (Charts and Graphs with GIFgraph). GIFgraph expands on GD by implementing methods for pie charts, bar charts, point graphs, line graphs and area charts. Creating these charts with GIFgraph is easy-peasy-lemon-squeezy: you just create a GIFgraph object, set its attributes (x and y labels, max and min values etc.), and shove data into it, and out comes a GIF that can be printed straight to STDOUT. The high-point and low-point of my experience of this book was the chapter on GIFgraph as this module was new to me and appeared to fit very nicely with a current project. The only thing missing from GIFgraph is 3-d graphing (except for pie charts) but then this is meant to be a review of the book, not the software described in it. However, the low-point was, having downloaded GIFgraph, that the first thing I saw in the README was this:

This is GIFgraph a package to generate GIF charts, using Lincoln Stein's GD.pm. This package is deprecated. I would recommend that you use GD::Graph directly. I also do not recommend the use of GIF images anymore, since Unisys has decided to be childish about their patent on LZW compression.

'C'est la vie,' I thought, but discovered one or two problems arising from this. With my current system, I found I had two choices: to upgrade my libraries to use GD::Graph directly, or to downgrade some libraries to be able to use GIFgraph (the most recent gd and GD libraries do not support GIF). There was no way I was going to do the latter so I am unable to use Chapter 6, which seems now to be obsolete because of the Unisys situation. (Is there a petition somewhere...???)

Gimp

Chapter 7 is a mini Gimp manual. The main use for Gimp is to edit graphics files offline rather than to deliver dynamic graphics on the Web. This chapter is more of an overview than anything else and was the least satisfying on the chapters. However, that may be because I know the Gimp too well already.

Dynamic Graphics Techniques

The final section of the book, Dynamic Graphics Techniques is a mixed bag covering image maps, GIF animation, a Web graphics cookbook and a brief introduction to PostScript. The first of these deals with the problems involved with using dynamically generated images as image maps. The example given is perhaps the most involved piece of code in the book. This is a 'wander engine' that allows the user to wander around a landscape. Data about the landscape and the objects in it are stored in a very basic SQL database. A CGI script creates an image of each area of the landscape with a number of objects lying around in it. Each object is clickable, so the script has to create a client-side image map on each landscape image.

GIF Animation

Chapter 9 (Moving Pictures: Programming GIF Animation) describes the use of PerlMagick to create GIF animations. Again, this section includes some very useful example Perl scripts, including a full implementation of a basic animation scripting language called GIFscript. This is the kind of thing that puts O'Reilly books far ahead of many competitors!

Cookbook

The Cookbook chapter includes six thingies of varying usefulness, ranging from a Web page access counter (I hate them!) to a JavaScript rollover menu (included because they are a 'much requested feature'). The most useful thingy is perhaps the set of thumbnailing scripts that will 'ease the tedium of making thumbnails of large groups of images'. In fact, I am using it for a project as we speak...

PostScript

Printing Web pages can be a very uncertain business - it would be much better if a Web site offered the option of a PostScript file in order to provide accurate output. The PostScript Perl module allows one to do this. Again, Wallace provides some very useful examples, covering the creation of individual pages, documents and graphical elements, covering the TextBlock, Document and Elements modules respectively.

Appendices

The first appendix is that PNG decoder script - I'm still not sure of the value of this but perhaps some time it will be the thing that saves the day! Next comes a Gimp reference - basically a description of the various menu options. Finally, Wallace finishes off with a full procedure reference for the Gimp's API - I anticipate this being a well-thumbed part of my copy of Wallace's book.

Finally...

No book is perfect and there were places where I would have liked more detail and places where less would have sufficed. However, a differently skilled reader would want different emphases. Wallace has covered a fairly large range of topics, packages and techniques, and for each one, he has given enough information to have a pretty good stab at a working project. Don't buy this book expecting in-depth reference material. The blurb on the back says that it is aimed at 'intermediate and advanced Web programmers' but I would imagine that an 'advanced' developer (and I am NOT one!) would know a lot of this stuff already. However, you if want to get into dynamic Web graphics programming without paying for fancy bloatware, I honestly cannot think of a better book than this to recommend. - Robert P. Scovell-Lightfoot.

Check this stuff out

You can check out some of the examples in the book on Shawn Wallace's Web site.End of Review

++++++++++

Peter Wainwright
Professional Apache
paper, 800 p., ISBN 1-861-00302-1, Wrox, $US40.00

Peter Wainwright.
Professional Apache

Apache is without doubt a great success story of the free-source initiative, which aims at producing solid, feature-rich and reliable software by having enthusiasts around the world working together and contributing to its development. That it is the most popular Web server application is, therefore, no surprise. Peter Wainwright's book starts with an easy introduction into the general notion of Web servers. It states what they do, which platform they run on, and focusses soon on the reasons why Apache is so popular, as well as outlining its basic hardware requirements. It then has a short but interesting section on networking.

Chapter 2 dives straight into the specifics of how to get and install this piece of software in its basic form by explaining the main configuration. Here there is enough information and practical advice to provide any novice with a working Apache installation. This gives the basic knowledge of what a Web server is and how it works, introduces the configuration and log files, as well as the various methods of running Apache.

Chapter 3 covers building Apache from source and discusses whether to statically or dynamically load modules. Chapter 4 gives an in-depth look at individually configuring Apache, in order to adapt it to any particular situation and environment with various chapters on dynamic content, server side, security, performance, monitoring Apache and finally extending Apache.

There is excellent coverage of the Apache modules and their configuration, which is lacking in some of the other Apache books. This book is suitable for anyone using the server software or developing Web applications to run with Apache, as it starts at a basic level suitable for a novice but quickly builds up to an advanced level.

There is something in this book for every one, Peter Wainwright shows his in-depth knowledge of Apache and explains it clearly using good examples. The book is filled with sound practical advice and it includes some useful appendixes. In comparison with the other "reference" books, it improves on the Definitive Guide by being more complete and on the Server Bible by having more in-depth information. - Richard Gale.End of Review


Contents Index

Copyright ©2000, First Monday