First Monday

FM Reviews

Rick E. Bruner, Bob Heyman and Leland Harden.
Net Results.2: Best Practices for Web Marketing.
Indianapolis, Ind.: New Riders, 2001.
paper, 343 p., ISBN 0-735-71024-4, US$35.00.
New Riders: http://www.newriders.com

Rick E. Bruner, Bob Heyman and Leland Harden. Net Results.2: Best Practices for Web Marketing.

With this book you will discover many tried and tested Web marketing and promotional techniques. It is aimed at a broad audience, from the corporate business planner with access to thousands of marketing dollars to Internet startups working out of garages.

Net Results is split into two main sections: Section One - Get the Site Right: Web Fundamentals and Section 2 - Audience Development. There is an appendix, too, which is an invaluable listing of Internet resources covering Internet guides, histories, glossaries, usage, searching, listings, directories, Internet Bodies and Groups, marketing resources, news, service companies and research groups.

Part 1 includes four chapters: Chapter 1 - Return On Investment Goals, Chapter 2 - Web Value Propositions, Chapter 3 - Design Optimisation, and Chapter 4 - Using Domain Names to Build Your Brand. Part 2 is spread across five chapters starting with Chapter 5 - Find and Be Found: Strategies for Search Engines and Directories, Chapter 6 - Word of Web: E-mail, Permission, and Viral Marketing, Chapter 7 - Building Online Audiences: Affinity Sites and Affiliate Programs, Chapter 8 - Media Savoir-faire: Public Relations for a Digital Age, and lastly Chapter 9 - Paid Media: Making Dollars and Sense of Web Advertising.

The authors prove that you can create an effective and successful marketing strategy following sound business fundamentals (which are covered extensively throughout the book). Bruner, Harden and Heyman are all pioneers of online marketing strategy with over 20 years of combined experience and believe me they have the knowledge! The authors share this expertise with you, giving you the confidence, whether you are a marketing professional, a budding Web site designer or developer, to try out many of the campaign examples that are littered, like confetti, in the course of the book.

Using case studies, Net Results conveys this information clearly and concisely, with the words just flowing off the page. If you are looking for ways to deploy an Internet campaign to increase your profitability and brand identity online, this is the book for you - without a doubt. The authors show that success requires hard work, focus and discipline. By digesting Net Results, you will get there faster and will definitely make less mistakes than some of the online businesses that have disappeared over the last eighteen months. This book is a welcome and powerful tool to keep you ahead of the pack.

The authors do not pull their punches. In the course of the book, they comment on the inability of many of the dot.gone companies to learn from previous errors combined with their arrogance towards their bread and butter customers, whom were consistently ignored. Buy this book now - that's my advice. - Glenn Dalgarno. End of Review

++++++++++

Corbin Collins.
Trellix Web: Web Site Creation Kit.
Upper Saddle River, N.J.: Prentice Hall, 2001.
paper, 299 p., with CD-ROM, ISBN 0-130-41206-6, US$24.99.
Prentice Hall: http://www.phptr.com

Corbin Collins. Trellix Web: Web Site Creation Kit.

Trellix Web is an ideal package for anyone who uses the Microsoft Windows environment. It will allow users to create, enhance, publish or modify their own high quality Web sites without having to learn "raw HTML" or CGI or even know how to FTP their site to a remote server. If you don't want to know the intricacies of authoring documents for online access, using HTML, or suffer the reading of numerous, and inevitably expensive, teach yourself-in-a-week books, then the Trellix Web could be just what you are looking for. It will enable you to create a professionally designed Web site in minutes, without those major headaches that many budding developers get during the design and development of Web sites.

The novice will find that the application, Trellix Web, is incredibly easy to install, prompting the user at every stage for any user entered preferences and variables that are required for the program to install properly. When the program is first launched, a dialog box will pop up asking you if you would like to have the Web Wizard appear in every session. Users choosing this option will be guided through question and answer sessions, it's that easy. Through the Web Wizard there are eight templates to choose from: Business, Personal, Photo Album, Organisation, Academic, Fan Site, Hobby How To's and Vacation Travelling. From here the Web Wizard asks a further series of questions, specific to the chosen template option in the previous step. Before long your first Web site is ready! It can now be previewed locally or sent remotely to a server.

The developers of Trellix Web have researched their product well before its release. By adopting the "step-by-step" approach they provide instructions to guide users through the whole process. To give added confidence, case studies are included and these are very sleek in their design, style and layout.

The Trellix Web Wizard and the Guided Tour are two excellent examples of the thought that the publishers and developers have put into Trellix Web. Users are not limited to the Web Wizard for all of their creative work. They can also add elements that they have created through other modules that are included, as well as from other applications at their disposal. Another feature of the Kit is the "One-Step Publish" feature which allows a site to be uploaded with a click of the mouse. "Web Gems" enable the addition of dynamic elements such as music, animation, forms and image maps. Online e-trading can also be included with a minimum of fuss.

Overall, Trellix Web is an attractive and inexpensive toolkit for developing a Web site from scratch. - Glenn Dalgarno. End of Review

++++++++++

Richard Coyne.
Technoromanticism.
Cambridge, Mass.: MIT Press, 2001, c.1999.
paper, 408 p., ISBN 0-262-53191-7, US$21.95.
MIT Press: http://mitpress.mit.edu/

Richard Coyne. Technoromanticism.

Firstly I should make clear that this is not a technology book. Rather it is a philosophical narrative investigating the romantic myths that surround the digital revolution.

The book is written under the aegis of the Leonardo series, a collaboration between MIT Press and the Leonardo/International Society for the Arts, Sciences and Technology. They aim to publish important texts that discuss and document the promise and problems of the emerging culture. I should also make clear that this is not an easy read. Partly this is because philosophy uses its own lexicon - I needed a dictionary of philosophy to make sure I understood some of the references - and partly because it raises and deals with some issues that we assume to be sacrosanct. It can be uncomfortable to have a mirror held up to some of our "givens".

As the preface sets out, addressing technoromanticism in a critical light not only lessens its hold but also reveals valuable insights into the computer and digital age. Coyne uses narrative as his vehicle to weave a cyclical discussion that returns to the core themes of unity and multiplicity. During our journey we look at how IT narratives attempt to transcend the material realm. We examine the empiricist tradition of realism and its critics and how contemporary narratives of fractured identities challenge technoromanticism. Finally there is a discussion of technoromantic narratives - total immersion environments, digital communities, the world of the cyborg - with an analysis of what they reveal.

I was fascinated by Coyne's ability to bring in analogies for our digital age and parallels in earlier movements. He examines the Arts and Crafts socialists of the 19th century and compares them to our current perceptions of the IT world, before starkly showing why they are not analogous. He treats Oedipal themes in works such as 2001, Bladerunner and Star Wars and provides an enthralling Freudian perspective on games such as Myst and Riven.

Overall this book is very enjoyable but challenging. It covers so much ground and pulls in so many references that it is, at times, dizzying. I'm not part of the book's target market so it isn't surprising that I found the writing style almost impenetrable in places and it certainly isn't something I could comfortably read over a pint. The issues it raises and deals with are complex and the sheer intellect with which Coyne handles them is breathtaking but be warned - this is a difficult read. - Nigel Gibson BSc (Hons)(Open). End of Review

++++++++++

Simson Garfinkel.
Database Nation: The Death of Privacy in the 21st Century.
Sebastopol, Calif.: O'Reilly, 2001.
paper, 336 p., ISBN 0-596-00105-3, US$16.95.
O'Reilly: http://www.oreilly.com

Simson Garfinkel. Database Nation: The Death of Privacy in the 21st Century.

Do you know who knows what about you? How much control do you have over your personal information? How much control do other people have over it? Whether we like it or not, our individual and group privacy is being eroded, the speed of this erosion is increasing with the assistance of technology. For example, in the final chapter of this book, Simson Garfinkel gives examples of how technology is more often designed to invade - rather than protect - privacy.

Database Nation shows how much data can be, and is being, accumulated about us, including details about our life and health, spending patterns and movements. As Simson Garfinkel shows, everyone who collects data can give a seemingly valid reason. These reasons include law enforcement and anti-terrorism, medical treatment and research, saving money and increasing efficiency for government and private organisations. However, he also shows how little control and regulation there is over the sale and exchange of data and how each little piece may be - and frequently is - combined with other pieces to form an overall picture. That there can be problems with this created picture is clear. Data may be inaccurate, problems with data verification (or lack of verification) are amongst the points made in this book. Personal information may be misused; if there are inaccuracies, they can have a major effect on an individual. Garfinkel includes case studies, and I am sure we could all supplement these with cases we have read or heard about. While there may not be one overwhelming Orwellian threat to our privacy, there are a multiplicity of little threats that we should be aware of.

Different chapters in the book cover a wide range of issues, including the overall collection and storage of data; government record-keeping and regulation; identification methods including biometrics, their fallibilities, and assumptions made; how information identifies what you did; surveys and surveillance/monitoring and how stereotyping influences who is watched and when; medical records, genetic testing and prediction; collection and analysis of purchasing patterns, including the use of affinity cards; ownership of personal information, credit scoring and the services credit agencies such as Experian offer to businesses; anti-terrorism and law enforcement monitoring; and, the use of AI-based information modelling and collection. There are also examples of how we can better control our personal information. The final chapter describes a privacy agenda for the U.S., although much would be applicable elsewhere.

Although the book is targeted at an American audience, it does not mean that the arguments are not applicable elsewhere. One of the areas discussed is the use of CCTV in public places. Is Great Britain the CCTV capital of the world? If you are a G.B. resident then, after reading what the book has to say, go out and see where the closest cameras are to you.

This is a very readable book, the examples and anecdotes, including the use of historical material, are cogent. Although it is clear that the author has strong views, the book is not simplistic. Garfinkel provides conflicting viewpoints which point to the complexity of the issues involved. There is an annotated bibliography, a list of Web sites and notes for each chapter for those who would like further details.

I do have some minor nit-picks. Some of the examples of how we can control our personal information are rather sketchily outlined, but as different countries have different contacts then this is arguably better than giving a detailed method applicable only to one country. I would also have liked to see Lauren Weinstein's Privacy Forum/Privacy Digest ( http://www.vortex.com/privacy/) and Peter Neumann's RISKS Digest (archives available at http://catless.ncl.ac.uk/Risks/) in more than just the chapter notes. There is a Web site associated with the book as well; see http://www.databasenation.com.

Do go and buy a copy. Pay by cash - after you've read the book you'll be glad you did. - Stella Page. End of Review

++++++++++

Gerald Grant.
Managing Telecommunications and Networking Technologies in the 21st Century: Issues and Trends.
Hershey, Pa.: Idea Group Publishing, 2001.
paper, 300 p., ISBN 1-878-28996-9, US$74.95.
Idea Group: http://www.idea-group.com/

Gerald Grant. Managing Telecommunications and Networking Technologies in the 21st Century.

Computer networking is starting to escape Western hegemony, as more and more locations across the world avail of the opportunities provided by this rapidly evolving field. Perhaps the most pleasing aspect of this book is the diversity and global reach of the papers published. A wide variety of topics are tackled by examining findings in places such as the United States, United Kingdom, China, Hong Kong and Sub-Saharan Africa. This international scope is refreshing and a timely reminder of the global nature of the telecommunication's community as a whole.

The book is divided into three distinct sections: developments in the telecoms field, international experiences in the development of the relevant infrastructure and the organizational challenges and impacts faced by the industry. The publication is well laid out in the customary style for books of this ilk.

The editor, Gerald Grant, readily admits that it "is impossible in one book to address all the issues and trends relating to telecommunications and networking". This much is true but the book still makes an admirable attempt at tackling the subject matter. The issues addressed are diverse and wide-ranging. This book hopes to make a small contribution to understanding some of the technologies, issues and challenges in telecommunications and networking in the early part of the coming century.

The merits of each paper are beyond the scope of this review but one paper which stood out for me by Carlson et al. entitled "Organisational Impacts of New Communication Technology". In this paper, the authors compare the implementation of cellular phones in the U.S. and France. It springs to mind simply because it combines the technical and political challenges faced in the field. Not all the submitted papers are of a purely technical nature; some touch more on the social and political ramifications of the field.

The telecommunications and networking field is experiencing tremendous growth in technologies and services. This is at the same time exciting and challenging. It is exciting because the new developments open the potential for many new advanced products and services. They are challenging because of the complex nature of the technologies themselves and the environments in which they are being deployed.

This book purports to present "some insights into these issues and developments". To this end it is successful but it does so in a very tried and tested manner, a manner which may satisfy the knowledgeable reader but not the novice or casual reader, who might not possess an immediate appreciation of the topics involved. - Declan J. Graham. End of Review

++++++++++

Nolan Hester.
Macromedia Dreamweaver UltraDev 4: Training From The Source.
Visual Quickstart Guide.
Berkeley, Calif.: Peachpit, 2001.
paper, 291 p., with CD-ROM, ISBN 0-201-72144-9, US$44.99.
Peachpit Press: http://www.peachpit.com

Nolan Hester. Macromedia Dreamweaver UltraDev 4: Training From The Source.

This book is an essential manual to developing rich and dynamic Web sites. You can recognise right away the influence of both Peachpit Press and Macromedia's developers on this very visual and clearly written guide. It is a compliment to both organisations for combining their resources to inspire the reader to try out Macromedia's powerful Web site authoring tool - Dreamweaver UltraDev 4. Add to this the writing style of Nolan Hester and you have a perfect learning tool.

In the course of this book, the reader will learn how to:

  1. Create Web sites that pull information from many different databases;
  2. Include dynamic data to produce custom and on-the-fly pages;
  3. Build pages that users can interact with, modify, or even update through their browser application window;
  4. Develop effective, precise and powerful record searches using SQL variables;
  5. Pass along user-entered information from page to page using server-side objects and modules;
  6. Set Password access and provide page-level security with server behaviours; and
  7. Use the Server Behaviour Builder to further extend Dreamweaver UltraDev's capabilities and increase team productivity.

The format follows a series of step-by-step lessons guiding the reader through the process of developing a database-driven and dynamic Web site. There are eleven lessons in total starting with setting up connections between multiple Web documents and databases, finishing with developing server behaviours to extend Dreamweaver's boundaries. The first lesson details the basics, such as the minimum requirements, and customising the Dreamweaver UltraDev work area, palettes and user interface. Lesson 2 assists in defining the Web site objectives and the transfer of locally authored content to the remote server where the site will be hosted. Lesson 3 brings in databases and connectivity issues as well as adding recordsets to documents. Lesson 4 helps the reader to understand how to include dynamic data and Lesson 5 details multiple recordset options. Lessons 6 and 7 instruct in the building of insert pages and updating Web site content using server behaviours. Lesson 8 shows how to develop searches with SQL variables and Lesson 9 continues with the display of server objects. Lesson 10 covers security issues and the setting of password access to data and documents. The last lesson encourages the readers to author their own server behaviours, such as writing cookies and storing user supplied data. The accompanying CD-ROM provides all the code and databases necessary to follow the lesson.

The one downside is represented by the fact that, in order for the readers to follow the step-by-step lessons, they will need to have access to a server with either of the following environments installed:

Unlike many other authoring languages where you do not need access to a special server to run your code, this is a necessity with Server Parsed Hypertext Documents. If you are running Microsoft Windows 95, 98, ME, NT4 or 2000, simply install the Microsoft Personal Web Server which, although limited in its capability, will provide you with the necessary server environment to parse the authored documents to the users browser window.

Overall this book is an excellent companion to Macromedia's Dreamweaver UltraDev 4. Although it is aimed mainly at the intermediate user or at the beginner who has just started experimenting with the UltraDev authoring application, and not as a guide to creating dynamic Web sites with any other authoring application, it is still highly commendable. - Glenn Dalgarno. End of Review

++++++++++

Darrel Ince.
Dictionary of the Internet.
Oxford: Oxford University Press, 2001.
paper, 340 p., with CD-ROM, ISBN 0-192-80124-4, £16.99.
Oxford University Press: http://www.oup.co.uk

Darrel Ince. Dictionary of the Internet.

With its newly released first edition of the Dictionary of the Internet, Oxford University Press has finally dedicated an entire volume (with accompanying CD-ROM) to this ever-expanding area of the English language.

Granted, there are already countless (free) online references and glossaries available, many even of excellent quality. However, what sets this 340-page dictionary apart from the rest is that it concentrates specifically on terms relating closely to the Internet, rather than dealing also with the computing industry in general; here you won't find words such as "CPU" or "floppy disk". Instead, the entries are generally of two kinds: those that describe technologies ("Hypertext Transfer Protocol"), techniques ("greek"), acronyms ("URL"), and those explaining the jargon used when online ("IMHO").

Overall, the 3,600-plus entries cover most of what you and I will be likely to encounter when surfing the Web, when reading/writing about information and communication technology, or when taking part in online chats. Darrel Ince is Professor of Computing at the Open University and his explanations are concise, accurate, and well cross-referenced. When going through some of the pages, it becomes clear how rich this linguistic corpus has become in only a few years, although it must also be recognised that a vast number of acronyms and specific terms have come to us all the way from the early days of computing, more than 40 years ago. The volume comprises three appendices: one lists country codes (such as .ie for Ireland or .nz for New Zealand), the second sports a few emoticons, while the third provides a compendium of the most used jargon abbreviations.

On the negative side, I found immediately a few omissions which I think should have been included: the identifiers "mailto:" and "news::", so common as hypertext links on many Web pages, for example. However, this is not a major fault, as it is likely that the only way to keep this reference work updated, including the many new terms that will invariably become part of the Internet parlance in the near future, is by adding a companion Web site. Fortunately, this is exactly what the author has done. The link, http://www.askoxford.com/worldofwords/internet, provides a way to complement the information contained in the book (and on the CD-ROM, too).

With respect to the CD itself, the first I noticed was that it seemed to have been produced with only Windows users in mind. The CD is named "010611_1609", which is not exactly a very descriptive title. Moreover, once opened, it reveals one folder and one file: "INTERNET.HTM". On a Macintosh, double-clicking it gives an error, as it has not been associated with any Mac applications (specifically a browser), so a Mac-savvy person will know that the file can be dragged into an open browser window to get to the main page. Other less-technical users, however, might wonder whether there is anything wrong with the disc. Under Windows no problems occur.

Once the main page is displayed, searching is easy through a little search field which matches the typed characters highlighting whatever comes close. Selecting the desired link will display (in a separate frame) its meaning. Oddly, there seemed to be no obvious way to go back to the initial page, as no link was provided. Repeated use of the "back" button was the only way. This is something that should be fixed for the next release, as "back home" links should be one of the first rules of Web usability.

Finally, a word on linguistic consistency and accuracy. Of all the fields of the English language, the one involved with Internet and the World Wide Web is probably the most anarchic of all. New coinages appear literally on a weekly basis, nobody appears to have real input on the appropriateness of new words or on their usefulness, especially when it comes to jargon. In general, it is nice to see a renowned publishing house, such as OUP, taking up the challenge of bringing some order to this exciting linguistic heritage. And it is exactly in such respect that consistency has to be a prime concern. Whether in numerous online publications, books, or in technical literature, the words "Internet" and "internet" or "Web" and "web" are used interchangeably, as if they meant the same thing. It is here that an authoritative dictionary should show the way. Alas, a few of these slips have managed to remain undetected even here: the back of the dictionary states that it "covers terms associated with the Web itself" but goes on to mention in the same paragraph that it provides "definitions for web jargon ... ." Still on the back, the reference to the included CD says: "Free CD-ROM to download to your PC". Well, downloading a CD to a computer is not something I have ever come across, so perhaps a more accurate phrase could have been used there.

In any case, these quirks aside, I was impressed by the effort; the Dictionary of the Internet is a great reference guide for anybody who wants to fully understand the Internet jargon and who wants to become fluent in the language of our modern digital age. - Paolo G. Cordone. End of Review

++++++++++

Olaf Kirch and Terry Dawson.
Linux Network Administrator's Guide.
Second edition.
Sebastopol, Calif.: O'Reilly, 2000.
paper, 503 p., ISBN 1-565-92400-2, US$30.95.

O'Reilly: http://www.oreilly.com

Olaf Kirch and Terry Dawson. Linux Network Administrator's Guide.

This second edition of the Linux Network Administrator's Guide (commonly abbreviated - perhaps irreverently - to "the NAG") sees Terry Dawson taking over as author and maintainer of what is, by its very subject matter, both worthwhile and an ongoing work-in-progress.

Perhaps one of the problems with a guide like this is simply whether the content can live up to the reader's interpretation of the title. I was initially disappointed with the NAG because it seemed to present a rather narrow definition of "network administration". This is probably because there is such a crossover between the worlds of network and system administration in an operating system that has networking at its core.

The book's treatment of hardware is limited - although it does provide some useful information on serial and parallel port IP should you need to implement them. The topics of network diagnostics and troubleshooting are barely touched on and most of the everyday network-related utilities - ping, traceroute and such like - might not really be described to the degree that a beginner might appreciate or expect.

Perhaps more pertinently, there is no discussion of what many users have come to regard as the mainstay of Internet working, namely the HTTP and FTP daemons. This is surprising as these services are generally installed by default with most of the popular Linux distributions. We are left to assume that the authors have either regarded these subjects as being sufficiently covered in other documents or that they were perhaps dissuaded by the variety of packages available. That said, there can be little argument against taking Apache and WU-FTP as examples of the art and providing a basic introduction.

In light of this omission, some readers may be surprised at the number of pages dedicated to protocols that are less widely demanded these days. Any administrator who has tried to drop support for a particular service knows how difficult this can be, even if it is only regularly used by a handful of users within a large organisation. To be honest, I find it difficult to imagine that the majority of this book's readers are clamouring to grapple with UUCP ... but, again, it is there if you need it.

However, these issues need to be seen in perspective. It is difficult to be critical of a guide that is written out of respect for an operating system and its community of users; one trivial reason being that the critic might end up on the receiving end of an invitation to fulfill any perceived shortcomings! Being released under the GNU Free Documentation License, the NAG is open to comment and correction from all quarters and this is perhaps its greatest strength. When you pick up the printed version of the NAG, there's reassurance in knowing that it has, to this extent at least, proven to be above reproach (see http://www.oreilly.com/catalog/linag2/errata/index.html).

Like its predecessor, the Second Edition begins with a well-written introduction to TCP/IP networking, supported with an explanation of how this is implemented in Linux. The structure of the book follows a logical process of implementing and configuring the most common network services as well as explaining some related security and maintenance issues.

Within the 150-odd new pages, Terry Dawson introduces Samba, the open source solution for connectivity with Microsoft's SMB file and printer sharing protocol, and provides us with a simple introduction to Internet firewall implementation using each of the technologies that have been incorporated into the Linux kernel - ipfwadm, ipchains and netfilter. The author also briefly discusses IP Masquerading and Network Address Translation (NAT); a definite boon for anyone wishing to allow local network access to the Internet via a single, perhaps temporary, connection. These are probably the sections that will be of most interest to domestic users or SOHO admins and demonstrates that the NAG is certainly moving with the Linux user base.

An extensive update to the Guide's Mail coverage sees the Sendmail chapters being thoroughly revised and a chapter on smail being supplanted with one on the more powerful Exim, written by project maintainer Philip Hazel. The Usenet section has also been upgraded with a comprehensive treatment of INN, a news server supported by the Internet Software Consortium that can, quite reasonably, be regarded as the industry standard. Other improvements include sections covering Novell IPX compatibility and IP accounting as well as a general updating and correcting of the original material.

The spiraling popularity of home networks and broadband Internet make this one volume that is sure to have an ever-increasing readership. Again, one has to admire O'Reilly for publishing a title that can be freely downloaded from the Internet. Most of the added value comes in the editing, indexing and presentation, which are first rate. The remainder comes in the fact that, unlike the online version, it can be read as easily at the console, on the train or in the bath! - Rory Beaton. End of Review

++++++++++

Pippa Norris.
Digital Divide: Civic Engagement, Information Poverty, and the Internet Worldwide.
Cambridge: Cambridge University Press, 2001.
cloth, 285 p., ISBN 0-521-80751-4, US$59.95.
Cambridge University Press: http://www.cup.cam.ac.uk/

Much has been written about the digital divide, from the excellent reports issued by the U.S. Department of Commerce's National Telecommunications and Information Administration (NTIA) - collected at the " Falling Through the Net" site - to those on other parts of the world (see, for example, " Bridging the Digital Divide" on the situation in central and eastern Europe). However, no one source has pulled all of the scattered information together in a concise way until now, with Pippa Norris' excellent and thoughtful book Digital Divide.

In three parts, Norris first describes the digital divide in the United States and in the rest of world. As she notes "Like gambling at Rick's bar - some popular accounts are shocked - shocked - to discover social inequalities on the Internet." Unlike those shocking popular accounts, Norris backs up her declarations with plenty of data in this section. Nothing like an array of tables and graphs to truly induce shock treatment! The second part of the book then examines the political uses of the Internet, from digital democracy in its various flavors and interpretations. Norris treats civic society well in this section, especially with her examination of online media (both traditional - newspapers - and "flash" - those online events that burst into flame on a very specific issue and then disappear). In the last part of the book, Norris brings everything together in treating the whole idea of a "democratic divide". Will the Internet lead to more civic engagement, or less? Even for Norris, it is difficult to peer into the future, since the Internet is so distinctive from previous "revolutionary" technologies in the past century.

Digital Divide is exciting, thought-provoking, and engaging. I only hope it has some impact among policy makers, politicians, corporations, and foundations as they collectively try to address the many issues in this book. I have already used examples from this book in classes, so I expect it will make its way into the curriculum to influence future generations. If you read only one book on this topic, make it this one. - Edward J. Valauskas. End of Review

++++++++++

David H.M. Spector.
Building Linux Clusters.
Sebastopol, Calif.: O'Reilly, 2000.
paper, 332 p., with CD-ROM, ISBN 1-565-92625-0.
O'Reilly: http://www.oreilly.com

David H.M. Spector. Building Linux Clusters.

In recent years, clustering has become the buzzword in many IT departments. Servers are clustered together with high availability products as a move towards continuous availability of resources. Large organisations purchase powerful "supercomputers", clusters of processors acting in parallel in order to carry out large numbers of calculations for "number crunching". Obviously, this technology has not been available to users in smaller organisations or to the home user.

This book provides an introduction and the software to take a number of relatively inexpensive machines and link them together to build a quite powerful parallel computing cluster. The software is scalable to allow really large clusters to be built, and is easy to configure.

The author starts from first principles and covers some of the history of clustering before going on to some of the important considerations such as parallelism and networking. One criticism at this point is that the copy of the book included some errors that should have been corrected by the editor or the author at some point. Illustrations located in the wrong place certainly detracts from the content. The author goes into some detail about the physical considerations of building a cluster, not least of which is where to locate the hardware, and what sort of hardware to get. The suggestions include racking, suitable flooring and cooling, since clusters of servers produce a lot of heat.

The next chapters cover installing the software and setting up the cluster. I initially thought the section on installing Linux was in the wrong book, but when I tried to get a cluster up and running, I realised it was there for a reason, and re-reading it solved all the problems I was having. The configuration of the nodes in the cluster is described well and would help a relative novice in UNIX or Linux to understand some useful networking concepts. The description of cluster management tools is clear and instructive and the author goes to the trouble of describing the cluster management database in some detail.

The second part of the book is useful for anyone interested in developing applications for a parallel cluster. It covers some of the tools that can be used for developing and debugging parallel applications. This is followed by a chapter that is intended to cover programming in a parallel environment, but which only scratches the surface of the topic. The author uses examples from his experience as a developer of Multi User Dungeons and gives examples of compilers that can be used in a parallel environment. This section was unfortunately a little light on content, but I applaud the author in trying to condense a huge topic into a single chapter. Building Linux Clusters provides three example programs for use within a cluster, and I had quite a lot of fun playing with these applications. More examples such as these may have been better than the previous chapter, as I found them more useful than the explanation of parallel programming.

The final section is a selection of appendices listing some very useful resources, relevant APIs, installation details and the Cluster Administration Database schema.

Overall the book is an excellent introduction into the use of Beowulf to create a parallel processing cluster. I would have liked to see more mention of High Availability clustering for Linux especially in the light of move to port HP MC Service Guard to Linux, and with the involvement of IBM in Linux I would not be surprised to see a Linux port of HACMP in the near future. - Peter Scott. End of Review

++++++++++

Julie M. Still (editor).
Creating Web-Accessible Databases: Case Studies for Libraries, Museums, and Other Nonprofits.
Medford, N.J.: Information Today, 2001.
cloth, 200 p., ISBN 1-573-87104-4, $39.50.
Information Today: http://www.infotoday.com/catalog/books.htm

Anyone who takes online databases for granted in this technological age will find this book an eye opener. Julie M. Still has gathered together a varied set of case studies and related articles on the labour-intensive art and science of online database creation and delivery. Herself a librarian, Still includes seven chapters from libraries (six in the U.S., one in the U.K.), one chapter on e-texts, three chapters from more or less commercial organisations (to some extent belying the 'non-profits' of the title), and one chapter on a database of academic research projects. The book is designed to answer a need for printed resources on how to turn a local database into a Web-accessible one.

Still's interests as an historian are also in evidence: clearly, all archives are by definition historical, but in addition to this, history as a subject of academic study is covered in a fascinating account by Melissa Doak (Chapter 4) of the Women and Social Movements in the United States, 1820-1940 Web site at the University of New York at Binghamton. This site began in 1997 as a means of publishing students' history research projects, and has now evolved into a large and popular collection of primary and secondary sources on a wide range of topics in American women's history.

Contributors to the book range from people working on behalf of organisations, to individual people plugging away at something they believe in. The most heartwarming story comes from Brian-John Riggs (Chapter 7), a secondhand book collector. The closure of his local 'book barn' following its move to an online database led Riggs himself to join the same e-network of secondhand and antiquarian booksellers. He recounts the steep learning curves, the doubts, the excitement, he experienced as his lifelong dream of running his own 'bookstore' became a reality. Within 24 hours of uploading his catalogue to the database he had received his first enquiry. This is the human face of the Internet at its most engaging.

Contributors relate the uphill process they went through in order to make their resources accessible to the Internet community, sometimes needing to digitise traditional media such as local card indexes or typewritten lists. We read of small beginnings leading to growing popularity and recognition from end users, and it soon becomes clear that there is no end to database creation and maintenance, as many users expect to see something new each time they visit a site.

Problems and setbacks are related as well as successes: this is a strength of the book, as readers are given an opportunity to learn from these pioneers' experiences. For example, Vicky H. Speck (Chapter 6) warns that the process will take longer and cost more than originally envisaged, and that the number you first thought of should be doubled when estimating storage space and processing power. Jeff Strandberg (Chapter 8) warns the reader not to go online just for the sake of it, before their site is ready: a half-baked 'under construction' site will not entice many users to revisit later.

Elizabeth Roderick (Chapter 5) stresses the importance of tracking information, so that database managers can gauge users' interests and respond accordingly. Security is an issue too, including the watermarking of images so that piracy can be detected. On the plus side is the fact that increased access to resources can stimulate ideas for research that people might not otherwise have had, and ready availability without the need to travel speeds up the research process. Also original documents, once digitised, will be saved from the wear and tear of continuous handling.

Anne T. Keenan (Chapter 9) writes from the point of view of the librarian offering Internet access to the general public in a small local library, and it is interesting to learn that e-mailing, e-shopping, and quick fact location are the three main priorities. Many users are not technologically sophisticated, and need the help of a librarian to get the most out of their searching.

Mary Mark Ockerbloom and John Mark Ockerbloom (Chapter 3) stress the collaborative potential of their two e-text database projects: thousands of people have had an input, whether as volunteers who input text, or simply as reporters of broken links. Amazingly, these two projects have so far operated without funding, thanks to their collaborative nature, which goes to prove what a powerful medium the Internet is for the sharing of knowledge, giving people the opportunity to play a part in the information world and enrich others' lives.

Now to some minor quibbles. One writer mentions a library containing 'census materials and old phone directories dating back to the early 19th century' (p. 28): the latter items would be quite a feat, even in the technologically advanced U.S. And there is a sense sometimes that a purely American readership is being catered to, for example in references to 'the antebellum decades' (p. 56) and 'the antebellum period' (p. 65), which leaves the non-American reader puzzled as to which war is 'the' war, in American terms. Finally, a few screenshot illustrations would have been nice.

What the book makes clear is that a whole range of Web users, from professional to amateur, now have increasing expectations, the free availability of online databases being one of them. Indeed, Speck (Chapter 6) states that younger students are now so Web-orientated that they refuse to look at printed resources! Laura B Spencer (Chapter 10) points out that user expectations can be unrealistic: an assumption is growing that all desired information should be at one's fingertips, and that no effort is needed in the actual research process. On the contrary: despite the easy availability of materials, time is still needed to sift through them, evaluate them, and think through the research topic. The fact remains that technology does not replace the need for us to use our brains. And as Aurora Ioanid and Vibiana Bowman (Chapter 11) tell us, when searching a centuries-old e-text one has to bear in mind different spellings, such as 'wytt' for 'wit', and 'trew' for 'true': trewely, we must keepe our wyttes about us. Terminology also changes through time: for example anyone researching social attitudes towards homosexuality a hundred years ago will need to work out which keywords to use if they are going to obtain any search results.

The book ends, appropriately enough, with a look towards the future. Richard Gartner (Chapter 12) argues that software standardisation will become increasingly important as more and more databases are set up. Many formats are mentioned during the course of the book, and it is hard to predict with accuracy how soon any software will become obsolete, but Gartner advocates XML (eXtensible Markup Language) as a universal and, hopefully, long-lived format.

So the demand is there, and the onus is on the owners and managers of databases to fulfil it. Fortunately the overriding ethos of the Internet - that of easy access to free information - is one that many libraries believe in and try to live by. Long may that ethos continue. As Ronald C. Jantz (Chapter 1) says, the situation is both a challenge and an opportunity, while Roderick (Chapter 5) speaks eloquently of 'the critical linkage and balance between change, tradition, innovation, people, and technology' (p. 86). By the end of the book, the message comes across that those in charge of information resources need to keep their skills updated and make their databases Web-accessible in order to remain viable in the brave new world of online searching. - Gill Stoker. End of Review

++++++++++

Jason Cranford Teague.
DHTML and CSS For The World Wide Web.
Visual Quickstart Guide.
Berkeley, Calif.: Peachpit, 2001.
paper, 592 p., ISBN 0-201-73084-7, US$21.99.
Peachpit Press: http://www.peachpit.com

Jason Cranford Teague. DHTML and CSS For The World Wide Web.

I was taken aback when DHTML and CSS For The World Wide Web crashed through the letter box, over an inch thick. A book this size would probably scare the socks off most Web designers and developers seeking to get an insight into the pros and cons of authoring DHTML and CSS hypertext documents, using standards endorsed by the World Wide Web Consortium. This book is a massive and hugely comprehensive reference work, useful in adding another level of usability to an Internet presence.

Whatever you wanted to do with DHTML and CSS is covered by this guide and, as usual with a Peachpit Press publication, it is full of diagrams, code, screenshots and images supporting the highly instructional text. As it states on the cover, "Teach Yourself DHTML and CSS the quick and easy way", and the author has done a brilliant job of making the text informative and easy to grasp. You'll be writing and experimenting with the examples in the book from the time you open the book to the minute you put the book on your desktop, in easy reach of your keyboard.

The Guide is divided into four main parts/sections: Part 1 - Cascading Style Sheets; Part 2 - Dynamic HTML; Part 3 - Using DHTML and CSS Tools and Part 4 - Dynamic Web Sites. Part 4 also contains an invaluable appendix with both CSS and DHTML quick reference guides, browser-safe fonts, tools of the trade and a definitive resource listings. This second edition, additionally, takes into account the "newish" Microsoft Internet Explorer 5.0 and AOL Nestcape Navigator 6 as well as the development of the CSS 1, 2 and 3 Recommended Standards now endorsed by W3.

In Part 1 - Cascading Style Sheets, you will find information ranging from CSS to font controls, from colour and background controls to visibility controls. Part 2 - Dynamic HTML covers DHTML, the Document Object Model, dynamic techniques, Netscape Layers and Internet Explorer for Windows. In Part 3 - Using DHTML and CSS Tools there are two chapters relating to authoring with an Adobe GoLive Primer and a Macromedia Dreamweaver Primer. Part 4 deals with creating a Dynamic Web site, Web page layout, importing external content, Web site navigation, special effects and debugging; it concludes with some insight into the future of the dynamic Web.

There is nothing boring about this book, it is a breath of fresh air every time you need to delve into its innards to find the answer you seek. If you are either a budding Web designer or developer the guide will help you bringing to your users a vastly improved level of interactivity. If you're interested in understanding the Web, or if you find yourself wondering what makes a site dynamic and what a Netscape layer might be,then this book will answer these questions and a whole lot more besides.

If you are an Internet professional, you might be choosier. However, I can vouch that this book has more than any other book on either CSS and/or DHTML I have purchased in the last eighteen months. My hat is off to Peachpit Press and to Jason Cranford Teague for another worthwhile publication. - Glenn Dalgarno. End of Review

++++++++++

Patricia Wallace.
The Psychology of the Internet.
Cambridge: Cambridge University Press, 1999.
cloth, 264 p., ISBN 0-521-79709-8, US$24.95.
Cambridge University Press: http://www.cup.cam.ac.uk/

Patricia Wallace. The Psychology of the Internet.

Patricia Wallace's survey of psychological issues related to the use of the Internet is simply the best overall book about the topic. Detailed, yet highly accessible, it is written so that anyone with an interest in the topic (and who wouldn't be?), can find some useful and helpful fact or story in this book's pages. I certainly now wonder more seriously about a question that Wallace asks on page 181 - what makes the Internet so compelling?

This book is divided into 12 chapters, examining personalities on the Internet to flaming to pornography to altruism. Frankly, my favorite chapter was simply entitled "The Internet as a Time Sink" - and that's a fact. Each chapter discusses case studies, experiments, and reports on the human use and abuse of the Internet. Much of the literature cited by Wallace has never quite been brought together in this fashion, with Wallace's special perspective and commentary.

I only wish Wallace had attacked some of these topics in greater depth. I would love to read her interpretation on Garrett Hardin's "tragedy of the commons" beyond a mere few pages (pp. 242-244). Her comments on the sensitivity of the commons to misuse - not only about the obvious misappropriation of bandwidth, but more importantly about the misuse of interpersonal trust - are excellent, but I would have preferred more than a few, well worded paragraphs!

Interested in psychological issues related to the Internet? There's only one book you should read and it is simply Patricia Wallace's The Psychology of the Internet. - Edward J. Valauskas. End of Review


Contents Index

Copyright ©2001, First Monday