First Monday

FM Reviews

Julian Baggini.
Making sense: Philosophy behind the headlines.
Oxford: Oxford University Press, 2002.
cloth, 296 p., ISBN 0-192-80339-5, £14.99, US$26.00.
Oxford University Press: http://www.oup.co.uk

Julian Baggini. Making sense: Philosophy behind the headlines.

Philosophical discourse is often thought to be abstract and irrelevant to our day-to-day lives. In reality, however, philosophy can be applied, with much success, to the evaluation of many spheres of human endeavour. From human rights to the legitimacy of protecting our environment: approaching the matter critically and with philosophical insight can only be beneficial. And so it is also with respect to news headlines.

How do we know whether what we read is true or not? How do we define the very notion of truth? Is there one truth at all? The author of Making Sense, Julian Baggini, opens his work with and extended discussion of epistemology, the status and nature of truth itself, and the relationship we have with that truth. Becoming clear about the various concerns related to the existence of truth will help later, when stories from newspapers and television programs will be assessed. Here we also learn about the distinction between realist and non-realist view, whereby one accepts that the truth exists independently of our knowing it, while the other assumes truth to be constructed out of perceptions and values taken from society and culture; we learn about relativism, about the concept of proof and about the fallibility of our own knowledge.

Although this initial chapter might, at first, appear daunting because of its sophisticated concepts, it is well worth the effort: the remaining discussions of subsequent chapters will make much more sense after the basic epistemological ideas have been grasped. And it is not that Baggini indulges in overly convoluted language. On the contrary, his exposition is engaging, peppered with easy-to-understand illustrative analogies to complement the most abstract ideas, and always remains anchored to the subject at hand, which is, after all, the role of philosophy in helping making sense of our world.

The material presented in the rest of the book spans a wide range of topical events, all taken as example for the analysis of perceptions, values, and convictions that invariably come into play when we respond to news headlines. For example, a chapter considers ethics and private life, by using former president Bill Clinton's private affair with Monica Lewinsky and asking whether it was really the concern of others, and to what extent. Further on, politics and the threat of war are interpreted through the eyes of the objective philosopher: what are the reasons for going to war? Do they provide justification for such a terrible act? What about applying the "just war theory" to find out? But if war is not justified, what about Pacifism? Is this moral opposition to all violence reasonable and tenable?

Another chapter is devoted to the role of science and scientists in the context of modern discoveries, particularly when these turn out to be controversial: the current debate on genetically modified food, or the BSE epidemic that swept throughout the United Kingdom in recent years, for instance. Again, Baggini applies philosophical "tools" to expose the real issues. Take harm and responsibility: "Many scientists would say that they are rarely, if ever, responsible for harms or benefits to society. Responsibility lies with those who use scientific findings." By referring to the moral principle of double effect we can test whether the scientists' stance is tenable: "This principle states that someone only does wrong, and is thus morally responsible for any harm caused, if they both caused and intended the harm. [...] In the BSE case, the defence is that scientists are not responsible for the consequences that follow the publication of their findings. If we accept the principle of double effect, we would have to agree that this defence works. Unless the scientists intended to deceive the public as to the risks of BSE, they are not responsible if, in fact, these risks are not properly dealt with."

The beauty of philophizing about a situation, is that it lays bare both the rational motivations, as well as the faults of invalid arguments, or of valid, but unsound ones. Too often we simply react instinctively to what we hear or read without attempting to recognise the larger picture; the sensationalism of modern journalism only hampers our efforts to make an informed judgement or take a coherent stance.

The signs are encouraging, however. We still have books, such as this one, that tell us how the truly philosophical individuals

"treat their own views with as much skepticism as those of the people they read about. They are always ready to subject any belief to rational scrutiny, not as a game, but in order to understand more. Their broad outlook and openness to new arguments give their life a kind of freedom and space. They learn a sense of perspective and of humility. They also learn when thinking is appropriate and what kinds of reasoning are suited to different purposes. They do not always expect final answers, but follow Aristotle's advice to expect only that degree of precision which each subject matter allows. [...] The information they acquire through their following of the news informs their own beliefs and opinions at least as much as their philosophical beliefs inform their understanding of the news."

This is applied philosophy at its best. — Paolo G. Cordone End of Review

++++++++++

Darlene J. Burnett and Diana G. Oblinger (editors).
Innovation in student services: Planning for models Blending high touch with high tech.
Ann Arbor, Mich.: Society for College and University Planning (SCUP), 2002.
paper, 279 p., ISBN 0-970-04131-4, US$40.00.
SCUP: http://www.scup.org/studentservices/

Darlene J. Burnett and Diana G. Oblinger (editors). Innovation in student services: Planning for models Blending high touch with high tech.

The aim of this book is to provide the reader with up-to-date information on the changes that technology and process re-engineering are making on student services. This book offers a set of formulated case studies by members of the IDM Best Practice Group, representing over 23 institutions. In the course of this work they share their experiences in developing innovations in student services.

Each part of this publication looks at areas of the student/customer culture that has arisen as we try to blend the high touch/high tech models to the student arena. Higher education has changed over the past few years; to keep student enrollment at high levels, many institutions have evolved to meet the needs of this demanding customer base.

With the advent of technology and the Web the way we teach and want to be taught has forced a re-evaluation of teaching methodologies and the delivery of teaching materials. This publication uses case studies to provide a view of innovations and changes that have resulted in a suite of best practices. Hence there is a focus on the need to re-engineer administrative and business operations to offer better services, not only to students but parents, faculty and staff. As a result there have been some radical changes in many institutions, giving the reader a variety of solutions, for both the physical and digital infrastructures.

Although this book concentrates on U.S. institutions, with only a passing reference to work in this area carried out elsewhere, I would recommend it as a reference tool that should be on the bookshelf of administrators. — David Phillips End of Review

++++++++++

Eli Cohen.
Challenges of information technology education in the 21st century.
Hershey, Pa.: Idea Group Publishing, 2002.
cloth, 290 p., ISBN 1-930-70834-3, US$74.95.
Idea Group: http://www.idea-group.com/

Eli Cohen. Making sense: Challenges of information technology education in the 21st century.

As the cover notes for the book explains, "over the past two decades, Information technology education has not just undergone evolution but more correctly revolution. What is taught and how it is taught has changed immensely". The book itself is a collection of 13 chapters all contributed by academics in the U.S., New Zealand and Australia, with an introduction by the editor Eli Cohen. Topics covered include: data modelling; pedagogy and technology; bridging the university-industry gap; information systems; and, curriculum development as an ecological process. The chapters are arranged into four thematic categories: examples on how to teach specific topics; teaching techniques and pedagogy; impact of the Web on IT teaching; developing an IT curriculum.

Despite the arrangement into thematic sections, most of the chapters are about the pedagogy of teaching IT, and the theories of teaching and learning which are used to underpin curriculum design, content delivery and assessment. Many contributions stress the need for the academy to meet the demands of an IT industry which is hungry for flexible and highly trained individuals who can continue to acquire new skills once in post and are able to self-direct and manage their own learning. Traditional IT teaching is seen in this volume as being too based in theory and hands-off work, and also stresses the needs of the individual rather than being team-based and founded in collaboration and communication. The contributors stress the need to get IT students working in teams, with their hands placed firmly on the technology and solving real-life problems. This is surely sound advice and anyone running an IT course would find much here to interest them, and to liven up a curriculum in need of some new thinking.

The book sets out a constructivist agenda for IT teaching, stressing the need for individuals to arrive at their own definitions of knowledge, rather than having the knowledge imparted to them as a one-way transaction. Scholars interested in pedagogy will hardly be overwhelmed by the originality of this mission statement (it is been around since the 1960s), but I get the sense that Cohen and his counterparts may be struggling with a conservative rump of IT educators who do see education as imparting the facts to a passive audience, and for whom this book may come as something of a challenge.

The fact that this book is an edited work with chapters on discrete topics is both its major strength and its major weakness. The strength comes from the fact that if you find a chapter of direct interest (such as problem based learning or using Management Information Systems in teaching), then you'll get around 25 pages of material which is organised into a coherent essay — complete with an abstract as it if were a journal article. But if you are browsing for ideas or want to read the book from cover to cover, then you may find the lack of a coherent thread running throughout a problem. It is true that all of the chapters fulfil the promise of the title of the collection, but an edited collection always suffers from a lack of narrative structure and development of ideas which can be achieved in a monograph. There are also some rather amateurish diagrams contained in some of the chapters and it is hard to take an argument seriously when illustrated with a few pieces of clip art seemingly randomly selected from a Microsoft Office application (and this in a book written by IT teachers!). The typesetting shows problems as well, and the kerning and word-spacing in some lines is so awkward as to render the line almost unreadable. We could have expected better in a scholarly work.

The potential audience for this book is limited. Those running an IT course would find things of interest here, but I struggle to find another category of reader who would enjoy or benefit from this collection of rather specialised essays. If you are interested in pedagogical issues and wish to read about learning theory then there are books around which do this much better. So the book has a limited audience, but it is a useful contribution to knowledge and with IT education continuing to grow, we can expect more books of this kind in the future. — Matthew Pearson End of Review

++++++++++

Daniel R Headrick.
When information came of age: Technologies of knowledge in the age of reason and revolution, 1700-1850.
Oxford: Oxford University Press, 2001.
cloth, 256 p., ISBN 0-195-13597-0, £24.50, US$45.00.
Oxford University Press: http://www.oup.co.uk

Daniel R Headrick. When information came of age: Technologies of knowledge in the age of
reason and revolution, 1700-1850.

The 20th century has long been referred to as 'The Information Age', thanks to the invention and widespread use of the computer. But when we think back to the 19th century, and the enormous energy that went into keeping of public records and statistics: anyone researching their family history (a hugely popular activity nowadays) will be well aware that in the U.K. the registration of births, marriages and deaths became a legal requirement during the 1830s, while census information began to be gathered at ten-yearly intervals from 1841 onwards. The Victorians produced huge volumes of paper and ink in their drive to document everything and everybody. But then we think back again to the parish register, a source of information on christenings, marriages and burials going back to the 16th century. And back still further to the Domesday Book of the 11th century.

So the human need to gather, analyse, process, and disseminate information goes back a very long way, and at every stage the purpose behind it was usually a political one, in the sense that knowledge is power. The Victorians, for example, aware of a rapidly expanding population and increasingly complex world, felt the need to keep track of social trends such as migration from country to town, housing conditions, employment, birth rates and death rates, in order to legislate, control, and generally 'run' the country. These trends have continued, and multiplied geometrically, throughout the 20th century, as technology has given us more ways of collecting, storing, analysing and disseminating information. And today, as the Internet gives more and more people the power to become gatherers, processors and purveyors of information, the whole business has become highly democratic and, at the same time, somewhat chaotic.

Headrick chooses the period 1700 to 1850 for the focus of his book, having identified it as a turning point, a time when a culmination of technologies led to a huge growth in information processing: improvements in map making, the establishment of uniform systems of measurement, a boom in the production of dictionaries and encyclopaedias, the classification of plants, the beginnings of modern chemistry, the development of statistics and their graphical display, the invention of the telegraph, improvements in postal services. He stops in 1850 at the point where the mechanisation of information began. He covers a large number of inventors, scientists and pioneers in Europe and America, from the famous such as Linnaeus and Lavoisier, to the less famous who, nonetheless, made important contributions in their field.

The book itself is a treasure trove of information, well written and well illustrated, and is full of evidence to dispel the myth that Information Technology began in the 20th century. — Gill Stoker End of Review

++++++++++

Mehdi Khosrow-Pour (editor).
Web-based instructional learning.
Hershey, Pa.: IRM Press, 2002.
paper, 322 p., ISBN 1-931-77704-7, US$59.95.
IRM Press: http://www.irm-press.com/

Mehdi Khosrow-Pour (editor). Web-based instructional learning.

This volume brings together an interesting selection of articles addressing aspects of Web-Based Learning that range from strategies for the use of distance learning technology to infrastructure issues in the "Third World". As such, it constitutes a valuable introduction to the multiple facets — pedagogical and technological — of the use of the Web in higher education, and provides a wide variety of perspectives. While half the authors here are American, there are contributors from a wide variety of countries, mainly based in universities, but also including a few in the corporate sector.

It should be noted that all the content has previously been published in other collections. Two chapters have appeared in earlier books already reviewed by this reviewer: chapter 7 is the same article as chapter 3 of the Beverly Abbey book reviewed at http://www.firstmonday.org/issues/issue6_12/reviews/ and chapter 8 first appeared in the book edited by Discenza et al., reviewed at http://www.firstmonday.org/issues/issue7_6/reviews/. The first four chapters were previously published in Distance learning technologies: Issues, trends and opportunities, reviewed at http://www.firstmonday.org/issues/issue6_3/reviews/, and most of the remaining contributions have previously been published in earlier volumes by the same editor. With 24 short chapters, the book is like a selection of appetisers, though the references provided for each chapter constitute a useful resource for those wanting to explore in more depth.

Theories of learning are not extensively represented here, but there are two very interesting articles on the topic. In chapter 1, having established the theoretical background of constructivism, Valerie N. Morphew addresses the question "what experiences should an instructor provide to help facilitate the act of co-construction?" (p. 5). She surveys a number of practices adopted in traditional classrooms and then briefly discusses how the same experiences that stimulate thinking and facilitate the co-construction of meaning in traditional settings can be made available to the distance learner. In chapter 7, Louis H. Berry focuses on the cognitive effects of Web page design, producing an interesting synthesis of research from various disciplines.

Chapter 2 is concerned with corporate training, and the authors are sensitive to the distinctions between this and general education: "In borrowing what is learned from higher education for application in business, care must be taken to respect the similarities and differences of each environment, so that the integrity of the generalizations is maintained" (p. 17). Their framework is based on the integration of change management, strategic management and project management, and in a nice twist on the pitfalls of technological determinism, they conclude that distance education must not be conceived as a solution waiting for a problem.

The constructivist perspective of chapter 1 is complemented by the emphasis on collaborative activities in chapter 4. Starting from the premise that learning occurs when faculty develop and encourage discussion through the use of social interaction, C. Mitchell Adrian poses the question of how to develop or maintain an environment of social interaction for a distance education program. The discussion focuses on ways in which electronic communication technologies can be combined with concepts taken from Total Quality Management.

Chapter 5 provides a very general overview of issues in Web-based education. Just as Morphew in Chapter 1 is concerned with applying traditional practices to distance settings, A. K. Aggarwal and Regina Bento here argue that "Web-based education can successfully simulate face-to-face teaching models, while adding some unique features made possible by the technology" (p. 59). Despite their support for Web-based education, they are concerned with "allowing Web-based courses to replicate more seamlessly the features of face-to-face instruction" rather than breaking away from the paradigm of face-to-face instruction in order to develop a pedagogy of distance education.

Taken to the extreme, the insistence on using Internet technologies as mechanisms for "delivering" education rather than using their potential for the social constructivist development of knowledge leads to the position expressed by Henry H. Emurian in chapter 9: "Since the inception of the world-wide web, nothing has changed about the way that people learn". This is contradicted in chapter 15, which reports on a study of the relative performance of students who took the traditional version or the Internet version of courses at the Metropolitan State College of Denver. The authors note that "there are significant differences in online learning experiences when one delves more deeply into how mastery of the material is obtained" (p. 180). Partly as a result of this, "the authors believe the findings support the theory that Internet delivered distance education courses require different design" (ibid.)

While several chapters in this collection cover individual technological elements such as audio and video streaming or visual basic programming, relatively little attention is paid to the issues of pedagogical design. Chapter 12, which discusses the "Classroom Component of an Online Learning Community", provides a useful complement to the discussion in chapter 4 of "Developing a Learning Environment"; and in chapter 20 Morgan Jennings, also of Metropolitan State College of Denver, discusses "What Do Good Designers Know That We Don't?". Noting that "a person who is attentive, emotionally involved, and engaged in discovery within a learning environment is more likely to learn and enjoy the experience" (p. 238), Jennings concludes that the aesthetic design of the learning environment may be a critical factor in promoting learning.

Overall, the variety of perspectives brought together in this collection provides a great deal of food for thought, although with 24 articles in the space of 300 pages, there is little room for in-depth discussion. — Peter J. Beech End of Review

++++++++++

Lawrence J. McCrank.
Historical information science: An emerging unidiscipline.
Medford, N.J.: Information Today, 2002.
cloth, 1,500 p., ISBN 1-573-87071-4, US$149.95.
Information Today: http://www.infotoday.com/catalog/books.htm

Lawrence J. McCrank. Historical information science: An emerging unidiscipline.

Historians have commented that the lack of written correspondence will mean that the interpretation of today's events will, perhaps, not be as rich as in the bygone age of letter writing. Yet today more and more is published both on paper and electronically. The work of information scientists, archivists and librarians provides valuable assistance in locating information but there remains the nagging doubt that the all-important paper or book may have been missed. Archivists struggle to deal with electronic material: an 8-inch floppy disk holding data will physically degrade and the data may be stored in a format unique to one system. The volumes of material produced cannot possibly be preserved. On the other hand, how can we decide which items will provide "continuing value" without invoking the prejudices of our age? Historical studies have developed considerably in the last quarter century, becoming on the one hand analytical and on the other populist. Lawrence McCrank's book examines the ways in which history, information science and computer technology might be brought together in a new discipline to support these changes.

McCrank describes the book as a "bibliographic essay" and, not surprisingly, just under half its pages are devoted to appendices, glossary, a vast bibliography and indices. I understand that the book is an extension of the author's work commissioned by the American Society for Information Science on "History, Computing and Archives" in 1995. The extension has included more current references although these are in the minority. It has a strong American bias with examples being drawn from institutions and government but does not neglect the European perspective.

The writer examines the changing way that historians approach their work, looking at the increasingly scientific approach to analysis of data and ways in which the vast volumes of data might be organised. More importantly the book has an educational strand. There is a graduate curricular guide for an Historical Information Science program in the appendices and the author proposes that this subject area be accorded its own place in academic study. In another sense the book itself is educational, throughout it I found myself wanting to follow-up on the examples given, to view the applications mentioned, to read the referenced works and getting ideas for projects, applications or research (never enough time!).

Sadly this book may not find its audience easily. Its size and presentation (my postman was astonished at its weight) will put off all but the keenest student. The style is precise and academic, carefully considered and well referenced. Buy a copy for your institution's library and direct your history, information science AND information technology students to it. Select their references carefully and each audience will benefit from the wealth information, starting points and opportunity to experience academic writing that it offers. And then, if you can, buy a copy for your own bookshelf and dip into it from time to time to re-examine definitions, resources or simply inspiration. — Wendy Baird End of Review

++++++++++

Evi Nemeth, Garth Snyder, Scott Seebass, and Trent H. Hein.
UNIX system administration handbook.
Third edition.
Upper Saddle River, N.J.: Prentice Hall, 2001.
paper, 896 p., ISBN 0-130-20601-6, US$69.00.
Prentice Hall: http://www.phptr.com

Evi Nemeth, Garth Snyder, Scott Seebass, and Trent H. Hein. UNIX system administration handbook.

This is the latest version of a book which has gained a formidable reputation amongst systems administrators since its first publication in 1988. This is a reputation that is deserved as the book is one of the most complete resources on this topic you could ever wish to find.

If you were to ask any systems administrator working in a mixed UNIX environment which is the commonest book they would all want, the answer would invariably be the "UNIX Admin's Rosetta Stone ". There are Web sites that purport to offer similar information, however, they are generally little more than a few pages listing the equivalent commands from each of the UNIX flavours. This book starts at the very beginning and covers the theory and general approach before including subsections covering the specifics of the chosen Operating Systems — Solaris, HP-UX, RedHat Linux and FreeBSD. The authors do state that they cover all the major variants of UNIX in the preface, although I do find surprising the complete lack of AIX coverage. AIX is a major variant of UNIX, and has a wide install base both in mixed and single install environments; I would have liked to see AIX covered in the book, regardless of the eclectic and arcane nature of the OS.

The tone of the book is friendly and approachable, and the authors come across as experienced colleagues keen to pass on their experiences and to help you avoid the pitfalls that they have experienced. They don't seek to simply churn out the contents of the manuals supplied with your UNIX boxes but begin by looking at the way the system boots and how you control it, and end with system daemons and how you control those. In between a large section covers e-mail, network connectivity and DNS. After all as we are often told UNIX is synonymous with the Internet so we should expect to see this. There is a very useful section on security, but the most interesting and useful part of all relates on policy and politics.

Policy and politics are often overlooked when carrying out day to day admin tasks, and when you have to administer over 150 servers, then they are areas that do tend to get moved into the background as the inevitable housekeeping and user requests come in each day. The authors stress the importance of planning and developing policies for disaster recovery and go on to give a number of interesting and amusing examples of what happens when policies are not implemented or don't exist at all.

There is no UNIX System Administrators Rosetta Stone, but this book comes very close. You won't find all the answers here, but you will turn to this book first when you need to start looking for an answer. If I were a systems admin on a desert island, I would want to have this with me. If you are a new systems admin or have been doing the job for years, you should have this book in your collection. — Peter Scott End of Review

++++++++++

Derek M. Powazek.
Design for community: The art of connecting real people in virtual places.
Indianapolis, Ind.: New Riders, 2001.
paper, 336 p., ISBN 0-735-71075-9, US$30.00.
New Riders: http://www.newriders.com

Derek M. Powazek. Design for community: The art of connecting real people in virtual places.

The Internet provides us with myriad opportunities to engage with others. The promise of building virtual communities of people, all working online from different locations and connected together by the magic of technology is frequently invoked in many spheres, especially in commerce, entertainment and education. But there are few books which actually look at what "community" means online and which provide a guide map around this complex area. This books fills that gap and does it extremely well.

Powazek points out to us that the definition of community, particularly when applied in online settings can be problematic. Often community is used indiscriminately for any group of individuals, regardless of the nature of their interaction, particularly in online settings. On these criteria when you stop at traffic lights on your drive to work you're suddenly part of a community of people stuck at the intersection of the A453 and the B2269. Whether you actually feel any affinity with this community is an entirely different matter. Faced with the often sloppy use of "community", Powazek writes "I believe the only context on which to judge a community is a community one" (p. xxi). The use of the individual as the arbiter here is an excellent idea and the idea of personal involvement and commitment drives the whole book from that point forward.

Design for Community is an unusual book in that it combines features of books which are normally published separately. Firstly it contains some real thinking about the ways in which interaction is being changed online, and whilst Powazek does not invoke any philosophers, sociologists or psychologists to bolster his arguments, there is a depth of argument here which repays further reading. He is particularly good when writing about the "paranoia problem"; the reluctance of people to use a computer for personal interaction. Secondly the book contains chapters on nuts and bolts issues "tool for doing the heavy lifting". It explains the technology available for creating and maintaining communities online and compares and contrasts different hardware and software options. Powazek doesn't have the room to go into great detail, but his knowledge is sound and anyone looking for initial advice could stop off at this book before seeking more detailed "how to" instructions elsewhere. Finally the book contains many examples of actual communities online with detailed analysis of how they are managed. These "real-life" examples give the book an added edge, the author's knowledge of these communities is considerable and he writes about them with insight. For instance writing about Slashdot (slashdot.org) Powazek explains how the site encourages people to create accounts and post responsibly: "If you choose to create an account — in addition to some nice personalisation features — your name is automatically added to all your posts. If you don't create an account, the sites names you itself: Anonymous coward" (p. 124).

The book is written in an informal and direct way and Powazek has a way of anticipating your next question and providing an answer to it. Although the book is 298 pages long, the clarity of the writing makes it feel much shorter and it's easy to underestimate how much material is covered simply because of the immediacy of the writing. Powazek often comes up with some interesting ideas which challenge much conventional wisdom about site design. For instance he has a rule called "bury the post button". Making posting difficult would seem a nonsense to most Web designers chanting the accessibility mantra, but he defends his rule by stating that the farther away the post button from the front door of the site the better the conversation gets. Troublemakers (trolls looking to lay down flame-bait in moderating parlance) are dissuaded from posting, as are those who have no real interest in the community and just want a place to say "Hi". Insights like these make the book worth the cover price and Powazek continues to challenge many commonly held notions as the chapters unfold.

There are many potential users of this book. Anyone engaged in Web design and who has been asked to provide "community-like" features on a site should read it because of its clear and well thought out philosophy on involvement and interaction on the Web. It may well give you some ammunition to go back to a client and argue for a different approach, but if you do decide to go ahead with an online community, then there is much in the book to draw on. Likewise people wishing to set up community based sites should read this book. Finally, scholars of Internet culture and those wishing to research the ways in which group interaction is being reconstructed by digital technology will also find much of benefit here. — Matthew Pearson End of Review

++++++++++

Patricia L Rogers (editor).
Designing instruction for technology enhanced learning.
Hershey, Pa.: Idea Group Publishing, 2002.
cloth, 286 p., ISBN 1-930-70828-9, US$74.95.
Idea Group: http://www.idea-group.com/

Patricia L Rogers (editor). Designing instruction for technology enhanced learning.

This book is an edited collection of chapters about using technology in education. The book is ambitious as it covers all stages of education from the primary school (elementary), through secondary education to the university and college. The book is divided into five sections. The first two cover an overview of the field of instructional design and the foundations of it. The next two cover designing for learners in primary and secondary education, and for learners in higher education, whilst the final section is about designing for learning environments. The majority of contributors are from the U.S., with some additional chapters by contributors from Iceland, Canada and the U.K.

Many of the contributions to this volume grow out of real life classroom applications of technology and the contributors are keen to share their experiences (both good and bad) with their readers. Some may find this aspect of the collection a real boon as they can read about real life situations and gain ideas for their own practice. Others may find the emphasis on reporting rather than examining the theoretical issues which underpin the change in pedagogy which accompanies the introduction of computers into education less than satisfactory.

The theme which runs throughout the volume is the need for educational practices which are constructivist in nature. Constructivism is a philosophical approach to the definition of knowledge which holds that knowledge is not "out there" as absolute facts waiting to be learned, but is rather a set of ideas and concepts which an individual holds about the world, and which is created inside the learner's head by their own cognitive agency. Constructivism has become the orthodoxy in almost all arenas of education (at least when teaching and learning are being discussed by academics) and this book continues to add to its supremacy as a way of thinking about teaching and learning. But not all the examples in the book really show constructivist principles at work. One chapter covers designing technology enhanced learning in the elementary school and shows how the computer can be used for children to draw pictures with, before transferring them to a Word document. This is all well and good and I don't doubt the increase in motivation of the students when introduced to this way of working, but the computer here is merely being used as an alternative to pen and paper (and not necessarily a superior alternative), whilst the underlying pedagogical model remains unchanged and to a great extent unexamined. It is easy to criticise individual essays for failing to examine all of the complex pedagogical issues involved in using technology in education, but taken as a whole, the essays in this volume are not outstanding contributions to this field and some of them appear to be there to make up the numbers.

This book is largely American in its focus and scope. References throughout are American in usage, and the use of instruction in the title is typical in this respect (Europeans would use teaching as an alternative). This American focus is not really a problem for readers of the book but teachers from other countries wanting to benefit from it may find themselves a little distanced from the content by some of the terms and concepts.

The audience for this book is potentially large. School teachers and college and university lecturers could potentially benefit from reading the material here. Certainly reading the volume from cover to cover would give a sense of how teachers in the U.S. are using technology and may well give a practitioner some useful ideas. But ultimately the book tries to do too much and too little at the same time. Too much in that it attempts to cover education from the primary stage right through to university, and too little in that each chapter is too short for the reader to get a real sense of how to use these ideas in their teaching in a practical way. — Matthew Pearson End of Review

++++++++++

Sam Williams.
Free as in freedom: Richard Stallman's crusade for free software.
Sebastopol, Calif.: O'Reilly, 2002.
paper, 240 p., ISBN 0-596-00287-4, £15.95, EUR25.60, US$22.95.
O'Reilly: http://www.oreilly.com

Sam Williams. Free as in freedom: Richard Stallman's crusade for free software.

When I first picked up this book out of interest in the origins of the GPL (General Public License) and the person attributed to be the father of the concept. I am not sure what I expected to find but my findings did surprise me.

Sam Williams' writing style and portrayal of Richard Stallman made the whole book interesting and not what it could have been, a straight reference book. The anecdotes and different perspectives offered by events, family and friends not only give you the reasons behind the GNU project. They also provide an excellent insight into the driving forces around Stallman, his frustrations and the hates that lead to what seems, at times, isolation in a high-tech world.

The processes and problems with software patients and claims on lines of code are shared in this book which, although at times one-sided, does give a different view point from that of major players, such as Microsoft, and shows the importance of Stallman's crusade.

Free as in freedom illustrates both good and bad sides of this modern day Zorro. In it Williams has been able to capture accurately the fascinating persona and his determination to challenge the establishment and by force of character to change the world's perception of how software was produced. A very good reading. — David Phillips End of Review


Contents Index

Copyright ©2003, First Monday