FM Reviews
First Monday

FM Reviews

Pierre Baldi.
The Shattered Self: The End of Natural Evolution.
Cambridge, Mass.: MIT Press, 2001.
paper, 245 p., ISBN 0-262-02502-7, US$24.95.
MIT Press:

Pierre Baldi. The Shattered Self: The End of Natural Evolution.

Despite their relative unaccessibility and difficulty, both genetic engineering and natural evolution have, in recent times, been catapulted to prominence by the interest generated in the Human Genome project, human cloning, and related sensational events. As a consequence, there is now a plethora of books devoted to the relationship between human behaviour and genetic makeshift. Among them, one stands out partly thanks to its somehow provocative and disquieting contents: The Shattered Self, by Pierre Baldi, Professor of Information and Computer Science and Biological Chemistry and Director of the Institute for Genomics and Bioinformatics at the University of California.

The subtitle - The End of Natural Evolution - immediately alerts the reader that we are dealing here with something of a major milestone, a turning point in human history. In the author's words, "our notions of self, life and death, intelligence, and sexuality are very primitive and on the verge of being profoundly altered on the scale of human history. It is this shattering - what causes it and its meaning - that forms the central thread of this book. This shattering is brought about by scientific progress in biology, computer science, and resulting technologies, such as biotechnology and bioinformatics."

Beginning with a tour of molecular biology, Baldi explores Genome sequencing and how this discipline has progressed in recent times; there follows a discussion of the mechanisms concerned with human reproduction, including cloning. An interesting point is made: if sexual pleasure developed as a means to ensure that reproduction would take place, will it survive the fact that we have devised methods whereby reproduction does not take place, despite the pleasure involved? Will sex genetically lose its appeal, as it becomes redundant? Other themes explored are the brain and its 'computing' power, artificial life, and some of the important ethical issues.

Bardi's style of writing is engaging and authoritative. There is so much that comes together in a book such as this one. Copious end notes provide extra details for even the most avid and enquiring reader. On the negative side, I have found that, in order to reinforce his points, the author, at times, makes use of obscure arguments in which the conclusion does not seem to follow from the premises: "Many bioethics texts share the same conservative punch line: we ought to be extremely careful and proceed very slowly with biotechnology, because we must preserve our notion of humanity and of who we are. Of course, I agree that we ought to be careful. But preserving our humanity is not a good reason. It amounts to saying that we should be careful exploring the oceans, because we have to preserve our notion that the Earth is flat. But flat the Earth is not. We should be careful because the oceans are dangerous. But we may well have to abandon our concept of a flat Earth if we want to align ourselves with reality." Somehow, the idea of a flat Earth (an obviously wrong assumption) is likened to the notion of what we are. Is the latter notion also so obviously wrong? And, if yes, why?

The book should be read keeping in mind that many of the scenarios discussed might become reality not in a few years time, but on the order of one or even several centuries. Thus, the danger of dealing with circumstance so far off in the future is that one might start dismissing certain ideas as pure speculation. There are bound to be technical, ideological, and cultural barriers that must be overcome, but the more we extrapolate, the more unclear it becomes what humans will need to tackle, in order to enjoy the many possibilities that biotechnology and genetics will offer (immortality is a good example).

While reading the book, I could not help but reflect on one particular point: why would we want to carry out some of the modifications to ourselves or to our environment that are presented here? It is clear why cloning and artificial fertilisation might have benefits; however in other cases, it seems to me that what is proposed is more a technological show-off, rather than a justified enhancement of human conditions. For example, in a section called "The New Music", Baldi writes:

"Music can be represented, composed, and recorded in digital media in many ways, from standard sheet notation, to sound waves, to the physical modelling of particular instruments. The digital record can be edited and manipulated down to the level of single bits, just like genomic DNA. Sound can then be synthesized back through a synthesizer, but also in many other possible ways, including some that are not yet available but could be realised within the next century, such as reproducing the movement and pressure of a bow on a violin. In fact, if I had to make a prediction, it would be that music will become increasingly computerised and its mechanical aspects will progressively disappear. Within a few hundred years, traditional instrumentalists could vanish almost entirely, and something derived from the computers of today could become a kind of universal instrument connected to the Internet."

I just wonder why such thing would be an improvement, and I cannot help disagreeing strongly with the author's prediction: it is not without reason that synthesizers and all the most sophisticated electronics have not yet managed to come even close to the sound produced by an original Stradivari violin made of wonderful wood back in the seventeenth century. Why would we, humans, want to get rid of natural artifact, simply in order to follow some sort of computerised path? I am not clear about this and I suspect that we will not necessarily follow that path. Somehow The Shattered Self suffers a little from a common syndrome whereby advancements in computing are seen as reflecting the general tendency of other technological and scientific studies. Not too many years back we were predicting that, by now, colonies of human beings would be living happily on some of the nearest planets. However, reality has proved much more difficult than our, at times presumptuous self-confidence in our capabilities would have led us to believe.

But I don't want to sound too pessimistic and skeptical. A great deal of the phenomena Baldi explores will most likely have an impact on future generations. I only wish the author had dealt more with the reasoning behind our wish to change ourselves in such radical ways. By all means, this is a book that will encourage any intelligent reader to start thinking about who we are and what we want to be. - Paolo G. Cordone End of Review


Daniel J. Barrett and Richard Silverman.
SSH, The Secure Shell: The Definitive Guide.
Sebastopol, Calif.: O'Reilly, 2001.
paper, 558 p., ISBN 0-596-00011-1, US$39.95, CA$58.95, UK£28.50.

Daniel J. Barrett and Richard Silverman. SSH, The Secure Shell: The Definitive Guide.

Security of the corporate network is an issue that many organisations spend a lot of time and effort trying to get right. There are a large number of tools available to the systems administrator that can, in the wrong hands, allow a hacker to gain access to your data. Someone with snoop or sniffer running can trace any telnet session with ease. The obvious solution is therefore to encrypt you data transmissions, and this is what SSH does.

This book has been difficult to review, not because of any difficulty in reading the book; the authors have made what could be a dry textbook immensely readable. The difficulty I have had reviewing it was that I took it to work to help me with a task I was carrying out, and my fellow Sys Admins fell on it with great interest. It took me some time to get hold of it for long enough to go through it in the detail needed for a review.

As I said earlier, the book could have been just another dry text, but in common with all of O'Reilly's titles, the authors have gone to pains to explain the subject without being boring. Where it would be clearer to have a diagram to reinforce a point then there is a diagram. The book starts with a basic overview of the ssh protocol and ssh features such as the secure network login and the scp protocol to use instead of ftp. The concepts of keys and agents are discussed and comparisons are made with other similar technologies.

The structure of the book leads the reader through a logical series of stages, basic client configuration and use, a more in depth look at SSH, compiling, installing and configuring a server, key management, advanced client configuration, more in depth server configuration and how to get Xwindows and SSH to work together. Barrett and Silverman have included a very useful section detailing a recommended setup for SSH and a further section including case studies for more complex issues such as unattended batch jobs, FTP forwarding and using Kerberos with SSH.

There is short section on troubleshooting and FAQs that I would have preferred to be somewhat larger. The information there is very useful, and answered one or two questions I had. Finally the authors point out that SSH is available for platforms other than UNIX, and describes some commercially available products for Macintosh and NT as well as UNIX.

The versions of SSH covered include SSH1, SSH2 and Open-SSH. This is more than adequate, and the latest version of SSH I am using is Version 3 of SSH2. The authors discuss all the differences between all these versions in great detail.

The Secure Shell is probably one of the most useful books I have come across in the last twelve months. It contains all the detail I required to set up a secure infrastructure for auditing the server installations we have in the organisation. We, as a team, are now working on using SCP rather than FTP to manage Oracle archive log shipping between sites for standby databases, and when I dial into the office from home, SSH gives me a more secure route into the servers than telnet ever did.

The most telling fact is that I had a terrible job getting the book back from my colleagues to review. If you are looking to secure your network, or just want to find out more about secure transports, then this is the book to buy.

Now if I can only find out who's got it now ... - Peter Scott End of Review


Jeffrey Dean.
LPI Linux Certification in a Nutshell.
Sebastopol, Calif.: O'Reilly, 2001.
paper, 570 p., ISBN 1-565-92748-6, US$39.95, UK£28.50.

Jeffrey Dean. LPI Linux Certification in a Nutshell.

Anyone who has been around a bit in the IT industry soon realises that the things prospective employers really lap up are those bits and pieces of professional certification and accreditation that have, traditionally, been picked up at the expense of previous employers. These days, however, fewer employers seem prepared to foot the bill of certification courses, particularly if they have suffered high staff turnover in the recent past.

This, then, leaves it up to the over-worked, under-paid and forward-thinking employee to decide which course of instruction and certification will provide the biggest pay-off .. in terms of both productivity and remuneration. There are now several options for Linux certification; this book - LPI Linux Certification in a Nutshell - aims to guide the reader through preparation for the Linux Professional Institute Level 1 examinations.

This is no small task as the experiential divide between any two readers could be considerable. Keeping the interest of the experienced user while not sailing over the head of the novice is an accomplishment for any technical writer and this is where the author of this title really earns his pay.

LPI Linux Certification in a Nutshell is split into two main sections; each covering the two exams which make up the first level of LPI certification. Exam 101, and the first section of the book, is primarily concerned with ensuring that the candidate has a firm grasp of the basic Unix commands, understands the GNU/Linux filesystem and standard device files, can successfully boot and shutdown a system as well as performing basic system administration. The second section - and Exam 102 - deals more heavily with system administration including installation and configuration of a GNU/Linux system with a variety of hardware, networking, XFree86 setup, basic system security and and an introduction to Unix shell scripting.

Sound a dawdle? I have to admit that I scoffed and went straight to the sample questions ... almost choked, then went back and began to actually read the book!

Let me qualify that. It isn't that the questions are overly taxing - at least not to someone who has been running Unix systems for a while - but it was a good indicator of how heavily I lean on the man pages! Thankfully the multiple-choice questions, although Linux specific, seem to show some consideration for readers who are more familiar with BSD, say, and are not any trickier than they need be. By any measure, the sample questions in this book are much tougher than those offered from the LPI Web site (at and should probably serve as a warning.

Mr. Dean's approach is methodical and is to be highly commended. Each sub-section of the exam is dealt with individually and he conveys the material well in an authoritative but sometimes - surprisingly - throwaway style. The only reason this title does not offer competition to the publisher's Linux in a Nutshell is the depth of coverage; Dean sticks to the information required for the LPI exams and doesn't run on with pages and pages of obscure command-line switches.

The style throughout is friendly yet informative and Dean seems to have an unconscious knack of answering questions just as they are forming in the reader's head. Some of the content, particularly that on networking and system hardware, actually puts other "definitive" Linux works to shame. Perhaps the most valuable aspect, however, is the frequent "box-outs" which highlight the most pertinent topics to revise for the examination itself. This is the key buying point, as success in the examination will be dependent on how well you know the material in this syllabus; against that, almost any amount of hands-on experience will pale.

Although O'Reilly's Nutshell series are intended as "Desktop Reference" manuals, I have to recommend this one as a good all-round read; not only as a primer for LPI certification, but as an excellent introductory text on GNU/Linux. In all, this is a valuable addition to O'Reilly's already packed stable of Linux titles and I look forward to more from the author. - Rory Beaton End of Review


Ollivier Dyens.
Metal and Flesh - The Evolution of Man: Technology Takes Over.
Cambridge, Mass.: MIT Press, 2001.
paper, 178 p., ISBN 0-262-04200-2, US$24.95.
MIT Press:

Ollivier Dyens. Metal and Flesh - The Evolution of Man: Technology Takes Over.

This is a collection of writings on the emergence of cultural biology. It consists of three sections in the form of loosely connected thoughts revolving around two central themes: how technology transforms our perception of the world and how culture is taking a life of its own, what Dyens calls 'cultural biology'.

In the first section, "The Crater in the Yucatan", Dyens talks about the profound transformations that are underway. These have the effect that biology is no longer of significance, being replaced by culture. According to Richard Dawkins' theory of evolution, as described in The Selfish Gene, which Dyens refers to, life is only relevant as a vehicle for genes to survive and replicate. Indeed, as Dawkins goes on, there are even more fundamental units called 'replicators' which have the goal of disseminating through reproduction. These are the building blocks for everything - genes, viruses and ideas. Dyens however, goes further by saying that the vehicle for replicators needn't be biological. Media environments such as the Internet and telecommunications enable replicators to spread without an organic being. As an example, Pamela Anderson's persona like her well-known assets is not real but televisual and cultural.

In the next section, "More or Less Alive," Dyens asks us to re-assess the very notion of aliveness. Can viruses, for example, be considered living? Doyne Farme, an artificial life scientist, suggests that a living being must have amongst other features:

  • Self-reproduction
  • Information storage
  • Metabolism
  • Functional interactions
  • The ability to evolve

They must also have the ability to defend themselves and read an environment and their enemies, i.e. the ability to manipulate representations. By this measure, viruses can be considered to be 'alive'. Similarly, non-biological entities such as expert systems, computer viruses or artificial intelligence could conceivably by considered to be 'alive'.

In the last section, "The Rise of Cultural Bodies", he describes how the physical body can be transformed completely. He uses examples from literature and history: The Island of Dr Moreau by H.G. Wells, George Orwell's 1984, Kafka's Metamorphosis and Nazi ideology. A common feature of all these writings is the pain that is inflicted on their victims. The victims were used as a way of disseminating their ideology over time.

Overall, I thought this was a very well researched book with a large number of references. It is certainly thought-provoking and complex.

I found the book quite hard to begin with, but this became easier as I progressed; the various ideas are quite interlinked and I got used to the writing style.

The structure of the book is slightly different from the usual, and this probably accounted for the initial difficulty as topics don't follow on from each other sequentially. The two main themes (how technology transforms our perception of the world and how culture is taking a life of its own) are explored from different perspectives in the three sections.

I don't think the ideas Dyens covers are revolutionary, and have already been considered by the likes of H.G. Wells, George Orwell, Kafka and Richard Dawkins. What he has done is pull together their ideas, extend them and apply them to a modern, 21st Century context.

Where the book is lacking, in my opinion is that it does not consider the effect of conflict between societies which may have developed at different rates, for example the developed versus the developing world, or deprived sections of society which may not be so heavily influenced by technology or a global culture. Also there is the risk that by inhabiting cyber worlds we might neglect other aspects of our world, such as the environment or the needs of those who are less fortunate than us.

In conclusion, I would say that I am glad I read this book, as it did give me a new perspective, it was complex, but challenging, stimulating and thought-provoking. - Kamal Khan End of Review


Scott Hawkins.
APACHE Web Server Administration and e-Commerce Handbook.
Upper Saddle River, N.J.: Prentice Hall, 2001.
paper, 384 p., with CD-ROM, ISBN 0-130-89873-2, US$40.49.
Prentice Hall:

Scott Hawkins. APACHE Web Server Administration and e-Commerce Handbook.

The APACHE Web Server Administration and e-Commerce Handbook is aimed at the many APACHE administrators of all skill levels. The APACHE server application is installed on over 61.5 percent of all Web servers worldwide at the time of reading this book. Because of the open source software movement, it costs little or nothing to obtain it and it consistently outperforms the available competition.

The main intention behind the Handbook is primarily to serve as a general and varied tutorial, with a useful reference for the APACHE Web Server administrator. The reader will need to have some familiarity with computer systems but not necessarily a networking background. The main theme throughout the Handbook is e-commerce, although the reader will find very extensive appendices covering a variety of peripheral information such as Name Resolution, TCP/IP, and regular expressions, all of which are handy for creating a functioning and stable Web server environment. As the author Scott Hawkins points out early on, the book is biased towards the UNIX/Linux operating system, as this was his main development and testing platform. However, much of the book has also been tested on Windows NT, Windows 9x and UNIX-based MAC OS X.

The Handbook also covers topics such as electronic payment and database interaction for e-commerce-driven Web sites. Many chapters in the book are in the form of structured essays on various topics, such as Virtual Hosting, moreover they can be read in any order still making perfect sense, even to a lay person. But do not think that the Handbook is aimed primarily at non-professionals. Seasoned 'admins' reading this review also need to brush up on new implementations of the APACHE distribution: modules are frequently modified, amended, removed or included in the next release of the APACHE distribution.

There are three main parts: Part I - The Basics, Part II - Advanced Administration, and Part III - e-Commerce; the appendices serve as Part IV, because it is extensive and covers a lot of information effectively and concisely. The information within the appendices is either background information such as UNIX concepts or deals with directive syntax issues which don't quite fit into the rest of the Handbook. There are many examples of Commands and Configuration Directives which are accompanied by concise instructions, combined now and again with sample output. Another bonus is the inclusion of the PHP Command reference.

Part I covers the basic concepts and techniques of the APACHE administration, walking the reader through the process of obtaining an APACHE distribution. Installing the APACHE server and configuring the environment to suit the administrators requirements. Part II brings in more advanced techniques and features of APACHE. Once the readers have got their heads around the basic concepts, they are then free to delve into the more advanced concepts that make up Part II, such as Hosting Multiple Sites, Proxy servers and caching, Logs and monitoring, Security, Dynamic content, Performance Tuning, URL Rewriting and Module construction. Part III is focused on e-Commerce-driven sites, Database connectivity, and the implementation of the mechanisms for the collection of payments made over the Web.

The Handbook is accompanied by a CD-ROM containing an extensive APACHE Toolkit plus e-commerce templates that will enable the reader to just jump in and get some very welcome hands-on experience. Also on the CD-ROM are APACHE binaries with source code, MySQL, PHP4 and Mod_perl examples.

All in all, Scott Hawkins and the publisher have done an excellent job of creating this comprehensive guide. Whether you are a total novice to the APACHE server or a professional administrator, you will find much that is not covered by any other APACHE Guide. It is both a complete APACHE tutorial and a valuable reference work. - Glenn Dalgarno End of Review


Tim Jordan.
Cyberpower: The Culture and Politics of Cyberspace and the Internet.
New York: Routledge, 2000, c. 1999.
paper, 264 p., ISBN 0-415-17078-8, US$27.95.

Tim Jordan. Cyberpower: The Culture and Politics of Cyberspace and the Internet.

Jordan is Senior Lecturer in the Department of Sociology at the University of East London and sets the scene by discussing three theories of power: Weber's Power as a possession, Barnes' Power as a social order and Foucault's Power as domination.

Each chapter is structured such that the key concepts are explained and the chapter is introduced. These makes the book easy to dip into and although individual chapters refer back to the main theories they stand-alone, a great device for those of us unable to read the book in a single sitting.

Using what he refers to as "Myths of Cyberspace", for example the story of Blair Newman's Cybersuicide, Jordan clearly illustrates the sources and types of power available to the individual and groups in cyberspace. By mapping what are, in some instances, well known stories against the sociological theories Jordan manages to explain his thesis in an easily understood and compelling manner. His language, while technical, is easily understood and does not obstruct the flow of his arguments. This is exemplified by the description of cyberpower at the beginning of the final chapter:

"Cyberpower is the form of power that structures culture and politics in Cyberspace and on the Internet. It consists of three interrelated regions: the individual, the social and the imaginary. Cyberpower of the individual consists of avatars, virtual hierarchies and informational space and results in cyberpolitics. Power here appears as a possession of individuals. Cyberpower of the social is structured by the technopower spiral and the informational space of flows and results in the virtual elite. Power here appears in the form of domination. Cyberpower of the imaginary consists of the utopia and dystopia that make up the virtual imaginary. Power here appears as the constituent of social order. All three regions are needed to map Cyberpower in total and no region is dominant over any other." (p. 208).

While most of us might perceive these elements of our online life Jordan manages to elucidate them in a coherent way that I've certainly found clarifies many of my perceptions.

His analysis of power and politics in the different types of online community, MUDs, Usenet and the Web, is compared and contrasted with the imagined cyberspaces of Gibson and Quarterman. Here, for example, he uses descriptions of fictional characters with the characters adopted by users in MUDs - already we see the parallels although there is no discussion of how much each of these worlds influences the other. Are MUD avatars perhaps selected and described after reading Neuromancer?

Jordan summarises his book with the reassuring thought that cyberspace will not be dominated by the cyber elites anymore than it will offer the libertarian ideal of individual empowerment. I'm not sure that reading this book will help many of us to guard against the first or move towards the second. Nevertheless for an educated and informative guide to the processes that make the politics of the Internet work and the theories behind them, this is a great book which clearly maps the power structures and explains how they operate. - Nigel Gibson BSc (Hons) End of Review


Jeannine M.E. Klein.
Building Enhanced HTML Help With DHTML and CSS.
Upper Saddle River, N.J.: Prentice Hall, 2001.
paper, 380 p., ISBN 0-130-17929-9, US$35.99.
Prentice Hall:

Jeannine M.E. Klein. Building Enhanced HTML Help With DHTML and CSS.

Building Enhanced HTML Help With DHTML and CSS was not the book I expected from just looking at the cover. The title might mislead you at first, especially if you only consider the references to HTML, DHTML and CSS; and maybe you might be disappointed for it is primarily a guide to creating state-of-the-art online Help applications for the Microsoft Windows operating system. Once I had clarified my error, I began to see the possibilities that an Enhanced HTML Help system might have within the Windows environment, as I have often found myself within the GUI of such a help system.

The accessibility of the writing made the topics covered within the book easy to digest even for an experienced user of the DHTML and CSS. The book does not set out to discuss Enhanced-HTML Help from a programming point of view but from an author's perspective; as such it does not discuss the way developers integrate the HTML Help file within an application's interface. Although the book does cover the fundamentals of authoring with HTML, DHTML and CSS, its primary aim is not to teach the reader these languages in-depth. The reader, however, will gain enough of the language basics to be able to author a simple Enhanced HTML Help file. The book provides everything needed to know how to create an attractive and useful complied project using Microsoft's HTML Help authoring application. On the way to achieving this the reader will also acquire the necessary skills for developing HTML Help documents that have been styled with CSS and made dynamic by the use of DHTML.

If you are a complete beginner you will discover:

  • how to structure the elements of the compiled HTML Help system
  • how to bring in functionality
  • some tricks and tips for using the HTML Help Workshop
  • the inner workings of the HTML Help navigation pane
  • fifty of the most used CSS properties for paragraphs, styling and layout
  • pre-canned javascript code for pop ups and text rollovers.

On the other hand this book will also provide the experienced reader with a framework that brings all the elements and different aspects together in a single book. So if you already have an understanding of the HTML Help system you have not been left out, as there is much that you may find interesting. For example, Microsoft's site-map format for table of contents and indexes, window definitions, meanings and parameters with recommended values. How to open HTML Help code and be able to tweak the contents, which are not normally accessible via the HTML Help Workshop, find work-arounds for the known HTML Help bugs and avoiding some of the pitfalls that befall modular designed HTML Help systems.

The developed end-product might be deployed as an online user assistance module or as a stand-alone tool or e-book for the author's own individual purposes, or as part of a corporate online Help resource catering for hundreds or even thousands of staff members. Because J.M.E. Klein has over ten years experience in developing such online enhanced help system solutions for corporate environments, the readers will get insights from a true pioneer of building effective online Enhanced Help applications, based entirely upon proven methodologies and real-world solutions from which basic building blocks can be developed.

No previous knowledge is required, except for a familiarity with the Windows platform, with using the online HTML Help applications, and for some proficiency in the hypertext languages. So whether you are a technical writer, content coordinator, documentation manager or developer you will definitely find this the ideal sourcebook for you. From project files to final distribution to your audiences, HTML Help will help you or your company reap the many benefits of an effective online HTML Help system. - Glenn Dalgarno End of Review


Jane Margolis and Allan Fisher.
Unlocking the Clubhouse: Women in Computing.
Cambridge, Mass.: MIT Press, 2002, c. 1999.
cloth, 180 p., ISBN 0-262-13398-9, $US24.95/UK£16.95.
MIT Press:

Jane Margolis and Allan Fisher. Unlocking the Clubhouse: Women in Computing.

It can't be easy to write up a piece of rigorous, academic research into something practical and readable, but the authors of this book have done just that. They strike a balance between a scholarly presentation of research findings and a fascinating account of students' experiences of studying Computer Science (CS) at the Carnegie Mellon School of Computer Science (SCS). The purpose of the research was to discover why more female students do not choose CS for their degree subject, and why so many of those who do choose it don't complete the course.

Margolis, Fisher and their research associate, Faye Miller, spent four years, from 1995 to 1999, conducting over 260 interviews with more than 100 students, male and female. By seeing some of the students more than once they were able to plot the highs and lows of confidence and interest. Their research methods are set out in the book's Appendix, providing a framework for others to carry out similar research, whether in terms of gender, race or class.

There are also some illuminating examples from prior research to strengthen the discussion. We are reminded that boys are conditioned from an early age to be adventurous and take risks, while girls are taught to be cautious and not stray too far. A "roaming radius" study of 1968 is quoted, of mothers and their four-year-old children: girls were regularly called back, while boys were allowed to stray much further afield. Similarly, a late 1970s study found that a boy of 10 or 12 might travel 2,452 yards before turning back, while a girl of the same age would return after only 959 yards. No wonder so many female students are hesitant in later life, and avoid taking the risks and meeting the challenges which are so vital to learning, whether in computing or in any other subject.

One thing that makes the book compelling reading is the way the individual students are presented as real people, not just statistics. Some of them are described quite vividly: there's one who rollerblades into the office for his interview (p. 21), another who turns up dressed in 'goth-nerd' style (p. 46). The interviews were taped, and the many brief quotations, with all the colloquialisms and speech oddities left in, are one of the most entertaining aspects of the book. Here is one example, from a female student:

"When I have free time, I don't spend it reading machine learning books or robotics books like these other guys here. It's like, "Oh, my gosh, this isn't for me." It's their hobby. They all start reading machine learning books or robotics books or build a little robot or something, and I'm not like that at all. In my free time, I prefer to read a good fiction book or learn how to do photography or something different, whereas that's their hobby, it's their work, it's their one goal. I'm just not like that at all. I don't dream in code like they do." (p. 5)

The point that emerges time and time again is that women have diverse and wide-ranging interests while men tend to focus on one thing. We can probably all think of someone who disproves this generalisation, but on the whole it does seem to be true. A female CS teacher remembers her time at college: just because the boys stayed up all night computing and the girls didn't, it didn't mean that the girls were any less dedicated or any less 'cut out' for the subject:

"If you are looking for this type of obsessive behavior, then you are looking for a typically young, male behavior. While some girls will exhibit it, most won't. But it doesn't mean that they don't love computer science!" (p. 75)

Another difference is that female students seem more interested in how computing connects with other fields, such as medical research, and want to use technology to make the world a better place. According to one researcher:

"The feminine take on technology looks right through the machine to its social function, while the masculine view is more likely to be focused on the machine itself." (pp. 55-56)

This finding has implications for curriculum development, where the focus has traditionally been on programming for its own sake, in isolation from any meaningful social context.

It emerges from the study that many young girls grow up with an interest in computers: not surprisingly, the way this interest is met by parents and teachers has an enormous influence. Technophobic mothers emerge as a negative role model - though this in itself can act as a spur for any daughter who decides she does not want to be like her Mom. More surprising is the fact that even the most enlightened and computer-literate parents choose to put the family computer in the son's bedroom, thereby creating access problems for a daughter who would rather like to use it too. And one father who installed the computer in his bedroom, on his side of the bed, justified his choice as follows:

"I paid for it; I bought it. ... I'm the father, and I make the rules around here." (p. 23)

A lot can go wrong at school too. A teacher might hold a parents' evening to explain the CS curriculum, and find that only parents of boys attend. A computing teacher might belittle a female pupil in front of a class of predominantly male pupils. And at school and college alike, male students might turn the computer lab into a male preserve, assume an air of knowing everything when they don't, and make remarks about the females being the 'token girls' in the class, just there for political correctness. There is even a level of sexual harassment in remarks like the following, recalled by a female student:

"Girls ... they just bring you girls here to make our computer science department look better. ... They don't really expect you to be able to code, but if you need help, you got the goods to get help from any guy you want." (p. 84)

Girls need a vast amount of confidence in order to withstand such comments. But this in itself is a key issue: girls lose confidence more easily than boys, and loss of confidence leads to loss of interest in the subject. Girls apparently place too much emphasis on what they don't know, and dismiss or minimise what they do know. When girls do well they put it down to luck, whereas when they fail they blame their own perceived lack of ability; boys tend to retain their confidence even when they fail, putting it down to external factors (p. 114). Reinforcing this is a general attitude of lower expectations for female students.

In the western world there are plenty of women who teach computer-related subjects such as word processing, spreadsheets, databases, Web design and e-conferencing. But Margolis and Fisher stress that they are looking at CS in terms of innovation: writing programs, designing new systems, working at the cutting edge of technology. It's not enough to know how to use computers: women should be involved in design and creation too. The gender distinction in a thirty-year-old children's book, I'm Glad I'm a Boy! I'm Glad I'm a Girl! still remains in people's attitudes today: "boys invent things and girls use things that boys invent" (p. 2).

Since the thalidomide tragedy of the early 1960s, when it took a female scientist to point out that drugs taken by the mother can affect the foetus, it has been recognised that a female presence among the experts adds a different and valuable perspective. Similarly, the absence of women engineers led to the first generation of airbags in cars being designed with adult males in mind, resulting in deaths of women and children that could have been avoided. Male engineers designed artificial heart valves for male-sized hearts. And so the list goes on. If women are not part of the design team, the results will be unbalanced, as borne out by the continued existence of violent computer games. On the other hand, it has to be admitted that without 'obsessive' male behaviour computers and the Internet would probably not exist at all. That being said, it's time to look to the future: the way is now open as never before for women to make a contribution. The cover design of Unlocking the Clubhouse says it all: a close-up view of the Enter key on a computer keyboard presents an open invitation to girls and women to join the club and make a difference.

For this to happen, the teaching of CS needs to adapt itself to the needs of female students, rather than the female students be shaped and fitted into the existing structure. In Chapter 7 of the book, Margolis and Fisher describe a summer school programme they ran from 1997 to 1999 for 240 high school CS teachers. The aim was to find ways of bringing more girls into CS by changing the way they are recruited, and changing the design of the curriculum itself. The programme was a great success: in a sample of five schools the proportion of girls to boys studying CS increased from an average of 13 percent before to an average of 30 percent after (p. 126). Similarly, a graph showing enrollment trends for women entering SCS at Carnegie Mellon shows a dramatic rise from seven percent in 1995 to 42 percent in 2000 (p. 137, Figure 8.1).

However, as Margolis and Fisher point out, there's no 'quick fix': changes take time, and have to be sustained. The image in the popular imagination is probably still the male 'geek' caricature: brilliant, heroic, but socially maladjusted, as reflected in the sub-title of Robert X. Cringely's bestseller Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can't Get a Date (1996). The absence of women in Cringely's book serves as a reminder of just how marginal women have so far been in the development of modern computing. It's time not just to unlock the clubhouse, but also to knock down a few walls and rearrange the furniture. - Dr. Gill Stoker End of Review


Jef Raskin.
The Humane Interface: New Directions for Designing Interactive Systems.
Boston: Addison-Wesley, 2000.
paper, 235 p., ISBN 0-201-37937-6, US$24.95.

Jef Raskin. The Humane Interface: New Directions for Designing Interactive Systems.

The area of computer user interface design is, as far as end-users are concerned, possibly one of the least exciting topics, despite the fact that it is the most relevant for them. On a daily basis we battle against poorly-designed widgets and interface elements, often having to behave the way the computer tells us to, when performing an action. But we should be able to come up with something better, if only we stopped for a minute and started thinking about it.

Jef Raskin, one of the creator of the interface for the Apple Macintosh, has written a wonderful book, in which he reflects on the current state of interface design and on how many of the elements we take for granted do not, after all, live up to their raison d'être: usability. From the very beginning, you get the impression that Raskin is someone who really knows a great deal about GUI. He is familiar with the underlying psychological and cognitive theories that must be taken into consideration when deciding on a sound interaction mechanism which does not get in the way of users, yet it allows them to carry out their work most effectively.

In eight chapters, Raskin introduces the concept of human-centered design and the definition of a Humane Interface, covers cognetics, meanings, modes, quantification, unification and navigation aspects. The last chapter identifies and addresses some issues that are not directly related to the front-end interface, but, as Raskin calls them, are Interface Issues Outside the User Interface. Here, programming environments, the seemingly unstoppable growth of cable diversity, and ethical questions relative to building interfaces are examined.

Some sections are fairly technical, as in the case of interface timing and the measurement of interface efficiency. However, most of the book is a pleasure to read, if only because it exposes the many failures of modern operating systems created by the major industry players. In this respect, one of the most enlightening illustration relates to the use of icons:

"Icons contribute to visual attractiveness of an interface and, under the appropriate circumstances, can contribute to clarity; however, the failings of icons have become clearer with time. For example, both the Mac and Windows 95 operating systems now provide aids to explain icons: When you point at the icon, a small text box appears that tells you what the icon stands for. The obvious reaction, which I have observed repeatedly when users first see this facility, is Why not just use the words in the first place? Why not indeed? Instead of icons explaining, we have found that icons often require explanation."

Once you reach the book's last page, it becomes clear that, in general, too little attention is given to user interface; probably good design is difficult because not enough time is spent analysing software products for usability. A paradox, of course, as the UI is the front-end which mediates between that very product's functionality and its users.

So, what can be done to improve the situation? In a section called "Intuitive and Natural Interfaces", Raskin examines an example of an alternative to desktop-oriented graphical interfaces: ZoomWorld. This particular zooming interface paradigm (ZIP) allows the user to locate information as if floating above it, with the possibility of zooming in or out, depending on whether one needs to see the details or the overall picture. Such an interface, called PAD++ can be found at and is a good starting point for anybody who wants to explore ways of interacting with a computer that are not based on the usual metaphors.

The message coming out of the book is: it is the responsibility of developers and designers to provide a system that frees the user from any unnecessary constraints. One of the best examples of such an approach to be found within the book is the following: "If a control must always (or never) be operated, don't provide it." There's a lot behind that sentence.

I am inclined to think that The Humane Interface should be compulsory reading for most computer users, for we, as users, ought to be experts on the matter. Allowing only the industry players to take care of it can be detrimental:

"There is a tendency in the computer industry to conform whether or not it has a productive outcome. ... Conforming and having a standard design are very important ... because it takes the user less time to get up to speed ... but if by conforming or standardizing, you are creating uselessness, then you have failed in your design." This is exactly one of the points made a few years back by Grudin's well-known article "The Case Against User Interface Consistency". Obviously, Grudin's analysis has not permeated into the industry. Abandon a standard when it is demonstrably harmful to productivity or user satisfaction. - Paolo G. Cordone End of Review


Mai-lan Tomsen.
Killer Content: Strategies for Web Content and E-Commerce.
Boston: Addison-Wesley, 2000.
paper, 240 p., ISBN: 0-201-65786-4, US$34.95.

Mai-lan Tomsen. Killer Content: Strategies for Web Content and E-Commerce.

If phrases like "diversification of revenue streams" and "monetizing value exchange" resonate, then this is the book for you. I did my best to ignore that unappealing and overused "killer" word in the title, but it was not long (p. 18) before another hackneyed phrase "content is king" made its entrance, "killer app" having sidled in even earlier via the preface. It took quite a while then, to persevere and try to overcome my (perhaps irrational) aversion to the vocabulary and investigate the content.

Taking the reader through a litany of business models and strategies for Web-based commerce, the book includes a perfunctory history of the Net. This "history" reveals somewhat disparagingly that ARPANET was only about "information exchange", and that with the advent of browsers the Web was able to evolve into its arguably, higher level of existence as a medium for "value exchange". The basic premise is that value exchange is the relationship between Web site and visitor based around content, and the book outlines how to optimise this relationship in the context of e-commerce. The first part explores the definition of value exchange via the framework of various business models. Part two is more practical in that it focuses on specific strategies and techniques such as the use of cookies, handling online payments and so on.

Apparently targeted at both Web publishers and Web users, if you can get past the 'business speak' this well structured little book turns out to be quite a useful introduction to e-commerce. I particularly liked Chapter 6 "Designing Web Information Structure" which, although brief, at least gives a nod to the importance of good usability in general and effective navigation in particular. Like many similar books, it ran the risk of being out of date before the printing ink was dry, being heavily reliant on case studies some of which laud Web sites that have since met their demise. This is of course inevitable given the subject matter, but could perhaps have been alleviated by adopting a rather more critical approach both to the sites offered as exemplars and the strategies and techniques described. - Jenny Le Peuple End of Review

Contents Index

Copyright ©2002, First Monday

A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2019. ISSN 1396-0466.