Host, by Peter James

ISBN: 9780575056190
Amazon ID: B004BDOJYU

This is one of Peter's earlier books (published in 1993, well before before the well-known Roy Grace series) and while it's a pretty good story, there are some things which one wouldn't expect to encounter in the later books he's done.

For example:

Someone correctly calculates this to be "approximately one hundred million megabytes". Whether you regard a megabyte as 1000 x 1000 bytes or 1024 x 1024 bytes, the approximation is still correct.

In the next sentence the same person utters, this has then become "ten million gigabytes". Suddenly 100x as much.

Later on, this data has been transferred to a "terabyte tape", which means it's not entirely clear whether the tape holds one terabyte, or simply some number of terabytes, but even so, one terabyte is (geometrically) precisely in the middle of the two numbers discussed previously.

On the other hand, I do appreciate the artistic / technical / financial licence with which the author in 1993 discusses "terabyte tapes", well before the days in which a terabyte is more or less the smallest hard disk you can buy (I'm writing this in 2021), SD cards of ΒΌ terabyte are sold for trivial prices in supermarkets, and terabyte SSDs are not unusual in domestic computers.

This book was published in 1993, the year before Thinking Machines Corporation declared bankruptcy, having been in business since 1983, with the corporate motto "We're building a machine that will be proud of us".

The "logic" the person uses is that "if it was just noise, sooner or later it would form repeating patterns. If it was [genuine] information, there would be no repeating pattern, because it would all be different."

I find this intriguing, especially in the light of Claude Shannon's work regarding information entropy, which is now used as the basis of compression algorithms, which manage to compress meaningful information (thus demonstrating that it contains patterns which can be compressed or replaced by shorter symbols), and cannot to any useful extent compress random data (which is today's equivalent of "electrical noise") because there are no repeating patterns to be compressed.

One of the main characters in the book, who claims to believe that a dead person's brain can be restored to life, either if their body can be frozen and then later repaired of its cause of death, or as a copy in a suitable computer, is astonishingly long-term obtuse and refuses to believe it when this actually happens. The lengths to which this person goes to deny what is clearly going on beggars belief.

Pretty late in the book, one of the characters, who has a doctorate in neuroscience and is now working in artificial intelligence, encounters a signed document, and observes that it had "tall thin handwriting, slightly uneven: he could see the hint of mental instability in the [person's] signature". Quite impressive for someone with no graphological training.

The eBook version also suffers unduly from clearly having been scanned from printed text and then been inadequately proof-read. Given that the author himself claims in the foreword that this was the first electronic book ever to be published, I can't help wondering whether the original electronic text was badly proof-read, or whether the current edition simply wasn't created from that and has incurred its own subsequent typographical errors.


Go up
Return to main index.