Retrocomputing Advice: Linksys ProConnect KVM2KIT

If you have a retrocomputing hobby, and you don’t have a lot of room (like most of us), it can be difficult to leave your stuff set up so you can enjoy it when you need to take a break.

One way to mitigate that problem is to use a KVM switch, so you can have PCs from multiple eras sharing the same desk, keyboard, mouse, and monitor.

I actually received my first Linksys ProConnect KVM2KIT in a gift of hand-me-down hardware and discovered that it is just about the most perfect KVM you could want for retrocomputing.

First, it switches two PS/2 ports and a VGA monitor. The only thing it’s missing is audio.

Second, you switch between the two PCs by double-tapping Scroll Lock, so you can hide the unit itself under or behind your desk to avoid having an anachronistic-looking box or button marring your setup like with something like the Belkin Flip F1DG102P.

Third, it requires no external power source, being powered by the PS/2 keyboard connector.

Fourth, it’s not that expensive on eBay.

Fifth, if you need it, you can set it to automatically cycle between its two inputs at a configurable rate… useful when you are doing something else while babysitting some install wizards.

As mentioned, it does lack audio switching, so you’ll have to come up with a solution for that on your own (two pairs of speakers, feeding the speaker out from one PC into the line in from another, a separate switch box, a mixer, etc.) but, aside from that, I have no complaints.

It Just Works™ and makes for a very comfortable and unobtrusive way to put two retrocomputers on one desk. Pair up an S-Video or Composite to VGA converter that supports VGA pass-through when not in use and is OK with more esoteric signals and you could add a console or an 8-bit micro to the mix.

Posted in Retrocomputing | Leave a comment

Incognito Mode for Zsh

Incognito Mode. We all use it from time to time, but, if you’re a heavy terminal user, you might use your terminal as often and in as many diverse ways as your browsers.

…so I thought, why not make an incognito mode for Zsh?

With this script, typing anonsh will recognfigure an open zsh instance in several ways:

  • HISTFILE will be unset so you still have a command history, but new entries won’t get copied from RAM to disk and will get erased when you exit the shell.
  • If you’re using the zsh_hardstatus_precmd function from my custom prompt, it will be redefined to show anonsh instead of the command name or working directory.
  • If the %1~ token for displaying the current path is found in your PS1/PS2/PS3/PS4 prompt variables (or the $base_prompt from my custom .zshrc), then it will be replaced with a literal ... so that clear can’t leave anything problematic behind.
  • clear will be redefined so that it also uses every method I could find to ask your terminal to clear its scrollback buffer, as well as asking GNU screen to empty its own scrollback buffer if $TERM is set to screen.
  • Zsh will be configured to call clear before exiting to clear any scrollback that might otherwise hang around if you are using termcapinfo xterm* ti@:te@ in your .screenrc.

Enjoy. 🙂

Posted in Geek Stuff | Leave a comment

Fanfiction – Invincible

Today’s fic is a little Harry Potter disaster movie slash Battlestar Galactica crossover named Invincible by Darth Marrs.

The basic concept is that the plot of the movie 2012 has come to the Harry Potter setting, but the radiation involved happens to resonate with magic and become much more lethal to magical plants and animals, humans included.

Nothing magical will be able to survive those radiation levels even if arks are built, and the magical governments of the world are determined to prevent a panic, so they keep it a secret. However, because Hermione was one of the researchers first called in to identify the problem, Harry learns of it and, inspired by a memory of a friend showing him Space Battleship Yamato, he sets out on a desperate effort to buy a decommissioned aircraft carrier, the HMS Invincible, and convert it into a spaceborn ark.

Of course, things can never be that simple. The purebloods are convinced of a goblin prophecy that will allow them to survive in the goblin caves, but they will only allow wizards in if Harry’s project is stopped. The purebloods see this as an opportunity to cleanse all the muggleborns and halfbloods from their society, so Harry’s efforts are quickly declared illegal, even as he allies with the Americans to share technology and magical innovation so they can build their own pair of ships. (Spoiler: There is no prophecy. The goblin religion forbids them from leaving Earth and they’re determined to take wizardkind with them into oblivion as one final act of spite.)

This is where one of my reasons for marking this story down a bit comes in: It’s tagged and presented as a Harry Potter – Battlestar Galactica crossover with elements of 2012 but, for the entire first half of the story, it’s a Harry Potter – 2012 crossover without a hint of Battlestar Galactica in sight. Having my expectations primed that way, I had trouble relaxing and enjoying the first half because I kept wanting to “rush through the introduction that has gone on far too long and get to the actual story”.

We’re talking “Battlestar Galactica doesn’t come in until chapter 16 of 33” in a novel-length story, 140,082 words long.

Still, for what it is, the first half is well-written… I just had trouble enjoying it for the same reason I rushed through more or less all of David Brin’s Sundiver. Improperly set expectations. I won’t mark it down too much for that… but I can’t ignore it entirely in my final scoring.

Anyway, the second half of the story starts with them discovering some survivors from Helena Cain’s illegal looting and pressganging of surviving civilian ships.

What follows is a story about Harry and company meeting up with Colonial survivors, dealing with the Cylons, and finally settling in an O’Neill cylinder built and abandoned by the Lords of Kobol.

Since all but one of my remaining problems with the story seem to stem from a common mismatch between my own philosophical outlook and that of the authors, I think I’ll address them together in one go before we get to the good parts:

I get a general sense from this story of a setting which implicitly penalizes the human drive to grow beyond what we are, but, more importantly, a story where the author sees nothing wrong with that… a setting which inherently pits science and magic at odds with each other as means for the betterment of our species, and casts science as wanting.

Most notably, in the core rationale for forcing the colonials to choose between allying with wizardkind or trying to make peace with the Cylons in the canonical way: Cylons have no souls and any descendants of human-form cylons, no matter how distant, will forever lose access to magic. (Specifically, the choice is cast as between immortality and magic.)

I could buy something like the teleporters that led to the situation in the My Little Pony sci-fi story Just Like Magic of Old, where the technology is flawed and the cylon resurrection process breaks the connection with whatever the source of magic is. Then, it’s a simple cautionary tale about trying to run before you’ve learned to walk, and nothing against science itself or our ability to better ourselves through dedication to studying the world.

…but saying cylons have no souls and having prophecy indicate that interbreeding with them will abandon magic forever has an undertone to it that “once a machine, always a machine”, that humans are the only “true” sapients and non-human sapients are impostors, not worthy of having the same aspirations.

It also feels like an opportunity for a slippery slope into exactly the mindset the colonials have which caused all the trouble in the first place and gives me a sense that the author might have that religious mindset that sees it as desirable that we be forever trapped as children under the rule of some immortal parent, rather than being able to achieve adulthood and sit as equals with the highest thinking powers in the universe.

(“Only God can truly create life” and so on. That belief that there’s some essence to the universe that we cannot replicate is a central part of why, despite China having 5000 years of unbroken history, it was Europe that developed science much more recently and leapfrogged China enough for Britain to dictate terms to them and get a 99 year lease on Hong Kong.)

…but then what do I know? I’m a physicalist for the same reasons professor Shelly Kagan lays out and I see it as unpalatable for the existence of the soul in fiction to have any implications beyond the ability to outlive one’s physical form… the ideal way to squash the implications being to have souls arise organically as an inevitable side-effect of sapience.

More generally, I’m always brought back to the Dresden Codak page “Caveman Science Fiction” which casts the recurring “you go too far” trope popular culture uses toward science in light of innovations we take so for granted that nobody will question that they’re a good thing.

This same theme also feels related to another detail I find distasteful: The magic used for things like expansion charms acts like radiation on muggles, further amplifying that “it’s one or the other” element.

Aside from feeling like a needless “this universe throws roadblocks in front of merging the best of magic and science for the betterment of all humanity” decision and opening up plot holes surrounding why all the non-human life in their expanded spaces isn’t dying off too, it reminds me of how one of the purposes of Bible passages like Leviticus 19:19 and Deuteronomy 22:9-11 is to support the concept that we are a special creation, separate from other animals, by establishing a recurring theme that, whether it’s animals, seeds, or clothing, different things must be kept separate… a mindset that I find harmful now, given how much we know about the nature of reality and how much trouble we have with baseless discrimination in society.

(A call-back to our base instinct to separate things we might feel empathy toward into “us” and “the other” and feel no empathy toward the latter.)

As for my last remaining problem? It feels like the story is incomplete as it relates to cylons. I’ll explain further as part of explaining what I did like, so let’s move on to that… and most of what I like has to do with the novel ideas:

First, using magic to solve the problems with an Alcubierre warp drive, resulting in a solution that is both novel and inherently depends on the mixing of magic and science for success.

Second, I like that this is the first story I can remember where it’s entire ships of wizards who meet the colonials, rather than just one or two, so they have far more ability to provide aid to the ships they encounter.

Third, the origins of the Lords of Kobol. (Wizards who lived where the Black Sea now is during the last ice age and fled during the previous occurrence of the 2012 plot… using magical sleeper ships because they lacked the scientific knowledge to go faster than light, and having lost their magic because they were so callous that they did things like slaughtering unicorns with impunity.)

…which is the first (though more minor) facet of what I meant when I said the tale of the cylons felt incomplete. Maybe it was mentioned briefly enough that I missed it while reading, but I don’t remember getting a clear explanation of how the Lords of Kobol went from wizards slaughtering unicorns to a population exclusively made of squibs and muggles. The story even has the characters notice that there had to be more to it than that. (Humans have a tendency to start to skim-read when we feel we know what’s going on, so it’s important for an author to spend time on something proportional to its importance.)

That leads into the next creative part: That the seers in the colonial population are squibs, that chamalla brings out marginal magical ability in the same way that an unnamed cactus-based drug (peyote?) does among Earth seers, and that Laura Roslin may be the most powerful seer in the surviving colonial population. I think this might have been touched on before in other crossovers I’ve read, but never in a way this distinctive and memorable.

Likewise, the idea that they’re a population full of squibs is also used to produce a very clever explanation for Messenger Six: Gauis Baltar is a squib and she’s a Class IV Demon… a malicious manifestation of his guilt born of what little magic he has, which has gained some measure of independent sapience but, because she’s still a part of him, she can’t just be killed.

Now, I’ll get into the cylons. While I find the mechanism a bit distasteful (stunning spells are lethal to cylons and are used to kill all the cylons on the fleet simultaneously so none can escape to act as saboteurs once they realize they’ve been discovered), I do like the novelty of having the Final Five wake up among the cylons and touch off the cylon civil war that way.

The problem I hinted at before is that it’s unfinished in that respect. The civil war removes the cylons as a threat, and… what? Again, unless I missed something, there’s still a population of psychologically human people out there who have every right to live, but who are without souls and who could meet and start interbreeding with souled humans at some point in the future. Either you applaud the extinction of thinking, feeling beings who fought their monstrous kin for the right to live free (which is a morally reprehensible thing to do) or you’ve just kicked resolving that social and political conflict down the road.

Finally, I really like the addition of and detail spent on the O’Neill cylinder. It’s far too rare to find those in fiction and it really lends a sense of how wondrous the sci-fi side of things can be. (For some reason, even the grandest stuff in series like Stargate: SG-1, Mass Effect, and Halo has a sense of being too familiar to do that… not to mention how they’re intrinsically tied to settings that lean toward action and martial conflict rather than the sense of adventure that, aside from Star Trek, seems to be relegated to the comparative obscurity of classic sci-fi novels.)

So, in the end, what do I think about it? I’d give it a 4.2 out of 5. It’s got some very creative elements and, despite its flaws, I enjoyed it. It might have been a 4.3 if I didn’t have so much trouble with the first half because of improperly set expectations, and, had the other flaws not been present, it could easily have been a 4.6.

Posted in Fanfiction | 1 Comment

The Existence of God(s) as Cosmic Horror (Pre-Draft)

I have a somewhat interesting perspective on the possible existence of higher powers
 one I haven’t seen someone else really focus on
 and I thought it’d make a nice appendix to the book on fiction that I’ve been accumulating notes for over the last two decades.

(If nothing else, as an example of a worldview you may not hold, so you can practice recognizing the defining traits of a worldview so that you can embody yours in your worldview more effectively.)

Now, given what I specifically plan to focus on, it’s going to take me several cycles of writing out a draft to tease insight out of my brain, then distilling the points that are sufficiently on-topic back into point-form notes, and then repeating
 possibly waiting a week in the process to let my perspective on the writing shift
 but the first such “just let things flow” draft was interesting enough that, despite being all over the place, I was encouraged to share it, so here we go.

(Bear in mind that I imagined what are in-line quotes in this post to be floated asides. I just don’t feel it’s worth the effort to try to implement that before I renovate my blog template.)


It is not uncommon for Christians to look at dictators who have been claimed to be atheists, and to see them as people convinced that they exist beyond all law. That morality is made-up, that, with no absolute authority, “all is permitted”, and that, like a spoiled child, their wants and desires are the most important thing in the world.

Now, a much longer article could be written about how that is rarely the case, and how you are unlikely to find someone who believes themselves to be evil yet manages to achieve power. Adolf Hitler’s Nazi party was outwardly religious and Hitler himself held many crackpot beliefs. Josef Stalin’s beliefs are unclear, but it is known that he saw the Russian Orthodox Church as a political rival.

However, you can find tons of atheist rebuttals like that. This is about why a healthy-minded individual might not only see gods as being as fictitious as unicorns, Santa Clause, and the Tooth Fairy, but see the cosmology thus described as explicitly evil. (There. I said it.)

First, think about why one would want an absolute authority. They want that certainty in their life
 but that assumes said authority is deserving of being absolute.

When I do good I feel good, when I do bad I feel bad, and that’s my religion.

Abraham Lincoln as Quoted in Herndon’s Lincoln (1890), p. 439

Search any religious text on Earth, and you will not find any ideas or morals beyond what could have been produced by the most forward-thinking person of the period in which it was produced. Worse, for the older books, they often enshrine ideas we now consider to be so bad that we must pick and choose which of the passages to obey and which to ignore
 hardly an absolute authority and evidence that our morals come from somewhere else.

And why does this same God tell me how to raise my children when he had to drown his?

Robert Green Ingersoll, Some Mistakes of Moses (1879), Section XVIII, “Dampness”

Rather than finding absolute authority comforting, I find it a terrifying prospect. Imagine that you were a slave in some ancient society, and the religious books said that was your lot in life. If they’re the works of fallible humans, then there is hope for a more equitable future
 but if they’re truly the word of some godly force, guiding the universe, then you have no hope. Existence itself will bend to keep your people oppressed.

Worse, though, think about the state of existence. If gods exist, then the absolute authorities that people desire seem perfectly OK with their thousands, millions, or billions of years of attention having left us to clean up countless sources of needless suffering, such as guinea worm disease (an African parasitic worm that can grow over two and a half feet long and must be slowly drawn out of the body over several days), Loa loa worm (a parasite that sometimes burrows into human eyeballs), Leprosy, and countless other horrifying conditions.

Mother is the name for God in the lips and hearts of little children.

William Makepeace Thackeray, Vanity Fair, Vol. II, ch. 2

Wanting to delegate ultimate authority to some immaterial super-adult is a very human perspective, but also a very childish one. “Better to enshrine the flawed, unjust laws mankind’s bronze-age ancestors were able to conceive into the very fabric of the cosmos itself, than to accept the terrifying uncertainty of being responsible for our own fates”. I, on the other hand, find it far more terrifying that all the ills around me are because some supreme being chose them to be that way
 that we’re trapped in this cycle of suffering and injustice because some dictator who can never be overthrown wills it to be.

If the universe has no gods, and is uncaring and amoral, then there is hope because every cause of our suffering is small and mortal and can be overcome. In general, humans instinctively fear the unknown, so anyone who is comfortable with part of the status quo will act to preserve it out of fear that the alternative will leave them worse off. Multiply that by several billion, and mix in our instinctive predisposition toward dominance hierarchies, and it makes all the sense in the world that evil continues to exist.

To me, the prospect of organized religion being correct is a far more terrifying cosmic horror than anything Lovecraft ever wrote, because Lovecraft wrote about beings and forces that rarely noticed us and only menaced us when we got caught under foot
 religion makes claims of powers that, ostensibly, have had thousands or millions or billions of years of active interest and chose this for us.

Religion is an insult to human dignity. With or without it you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion.

Steven Weinberg, Freethought Today, April, 2000

Why should I find it comforting that some ultimate authority with suspiciously human characteristics supposedly put us into this mess and has plans for us, when evil people seem to find perfectly good justifications to do evil things anyway, and good people have become more moral than the “moral authority”?

Beyond what Mr. Weinberg says, though, religion is an insult to human dignity because of how readily it steals the credit for our successes, shames us for the instincts supposedly put into us by whatever sadistic puppet-master made us to populate his cock-fighting pit, and attempts to stifle the true shining feature that distinguishes us from all other species on this Earth: Our insatiable curiosity and unflinching desire to investigate the true nature of our universe.

The whole idea that Adam and Eve were expelled from the Garden because they dared seek knowledge tells us everything we need to know about the founders of religion and its true purpose.

CDenic on YouTube

That said, I’ve gotten off-topic. The interesting core part of my perspective which I haven’t seen others touch on is that “if you think on it enough, any setting where the power that motivates the universe flows through a thinking divinity will become cosmic horror”.

Fundamentally, I find it inherently horrifying to imagine a setting where there can never be any hope of humans being truly free to sit, unbeholden to the whims of some higher power
 and, frankly, I find myself suspicious of the upbringing of anyone who longs for that. It just sounds a little too much like someone who’s developed a trauma bond with an abusive father and, having been forced out into the world, seeks to replace them.

I think anyone who left their home to escape a controlling father can probably understand why the idea of a celestial counterpart one can’t escape even in death might be horrifying.

There. I’ve said my piece. Even if you’re an atheist, you probably didn’t think of the Lovecraft comparison, so, whoever you are, sit and cogitate on it. Expanding the mind always makes for better authors.


In later drafts, I want to dial in more closely on the subjective psychological aspects of my perspective, how they interact with the feel, tone, and atmosphere of a narrative, how they relate to fantasy and other settings where tangible, empirically demonstrable power is available to mortals at the will of higher powers, and possibly how it interacts with how equally horrifying I find the prospect of nanotechnology, given what understanding I have of the world as-is from my expertise with computer programming.

(You think COVID-19 is scary? Imagine that it can leave you trapped within your body as it’s remote-controlled by someone else.)

As a matter of fact, I’ve already started another “draft, then distill cycle” even before starting to distill the first one, and here’s what’s come of it so far:


Imagine for a moment that you’re a child with a capricious, controlling father. All your life, he has done things that he says you’ll thank him for when you’re older, even though something deep inside you says “This is unfair. This is wrong!”

Now imagine that you’ve been told you must obey him, even as an adult, and that his decisions will be no less inscrutable.

Now, imagine that his name is God
 the ultimate inscrutable father figure, rewarding and punishing you, seemingly at random, always telling you to trust him because “It’s for the best”. A father figure who you will be beholden to for eternity, even in death.

Take away the name “God” for a moment. Doesn’t that seem like a terrifying prospect? A capricious father who you can never escape?

If nothing else, it should be a damning indictment of human social instinct that there exist religious women in Saudi Arabia.

Religion is rooted in the primal assumption that an intelligent force lies behind observed events wrapped in efforts to make that misconception less terrifying. To welcome gods into your worldview in this age when science has explained so much is to welcome a sweetened medicine long after it has proven not only non-efficacious, but actively harmful to your health, because you are addicted to the sweetener.

P.S. To anyone who thinks I “hate” God, please read what I wrote more closely. This is my objection to the idea of God being real. I could expound at equal length on the idea of Harry Potter being real, but that doesn’t mean I believe the Wizarding World exists.

Posted in Web Wandering & Opinion | Leave a comment

Novel – The Wiz Biz

…and since I seem to be reviewing books in reverse order from when I read and took notes on them, today’s review will be The Wiz Biz by Rick Cook… an omnibus of Wizard’s Bane and The Wizardry Compiled and another book that spent some time in the Baen Free Library but isn’t there anymore.

(And, since it’s an omnibus, prepare for a long review.)

The first book is a story about a human computer wiz (named Walter Irving “Wiz” Zumwalt) who gets summoned into a fantasy land where the good guys are losing… he’s destined to be the hero, despite everyone (including him) thinking that he was summoned by mistake.

Eventually, he realizes that magic is kind of like programming and it becomes clear to the readers that what allows him to win is a bit more than that. While his programming knowledge is technically what allows him to win, what really makes him the hero is that he’s an Outside-Context Problem to the villains who, until now, were inexorably winning the war.

It is a classic 80s “male geek becomes a hero” story, and the first volume doesn’t pass the Bechdel Test as far as I can remember, but it does pass the Sexy Lamp Test (at least one female character who’s important enough to the story that you couldn’t replace her with a sexy lamp bearing a post-it note), so it’s more a matter of demographics than bad writing. (There are at least half a dozen female characters who have meaningful conversations and are crucial to the story, but the main character, who happens to be male, is so central to the story that it’s hard to find any scenes where people talk without him coming up, female or otherwise. It’s just that kind of story.)

The story also has an interesting approach to the interaction between the hero and Moira, the lead female: It flat-out lampshades the clichĂ©s. He’s attracted to her because the wizard who gave his life to summon him cast an infatuation spell and he comes right out and says she’s not his usual type (he prefers willowy blondes). Not only does she not like him for most of the story, she has every right to that view because he’s as likely to get them both killed as an unattended toddler. Worse, because he’s an adult male, he’s used to being competent and that makes him even more of a liability.

As I mentioned earlier, his “secret power” is that he’s a walking Outside-Context Problem, so I’d like to go a little more into that.

First, he’s such an unexpected thing that the bad guys assume their future-scrying must be on the fritz. Beyond that, much of the driving conflict revolves around him struggling to recognize that he actually is relevant and how so, rather than just possibly being a misfire of the summoning.

Second, the story likes to draw parallels between computer hackers and wizards and, more narrowly, the secret of his success is simple: All the other wizards are programming raw machine code and a good wizard is someone who can do so without accidentally corrupting or deleting themself. On the other hand, his post-secondary education taught him how to think like a computer scientist. (Something he finally realizes two thirds of the way through the first book.)

Like most good novels (and like far too few pieces of fanfiction), it does a good job of combining sections where a lot happens in a short amount of time with sections where a lot of time is glossed over in very little prose, and doing it without the reader getting confused.

…and, like George O. Scott’s Venus Equilateral (another omnibus I highly recommend), this is a story where, once the character realizes his role, you get a strong sense that the author has real-life experience. George O. Scott was a radio engineer, and Rick Cook clearly has training in the kinds of low-level programming Wiz gets up to. That in and of itself is fascinating and it’s the essence of one of the meanings of the phrase “write what you know”.

I have high-level programming experience, but it still would have taken me some thought and research to come up with this. We’re just so used to modern processor architectures being boring and same-y… even compared to what actually existed around the middle of the 20th century. (Even the stuff that still exists is likely to surprise higher-level programmers who didn’t encounter alternatives at school, such as AVR chips using a modified Harvard architecture rather than a Von Neumann one.)

…of course, reveals never go well with technology-like things, and, when he demonstrates his first magic program, it has a bug that results in just about every bad thing that could happen without the magic going truly out of control. (It destroys his progress with the female lead, alerts the enemy to their position, gets orders issued to have their little sanctuary burnt to the ground, etc.)


but, at the same time, it’s also used to explore his motivations and bring about character development before he even realizes the magnitude of his screw-up, and, again, it’s told from that delightful perspective of interpreting what’s going on through the lens of computing before it became ubiquitous.

There’s also a recurring theme of his approach to programming being so alien that, when perfectly reasonable assumptions are made about what a magic user is, it leads people to the wrong conclusions about Wiz
 especially the bad guys.

The secret to a good fantasy crossover is to ensure that your “modern” character isn’t any smarter (or even necessarily as smart as) the natives, but has knowledge or life experience that provides a truly unorthodox solution to the problems without making the natives feel incompetent.

I also like the irony in tweaking the old “hero rescues the girl” trope by having the mass of bad guy henchmen take Moira by surprise and capture her specifically because she’s the biggest threat as a magic-user, so she obviously must be the hero. (and leave Wiz to become the hero because he has no magical aptitude they can sense at all.)

Finally, sometimes formula exists for a reason, but a good author will always find ways to play that to their advantage. I got a real kick out of how, at the point where a movie would have something like a training montage, Wiz has his “I’m not going to take it any more” moment and discovers the local equivalent to energy drinks. (Vile stuff that tastes like coffee you could stand a spoon up in and works the same.) What could be more fitting for a programmer? 😛

…which brings me to one of the things I find most personally noteworthy about the story… how it feels.

I’m not sure if it has a proper name, but there’s an atmosphere that I’ve felt from various Interactive Fiction (text adventure) games and Legend Entertainment games I sampled, as well as from amateur fiction in subcultures that originated on Usenet and wound up on sites like Sapphire’s Place and The Transformation Story Archive. As far as I can intuit, what I’m picking up on is the ambient feel of 1980s and early 1990s college/university geek culture which got flooded out by the original Eternal September.

Some of the plot elements I’ve described feel familiar to 80s movies, but this goes beyond that and I think it has to do with prose being better at communicating certain elements to someone who didn’t live through them.

(Speaking of which, I’d appreciate any suggestions for how to find more of it in an efficient way, given that I’m a little too young to have experienced it firsthand.)

Now, before I move onto the second volume, Wizardry Compiled, I’d like to touch on something that is best explored across both books together: How they handle female characters.

I did say that it fails the Bechdel test for not having a scene where two females talk to each other without Wiz coming up but, despite that, it has a nice amount of depth for the type of story it is.

For example, there’s a scene where Wiz encounters a fleeing caravan of Fae refugees and it’s a conversation with a brownie mother that makes him realize that he’s unintentionally become a peddler of horrible weapons… but that’s just one scene. Let’s talk Moira.

As I mentioned, Moira is the love interest who starts out hating Wiz, but there’s a very nicely couched reveal involved. It turns out that she doesn’t just resent Wiz for the obvious reason, but also for deeper reasons that have to do with her own past. (And I like how it lampshades that pattern: There’s actually a scene where Wiz has to keep shooting down her excuses until, finally, she has to admit the real reason just as much to herself as to him.)

Beyond that, she’s also integral to the second book, which is structured into two independent stories: One following Wiz, who’s out in the world and uncontactable, and one following those back in the capital, including Moira, who takes a leap into the unknown to visit our world and hire more programmers. (A part of her arc involving her dissatisfaction at now living in the shadow of the man who is revolutionizing magic.)

…yeah. The first arc, originally published as Wizard’s Bane, is what most people would write and call things done. He learns how to do magic like a programmer, defeats the bad guys, gets the girl, decides to stick around, and is set on a path to revolutionize the world. The second book is all about how “happily ever after” isn’t so simple.

(And the conversation which allows the second volume to pass the Bechdel test is thematically related, with Moira helping one of the hired programmers, Judith, come to terms with the loss of her childhood fantasies about dragons.)

The Wizardry Compiled, is all the stuff that would normally be boiled down into a mere epilogue. It takes place two years later and covers Wiz training wizards to think like programmers when teaching was never his strength, navigating the politics of the wizarding hierarchy when politics is even less so, overcoming the tendency to neglect everything else for his work before he loses Moira, and Moira actually making what is effectively first contact between the fantasy setting and our world and importing more programmers to help with what he started, all while a conspiracy festers between good guys and bad guys to preserve the status quo and prevent their own obsolescence.

It’s dedicating an entire book to this “epilogue fodder” which takes the story from merely “good with some great scenes surrounding the invention of a magic toolchain” to “classic and legendary” in my eyes. In fact, the first chapter of it feels sort of like an epilogue that realized it had more to say.

It begins with a very nicely chosen quote, and continues to introduce the chapters with good quotes throughout:

You can always tell a really good idea by the enemies it makes

programmers’ axiom

Like the first book, it continues the theme that the biggest hazard of magic is unintended side-effects but, unlike the first, this one has the unintended side-effects stem from unintended ways that humans will intentionally use what Wiz has given them, rather than what the “computer” will do with what humans ask.

In essence, it’s shifted from bugs in the code that is magic to bugs in the end users who run the code that is knowledge. The first book touched on that in the form of the Black League (ie. villains) but now it’s focused on the casual cruelty of ordinary humans against those not of their tribe.

In keeping with the style, it makes use of the same out-of-the-box wit as the first volume, with moments like Wiz surviving a death trap because “even death traps need regular maintenance”, a fire-breathing dragon accidentally giving itself steam burns, flawed magic code being literally buggy, and a different bit with a dragon which reminds me of a scene in the 1996 Steve Martin movie, Sgt. Bilko.

The best part, in my opinion, is the part of the second book beginning about half-way through when they hire more programmers from our world. Aside from that being entertaining in itself, something about the mindset required for writing them made the cuts back to Wiz significantly punchier.

It’s a shame that Rick Cook didn’t fully grasp what he had though. The sequels to these two volumes feel too much like cargo cult copies of the first two
 similar, but with an unsatisfying shallowness to them and focused more on the programmer culture references and less on the deeper technical and social commonalities between magic/wizards and programming/silicon valley.

I think it’s that he got so into the appeal of the latter half of the second book that he overcompensated and jettisoned the deeper aspects that made the first book work and made the second one an even better balance. Even the second half of the second book feels like it’s toeing the line on that “too much shallow humour” front. (I also noticed the second book starting to show hints of the “shallow humour and action starting to crowd out deeper elements, which get squashed into the end” pattern I vaguely remember observing in the later books.)

Still, both volumes in The Wiz Biz are definitely excellent and I don’t want to fault them just because I’m able to see room for improvement. (It’d be a pretty sad existence if I couldn’t just relax and enjoy things.) It’s just that, if you do try to analyze the experience, you can see how the first book was more serious, while the second book slides from where the first book left off, to an optimal balance of deep insight and shallow humour, then fails to settle there and starts to hint at what the sequels would be before it ends.

Get the first two, but don’t get the later ones. They’ll feel like a disappointment.

That said, people do say to try to get the two books separately if you can, because The Wiz Biz was edited sloppily. I did notice the odd typo, but my main problem with the omnibus is that it doesn’t always use proper scene breaks, which means I have to occasionally stop and rewind to make sense of what’s going on.

In the end, whichever version you get, these are classic fiction that I’d highly recommend. 5 out of 5.

Posted in Geek Stuff, Reviews | Leave a comment

On The Unix-Haters Handbook

A few months ago, I wrote a forum post explaining the context in which The Unix-Haters Handbook existed since, seen through the lens of today’s Linux, it does come across as somewhat ridiculous.

While Eric S. Raymond did write a retrospective review back in 2008, and it has the benefit of him having been involved with its authors when it was written (and I certainly recommend reading it, since he touches on things I didn’t for lack of experience), I still think it feels a little bit too eager to argue “that’s all fixed now”… probably because ESR is coming at UNIX from an “old hand who is used to the technicals” side of things.

Since I wrote my own explanation before I discovered ESR’s post, and I came at it more from the direction of what a user from 2020 would take for granted, I thought there was still worth in not letting my own explanation fade into obscurity, so here’s a polished-up version.

Because it began as a series of notes taken as I read the book, it’s more or less in the same order, even if I didn’t bother to dig up an exact reference to each page I’m responding to.


The Unix-Haters Handbook is a book from 1994 which is best described as “a frustrated rant about how bad UNIX is, mostly by people who experienced or wrote more comfortable systems that got displaced or never caught on”.

I agree that the style definitely gets people’s backs up, to the detriment of its message (something ESR also pointed out), but, to be fair to it, a lot of it was true about certified UNIX implementations back in 1994 when it came out and, to people who aren’t used to the warts, some of it still is.

ESR’s The Art of UNIX Programming (free to read) and Keith Packard’s A Political History of X (YouTube video) both agree on what a mess was made of things by vendor competition on some fronts combined with vendor apathy on others. (Imagine an ecosystem where every single vendor was Canonical and, by the time they gave up on their own Upstarts and Mirs, Linux was coming out and irrelevance was inevitable.)

If The Art of UNIX Programming describes the beauty of the UNIX design as a goal, The Unix-Hater’s Handbook describes how horrendously vendors were botching the implementation of it in the 80s and early 90s. This snip from a quoted e-mail on the topic of platform compatibility really was fitting:

The evaluation process consisted largely of trying to get their test program, which was an early prototype of the product, to compile and run on the various *nixes. Piece of cake, sez I. But oops, one vendor changed all the argument order around on this class of system functions. And gee, look at that: A bug in the Xenix compiler prevents you from using byte-sized frobs here; you have to fake it out with structs and unions and things. Well, what do you know, Venix’s pseudo real-time facilities don’t work at all; you have to roll your own. Ad nauseam.

I don’t remember the details of which variants had which problems, but the result was that no two of the five that I tried were compatible for anything more than trivial programs! I was shocked. I was appalled. I was impressed that a family of operating systems that claimed to be compatible would exhibit this class of lossage. But the thing that really got me was that none of this was surprising to the other *nix hackers there! Their attitude was something to the effect of “Well, life’s like that, a few #ifdefs here, a few fake library inter-face functions there, what’s the big deal?”

The Unix-Haters Handbook, Page 12

Yes, there are some bits that I disagree with, such as that the brevity of commonly-used commands like rm and mv is too cryptic to be acceptable. However, for the most part, they were right. (Though late to the party, as ESR points out.)

Dumb Deletion

At the time, shells didn’t ask “Are you sure?” if you accidentally lifted Shift a moment too late and typed rm *>o (delete all files in the current directory and redirect stdout to a file named o) rather than rm *.o (delete all files in the current directory with a .o extension).

I agree that’s a problem but they blame it on programs being unable to access a raw form of the command line to implement their own “Are you sure?”. Given that they later bemoan allowing applications to bring their own termcap and curses, I’m going to have to disagree with them.

Windows has shown that expecting every application to carry along argument-parsing code has enough problems to make that idea “using a sledgehammer to swat a fly” so, as far as pre-checks go, I think making rm a shell built-in that asks “Are you sure?” is good enough… especially when it’s intended to double as a scripting primitive. Right assessment of the problem, wrong proposed solution.

Now, one thing they bring up which is, if anything, a more reasonable complaint today than in 1994, is the lack of an option to hook in Trash/Recycle Bin support at a low enough level that commands like rm reliably send files to it.

I don’t think it’s fair to fault UNIX specifically when no major operating system in 2020 does that, but, with today’s democratized computing and hundreds of gigabytes of storage, the lack of a fully reliable way to undo accidental file deletion is glaring.

Tools like rdiff-backup and Apple’s Time Machine certainly help to reduce the scale of the problem, but they are still incremental backup tools that take periodic snapshots, rather than hooks into file deletion which catch everything, so the core complaint remains valid.

Cocked-up Command Lines

First, let me be clear that this was when csh-family shells were common. That alone should explain a lot of the frustration.

For all their weird legacy quirks, modern bash and zsh are significantly more user-friendly than what people were using in 1994, and that’s not even counting options that break scripting compatibility, like fish and Nushell.

Second, at the time, unices were a horrendous mess of commands using inconsistent argument styles. You can still see that in the single-dash long options used in X.org programs and, yes, they are very annoying.

POSIX, as “too little, too late” as it was, and the standardization of Linux and BSD on single dashes being used for single-character short options helped a lot here, as did conventions like those summed up in The Art of UNIX Programming, Chapter 10, Section 5.

Third, this was before the innovation of -- as a standard way to say “no more option flags. What follows are unarguably positional arguments”, which made it needlessly ugly and annoying to properly handle the potential for file or directory names beginning with -.

Making that even worse, at the time, - did double duty as both “no more options” and “stdin/stdout as a pseudo-filename”, with no consistency between commands beyond “read the manpage”.

It is perfectly reasonable to complain about that state of things and to expect standardized, OS-provided argument parsing to be the proper solution.

That said, we’re still innovating on argument parsing over 20 years later, so I’d argue that the best solution would have been an optional reference implementation with an alluring API. That, as an implementation of part of a command-line analogue to Apple’s Macintosh HIG or IBM’s CUA would have served as a strong incentive to follow convention without stifling innovation.

UNIX’s greatest strength has always been its ability to evolve to meet new challenges.

Unhelpful Help

While I haven’t used any of the unices touched on in The Unix-Haters Handbook, I fully believe the authors’ accounts of how UNIX commands behaved when presented with incorrect input. That kind of lack of attention to anything but the ideal case is sadly common in software development.

Thankfully, in my experience, the GNU userland tools are generally much better about that… probably because enough coders tripped over problems and decided to make sure nobody else would. That’s the beauty of open source.

However, their complaint that not all shell builtins have manpages is worse. Sure, it’s bad if a program is too sensitive to a very human failure to fully read all provided documentation, but how is a novice supposed to read the documentation if there’s no unified way to do so and no obvious way to identify which reference to consult?

Things are better on that front now… but not perfect. Google aside, try typing man cd on Ubuntu Linux 16.04 LTS and you’ll still get No manual entry for cd, because not all shell builtins have corresponding command-line utilties with manpages.

It also really didn’t help that, from what I’ve seen, early 90s commands did have godawful equivalents to what GNU exposed as –help and that, prior to GNU info and Google, man was all there was for commands which had outgrown needing just a short page.

While I’ve been spoiled by HTML and search engines and Zeal, GNU info basically was the solution to all their complaints on that front.

Bad command or filename? No… just bad commands

Let’s start out gently. tar used to allow you to accidentally backup nothing. Now, GNU tar says tar: Cowardly refusing to create an empty archive. Bad old days, meet modern sanity.

It’s still true that, if you type cat - - -, it will take three presses of Ctrl+D to get your terminal back with no feedback on what’s going on, but I’m going to rule on cat‘s side here. It’s more a shell scripting primitive than a user command and you asked it to read three files from stdin. (Ctrl+D sends an “end of file” signal.) It would be worse to make the command more inconsistent.

That said, to do that wouldn’t even occur to someone anymore. Ctrl+C is the standard way to break out of a command that people are taught, so I can only assume that either Ctrl+C was less robust back then or whatever training that led them to Ctrl+D was deficient.

Likewise, back when people used RCS for revision control, you could accidentally type indent rather than ident and mangle a non-C file. indent will still do that (I tested it on a Python file) but, thankfully, it’s not installed by default, and nobody of note uses RCS anymore, so there’s no reason to type ident anyway.

…but it’s not all sunshine and roses. If you mistakenly type cc -o hello.c hello, even modern-day GCC is stupid enough to blank out your source file before doing any sanity checking.

(I don’t have time to take responsibility for things like “Is this still a problem on a newer version?”, but somebody should probably report that.)

Also, I’ll have to disagree with them on how cd and symlinks interact. It would make no sense for .. to take you down a route you didn’t come up, just because you crossed a symlink somewhere along the way.

Netadmin-Pattern Baldness

I won’t go into too much detail on the two chapters dedicated to Sendmail, Usenet, and UUCP.

Yes, old Sendmail versions were braindead and had an infamously cryptic configuration syntax. Thankfully, aside from the classic stories it left behind, like The case of the 500-mile email, even modern Sendmail has given way to Postfix and Exim.)

Yes, Usenet had its problems, but it’s basically dead outside its use as a paid-for channel for illicit file-sharing and Gmane‘s proxy for putting an NNTP frontend on mailing lists. (Though the latter use does a pretty good job of cutting away the flawed parts and using the good parts.)

Yes, UUCP was a monstrous pain. Nobody uses it anymore.

Terminal Stupidity

As ESR pointed out, the central thrust of The Unix-Haters Handbook with regard to terminal handling isn’t really something that can be laid at UNIX’s feet.

These are people pining for Apple-like ecosystems where it was reasonable to demand that the terminal or display hardware play a role in auto-configuration, while UNIX was developed to deal with a dog’s breakfast of terminal hardware that could be extremely dumb, and there was limited ability to dictate terms to terminal makers.

Yes, configuring terminals was a mess and, yes, box-drawing before Unicode-aware ncurses was sub-par, but most of that was beyond UNIX’s ability to fix while retaining compatibility with all the hardware people wanted to use it with. These days, terminal emulators all claim to be some extended form of a DEC VT100 with Unicode support, and everything more or less Just Works™.

That said, it’s not all perfect. I can personally attest to what a mess things can become if you want to combine something like GNU Screen or tmux with support for more than 16 colors… urxvt 88-color mode? xterm-compatible 256-color mode? True/hex color mode? You basically have to just manually test your terminal emulator and then force-override the definition Screen exposes to the application.

Beyond that, however, I think it’s a perfectly reasonable complaint that UNIX chose to require that something like termcap be linked into every application, rather than building something akin to GNU Screen into the terminal driver layer to abstract away the differences… especially if you’ve experienced such a design elsewhere on other systems. More difficult to innovate with, perhaps, but also more robust for the cases it does support. (It reminds me of software mixing support in ALSA vs. OSSv4… but Linux has always been an infamous mess when it comes to audio.)

Weighty Windowing

As ESR points out, it’s likely this chapter was mostly written by someone who poured his blood, sweat, and tears into a superior alternative to X11, only to see it lose out for being proprietary. That said…

X11R4

The biggest complaint about X11 itself is the same complaint people had about Emacs. In 1994, it wasn’t reasonable to try to compete with the aesthetics of a system like Windows 3.1 using a network-transparent system written in C. This quote says it all:

X has had its share of $5,000 toilet seats—like Sun’s Open Look clocktool, which gobbles up 1.4 megabytes of real memory! If you sacrificed all the RAM from 22 Commodore 64s to clock tool, it still wouldn’t have enough to tell you the time. Even the vanilla X11R4 “xclock” utility consumes 656K to run. And X’s memory usage is increasing.

The UNIX-Haters Handbook, Page 123

That said, performance wasn’t the only complaint about the core X server. They were talking about X11R4, which definitely had its problems.

Heck, I started using Linux with XFree86 implementing X11R6, and, coming from Windows, it was shameful what an annoyance it was to get it configured and working properly… and I know there are warts I’ve forgotten.

It wasn’t until surprisingly recently that they sat down and implemented the autoconfiguration that we enjoy from X.org today.

What’s worse, X11R4 existed during an era with multiple competing proprietary X distributions, rather than everyone sharing X.org. Imagine every complaint you have about the nVidia binary driver turned up to eleven.

Motif

They also complain about Motif, which is valid. This is back before GTK+ and Qt. Motif was trying to be a clone of Windows widgetry, except dog-slow.

The funny thing is, as much as I’m not a fan of how we got there, we have the design they proposed for dividing the workload between client and server… we download JavaScript or WebAssembly into a web browser to implement single-page web applications, so the division of responsibility can be decided on an application-by-application basis.

Of course, for native applications, X11 spent so long twiddling its thumbs on ensuring coherent rendering during window resizing that modern GTK and Qt got fed up and now do everything client-side and push a pixmap to the X server for the window contents.

Configuration

They also complain about configuring X… and it’s all valid.

If you’ve never tried using xauth, consider yourself lucky… especially now that the man page has an examples section and we have Google in case that’s not enough. Just use ssh -X or ssh -Y for X remoting and save your sanity.

As for their complaints about xrdb, yeah. Good aspriations, shoddy execution… and I get to say that. I use things like Tk, xterm, and urxvt on my 2020 desktop, so I got to see the contrast first-hand.

Sub-Par Scripting

They spend a lot of time ragging on shell scripting… and it’s warranted.

First, let me repeat: This was the era of csh-family shells.

Second, this was before things like Python were ubiquitous, so you had to press shell scripting into service in roles it really isn’t very well suited to. Perl existed, but it was young and I’m guessing they hadn’t learned of it yet.

(And let me say that I have experience trying to write shell scripts that are properly portable to GNU-based Linux, busybox-based Linux, FreeBSD, OpenBSD, and macOS. Even if shellcheck had been around, it doesn’t warn you when you use non-portable options to the commands you’re calling, and POSIX is uncomfortably limited. It was not fun.)

They also make the good point that shell scripting forces a command’s user-visible output to become an API that you can’t ever change without breaking things… sort of like how Python’s standard library is where code goes to die, or how Firefox had to get rid of its legacy “override arbitrary browser internals” addon API because it was an albatross around their neck.

Finally, modern find has -L. Try to empathize with them for having an archaic version of find that simply will not follow symlinks no matter how many goats you sacrifice. Plus, find is a mess, interface-wise. That’s why fd got written.

I could have put find under “bad commands”, but shell scripting is so anemic that find really serves double duty as far too many things that other languages would have via built-in primitives or a standard library.

“My Code Compiles, So It Should Run”

Going into the chapters which complain about C and C++, it’s important to have a little context:

  • The UNIX-Haters Handbook was published in 1994
  • The first ANSI C standard was ratified in 1989, only five years earlier.
  • C++ was originally known as “C with classes” and Cfront, the original C++ compiler, was a transpiler that converted C++ to C.
  • The second edition of The C Programming Language by Brian W. Kernighan and Dennis M. Ritchie has this to say about the difference between pre- and post-standardized C:

For most programmers, the most important change is a new syntax for declaring and defining functions. A function declaration can now include a description of the arguments of the function; the definition syntax changes to match. This extra information makes it much easier for compilers to detect errors caused by mismatched arguments; in our experience, it is a very useful addition to the language.

The C Programming Language, Page 2

As a programmer in 2020, the idea that C once didn’t let you declare your arguments, and that one of its creators said “in our experience, it is a very useful addition” helps to drive home how different C programming was in the late 80s and early 90s, even before you realize that there were no free C and C++ IDEs for UNIX in the modern sense, no IntelliSense, etc.

Combine this with how even GCC in 2020 can lag behind standards sometimes, and how much vendor fragmentation and site upgrade schedules can hold things back, and you start to get a feel for how painful C could be.

As for C++, these are people who came from languages like SmallTalk which invented object-oriented programming as we know it. What did you expect them to think of C++’s claims to being comparable, when it began as “C with classes”? It’s nothing but leaking abstractions piled on leaking abstractions and that’s being kind.

Now, to be fair to C and C++, I do not agree with the degree to which they seem to love “distribute the program as a memory image. Allow stopping a running program, editing the code in situ and resuming execution, etc.” programming. As an experimental/exploratory programming tool, that’s nice, but I think it’s completely unsuited to developing applications to be distributed, let alone infrastructure. The 2020 programming ecosystem seems to agree since, outside the limited form that has been reinvented by browser developer toolboxes, it’s more or less nonexistent.

Sisyphus, Sysadmin

Have you ever had to work on a system with such lazy checks in its tooling that the partitioning tools would let you create overlapping partitions? That was UNIX.

Have you ever seen the developers of your system send out an e-mail asking for resubmission of all bug reports since such and such a time because they lost them? That happened to 4BSD because dump was too disruptive and tar was more or less useless.

It’s important to remember that UNIX wasn’t always the shining titan of uptime and reliability that it’s become. In fact, even over half a decade later, with GNU tools, I still vaguely remember how much manual tuning, configuring, and babysitting a Linux system needed… and that was an improvement. There’s good reason that some Linux YouTubers comment on how Linux has become so much better, even just since they got into it, as to be almost unrecognizable.

How many people remember having to fsck ext2? I vaguely do, and I believe them when they say that UFS was much worse. We take filesystems which journal at least their metadata for granted these days.

Also, you certainly can’t blame them for lamenting the lack of a logical volume management system like LVM.

Got Root?

What about their complaints about UNIX’s security model? Yes! A simple split between regular users and root, plus SUID, is a pretty braindead security model and this is probably one of the biggest places where I have to disagree with ESR.

ESR argues that it’s bad, but nobody did better… Really? I find it hard to believe that something as simple as splitting root up into a set of capability flags and splitting SUID up into a matching set of extended file attributes was hard to think of in 2008 or even in 1994… but that’s what we finally have with POSIX Capabilities, including the ability to opt a process and all its descendants out of UID 0’s magical privileges should they manage to attain it.

Sure, I disagree with the authors on whether it’s a good idea for processes to inherit permissions from their parents, but UNIX’s permissions were definitely underpowered.

(Though, on the other hand Windows NT actually goes too far in the opposite direction, with a system of ACLs that is too fine-grained and, as a result, complicates things to the point where people are too likely to just grant all permissions in a frustrated last effort to make things work.)

NFS… ’nuff said.

Let me give some context here. I’ve never used NFS on my Linux systems. Do you know why? Because it was braindead even in the mid 2000s.

I hear that NFS 4 finally fixed a lot of the braindeadness, but I’m firmly habituated to using Samba (an implementation of Microsoft’s SMB/CIFS family of file-sharing protocols) between my Linux machines.

Why? Mainly because, until NFS 4 was production-ready, what I could use had no authentication! Whatever UID and GID you claimed to be, those are the permissions you got.

(Or, if that wasn’t the case, they sure thought it was some deep secret. I remember researching for ages because I couldn’t believe a security model so braindead would have survived to the end of the 1980s, let alone the early 2000s.)

It also used a firewall-unfriendly mess of ports and had a fragile stateless design… both things NFS 4 fixed by taking inspiration from alternatives including SMB/CIFS. I’ll leave The UNIX-Haters Handbook to explain the other problems.

The list goes on. In the end, I suppose the lesson is to never underestimate the power of vendors to botch things, to never judge the past by the present, and to recognize how much we take for granted in the world of computing today.

Posted in Web Wandering & Opinion | Leave a comment

Comparative Lovecraft and “The Colour Out of Space”

Lately, I’ve been trying to clear out my backlog of purchased audiobooks while I get other things done… and one of the things I bought was a Groupees bundle of HorrorBabble H.P. Lovecraft readings.

So far, The Colour Out of Space has been my favourite Lovecraft short story… a fact that I apparently share with Mr. Lovecraft himself.

You can legally enjoy it for free in both textual and audio form… though I prefer the audio version I’ve linked, as read by Ian Gordon for HorrorBabble. It really helps to keep you immersed when you can focus on visualizing the events being described without also having to dedicate effort to reconstructing the intonation and cadence of the narrator’s voice from the text.

Given that it’s a horror story, where presentation is key, I don’t want to say too much about the plot, so I’ll just borrow the beginning of Wikipedia‘s synopsis:

An unnamed surveyor from Boston, telling the story in the first-person perspective, attempts to uncover the secrets behind a shunned place referred to by the locals of Arkham as the “blasted heath.”

Instead, I’d like to talk about why I enjoy it so thoroughly. (Though, given how much I say, I’d appreciate it if you read/listen to the story first, then come back to finish this. I worry that analyzing the story in such depth will have an “explaining the joke ruins it” effect.)

First, many of Lovecraft’s stories, such as The Call of Cthulhu, have an element of xenophobia to them which hasn’t aged well. (The Rats in the Walls is uniquely egregious among the stories I’ve read so far. Not willing to settle for Lovecraft’s usual stuff, like implying that swarthiness is a sign of disreputability, it’s an otherwise excellent story where one of the central characters is the main character’s cat… named “N***er-Man” and, yes, that is the word you think it is.)

(Speaking of which, I suggest searching YouTube for a documentary named “H.P. Lovecraft: Fear of the Unknown” for a good introduction to who he was and why he’s so significant to the history of horror and science fiction. To oversimplify it, he changed the cultural zeitgeist, like all classic authors.)

The Colour Out of Space avoids this tendency toward xenophobia by taking place in the backwoods of New England with no human antagonists or henchmen. A minor thing compared to some of the other points I raise, but it definitely helps.

(At The Mountains of Madness also does well on this front, which makes sense as it was one of the last things he wrote before his untimely death from cancer, but it deserves its own blog post, if I find time.)

Second, Lovecraft’s fondness for uncertainty doesn’t throw the reality of the threat into question. For all its beautiful world-building, I wasn’t a fan of The Shadow Out of Time because, in the end, the main character is left unsure whether the experiences were real or a sign they’re relapsing into insanity. I found that a very unsatisfying anti-climactic experience. The Colour out of Space is, without a doubt, a real thing within in its setting.

(Not that I need a story to go that far. The Case of Charles Dexter Ward worked despite access to the horrifying setting being lost and all it took was having a second person to verify that the main character hadn’t imagined the whole thing.)

Third, and most distinctively, it truly feels like a cosmic horror story to me. While I tend to find Lovecraft’s writings fascinating, it’s rare for them to actually evoke a stirring of cosmic horror.

The problem is that, so often, his stories achieve the horror in a way which offers some kind of escape hatch.

For example, in The Dunwich Horror, the monster “is a Cosmic Horror”, but, in the end, it’s defeated using magic. While the twist is great, it still defuses the sense of horror for me for two reasons:

  1. If they can defeat the threat once, then they should be able to defeat it again.
  2. “Magic” is sort of a “get out of needing to understand free” card. By definition, magic is something you feel you can use without understanding it because, if you understood it, you’d just call it another branch of science.

(That said, Dunwich is an example of something The Colour Out of Space does not do. H.P. Lovecraft has a recurring fondness for ending the last paragraph of his stories on a strong, punchy revelation that drives home the horror of the story’s central concept. The Mound does it, as do Out of the Aeons, The Shadow Out of Time, Polaris and others. It’s something to look forward to, but The Colour Out of Space’s last sentence isn’t punchy in that way.)

The Shadow over Innsmouth is another example of that “defusing the horror” problem. While I enjoyed the certainty of having the story told in flashback after the government proved that the hero wasn’t crazy by coming in to address the problem, it also means that the problem has a sufficient solution to avoid having to deal with the lurking sense of horror. Sure, there’s a lurking threat to an individual, and there are horrifying elements, but it doesn’t have that inexorability that I admire The Colour Out of Space for.

Out of the Aeons is good, but, like The Call of Cthulhu, the threat is too distant. All physical evidence of the problem lurked under the ocean for eons before surfacing, and then returned to the ocean. It’s too easy to feel that it’ll probably stay under the ocean for eons more before coming up for another ultimately ineffective burp of momentary crisis.

Rather than in the middle of the sea, The Colour Out of Space takes place in in a relatively settled part of rural America, and the ending does a powerful job of driving home that the threat is still lurking.

The Call of Cthulhu also shares a similar problem to At The Mountains of Madness if you’re going for horror… the threats are too comprehensible. Yes, they’re powerful, but, despite R’lyeh’s non-euclidean geometry, it’s still just aliens and it feels like, with the march of science, we’re likely to have a way to effectively fight back by the time they surface. (That said, At The Mountains of Madness makes for a wonderful adventure story akin to something like Jules Verne’s Journey to the Center of the Earth [1] [2] [3] [4].)

The Colour Out of Space works so beautifully because it drives home that this is the realm of science, not magic, but, despite that, it utterly baffles the scientists called in to comprehend it. At the same time, the description presented to the reader is very convincing as something that is genuinely incomprehensible in its nature, yet comprehensible in its effects.

(As opposed to the sense of the author saying “take my word for it” that you feel with some supposedly genius characters, where willing suspension of disbelief is pressed into service beyond its ideal scope. It’s self-evident that characters like Sherlock Holmes, or L and Light Yagami from Death Note are geniuses and you feel that. Characters like Hermione Granger from Harry Potter, on the other hand, feel like the author is cheer-leading their intelligence, but it feels like they’re only intelligent when the author specifically tries to evidence it, rather than it being a property of their being which affects their thoughts and behaviour more subtly at all times.)

The other big reason The Colour Out Of Space works so well is that there’s no “defeat the threat” moment. It’s more like the Portal games in that the protagonists survive without triumphing. Thus, avoiding the need to demonstrate that humans can win.

All in all, I highly recommend that you find 80 minutes when you can listen to an audiobook and give it a listen. Of Lovecraft’s works that I’d read as horror, it’s certainly the best… and if you enjoyed it, try some of the others I named. Lovecraft’s writing style takes a bit of getting used to, being strongly based on the archaic writing styles used in the antique books he grew up reading, but hearing it in audiobook form helps a lot, and his stories have a very distinctive atmosphere to them once you get into them.

P.S. It also was the inspiration for the most memorable part of a piece of Ranma 1/2 fanfiction from the rec.arts.anime.creative era named Bliss by Mike Loader and Lara Bartram. Given that it is still a memorable story to me, and that it won second place in the April 1999 TASS awards and 10th place in the 1999 annual TASS awards, I may re-read and review that too.

UPDATE: You may also want to watch Fredrik Knudsen’s Down the Rabbit Hole livestream special, Lovecraft & Junji Ito.

Posted in Reviews | 2 Comments