The Existence of God(s) as Cosmic Horror (Pre-Draft)

I have a somewhat interesting perspective on the possible existence of higher powers… one I haven’t seen someone else really focus on… and I thought it’d make a nice appendix to the book on fiction that I’ve been accumulating notes for over the last two decades.

(If nothing else, as an example of a worldview you may not hold, so you can practice recognizing the defining traits of a worldview so that you can embody yours in your worldview more effectively.)

Now, given what I specifically plan to focus on, it’s going to take me several cycles of writing out a draft to tease insight out of my brain, then distilling the points that are sufficiently on-topic back into point-form notes, and then repeating… possibly waiting a week in the process to let my perspective on the writing shift… but the first such “just let things flow” draft was interesting enough that, despite being all over the place, I was encouraged to share it, so here we go.

(Bear in mind that I imagined what are in-line quotes in this post to be floated asides. I just don’t feel it’s worth the effort to try to implement that before I renovate my blog template.)


It is not uncommon for Christians to look at dictators who have been claimed to be atheists, and to see them as people convinced that they exist beyond all law. That morality is made-up, that, with no absolute authority, “all is permitted”, and that, like a spoiled child, their wants and desires are the most important thing in the world.

Now, a much longer article could be written about how that is rarely the case, and how you are unlikely to find someone who believes themselves to be evil yet manages to achieve power. Adolf Hitler’s Nazi party was outwardly religious and Hitler himself held many crackpot beliefs. Josef Stalin’s beliefs are unclear, but it is known that he saw the Russian Orthodox Church as a political rival.

However, you can find tons of atheist rebuttals like that. This is about why a healthy-minded individual might not only see gods as being as fictitious as unicorns, Santa Clause, and the Tooth Fairy, but see the cosmology thus described as explicitly evil. (There. I said it.)

First, think about why one would want an absolute authority. They want that certainty in their life… but that assumes said authority is deserving of being absolute.

When I do good I feel good, when I do bad I feel bad, and that’s my religion.

Abraham Lincoln as Quoted in Herndon’s Lincoln (1890), p. 439

Search any religious text on Earth, and you will not find any ideas or morals beyond what could have been produced by the most forward-thinking person of the period in which it was produced. Worse, for the older books, they often enshrine ideas we now consider to be so bad that we must pick and choose which of the passages to obey and which to ignore… hardly an absolute authority and evidence that our morals come from somewhere else.

And why does this same God tell me how to raise my children when he had to drown his?

Robert Green Ingersoll, Some Mistakes of Moses (1879), Section XVIII, “Dampness”

Rather than finding absolute authority comforting, I find it a terrifying prospect. Imagine that you were a slave in some ancient society, and the religious books said that was your lot in life. If they’re the works of fallible humans, then there is hope for a more equitable future… but if they’re truly the word of some godly force, guiding the universe, then you have no hope. Existence itself will bend to keep your people oppressed.

Worse, though, think about the state of existence. If gods exist, then the absolute authorities that people desire seem perfectly OK with their thousands, millions, or billions of years of attention having left us to clean up countless sources of needless suffering, such as guinea worm disease (an African parasitic worm that can grow over two and a half feet long and must be slowly drawn out of the body over several days), Loa loa worm (a parasite that sometimes burrows into human eyeballs), Leprosy, and countless other horrifying conditions.

Mother is the name for God in the lips and hearts of little children.

William Makepeace Thackeray, Vanity Fair, Vol. II, ch. 2

Wanting to delegate ultimate authority to some immaterial super-adult is a very human perspective, but also a very childish one. “Better to enshrine the flawed, unjust laws mankind’s bronze-age ancestors were able to conceive into the very fabric of the cosmos itself, than to accept the terrifying uncertainty of being responsible for our own fates”. I, on the other hand, find it far more terrifying that all the ills around me are because some supreme being chose them to be that way… that we’re trapped in this cycle of suffering and injustice because some dictator who can never be overthrown wills it to be.

If the universe has no gods, and is uncaring and amoral, then there is hope because every cause of our suffering is small and mortal and can be overcome. In general, humans instinctively fear the unknown, so anyone who is comfortable with part of the status quo will act to preserve it out of fear that the alternative will leave them worse off. Multiply that by several billion, and mix in our instinctive predisposition toward dominance hierarchies, and it makes all the sense in the world that evil continues to exist.

To me, the prospect of organized religion being correct is a far more terrifying cosmic horror than anything Lovecraft ever wrote, because Lovecraft wrote about beings and forces that rarely noticed us and only menaced us when we got caught under foot… religion makes claims of powers that, ostensibly, have had thousands or millions or billions of years of active interest and chose this for us.

Religion is an insult to human dignity. With or without it you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion.

Steven Weinberg, Freethought Today, April, 2000

Why should I find it comforting that some ultimate authority with suspiciously human characteristics supposedly put us into this mess and has plans for us, when evil people seem to find perfectly good justifications to do evil things anyway, and good people have become more moral than the “moral authority”?

Beyond what Mr. Weinberg says, though, religion is an insult to human dignity because of how readily it steals the credit for our successes, shames us for the instincts supposedly put into us by whatever sadistic puppet-master made us to populate his cock-fighting pit, and attempts to stifle the true shining feature that distinguishes us from all other species on this Earth: Our insatiable curiosity and unflinching desire to investigate the true nature of our universe.

The whole idea that Adam and Eve were expelled from the Garden because they dared seek knowledge tells us everything we need to know about the founders of religion and its true purpose.

CDenic on YouTube

That said, I’ve gotten off-topic. The interesting core part of my perspective which I haven’t seen others touch on is that “if you think on it enough, any setting where the power that motivates the universe flows through a thinking divinity will become cosmic horror”.

Fundamentally, I find it inherently horrifying to imagine a setting where there can never be any hope of humans being truly free to sit, unbeholden to the whims of some higher power… and, frankly, I find myself suspicious of the upbringing of anyone who longs for that. It just sounds a little too much like someone who’s developed a trauma bond with an abusive father and, having been forced out into the world, seeks to replace them.

I think anyone who left their home to escape a controlling father can probably understand why the idea of a celestial counterpart one can’t escape even in death might be horrifying.

There. I’ve said my piece. Even if you’re an atheist, you probably didn’t think of the Lovecraft comparison, so, whoever you are, sit and cogitate on it. Expanding the mind always makes for better authors.


In later drafts, I want to dial in more closely on the subjective psychological aspects of my perspective, how they interact with the feel, tone, and atmosphere of a narrative, how they relate to fantasy and other settings where tangible, empirically demonstrable power is available to mortals at the will of higher powers, and possibly how it interacts with how equally horrifying I find the prospect of nanotechnology, given what understanding I have of the world as-is from my expertise with computer programming.

(You think COVID-19 is scary? Imagine that it can leave you trapped within your body as it’s remote-controlled by someone else.)

As a matter of fact, I’ve already started another “draft, then distill cycle” even before starting to distill the first one, and here’s what’s come of it so far:


Imagine for a moment that you’re a child with a capricious, controlling father. All your life, he has done things that he says you’ll thank him for when you’re older, even though something deep inside you says “This is unfair. This is wrong!”

Now imagine that you’ve been told you must obey him, even as an adult, and that his decisions will be no less inscrutable.

Now, imagine that his name is God… the ultimate inscrutable father figure, rewarding and punishing you, seemingly at random, always telling you to trust him because “It’s for the best”. A father figure who you will be beholden to for eternity, even in death.

Take away the name “God” for a moment. Doesn’t that seem like a terrifying prospect? A capricious father who you can never escape?

If nothing else, it should be a damning indictment of human social instinct that there exist religious women in Saudi Arabia.

Religion is rooted in the primal assumption that an intelligent force lies behind observed events wrapped in efforts to make that misconception less terrifying. To welcome gods into your worldview in this age when science has explained so much is to welcome a sweetened medicine long after it has proven not only non-efficacious, but actively harmful to your health, because you are addicted to the sweetener.

P.S. To anyone who thinks I “hate” God, please read what I wrote more closely. This is my objection to the idea of God being real. I could expound at equal length on the idea of Harry Potter being real, but that doesn’t mean I believe the Wizarding World exists.

Posted in Web Wandering & Opinion | Leave a comment

Novel – The Wiz Biz

…and since I seem to be reviewing books in reverse order from when I read and took notes on them, today’s review will be The Wiz Biz by Rick Cook… an omnibus of Wizard’s Bane and The Wizardry Compiled and another book that spent some time in the Baen Free Library but isn’t there anymore.

(And, since it’s an omnibus, prepare for a long review.)

The first book is a story about a human computer wiz (named Walter Irving “Wiz” Zumwalt) who gets summoned into a fantasy land where the good guys are losing… he’s destined to be the hero, despite everyone (including him) thinking that he was summoned by mistake.

Eventually, he realizes that magic is kind of like programming and it becomes clear to the readers that what allows him to win is a bit more than that. While his programming knowledge is technically what allows him to win, what really makes him the hero is that he’s an Outside-Context Problem to the villains who, until now, were inexorably winning the war.

It is a classic 80s “male geek becomes a hero” story, and the first volume doesn’t pass the Bechdel Test as far as I can remember, but it does pass the Sexy Lamp Test (at least one female character who’s important enough to the story that you couldn’t replace her with a sexy lamp bearing a post-it note), so it’s more a matter of demographics than bad writing. (There are at least half a dozen female characters who have meaningful conversations and are crucial to the story, but the main character, who happens to be male, is so central to the story that it’s hard to find any scenes where people talk without him coming up, female or otherwise. It’s just that kind of story.)

The story also has an interesting approach to the interaction between the hero and Moira, the lead female: It flat-out lampshades the clichés. He’s attracted to her because the wizard who gave his life to summon him cast an infatuation spell and he comes right out and says she’s not his usual type (he prefers willowy blondes). Not only does she not like him for most of the story, she has every right to that view because he’s as likely to get them both killed as an unattended toddler. Worse, because he’s an adult male, he’s used to being competent and that makes him even more of a liability.

As I mentioned earlier, his “secret power” is that he’s a walking Outside-Context Problem, so I’d like to go a little more into that.

First, he’s such an unexpected thing that the bad guys assume their future-scrying must be on the fritz. Beyond that, much of the driving conflict revolves around him struggling to recognize that he actually is relevant and how so, rather than just possibly being a misfire of the summoning.

Second, the story likes to draw parallels between computer hackers and wizards and, more narrowly, the secret of his success is simple: All the other wizards are programming raw machine code and a good wizard is someone who can do so without accidentally corrupting or deleting themself. On the other hand, his post-secondary education taught him how to think like a computer scientist. (Something he finally realizes two thirds of the way through the first book.)

Like most good novels (and like far too few pieces of fanfiction), it does a good job of combining sections where a lot happens in a short amount of time with sections where a lot of time is glossed over in very little prose, and doing it without the reader getting confused.

…and, like George O. Scott’s Venus Equilateral (another omnibus I highly recommend), this is a story where, once the character realizes his role, you get a strong sense that the author has real-life experience. George O. Scott was a radio engineer, and Rick Cook clearly has training in the kinds of low-level programming Wiz gets up to. That in and of itself is fascinating and it’s the essence of one of the meanings of the phrase “write what you know”.

I have high-level programming experience, but it still would have taken me some thought and research to come up with this. We’re just so used to modern processor architectures being boring and same-y… even compared to what actually existed around the middle of the 20th century. (Even the stuff that still exists is likely to surprise higher-level programmers who didn’t encounter alternatives at school, such as AVR chips using a modified Harvard architecture rather than a Von Neumann one.)

…of course, reveals never go well with technology-like things, and, when he demonstrates his first magic program, it has a bug that results in just about every bad thing that could happen without the magic going truly out of control. (It destroys his progress with the female lead, alerts the enemy to their position, gets orders issued to have their little sanctuary burnt to the ground, etc.)

…but, at the same time, it’s also used to explore his motivations and bring about character development before he even realizes the magnitude of his screw-up, and, again, it’s told from that delightful perspective of interpreting what’s going on through the lens of computing before it became ubiquitous.

There’s also a recurring theme of his approach to programming being so alien that, when perfectly reasonable assumptions are made about what a magic user is, it leads people to the wrong conclusions about Wiz… especially the bad guys.

The secret to a good fantasy crossover is to ensure that your “modern” character isn’t any smarter (or even necessarily as smart as) the natives, but has knowledge or life experience that provides a truly unorthodox solution to the problems without making the natives feel incompetent.

I also like the irony in tweaking the old “hero rescues the girl” trope by having the mass of bad guy henchmen take Moira by surprise and capture her specifically because she’s the biggest threat as a magic-user, so she obviously must be the hero. (and leave Wiz to become the hero because he has no magical aptitude they can sense at all.)

Finally, sometimes formula exists for a reason, but a good author will always find ways to play that to their advantage. I got a real kick out of how, at the point where a movie would have something like a training montage, Wiz has his “I’m not going to take it any more” moment and discovers the local equivalent to energy drinks. (Vile stuff that tastes like coffee you could stand a spoon up in and works the same.) What could be more fitting for a programmer? 😛

…which brings me to one of the things I find most personally noteworthy about the story… how it feels.

I’m not sure if it has a proper name, but there’s an atmosphere that I’ve felt from various Interactive Fiction (text adventure) games and Legend Entertainment games I sampled, as well as from amateur fiction in subcultures that originated on Usenet and wound up on sites like Sapphire’s Place and The Transformation Story Archive. As far as I can intuit, what I’m picking up on is the ambient feel of 1980s and early 1990s college/university geek culture which got flooded out by the original Eternal September.

Some of the plot elements I’ve described feel familiar to 80s movies, but this goes beyond that and I think it has to do with prose being better at communicating certain elements to someone who didn’t live through them.

(Speaking of which, I’d appreciate any suggestions for how to find more of it in an efficient way, given that I’m a little too young to have experienced it firsthand.)

Now, before I move onto the second volume, Wizardry Compiled, I’d like to touch on something that is best explored across both books together: How they handle female characters.

I did say that it fails the Bechdel test for not having a scene where two females talk to each other without Wiz coming up but, despite that, it has a nice amount of depth for the type of story it is.

For example, there’s a scene where Wiz encounters a fleeing caravan of Fae refugees and it’s a conversation with a brownie mother that makes him realize that he’s unintentionally become a peddler of horrible weapons… but that’s just one scene. Let’s talk Moira.

As I mentioned, Moira is the love interest who starts out hating Wiz, but there’s a very nicely couched reveal involved. It turns out that she doesn’t just resent Wiz for the obvious reason, but also for deeper reasons that have to do with her own past. (And I like how it lampshades that pattern: There’s actually a scene where Wiz has to keep shooting down her excuses until, finally, she has to admit the real reason just as much to herself as to him.)

Beyond that, she’s also integral to the second book, which is structured into two independent stories: One following Wiz, who’s out in the world and uncontactable, and one following those back in the capital, including Moira, who takes a leap into the unknown to visit our world and hire more programmers. (A part of her arc involving her dissatisfaction at now living in the shadow of the man who is revolutionizing magic.)

…yeah. The first arc, originally published as Wizard’s Bane, is what most people would write and call things done. He learns how to do magic like a programmer, defeats the bad guys, gets the girl, decides to stick around, and is set on a path to revolutionize the world. The second book is all about how “happily ever after” isn’t so simple.

(And the conversation which allows the second volume to pass the Bechdel test is thematically related, with Moira helping one of the hired programmers, Judith, come to terms with the loss of her childhood fantasies about dragons.)

The Wizardry Compiled, is all the stuff that would normally be boiled down into a mere epilogue. It takes place two years later and covers Wiz training wizards to think like programmers when teaching was never his strength, navigating the politics of the wizarding hierarchy when politics is even less so, overcoming the tendency to neglect everything else for his work before he loses Moira, and Moira actually making what is effectively first contact between the fantasy setting and our world and importing more programmers to help with what he started, all while a conspiracy festers between good guys and bad guys to preserve the status quo and prevent their own obsolescence.

It’s dedicating an entire book to this “epilogue fodder” which takes the story from merely “good with some great scenes surrounding the invention of a magic toolchain” to “classic and legendary” in my eyes. In fact, the first chapter of it feels sort of like an epilogue that realized it had more to say.

It begins with a very nicely chosen quote, and continues to introduce the chapters with good quotes throughout:

You can always tell a really good idea by the enemies it makes

programmers’ axiom

Like the first book, it continues the theme that the biggest hazard of magic is unintended side-effects but, unlike the first, this one has the unintended side-effects stem from unintended ways that humans will intentionally use what Wiz has given them, rather than what the “computer” will do with what humans ask.

In essence, it’s shifted from bugs in the code that is magic to bugs in the end users who run the code that is knowledge. The first book touched on that in the form of the Black League (ie. villains) but now it’s focused on the casual cruelty of ordinary humans against those not of their tribe.

In keeping with the style, it makes use of the same out-of-the-box wit as the first volume, with moments like Wiz surviving a death trap because “even death traps need regular maintenance”, a fire-breathing dragon accidentally giving itself steam burns, flawed magic code being literally buggy, and a different bit with a dragon which reminds me of a scene in the 1996 Steve Martin movie, Sgt. Bilko.

The best part, in my opinion, is the part of the second book beginning about half-way through when they hire more programmers from our world. Aside from that being entertaining in itself, something about the mindset required for writing them made the cuts back to Wiz significantly punchier.

It’s a shame that Rick Cook didn’t fully grasp what he had though. The sequels to these two volumes feel too much like cargo cult copies of the first two… similar, but with an unsatisfying shallowness to them and focused more on the programmer culture references and less on the deeper technical and social commonalities between magic/wizards and programming/silicon valley.

I think it’s that he got so into the appeal of the latter half of the second book that he overcompensated and jettisoned the deeper aspects that made the first book work and made the second one an even better balance. Even the second half of the second book feels like it’s toeing the line on that “too much shallow humour” front. (I also noticed the second book starting to show hints of the “shallow humour and action starting to crowd out deeper elements, which get squashed into the end” pattern I vaguely remember observing in the later books.)

Still, both volumes in The Wiz Biz are definitely excellent and I don’t want to fault them just because I’m able to see room for improvement. (It’d be a pretty sad existence if I couldn’t just relax and enjoy things.) It’s just that, if you do try to analyze the experience, you can see how the first book was more serious, while the second book slides from where the first book left off, to an optimal balance of deep insight and shallow humour, then fails to settle there and starts to hint at what the sequels would be before it ends.

Get the first two, but don’t get the later ones. They’ll feel like a disappointment.

That said, people do say to try to get the two books separately if you can, because The Wiz Biz was edited sloppily. I did notice the odd typo, but my main problem with the omnibus is that it doesn’t always use proper scene breaks, which means I have to occasionally stop and rewind to make sense of what’s going on.

In the end, whichever version you get, these are classic fiction that I’d highly recommend. 5 out of 5.

Posted in Geek Stuff, Reviews | Leave a comment

On The Unix-Haters Handbook

A few months ago, I wrote a forum post explaining the context in which The Unix-Haters Handbook existed since, seen through the lens of today’s Linux, it does come across as somewhat ridiculous.

While Eric S. Raymond did write a retrospective review back in 2008, and it has the benefit of him having been involved with its authors when it was written (and I certainly recommend reading it, since he touches on things I didn’t for lack of experience), I still think it feels a little bit too eager to argue “that’s all fixed now”… probably because ESR is coming at UNIX from an “old hand who is used to the technicals” side of things.

Since I wrote my own explanation before I discovered ESR’s post, and I came at it more from the direction of what a user from 2020 would take for granted, I thought there was still worth in not letting my own explanation fade into obscurity, so here’s a polished-up version.

Because it began as a series of notes taken as I read the book, it’s more or less in the same order, even if I didn’t bother to dig up an exact reference to each page I’m responding to.


The Unix-Haters Handbook is a book from 1994 which is best described as “a frustrated rant about how bad UNIX is, mostly by people who experienced or wrote more comfortable systems that got displaced or never caught on”.

I agree that the style definitely gets people’s backs up, to the detriment of its message (something ESR also pointed out), but, to be fair to it, a lot of it was true about certified UNIX implementations back in 1994 when it came out and, to people who aren’t used to the warts, some of it still is.

ESR’s The Art of UNIX Programming (free to read) and Keith Packard’s A Political History of X (YouTube video) both agree on what a mess was made of things by vendor competition on some fronts combined with vendor apathy on others. (Imagine an ecosystem where every single vendor was Canonical and, by the time they gave up on their own Upstarts and Mirs, Linux was coming out and irrelevance was inevitable.)

If The Art of UNIX Programming describes the beauty of the UNIX design as a goal, The Unix-Hater’s Handbook describes how horrendously vendors were botching the implementation of it in the 80s and early 90s. This snip from a quoted e-mail on the topic of platform compatibility really was fitting:

The evaluation process consisted largely of trying to get their test program, which was an early prototype of the product, to compile and run on the various *nixes. Piece of cake, sez I. But oops, one vendor changed all the argument order around on this class of system functions. And gee, look at that: A bug in the Xenix compiler prevents you from using byte-sized frobs here; you have to fake it out with structs and unions and things. Well, what do you know, Venix’s pseudo real-time facilities don’t work at all; you have to roll your own. Ad nauseam.

I don’t remember the details of which variants had which problems, but the result was that no two of the five that I tried were compatible for anything more than trivial programs! I was shocked. I was appalled. I was impressed that a family of operating systems that claimed to be compatible would exhibit this class of lossage. But the thing that really got me was that none of this was surprising to the other *nix hackers there! Their attitude was something to the effect of “Well, life’s like that, a few #ifdefs here, a few fake library inter-face functions there, what’s the big deal?”

The Unix-Haters Handbook, Page 12

Yes, there are some bits that I disagree with, such as that the brevity of commonly-used commands like rm and mv is too cryptic to be acceptable. However, for the most part, they were right. (Though late to the party, as ESR points out.)

Dumb Deletion

At the time, shells didn’t ask “Are you sure?” if you accidentally lifted Shift a moment too late and typed rm *>o (delete all files in the current directory and redirect stdout to a file named o) rather than rm *.o (delete all files in the current directory with a .o extension).

I agree that’s a problem but they blame it on programs being unable to access a raw form of the command line to implement their own “Are you sure?”. Given that they later bemoan allowing applications to bring their own termcap and curses, I’m going to have to disagree with them.

Windows has shown that expecting every application to carry along argument-parsing code has enough problems to make that idea “using a sledgehammer to swat a fly” so, as far as pre-checks go, I think making rm a shell built-in that asks “Are you sure?” is good enough… especially when it’s intended to double as a scripting primitive. Right assessment of the problem, wrong proposed solution.

Now, one thing they bring up which is, if anything, a more reasonable complaint today than in 1994, is the lack of an option to hook in Trash/Recycle Bin support at a low enough level that commands like rm reliably send files to it.

I don’t think it’s fair to fault UNIX specifically when no major operating system in 2020 does that, but, with today’s democratized computing and hundreds of gigabytes of storage, the lack of a fully reliable way to undo accidental file deletion is glaring.

Tools like rdiff-backup and Apple’s Time Machine certainly help to reduce the scale of the problem, but they are still incremental backup tools that take periodic snapshots, rather than hooks into file deletion which catch everything, so the core complaint remains valid.

Cocked-up Command Lines

First, let me be clear that this was when csh-family shells were common. That alone should explain a lot of the frustration.

For all their weird legacy quirks, modern bash and zsh are significantly more user-friendly than what people were using in 1994, and that’s not even counting options that break scripting compatibility, like fish and Nushell.

Second, at the time, unices were a horrendous mess of commands using inconsistent argument styles. You can still see that in the single-dash long options used in X.org programs and, yes, they are very annoying.

POSIX, as “too little, too late” as it was, and the standardization of Linux and BSD on single dashes being used for single-character short options helped a lot here, as did conventions like those summed up in The Art of UNIX Programming, Chapter 10, Section 5.

Third, this was before the innovation of -- as a standard way to say “no more option flags. What follows are unarguably positional arguments”, which made it needlessly ugly and annoying to properly handle the potential for file or directory names beginning with -.

Making that even worse, at the time, - did double duty as both “no more options” and “stdin/stdout as a pseudo-filename”, with no consistency between commands beyond “read the manpage”.

It is perfectly reasonable to complain about that state of things and to expect standardized, OS-provided argument parsing to be the proper solution.

That said, we’re still innovating on argument parsing over 20 years later, so I’d argue that the best solution would have been an optional reference implementation with an alluring API. That, as an implementation of part of a command-line analogue to Apple’s Macintosh HIG or IBM’s CUA would have served as a strong incentive to follow convention without stifling innovation.

UNIX’s greatest strength has always been its ability to evolve to meet new challenges.

Unhelpful Help

While I haven’t used any of the unices touched on in The Unix-Haters Handbook, I fully believe the authors’ accounts of how UNIX commands behaved when presented with incorrect input. That kind of lack of attention to anything but the ideal case is sadly common in software development.

Thankfully, in my experience, the GNU userland tools are generally much better about that… probably because enough coders tripped over problems and decided to make sure nobody else would. That’s the beauty of open source.

However, their complaint that not all shell builtins have manpages is worse. Sure, it’s bad if a program is too sensitive to a very human failure to fully read all provided documentation, but how is a novice supposed to read the documentation if there’s no unified way to do so and no obvious way to identify which reference to consult?

Things are better on that front now… but not perfect. Google aside, try typing man cd on Ubuntu Linux 16.04 LTS and you’ll still get No manual entry for cd, because not all shell builtins have corresponding command-line utilties with manpages.

It also really didn’t help that, from what I’ve seen, early 90s commands did have godawful equivalents to what GNU exposed as –help and that, prior to GNU info and Google, man was all there was for commands which had outgrown needing just a short page.

While I’ve been spoiled by HTML and search engines and Zeal, GNU info basically was the solution to all their complaints on that front.

Bad command or filename? No… just bad commands

Let’s start out gently. tar used to allow you to accidentally backup nothing. Now, GNU tar says tar: Cowardly refusing to create an empty archive. Bad old days, meet modern sanity.

It’s still true that, if you type cat - - -, it will take three presses of Ctrl+D to get your terminal back with no feedback on what’s going on, but I’m going to rule on cat‘s side here. It’s more a shell scripting primitive than a user command and you asked it to read three files from stdin. (Ctrl+D sends an “end of file” signal.) It would be worse to make the command more inconsistent.

That said, to do that wouldn’t even occur to someone anymore. Ctrl+C is the standard way to break out of a command that people are taught, so I can only assume that either Ctrl+C was less robust back then or whatever training that led them to Ctrl+D was deficient.

Likewise, back when people used RCS for revision control, you could accidentally type indent rather than ident and mangle a non-C file. indent will still do that (I tested it on a Python file) but, thankfully, it’s not installed by default, and nobody of note uses RCS anymore, so there’s no reason to type ident anyway.

…but it’s not all sunshine and roses. If you mistakenly type cc -o hello.c hello, even modern-day GCC is stupid enough to blank out your source file before doing any sanity checking.

(I don’t have time to take responsibility for things like “Is this still a problem on a newer version?”, but somebody should probably report that.)

Also, I’ll have to disagree with them on how cd and symlinks interact. It would make no sense for .. to take you down a route you didn’t come up, just because you crossed a symlink somewhere along the way.

Netadmin-Pattern Baldness

I won’t go into too much detail on the two chapters dedicated to Sendmail, Usenet, and UUCP.

Yes, old Sendmail versions were braindead and had an infamously cryptic configuration syntax. Thankfully, aside from the classic stories it left behind, like The case of the 500-mile email, even modern Sendmail has given way to Postfix and Exim.)

Yes, Usenet had its problems, but it’s basically dead outside its use as a paid-for channel for illicit file-sharing and Gmane‘s proxy for putting an NNTP frontend on mailing lists. (Though the latter use does a pretty good job of cutting away the flawed parts and using the good parts.)

Yes, UUCP was a monstrous pain. Nobody uses it anymore.

Terminal Stupidity

As ESR pointed out, the central thrust of The Unix-Haters Handbook with regard to terminal handling isn’t really something that can be laid at UNIX’s feet.

These are people pining for Apple-like ecosystems where it was reasonable to demand that the terminal or display hardware play a role in auto-configuration, while UNIX was developed to deal with a dog’s breakfast of terminal hardware that could be extremely dumb, and there was limited ability to dictate terms to terminal makers.

Yes, configuring terminals was a mess and, yes, box-drawing before Unicode-aware ncurses was sub-par, but most of that was beyond UNIX’s ability to fix while retaining compatibility with all the hardware people wanted to use it with. These days, terminal emulators all claim to be some extended form of a DEC VT100 with Unicode support, and everything more or less Just Works™.

That said, it’s not all perfect. I can personally attest to what a mess things can become if you want to combine something like GNU Screen or tmux with support for more than 16 colors… urxvt 88-color mode? xterm-compatible 256-color mode? True/hex color mode? You basically have to just manually test your terminal emulator and then force-override the definition Screen exposes to the application.

Beyond that, however, I think it’s a perfectly reasonable complaint that UNIX chose to require that something like termcap be linked into every application, rather than building something akin to GNU Screen into the terminal driver layer to abstract away the differences… especially if you’ve experienced such a design elsewhere on other systems. More difficult to innovate with, perhaps, but also more robust for the cases it does support. (It reminds me of software mixing support in ALSA vs. OSSv4… but Linux has always been an infamous mess when it comes to audio.)

Weighty Windowing

As ESR points out, it’s likely this chapter was mostly written by someone who poured his blood, sweat, and tears into a superior alternative to X11, only to see it lose out for being proprietary. That said…

X11R4

The biggest complaint about X11 itself is the same complaint people had about Emacs. In 1994, it wasn’t reasonable to try to compete with the aesthetics of a system like Windows 3.1 using a network-transparent system written in C. This quote says it all:

X has had its share of $5,000 toilet seats—like Sun’s Open Look clocktool, which gobbles up 1.4 megabytes of real memory! If you sacrificed all the RAM from 22 Commodore 64s to clock tool, it still wouldn’t have enough to tell you the time. Even the vanilla X11R4 “xclock” utility consumes 656K to run. And X’s memory usage is increasing.

The UNIX-Haters Handbook, Page 123

That said, performance wasn’t the only complaint about the core X server. They were talking about X11R4, which definitely had its problems.

Heck, I started using Linux with XFree86 implementing X11R6, and, coming from Windows, it was shameful what an annoyance it was to get it configured and working properly… and I know there are warts I’ve forgotten.

It wasn’t until surprisingly recently that they sat down and implemented the autoconfiguration that we enjoy from X.org today.

What’s worse, X11R4 existed during an era with multiple competing proprietary X distributions, rather than everyone sharing X.org. Imagine every complaint you have about the nVidia binary driver turned up to eleven.

Motif

They also complain about Motif, which is valid. This is back before GTK+ and Qt. Motif was trying to be a clone of Windows widgetry, except dog-slow.

The funny thing is, as much as I’m not a fan of how we got there, we have the design they proposed for dividing the workload between client and server… we download JavaScript or WebAssembly into a web browser to implement single-page web applications, so the division of responsibility can be decided on an application-by-application basis.

Of course, for native applications, X11 spent so long twiddling its thumbs on ensuring coherent rendering during window resizing that modern GTK and Qt got fed up and now do everything client-side and push a pixmap to the X server for the window contents.

Configuration

They also complain about configuring X… and it’s all valid.

If you’ve never tried using xauth, consider yourself lucky… especially now that the man page has an examples section and we have Google in case that’s not enough. Just use ssh -X or ssh -Y for X remoting and save your sanity.

As for their complaints about xrdb, yeah. Good aspriations, shoddy execution… and I get to say that. I use things like Tk, xterm, and urxvt on my 2020 desktop, so I got to see the contrast first-hand.

Sub-Par Scripting

They spend a lot of time ragging on shell scripting… and it’s warranted.

First, let me repeat: This was the era of csh-family shells.

Second, this was before things like Python were ubiquitous, so you had to press shell scripting into service in roles it really isn’t very well suited to. Perl existed, but it was young and I’m guessing they hadn’t learned of it yet.

(And let me say that I have experience trying to write shell scripts that are properly portable to GNU-based Linux, busybox-based Linux, FreeBSD, OpenBSD, and macOS. Even if shellcheck had been around, it doesn’t warn you when you use non-portable options to the commands you’re calling, and POSIX is uncomfortably limited. It was not fun.)

They also make the good point that shell scripting forces a command’s user-visible output to become an API that you can’t ever change without breaking things… sort of like how Python’s standard library is where code goes to die, or how Firefox had to get rid of its legacy “override arbitrary browser internals” addon API because it was an albatross around their neck.

Finally, modern find has -L. Try to empathize with them for having an archaic version of find that simply will not follow symlinks no matter how many goats you sacrifice. Plus, find is a mess, interface-wise. That’s why fd got written.

I could have put find under “bad commands”, but shell scripting is so anemic that find really serves double duty as far too many things that other languages would have via built-in primitives or a standard library.

“My Code Compiles, So It Should Run”

Going into the chapters which complain about C and C++, it’s important to have a little context:

  • The UNIX-Haters Handbook was published in 1994
  • The first ANSI C standard was ratified in 1989, only five years earlier.
  • C++ was originally known as “C with classes” and Cfront, the original C++ compiler, was a transpiler that converted C++ to C.
  • The second edition of The C Programming Language by Brian W. Kernighan and Dennis M. Ritchie has this to say about the difference between pre- and post-standardized C:

For most programmers, the most important change is a new syntax for declaring and defining functions. A function declaration can now include a description of the arguments of the function; the definition syntax changes to match. This extra information makes it much easier for compilers to detect errors caused by mismatched arguments; in our experience, it is a very useful addition to the language.

The C Programming Language, Page 2

As a programmer in 2020, the idea that C once didn’t let you declare your arguments, and that one of its creators said “in our experience, it is a very useful addition” helps to drive home how different C programming was in the late 80s and early 90s, even before you realize that there were no free C and C++ IDEs for UNIX in the modern sense, no IntelliSense, etc.

Combine this with how even GCC in 2020 can lag behind standards sometimes, and how much vendor fragmentation and site upgrade schedules can hold things back, and you start to get a feel for how painful C could be.

As for C++, these are people who came from languages like SmallTalk which invented object-oriented programming as we know it. What did you expect them to think of C++’s claims to being comparable, when it began as “C with classes”? It’s nothing but leaking abstractions piled on leaking abstractions and that’s being kind.

Now, to be fair to C and C++, I do not agree with the degree to which they seem to love “distribute the program as a memory image. Allow stopping a running program, editing the code in situ and resuming execution, etc.” programming. As an experimental/exploratory programming tool, that’s nice, but I think it’s completely unsuited to developing applications to be distributed, let alone infrastructure. The 2020 programming ecosystem seems to agree since, outside the limited form that has been reinvented by browser developer toolboxes, it’s more or less nonexistent.

Sisyphus, Sysadmin

Have you ever had to work on a system with such lazy checks in its tooling that the partitioning tools would let you create overlapping partitions? That was UNIX.

Have you ever seen the developers of your system send out an e-mail asking for resubmission of all bug reports since such and such a time because they lost them? That happened to 4BSD because dump was too disruptive and tar was more or less useless.

It’s important to remember that UNIX wasn’t always the shining titan of uptime and reliability that it’s become. In fact, even over half a decade later, with GNU tools, I still vaguely remember how much manual tuning, configuring, and babysitting a Linux system needed… and that was an improvement. There’s good reason that some Linux YouTubers comment on how Linux has become so much better, even just since they got into it, as to be almost unrecognizable.

How many people remember having to fsck ext2? I vaguely do, and I believe them when they say that UFS was much worse. We take filesystems which journal at least their metadata for granted these days.

Also, you certainly can’t blame them for lamenting the lack of a logical volume management system like LVM.

Got Root?

What about their complaints about UNIX’s security model? Yes! A simple split between regular users and root, plus SUID, is a pretty braindead security model and this is probably one of the biggest places where I have to disagree with ESR.

ESR argues that it’s bad, but nobody did better… Really? I find it hard to believe that something as simple as splitting root up into a set of capability flags and splitting SUID up into a matching set of extended file attributes was hard to think of in 2008 or even in 1994… but that’s what we finally have with POSIX Capabilities, including the ability to opt a process and all its descendants out of UID 0’s magical privileges should they manage to attain it.

Sure, I disagree with the authors on whether it’s a good idea for processes to inherit permissions from their parents, but UNIX’s permissions were definitely underpowered.

(Though, on the other hand Windows NT actually goes too far in the opposite direction, with a system of ACLs that is too fine-grained and, as a result, complicates things to the point where people are too likely to just grant all permissions in a frustrated last effort to make things work.)

NFS… ’nuff said.

Let me give some context here. I’ve never used NFS on my Linux systems. Do you know why? Because it was braindead even in the mid 2000s.

I hear that NFS 4 finally fixed a lot of the braindeadness, but I’m firmly habituated to using Samba (an implementation of Microsoft’s SMB/CIFS family of file-sharing protocols) between my Linux machines.

Why? Mainly because, until NFS 4 was production-ready, what I could use had no authentication! Whatever UID and GID you claimed to be, those are the permissions you got.

(Or, if that wasn’t the case, they sure thought it was some deep secret. I remember researching for ages because I couldn’t believe a security model so braindead would have survived to the end of the 1980s, let alone the early 2000s.)

It also used a firewall-unfriendly mess of ports and had a fragile stateless design… both things NFS 4 fixed by taking inspiration from alternatives including SMB/CIFS. I’ll leave The UNIX-Haters Handbook to explain the other problems.

The list goes on. In the end, I suppose the lesson is to never underestimate the power of vendors to botch things, to never judge the past by the present, and to recognize how much we take for granted in the world of computing today.

Posted in Web Wandering & Opinion | Leave a comment

Comparative Lovecraft and “The Colour Out of Space”

Lately, I’ve been trying to clear out my backlog of purchased audiobooks while I get other things done… and one of the things I bought was a Groupees bundle of HorrorBabble H.P. Lovecraft readings.

So far, The Colour Out of Space has been my favourite Lovecraft short story… a fact that I apparently share with Mr. Lovecraft himself.

You can legally enjoy it for free in both textual and audio form… though I prefer the audio version I’ve linked, as read by Ian Gordon for HorrorBabble. It really helps to keep you immersed when you can focus on visualizing the events being described without also having to dedicate effort to reconstructing the intonation and cadence of the narrator’s voice from the text.

Given that it’s a horror story, where presentation is key, I don’t want to say too much about the plot, so I’ll just borrow the beginning of Wikipedia‘s synopsis:

An unnamed surveyor from Boston, telling the story in the first-person perspective, attempts to uncover the secrets behind a shunned place referred to by the locals of Arkham as the “blasted heath.”

Instead, I’d like to talk about why I enjoy it so thoroughly. (Though, given how much I say, I’d appreciate it if you read/listen to the story first, then come back to finish this. I worry that analyzing the story in such depth will have an “explaining the joke ruins it” effect.)

First, many of Lovecraft’s stories, such as The Call of Cthulhu, have an element of xenophobia to them which hasn’t aged well. (The Rats in the Walls is uniquely egregious among the stories I’ve read so far. Not willing to settle for Lovecraft’s usual stuff, like implying that swarthiness is a sign of disreputability, it’s an otherwise excellent story where one of the central characters is the main character’s cat… named “N***er-Man” and, yes, that is the word you think it is.)

(Speaking of which, I suggest searching YouTube for a documentary named “H.P. Lovecraft: Fear of the Unknown” for a good introduction to who he was and why he’s so significant to the history of horror and science fiction. To oversimplify it, he changed the cultural zeitgeist, like all classic authors.)

The Colour Out of Space avoids this tendency toward xenophobia by taking place in the backwoods of New England with no human antagonists or henchmen. A minor thing compared to some of the other points I raise, but it definitely helps.

(At The Mountains of Madness also does well on this front, which makes sense as it was one of the last things he wrote before his untimely death from cancer, but it deserves its own blog post, if I find time.)

Second, Lovecraft’s fondness for uncertainty doesn’t throw the reality of the threat into question. For all its beautiful world-building, I wasn’t a fan of The Shadow Out of Time because, in the end, the main character is left unsure whether the experiences were real or a sign they’re relapsing into insanity. I found that a very unsatisfying anti-climactic experience. The Colour out of Space is, without a doubt, a real thing within in its setting.

(Not that I need a story to go that far. The Case of Charles Dexter Ward worked despite access to the horrifying setting being lost and all it took was having a second person to verify that the main character hadn’t imagined the whole thing.)

Third, and most distinctively, it truly feels like a cosmic horror story to me. While I tend to find Lovecraft’s writings fascinating, it’s rare for them to actually evoke a stirring of cosmic horror.

The problem is that, so often, his stories achieve the horror in a way which offers some kind of escape hatch.

For example, in The Dunwich Horror, the monster “is a Cosmic Horror”, but, in the end, it’s defeated using magic. While the twist is great, it still defuses the sense of horror for me for two reasons:

  1. If they can defeat the threat once, then they should be able to defeat it again.
  2. “Magic” is sort of a “get out of needing to understand free” card. By definition, magic is something you feel you can use without understanding it because, if you understood it, you’d just call it another branch of science.

(That said, Dunwich is an example of something The Colour Out of Space does not do. H.P. Lovecraft has a recurring fondness for ending the last paragraph of his stories on a strong, punchy revelation that drives home the horror of the story’s central concept. The Mound does it, as do Out of the Aeons, The Shadow Out of Time, Polaris and others. It’s something to look forward to, but The Colour Out of Space’s last sentence isn’t punchy in that way.)

The Shadow over Innsmouth is another example of that “defusing the horror” problem. While I enjoyed the certainty of having the story told in flashback after the government proved that the hero wasn’t crazy by coming in to address the problem, it also means that the problem has a sufficient solution to avoid having to deal with the lurking sense of horror. Sure, there’s a lurking threat to an individual, and there are horrifying elements, but it doesn’t have that inexorability that I admire The Colour Out of Space for.

Out of the Aeons is good, but, like The Call of Cthulhu, the threat is too distant. All physical evidence of the problem lurked under the ocean for eons before surfacing, and then returned to the ocean. It’s too easy to feel that it’ll probably stay under the ocean for eons more before coming up for another ultimately ineffective burp of momentary crisis.

Rather than in the middle of the sea, The Colour Out of Space takes place in in a relatively settled part of rural America, and the ending does a powerful job of driving home that the threat is still lurking.

The Call of Cthulhu also shares a similar problem to At The Mountains of Madness if you’re going for horror… the threats are too comprehensible. Yes, they’re powerful, but, despite R’lyeh’s non-euclidean geometry, it’s still just aliens and it feels like, with the march of science, we’re likely to have a way to effectively fight back by the time they surface. (That said, At The Mountains of Madness makes for a wonderful adventure story akin to something like Jules Verne’s Journey to the Center of the Earth [1] [2] [3] [4].)

The Colour Out of Space works so beautifully because it drives home that this is the realm of science, not magic, but, despite that, it utterly baffles the scientists called in to comprehend it. At the same time, the description presented to the reader is very convincing as something that is genuinely incomprehensible in its nature, yet comprehensible in its effects.

(As opposed to the sense of the author saying “take my word for it” that you feel with some supposedly genius characters, where willing suspension of disbelief is pressed into service beyond its ideal scope. It’s self-evident that characters like Sherlock Holmes, or L and Light Yagami from Death Note are geniuses and you feel that. Characters like Hermione Granger from Harry Potter, on the other hand, feel like the author is cheer-leading their intelligence, but it feels like they’re only intelligent when the author specifically tries to evidence it, rather than it being a property of their being which affects their thoughts and behaviour more subtly at all times.)

The other big reason The Colour Out Of Space works so well is that there’s no “defeat the threat” moment. It’s more like the Portal games in that the protagonists survive without triumphing. Thus, avoiding the need to demonstrate that humans can win.

All in all, I highly recommend that you find 80 minutes when you can listen to an audiobook and give it a listen. Of Lovecraft’s works that I’d read as horror, it’s certainly the best… and if you enjoyed it, try some of the others I named. Lovecraft’s writing style takes a bit of getting used to, being strongly based on the archaic writing styles used in the antique books he grew up reading, but hearing it in audiobook form helps a lot, and his stories have a very distinctive atmosphere to them once you get into them.

P.S. It also was the inspiration for the most memorable part of a piece of Ranma 1/2 fanfiction from the rec.arts.anime.creative era named Bliss by Mike Loader and Lara Bartram. Given that it is still a memorable story to me, and that it won second place in the April 1999 TASS awards and 10th place in the 1999 annual TASS awards, I may re-read and review that too.

UPDATE: You may also want to watch Fredrik Knudsen’s Down the Rabbit Hole livestream special, Lovecraft & Junji Ito.

Posted in Reviews | 2 Comments

A Verbal Middle Finger to WordPress

I just lost a fair bit of work from WordPress “helping” me, so I need to vent.

Think for a moment. What is the #1 thing that should define a good text editing experience… something so fundamental that you take it for granted… I’ll give you a hint. Autocorrect screw-ups are a weaker form of it failing.

Give up? It shouldn’t #$%*ing lose or corrupt your work!

WordPress deserves an award for how flagrantly it violates this principle and, if I can ever find the time to satisfy my obsessive dedication to preventing dead links, I’m going to switch to a static generator. (Partly because I blog infrequently enough that it feels like all I ever do is apply updates. Static HTML can’t have exploits in it.)

Let’s start simple. You might have noticed that I like to use all the semantic elements HTML gives me for making my posts as readable as possible, including tables and especially definition lists.

Have you ever tried to edit a table or definition list in WordPress? …especially with the legacy TinyMCE editor? There’s no option for them, so you have to switch into the raw HTML view and edit by hand.

That wouldn’t be so bad, except that there’s no “preview without publishing” option in raw HTML mode when you’re editing an existing post, except for switching back to WYSIWYG mode… and WYSIWYG mode normalizes your work into gibberish with no Undo option if you don’t get your HTML tags just right.

Better hope you made a backup of your HTML in an external text editor before previewing it.

…and that was before the new Blocks-based editor. If you’re using Blocks, you get a helpful little “Invaid HTML. What do you want to do?” popup with two buttons… they’re kinda cryptic and one of them basically means “convert from legacy TinyMCE block to custom HTML block without any apparent option to Undo”.

Better make a backup of that HTML, hit Reload to “undo”, and then paste the HTML into the TinyMCE/legacy block’s HTML mode to restore your work.

OK, so maybe I’m just too much of a technical writer and I should try using Blocks the way it was intended… I think you can guess by now that it’s no picnic either.

Want to edit stuff naturally in the new Blocks editor? Good luck. The interaction between cursor motion and selection means that trying to select a paragraph of scratch text without reaching for the mouse will snap the selection to the entire block, including text outside your intended span.

(Admittedly, this is something I dread having to do right in my own project, which also has editable text spanning multiple contenteditable elements… but at least I’m diligent enough to be aware of the problem and dedicated enough to try to figure out some kind of solution.)

…and that’s not even counting the idiocy that is how the Up arrow likes to jump from the second line of a paragraph block to the end of the previous block when you’re trying to position the cursor at the beginning of a paragraph. The Blocks editor is so braindead that it boggles my mind that nobody else noticed this.

(Nor the more minor annoyance that the heuristics for determining when to finalize one Undo action and start another are inferior to the desktop text editors I use, so I wind up undoing too much.)

…but I saved the worst for last. The piece de resistance that inspired this whole rant. I just lost a big edit to an old list post because, for reasons I can’t even fathom, switching from TinyMCE to raw HTML reverted it to the saved version. No “Are you sure?”. I just did what I always do to switch to HTML to fix up my markup and… that’s odd. Where are my changes?

OK. Maybe I should have un-published the post into a draft for a moment, even though I despise 404ing links even temporarily… the “Save Draft” button is unreliable. Wait too long with a WordPress tab open and clicking “Save Draft” will leave you stuck on “[Icon] Saving”.

Now, admittedly, I do, on rare occasions, have HTTP requests get stuck on setting up the connection. I’m assuming it’s some kind of packet loss during the initial handshake and TCP has to wait for it to time out and resend the packet. Maybe it’s that WordPress is much more prone to that somehow.

…but the usual solution is to click the Submit button again… a button that WordPress hides. …and there’s no “click Stop and then reload”, because WordPress has no analogue to the Stop button!

Wanna use the preview function but have your browser set to only allow new windows/tabs to be opened by an explicit middle-click or context-menu choice? Too bad. The Preview button can’t be middle-clicked.

I’m on WordPress because I’ve been on WordPress since December of 2004, and my workaround, when I don’t get lulled into using the UI the intended way, is to just copy my post into gVim, edit it there, and then copy it back.

Given that the “can’t trust WordPress to protect your text” aspect of this seems fundamental to how the authors of WordPress approach the interaction between author, text, and tool, I think you can see why, as soon as I have time, I want to translate all my posts into Markdown, switch the site to something like Pelican, and say good riddance to bad rubbish.

UPDATE: Apparently it’s also prone to forcing a “convert to visual and resolve broken HTML” when losing focus so I can copy-paste text from another tab. Ffffff……

Posted in Geek Stuff | Leave a comment

Novel – World War: In The Balance

Now for a book that has been mentioned before as a source for some of the fanfics I’ve reviewed:

World War: In The Balance by Harry Turtledove

The basic concept is that, during World War 2, a bunch of reptilian aliens who call themselves The Race arrive with plans for conquest. However, unlike in the usual old tropes where aliens are either hyper-competent or completely incompetent, these aliens were blinded by their preconceptions. More specifically, while technologically superior, they make Imperial China’s cautious, bureaucratic approach to progress look reckless and expected to find us still fighting with knights on horseback.

The main thing I can say about this story is “slow but steady”. Aside from the concept, it’s not a story characterized by creative ideas that make you stand up and take note, or graced with a plethora of quotable passages, but, if you give it time to grow on you, it’s like a nutritious but familiar staple. It’s something I find enjoyable and satisfying, but it took until around chapter six for me to really feel that it had consistently made its characters interesting to read about and, to continue the food metaphor, it’s tasty and something to look forward to, but not exciting like going to a restaurant.

I think the problem is that it feels like Turtledove was determined to make the story about his characters first and foremost, even if that meant turning down other opportunities presented by the concept. …and the characters are the kind who start OK and grow into being good, rather than starting out good and growing into being amazing.

The worst part of that is probably how much time the first chapter spends on a couple of American baseball players when it was supposed to be hooking my interest. They just don’t start to get interesting until after the war comes to them and, even then, they’re potentially the least interesting of the characters. My best guess is that they’re supposed to be relatable, but I’m not American, not into sports and, besides, this is supposed to be a blend of sci-fi and historical fiction.

It’s technically a first contact story, but it doesn’t embrace the reactions by one group to revelations about the other in the way first contact stories do when they truly commit to “being about” that. (Compare Turtledove’s short story The Road Not Taken, which is sort of the progenitor of World War, for something which does embrace that aspect… or the web-original series The Deathworlders. Having aliens who are so behind us in everything except faster-than-light travel is intrinsically more fantastic, so it takes less effort to make engaging.)

It’s technically historical fiction yet I find Eric Flint’s writing much more engaging on that front. For all the moments which feel satisfying, like a village in Communist China celebrating an air raid destroying only the compound where the corrupt officials live or the banter between two British radar technicians, or an interaction between a white man and a black man in the U.S.A. in the 40s in the south, it feels diluted compared to Flint.

I think the problem is that the story is spending a lot of “screen time” trying to make the characters relatable when it should be recognizing that it already has that and focusing on what’s different instead. The whole point is to show the unfamiliar in a way that connects it to the familiar, not vice-versa. (Compare Eric Flint’s 1632 or Leo Frankowski’s The Cross-Time Engineer.)

You clearly love history, Mr. Turtledove. I understand not feeling that “jumping right into the action” is appropriate for the tone you want to establish, but literature is the interesting parts of life condensed. If you want to show a realistic pace of events, then you need to make up for it in the chapters before your characters have proved themselves interesting by packing in other interesting “little-known but obvious in hindsight” details to make up for it. That’s one of the meanings of the phrase “write what you know”. “Write what you know, because others don’t know it and find it fascinating”.

The baseball players are a particularly bad example of this, since I found them dull aside from the moment where one was able to translate “Espíritu Santo” because Spanish was close enough to Italian. I would have much preferred it if the other “ordinary human” groups were given more time, like the German soldier on the eastern front, or the jew in the Polish ghetto. They feel like they have more relevance. (Even when I’d rather be reading scenes of the aliens reacting to Earth or humans reacting to them, like the ground-based radar operators who first see evidence of them or the pilots who survive their first attack. That’s the most concentrated expression of what a first-contact story revolves around.)

It doesn’t help that Turtledove’s style isn’t very punchy, with the most memorable/quotable line I found being when one German soldier reassured another who had jumped at a sound from the sky by saying it was “Just one of the Ivans’ flying sewing machines”… which is amusing mental imagery.

(Sure, there was only one major quotable passage in Mother of Demons, but that book had a lot of philosophical meat to it and introduced me to the concept of r/K selection theory… things which this story lacks.)

That said, I do like the recurring theme that humanity manages to hold on by the skin of our teeth because we were damn lucky. The Race commander decides to start out with EMP weapons to fry all our solid-state circuitry… during the brief window of a few decades when we were using vacuum tubes. The Germans manage to take out the landing ships with the main stockpile of nuclear weapons… with a giant piece of artillery (that really existed) that fires shells too robust to be taken out by their interceptor missles and so archaic that they never anticipated it.

I also like the recurring focus on a world caught wanting to cling to old national disputes and racial identities in a situation where they must come together to survive a truly alien common adversary. …and the way that it manages to work prejudices and slurs into dialogue in a fashion where it’s not objectionable, it’s being honest about how such people would have behaved in that era.

That evolves nicely and, around the mid-point, you start to see characters, both human and Race, beginning to question things about their worldviews that they previously took as gospel, and starting to relate to each other.

It just feels like, in its weakest moments, the writing is… not dry, but too wedded to reminding the reader of how mundane things continue to happen in war, in a way which just comes across as feeling either slow-paced or padded.

It doesn’t help that the book is like the first volume of a three-part Lord of the Rings set… it ends at the end of an arc, but that doesn’t even conclude that subplot, let alone the arc the earlier chapters primed you to expect the book to follow. For a book that was so slow to get started, and so reluctant to feel punchy once it did, having an arc so long that it takes multiple volumes feels like Turtledove’s editor was asleep on the job.

(Bear in mind that I’m not saying all that content needed to be cut, but, if Turtledove didn’t want to tighten it up, I’d probably suggest doing something more like what Anne McCaffrey did with most of her Pern books, and writing multiple books that explore the same period of world-changing events by following different characters.)

All in all, I’d give it a 4.0 out of 5. It’s solid, well-done writing, it stays engaging once it gets past that initial hump, and it’s clear that Harry Turtledove knows his stuff when it comes to history, but I expect so much more from a professionally published book.

Posted in Reviews | Leave a comment

A simple Clipboard/Drag-and-Drop Test Tool

While working on one of my projects, I found myself needing to inspect a raw, “all offered mimetypes” view of what was on the clipboard.

I remembered that I used to do so using some PyGTK demos, but they were never as feature-complete as I wanted, I don’t know if they’ve been ported to GTK 3.x, and, besides, I use Qt these days.

…so, I wrote the utility I wanted and here it is in case it helps anyone else. It does clipboard, selection, and drag-and-drop.

Posted in Geek Stuff | Leave a comment