Novel – Decision at Doona

While working on another blog post (still to come), I needed a point of comparison, so I decided to re-read one of my old favourites.

Decision at Doona by Anne McCaffrey

At its core, this 1969 novel is a soft sci-fi first-contact story. You have humans and an alien species known as the hrrubans, and they both wind up missing each other’s surveys and colonizing the same planet at the same time… for the same reason.

Both species are horrendously overpopulated and, to cope with the overpopulation, have pacified their cultures to the point where youth suicides are up, apathy is causing a labour shortage, and the colony on Doona/Rrala is an experiment into returning to an earlier stage of their cultural development.

According to Ms. McCaffrey’s Wikipedia page, the idea for it actually came to her when, at a school play, she watched a teacher tell her four-year-old son to quiet down.

The core driving conflict is that, in both cases, their governments have a “no contacts” policy which would pull everyone back to the overcrowded tedium they thought they’d escaped. For the humans, the big conflict comes from trying to do the wrong things for the right reasons. Specifically, this is humanity’s second first-contact, and an innocent mistake in their first went so disastrously wrong that the other species committed mass suicide. The result was a deep and lasting sense of guilt and a law that says no contact is to be made with indigenous populations under any circumstance, and, as far as they’re aware, the hrrubans are not space-faring.

(Of course, the grin-inducing part is that the hrrubans are actually more technologically advanced, to the point where their village wasn’t found on surveys because they have matter-transmitter technology and brought the whole thing home for the winter.)

I’ve always enjoyed Anne McCaffrey’s work and I’ve always enjoyed first-contact stories, so, on that front, the only thing I would have wished for is more time spent looking at humans from the perspective of hrruban characters. It’s also got two sequels, the latter of which involves humans and hrrubans engaging in first contact with a third species together.

…So, what are the noteworthy characteristics?

First, I like how it introduces things. The first chapter gives the colonization plans from the perspective of the hrruban government officials… but it’s so familiar that, if you skipped past the list of characters at the beginning of the book, it’s easy to mistake titles like “First Speaker” for fancy sci-fi human titles until chapter 8, when they refer to the aliens as “bareskins”. It’s a nice touch for a story intentionally built around how much the two species have in common.

I also like the world-building in details like the hrrubans having developed a treatment for the weak, porous local wood which involves boiling the sap of a local tree until it turns into a penetrating varnish which reinforces and seals it.

(And, while it’s in one of the sequels rather than the original, I just have to mention the pun that easily stuck in my memory for the two decades where I was too distracted by other things to read any Anne McCaffrey: A pilot named Mr. Horstmann who called his ship The Apocalypse just so he could make a Dad Joke everywhere he goes.)

Now, about the downsides…

The first downside is one I don’t consider a downside, but others might. I like Anne McCaffrey, but my brother doesn’t because he feels that all her stories feel a bit too much like rural contemporary stories transplanted into other settings. I don’t get that impression strongly enough for it to be a problem, but you might feel differently.

The second downside is one that will only get worse over time, and which Decision at Doona is probably more hard hit by than her other stories, being both sci-fi and family-centric: While she did a better job than various male authors I’ve read from the same period, and the book clearly wouldn’t pass John W. Campbell‘s dictates as editor-in-chief of Astounding Science Fiction (later Analog) that humanity is always to be depicted as superior to aliens, Ms. McCaffrey couldn’t completely rid her futuristic family-man main character and his wife of 1969-isms.

That shouldn’t be as big an issue if you’re reading a story about a setting like Pern which has regressed to pre-industrial pseudo-fantasy, or one of her other series with an independent woman as the lead. However, reading Doona in 2020, it’s noticeable how important it is to the main character’s psyche to remain “in charge” as a father, and all I can remember about what his wife actually does is that she takes care of the kids and cooks.

To be fair, the book is only 245 pages long and there is a fairly sharp drop-off in how much focus is given to secondary characters so it could partly be that having the main character and his son being primary characters and her being part of the secondary cast exacerbates the sense of implausibility of having a woman of the future having no apparent noteworthy traits outside of the core traditionally feminine pursuits.

To compound that impression, there was a scene which I was certain was alluding to him having spanked his 6-year-old son (whose incorrigibility is very significant to the plot) but, later, that gets called into question when one of the girls who was watching him winds up suffering from alien poison ivy on steroids and the line “She’s never hurt anywhere in her life. How do you explain pain to her?” comes up.

It also occurs to me that, given how many of her books have female leads, McCaffrey may have been accidentally overcompensating in writing Doona, and folded a few too many 1969-isms back in while trying to prevent her male lead from coming across as feminine. It would make sense, given that Doona is the earliest McCaffrey story with a male lead that readily comes to mind.

But, whatever the reason, “the squeaky wheel gets the grease.” I don’t want to give an overinflated impression of how much these flaws grabbed my attention while I was reading. It’s still an enjoyable read and I didn’t notice them at all as a teenager. Likewise, there will no doubt be many people who grew up with fathers like the main character. …I just think it’s something that’s going to only get more discordant for future generations as more families don’t yet have interstellar travel, but have already achieved men not seeing dominance as an essential part of their self identity.

All in all, I’d give it a 4.3 out of 5 rating and, without the anachronistic elements, it’d be a 4.5 for how much I’ve enjoyed my various reads of it over the years. Definitely something to try if you like Anne McCaffrey’s style and the length and style are well-suited to a teenager looking to grow beyond Young Adult fiction.

Posted in Reviews | 1 Comment

Novel – Mother of Demons

Here’s a book I read on the Baen Free Library ages ago and finally got around to buying and reading in print at the beginning of December.

Mother of Demons by Eric Flint (sadly, no longer in the Free Library)

Superficially, it’s another “failed colony” story, similar to the premise used by Anne McCaffrey’s Pern series and Marion Zimmer Bradley’s Darkover series for creating a fantasy story in a fundamentally sci-fi setting.

Darkover is still on my TODO list, but I can already tell that I tend to prefer such because I normally read fantasy despite the simplified morality, rather than because of it, and if a cosmology includes gods, I’m prone to seeing it as a hint of dystopianism… a sign that the children will forever be ruled by their parents… it may also be that my outlook on the world harmonizes better with authors who are drawn to writing such stories.

I love that the book has that mindset. It’s rich with intellectual knowledge, and even the title embodies it, in that, as the story points out, outside of the Abrahamic religions, “demon” essentially means “powerful outsider” without automatically implying “evil”.

On that note, this story isn’t a fantasy story. Rather, it’s more a blend of historical fiction, fantasy, and sci-fi influences. A sci-fi setup, where the technology was lost, seen through the eyes of bronze-age land squids and human PhDs… and that’s what makes it good: It’s engaging to see the humans from the perspective of the aliens, and it’s engaging to read about the insights of the highly educated human characters from an author like Eric Flint.

(If you’ve ever read Eric Flint’s Ring of Fire series (beginning with 1632), you’ll know what I mean when I say that this mixes in sociology, history, tactics, and religion… Eric Flint clearly loves to study history, then share that love in his writing.)

It does have the odd moment of humour, but it’s generally serious. A notable example would be this exchange between the main two surviving human adults:

Julius immediately named their hut “Sodom and Gomorrah.” And he demonstratively refused to come near it, fearing, or so he claimed, the wrath of God.
“You don’t even believe in God!” Indira had once protested.
Julius chewed his lip. No, I don’t. But you never know. And if He does exist, He has two outstanding characteristics. Judging, at least, from the Old Testament.”
“Which are?”
“He’s the most hot-tempered, narrow-minded, mean-spirited, intolerant, anal-compulsive, bigoted redneck who ever lived. And, what’s more to the point, He’s a lousy shot.”
“It’s true!” he insisted, in the face of Indira’s laughter. “Read the Book yourself. Somebody pisses Him off, does He nail ’em right between the eyes like Buffalo Bill? Hell, no! He drowns everything. Or He blasts whole cities, or drops seven lean years on entire nations.

— Mother of Demons by Eric Flint, p. 111

The core premise hinted at by the title, which gets revealed as the story progresses revolves around Indira (the historian and one of only a handful of surviving adults) seeing events unfold through a historian’s eyes, but being tormented by her inability to acknowledge that, no matter what she does, she will be instrumental in shaping history and, no matter what she does, a great deal of suffering is on the horizon.

The simplest metaphor I can give for her dilemma is that a cannibalistic analogue to the Roman army is on the march, and the leader of the next generation of humans has the potential to found an analogue to the Mongol Empire… but, at the same time, that’s not necessarily as bad as it may sound, because the mongols got a lot of bad press in the west. (The Mongol Empire was a beacon of religious tolerance and appreciation for academic knowledge… they just also pissed off medieval Europe with their pragmatism in the art of war and how effective it made them against chivalrous knights.)

Philosophy is a significant focus of the story… more so in the later chapters than the earlier ones. It starts out sparse, but then builds and builds, even as the history, which was never sparse, also builds and carries the readers along for the ride.

I’ve always been too fascinated by deep technical knowledge to dedicate proper effort to being a student of the softer sciences but, just in how its presented, this comes across as an amazing work on the history alone.

The story also makes excellent use of “the colonists are dependent on a local species, which have spiritual beliefs which preclude dissection” to justify limiting how much time is spent on describing the aliens’ biology, which is a clever solution.

That said, for all that I praise the story, I wish the second part had been structured differently, because it feels like it’s told almost exclusively in flashback (regardless of actual temporal progression) and that left me having trouble caring about the humans, because I wanted to go back to the alien characters.

On the other hand, when the time spent on the humans is good, it’s good. I love that it touches on the distinction between K and r breeding strategies and it was very satisfying to read an argument for why it should be inevitable for humans and aliens to have similar emotions (p.199) as a result of similar survival pressures needing to be processed by the brain and surfaced to the conscious mind.

To be honest, I suspect that my issue with the time spent on humans might be in part because I’m an aspie, so others may not have as big a problem with the structure as I did. I have mentioned before that, when I read Poul Anderson’s Tau Zero (nominated for a Hugo award), I skimmed through most of it, bored out of my mind as I waited for the soap opera to give way to “good writing”.

As it approaches the end, there’s a lot of focus on military tactics, which helps to make things interesting in such a fractious setting. It’s a rare story where I enjoy a battle scene in print because I’m sharing it with a character who’s more or less critiquing it. (p. 296)

I suppose, if there was one thing I’d have to pick as the best part of the story, it’d probably be how masterfully Eric Flint piles on the fantasy-style “invent tons of words and cultural elements” world building without it feeling onerous to me. That’s a skill far too often underestimated (it’s one of the few areas that come to mind most strongly as something I don’t understand the rules of) and, if you’re talking subjectively (i.e. feeling onerous to me), it’s a test even professional authors don’t always pass. (For example, I’ve been meaning to read The Left Hand of Darkness by Ursula K. Le Guin and The Bone Doll’s Twin by Lynn Flewelling but I just can’t find a time when I’m in the right mood to get into them.)

By fanfiction standards, it’d definitely be a 5 out of 5 but I’ve been slacking on print fiction for so long that I worry my intuition for how good print works are is out of touch with my current level of insight. As such, I’ll rate it a 4.8 to leave a little room above it for what I anticipate encountering when I read/re-read some of the other books in my collection. (The question is whether stories which avoid the issues I had with this are common enough to merit reserving an entire rating increment or rare enough that they should be distinguished by my equivalent to “nominated for/won an award” category.)

Either way, give it a read.

Posted in Reviews | Leave a comment

A virtualenv indicator that works everywhere

NOTE: While none of the significant elements of this approach require zsh, the formatting syntax and mechanism for updating a prompt dynamically differ between zsh and bash. See the end for a version with zsh-specific bits stripped out.

For a while, I’ve been using a variation on this zsh prompt tweak to get a pretty indication that I’m in a virtualenv. However, I was never quite satisfied with it for two reasons:

  1. It only works for virtualenvs activated through virtualenvwrapper
  2. It goes away if I launch a child shell… which is when I’m most likely to be confused and needing an indicator.

The solution was obvious: Instead of using a virtualenvwrapper hook, put something in my .zshrc which will detect virtualenvs opened through any means.

For those who just want something to copy-paste, here’s what I came up with:

zsh_virtualenv_prompt() {
    # If not in a virtualenv, print nothing
    [[ "$VIRTUAL_ENV" == "" ]] && return

    # Support both ~/.virtualenvs/<name> and <name>/venv
    local venv_name="${VIRTUAL_ENV##*/}"
    if [[ "$venv_name" == "venv" ]]; then
        venv_name=${VIRTUAL_ENV%/*}
        venv_name=${venv_name##*/}
    fi

    # Distinguish between the shell where the virtualenv was activated and its
    # children
    if typeset -f deactivate >/dev/null; then
        echo "[%F{green}${venv_name}%f] "
    else
        echo "<%F{green}${venv_name}%f> "
    fi
}

setopt PROMPT_SUBST PROMPT_PERCENT

# Display a "we are in a virtualenv" indicator that works in child shells too
VIRTUAL_ENV_DISABLE_PROMPT=1
RPS1='$(zsh_virtualenv_prompt)'

First, notice the use of VIRTUAL_ENV_DISABLE_PROMPT. This is because activate will prepend a less attractive indicator to PS1 that also goes away in child shells.

(Just make sure you remove any PS1="$_OLD_VIRTUAL_PS1" you might have added to postactivate or you’ll have no prompt after typing workon projname and be very confused.)

Second, note the use of PROMPT_SUBST. This is actually shared with my code for adding git branch information to PS1, PS2, and PS3 because profiling showed it to be faster than using a precmd function.

Third, note the single quotes for RPS1. That’s necessary to defer the invocation of $(check_virtualenv) so PROMPT_SUBST can see it.

I also added a couple of convenience features:

  • I have had a history of virtualenvwrapper not getting along with Python 3.x, so some of my projects have their virtualenvs at ~/src/<name>/venv rather than ~/.virtualenvs/<name>. This script will display <name> in the prompt either way.
  • If I’m in a child shell where the deactivate function isn’t available, the prompt will show <foo> rather than [foo] to make me aware of that.

Aside from that, it’s just ordinary efforts to avoid performing disk I/O or use $() in something that’s going to get run every time the prompt is displayed, and a function structured so the most common code path executes the fewest statements.

While this StackOverflow answer cautions against using VIRTUAL_ENV to detect virtualenvs, its reasoning doesn’t apply here, because it’s talking about detecting whether your Python script is running under the influence of a virtualenv, regardless of whether activate was used to achieve that. The purpose of this indicator, on the other hand, is specifically to detect the effects of activate so I don’t run something like manage.py runserver or pip install in the wrong context.

Bash Version

bash_virtualenv_prompt() {
    # If not in a virtualenv, print nothing
    [[ "$VIRTUAL_ENV" == "" ]] && return

    # Support both ~/.virtualenvs/<name> and <name>/venv
    local venv_name="${VIRTUAL_ENV##*/}"
    if [[ "$venv_name" == "venv" ]]; then
        venv_name=${VIRTUAL_ENV%/*}
        venv_name=${venv_name##*/}
    fi

    # Distinguish between the shell where the virtualenv was activated and its
    # children
    if typeset -f deactivate >/dev/null; then
        echo "[${venv_name}] "
    else
        echo "<${venv_name}> "
    fi
}

# Display a "we are in a virtualenv" indicator that works in child shells too
VIRTUAL_ENV_DISABLE_PROMPT=1
PS1='$(bash_virtualenv_prompt)'"$PS1"

It’s almost identical to the zsh version, but the following functions which zsh provides for free are left to the reader in the bash version:

  • Implementing a right-aligned chunk of the prompt which stays properly positioned if you resize your terminal.
  • Using tput to retrieve the colour-setting escape sequences for your terminal and then caching them in a variable so you’re neither hard-coding for a specific terminal type nor performing multiple subprocess calls each time you display your prompt.

Posted in Geek Stuff | Leave a comment

How to skip the fortune command when your shell is slow to start

A.K.A. How to get and compare timestamps without external commands in shell script (and without even invoking subshells in Zsh)

I love the fortune command. It’s a charming little addition to each new tab I open… until something (like a nightly backup) has blown away my disk cache or a runaway memory leak is causing thrashing. Then, it’s just a big delay in getting to what I want to do.

The obvious solution to any non-shell programmer is to time everything and invoke fortune only if it’s not already taking too long, but shell script complicates that by having so few builtins. We don’t want to invoke an external process, because that would defeat the point of making fortune conditional, and we don’t want to invoke a subshell because, if we’re thrashing because of memory contention, that’ll also make things worse.

It turns out that bash 4.2 and above can get us half-way there by using a subshell to invoke the printf builtin with the %(%s)T token, but Zsh has a clever little solution that even reuses code that we’re going to need anyway: prompt substitutions!

Here’s the gist of how to pull it off:

# Top of .zshrc
local start_time="${(%):-"%D{%s}"}"

# -- Do all my .zshrc stuff here

local end_time="${(%):-"%D{%s}"}"
if [[ $(( end_time - start_time )) < 2 ]]; then
    if (( $+commands[fortune] )); then
        fortune
    fi
else
    echo "Skipping fortune (slow startup)"
fi

This is a standard “subtract start time from end time to get how long it took, then compare it to a threshold” check, so the only part that should need to be changed in bash is using start_time="$(printf "%(%s)T")". Instead, let’s pick apart how the Zsh version works:

  1. We start with a bog standard ${VAR:-DEFAULT} parameter expansion however, unlike bash, Zsh does consider ${:-always default} to be valid syntax.
  2. The (%) on the left-hand side is a special magic flag, similar to the (?i) syntax used for inline flag-setting in some regular expression engines. It enables prompt expansion of both the (nonexistent) variable’s contents and the fallback value.
  3. %D{...} is Zsh’s prompt expansion placeholder for putting strftime (man strftime(3)) timestamps into your prompt.
  4. %s is the strftime token for “seconds since the epoch”
  5. You have to quote the %D{...} or the ${...} consumes the closing curly brace too eagerly.

That’s the big magic thing. A way to write an equivalent to time(2) from the C standard library in pure Zsh script with no use of $(...) or zmodload and, since we’re using prompt expansion to do it, the only thing we might not already have needed to load into memory is the code for the %D{...} expansion token.

(Unfortunately, there’s no way to get sub-second precision with this approach, so the only two useful threshold values for a well-optimized zshrc are probably “1 second” and “2 seconds”.)

Now for that odd (( $+commands[fortune] )) way of checking for the presence of the fortune command. What’s up with that?

Well, it’s actually a micro-optimization that I use in my zsh-specific scripts. According to this guy’s tests, it runs in half the time the other options take and, in my own tests using his test scripts, I found that, depending on the circumstances, that could go as far as one tenth of the others, and that the others vary wildly relative to each other. (On runs where $+commands is 7 to 10 times as fast as type and which, hash is sometimes twice as fast as type or which and sometimes half as fast.)

Normally, this would be a moot point because any of the portable ways of checking for the existence of a command via a subshell and a builtin would be far too quick for it to matter (ie. I do it just for the heck of it) but, in this case, it felt appropriate.

(Another unnecessary micro-optimization that I didn’t use here was preferring [[ ]] over [ ] in my zshrc scripts. My tests found a million runs of [[ "$PWD" == "$HOME" ]] to take about 1.4 seconds, while a million runs of [ "$PWD" = "$HOME" ] took about 4.2 seconds.)

Posted in Geek Stuff | Leave a comment

On-Demand Loading for your .zshrc or .bashrc

Recently, I’ve been trying to make my coding environment snappier, and one thing I was never happy with was how slow my .zshrc is.

Now, don’t get me wrong, I’m not one of those people using oh-my-zsh with a ton of plugins and seeing 15-second waits for my shell to start… but I do want a new tab to be ready in a second or less.

So, I slapped zmodload zsh/zprof onto the top of my .zshrc, opened a new tab, and ran zprof | less …and 50% of the wait was in sourcing virtualenvwrapper, which I don’t feel like reinventing.

Time to take a lesson from the improvements I’ve been making to my .vimrc. Specifically, the { 'on': ['CommandA', 'CommandB'] } option hanging off the end of various lines for my plugin loader.

A little experimentation later and I came up with this construct:

function init_virtualenvwrapper {
    # Don't do anything if it's already loaded
    type virtualenvwrapper_workon_help &>/dev/null && return

    # ------------------------------------------------
    # normal stuff to load virtualenvwrapper goes here
    # ------------------------------------------------
}

for cmd in workon mkproject mkvirtualenv; do
    function $cmd {
        unset -f "$0"
        init_virtualenvwrapper
        "$0" "$@"
    }
done

For those not familiar with shell scripting, I’ll clarify.

For each shell function or command that I want to trigger deferred loading, I create a function with the same name that does the following:

  1. “Delete” itself so it won’t interfere with what virtualenvwrapper is going to set up. (You want to do this first to avoid removing what virtualenvwrapper just created)
  2. Call the virtualenvwrapper setup code to load the real command.
  3. init_virtualenvwrapper starts by checking for some side-effect of having been run before and exits early if that’s the case. (This keeps mkproject from re-doing what workon already did, or vice-versa.)
  4. Call the actual command and pass through any arguments.

Doing this means that:

  1. Your .zshrc or .bashrc startup time only pays the price for declaring a few shell functions. (And, if that gets too heavy for some reason, you could move init_virtualenvwrapper into another file and source it on demand.)
  2. Your first call to a wrapped command like workon will take longer. (eg. if it was adding two seconds to your shell start time, then your first call to it will take two seconds longer.)
  3. Subsequent calls to that or any other command sharing the same init_virtualenvwrapper will be as quick as usual.

Unfortunately, this design is actually Zsh-specific, which sucks for me because this is a file I share between .zshrc and .bashrc:

  1. Bash doesn’t support using a variable for a function name, so you can’t use a for loop. You’ll just get `$cmd': not a valid identifier.
  2. In my testing, functions didn’t set $0 in bash, so this will actually execute bash "$@", bringing you back to where you started, while zsh doesn’t set the FUNCNAME array variable that bash uses.

So, if you want to support both, here’s the most concise form I was able to put together:

function init_virtualenvwrapper {
    local _cmdname="$1"
    shift
    unset -f "$_cmdname"

    # Don't do anything if it's already loaded
    if ! type virtualenvwrapper_workon_help &>/dev/null; then
            # ----------------------------------------
            # normal stuff to load virtualenvwrapper
            # ----------------------------------------
    fi

    "$_cmdname" "$@"
}
# }}}

function workon {
    init_virtualenvwrapper "${FUNCNAME[0]:-$0}" "$@"
}
function mkproject {
    init_virtualenvwrapper "${FUNCNAME[0]:-$0}" "$@"
}
function mkvirtualenv {
    init_virtualenvwrapper "${FUNCNAME[0]:-$0}" "$@"
}

Anyway, I hope this helps to inspire anyone else who’s suffering from slow shell startup times.

UPDATE: And now, shortly after writing that, I discover that someone else went to the trouble of using eval to provide a nice API on top of this trick and put it up on GitHub as sandboxd. From that name, I can see why i didn’t find it before.

Posted in Geek Stuff | 2 Comments

So… What Does Your Government Do With Culture?

So often, asking people about culture is like asking a fish “How’s the water?” (The answer you’ll get is “What’s water?”) but it’s still useful to ask the question and, sometimes, you get interesting answers.

This time I’m wondering about ways your country’s government promoted the enrichment of culture and I think the best way to jog people’s memories is to give a bunch of examples of the kind of thing I’m talking about.

Everyone’s at least heard of government grant programs in the abstract (and Canada does have those. They’ve been instrumental in the creation of indie games I love, like Guacamelee, and shows like Mayday, a long-running docudrama series nominated for many awards which, unlike so many American ones, remembers that air crash investigations are detective stories first and human drama second)

Can you think of any “thanks to” credits for other government agencies or programs that show up in the credits of your favourite shows or on the websites of your favourite games?

…but, still, that’s kind of an obvious way to do it. What about stuff that’s less overtly “government promoting culture”?

Next down the progression of obviousness, there’s public broadcasting. Like the U.K., Canada has a public broadcaster (the CBC). It does produce excellent content of its own, such as the radio programs Quirks & Quarks and Because News (also available as podcasts), and it has adapted well to the Internet era (in addition to podcasts and the like, they also do print articles now), but it was actually ahead of its time. For quite a while before YouTube came around, you used to be able to watch complete archives of shows like Royal Canadian Air Farce on the CBC website in RealVideo format.

The U.K. actually has more than one public broadcaster. For everyone who knows about the BBC, how many of you know that Channel 4 (of Time Team fame) is also government-owned?

… but encountering your public broadcaster while channel-surfing is still too obvious. Let’s go deeper.

For example, since 1961, the CBC has been part of a partnership with House of Anansi Press and the University of Toronto to produce the Massey Lectures. If you like TED Talks, check them out. (I especially recommend Doris Lessing’s Prisons We Choose To Live Inside from 1985. It’s an amazing talk about human psychology that’s more relevant than ever, you can listen to it online for free, and, to my embarrassment, I didn’t know about it until the print version was assigned to me as reading in university.)

Can you think of anything your government contributes resources to along these lines? Recurring cultural events?

…how about PSAs that go beyond just being practical and help to spread culture? When I was a child, I don’t remember CBC television having commercials… though it’s possible they just had a reduced supply of them. The important thing is, their shows were formatted to leave room for a normal number of commercials. …so how did they fill that time?

Some other channels, such as the Family Channel (a kids channel which used to be commercial-free), filled the time with random pop music videos, but CBC did something a little more appropriate… they filled the time with shorts provided by other government-backed cultural enterprises like the National Film Board of Canada, and Canadian Heritage Minutes. Anyone who grew up in Canada is likely to fondly remember these things, so I’d say it was hugely effective.

Like so many kids, I forgot most of what I learned in history class, but I still remember about the amusing origin of the name Canada, the Halifax explosion, and the origin of Winnie the Pooh.

Likewise, what kid would know about Wade Hemsworth’s music if not for classic animated shorts like The Log Driver’s Waltz and The Blackfly Song produced by The National Film Board of Canada? (Not to mention the classic cartoon version of The Cat Came Back?)

Can you think of anything this engaging that your government actively produced or did they stick to purely functional pieces like Duck and Cover? (I’ll also accept stuff that isn’t distinctive to your local culture, but demonstrates that PSAs can be entertainment in their own right, such as Australia’s Dumb Ways to Die.)

Anyway, now we get to the stuff you take most for granted.

When I was a kid and I occasionally saw American money, I’d think “Huh. American money is ugly.” Much later, my father brought home a bunch of European coins. To my surprise, it turns out that it’s not that American money is ugly… it’s that Canadian money is uncommonly artistic. Of all the money I saw, the only other country with comparably beautiful coinage was Ireland. Don’t believe me? Scroll down to the pictures on these pages. (Note: I linked to pre-Euro Irish currency because that’s what I saw.)

The U.S., the U.K., Belgium, France, Switzerland, Germany… every other coin I saw had some boring piece of patriotic imagery or maybe a coat of arms, while Canadian and Irish coins were beautiful expressions of the culture of the nation in question. (Don’t believe me? Here’s the Canadian 50 dollar bill from 2004… that imagery commemorating The Famous Five wasn’t a special commemorative bill. That was the normal fifty.)

It makes it look as if all those cultures have deep insecurities, so they’re “compensating for something” with their patriotic imagery. It’s such a given that I love my country that, for most of my life, I never understood the point of putting up a Canadian flag on a non-government property. Why plaster the same patriotic imagery everywhere like graffiti when you can instead be expressing yourself, either by creating your own art or by displaying other people’s art which speaks to your sense of aesthetics.

…but enough of that tangent. Another example would be the Canadian flag. No tiny details like on Mexico’s flag, but still with more artistry recognizable to the common person than all those flags made of coloured rectangles and/or stars. (I’m not singling the U.S. out here. Look at France, the U.K., Russia, and countless other countries.)

It just seems like counties get boring when the topic of government art comes around. The only other flags that readily come to mind as having that kind of elegance are the Japanese and South Korean flags and the fern iconography that showed up in the New Zealand flag referendums.

…or, for that matter, look at the elegance of the T-130 wordmark [2] used on official Canadian government documents and signage, such as equally elegant T-605 primary identification signs.)

If you’re going to see something in so many places, why is it such a rare idea for government decision makers to grasp that it should be as aesthetically satisfying as possible?

It took me years to notice these things, so I’m really curious to see examples where Canada is the boring one and I’m just taking it for granted that thing X and thing Y are boring. (Does anyone have a really interesting national anthem? Canada’s seems to be just as boring as the American and Australian ones.)

Posted in Web Wandering & Opinion | Leave a comment

Forcing Firefox to Open CBZ Files Properly

If you’ve ever downloaded a .cbz file using Firefox and then tried to click it in the downloads panel, you might have noticed that Firefox ignores the association for the .cbz extension and instead opens it as a .zip file. (This isn’t the only situation where this happens and I filed a bug about it a year ago.)

I never got around to looking into why it doesn’t make the same mistake with .odt documents, which are also Zip files with a specialized extension, but I think you can see why I wouldn’t like it.

Here’s a quick little script that can be set as the Zip handler on a KDE-based desktop which will hand over to Ark under normal circumstances but, if it receives a file with a .cbz extension, will pop up a Yes/No dialog offering to open in Comix instead.

Anything which obeys the system’s associations properly will never trigger it, because .cbz files will be associated with Comix and it won’t pop up the dialog if fed a .zip file, so it’s about the best solution one can have without fixing Firefox.

This approach could also be easily extended to application/octet-stream to work around the other situation where I’ve seen this causing problems. (Patreon serving up image files with the wrong mimetype, if I remember correctly.)

Posted in Geek Stuff | Leave a comment