A Balanced Exploration of Arguments For and Against Ape-like Cryptids

I was watching some old documentaries on YouTube and a particular one on Bigfoot made some points I hadn’t considered in my conclusion that it was just a mix of myth and hoax.

Don’t get me wrong, I haven’t become a believer, but I now don’t feel that there’s enough evidence to solidly disbelieve either. (Luckily, I don’t really have an emotional connection to either outcome, and that ambivalence makes it easy to sit on the fence.)

Arguments Against

So, why was I so certain that Bigfoot didn’t exist before? Mostly because I had the following strong arguments which grouped Bigfoot in with other things that everyone agrees to be fake:

First (and most convincingly), now that everyone has a cameraphone in their pocket, we should be seeing a lot more evidence. Evidence for real phenomena doesn’t stay flat as the number of people with the will and means to gather it grows exponentially.

The BBC had a whole series, Nature’s Weirdest Events, built entirely around this idea and the fact that, with the massive number of people carrying cameras now, even very rare occurrences are getting conclusive recordings made.

Second, in 2017, they finally did DNA analysis on purported Yeti remnants (fur samples caught on vegetation in valleys, scalps kept as relics, etc.) and every single one was conclusively determined to not be from a primate. Some were potentially from an unknown type of bear, but the only conclusively positive result that was surprising was that, judging by the fur samples, the Himalayan goral has a larger range than we thought.

Third, oral histories (in any culture) tend to blend myth and reality in ways that mislead those who try to extract reality from them without being experts, and Bigfoot believers invariably brush this under the rug.

Fourth, it’s very rare for large animals to remain undiscovered. The countless new species we continue to discover tend to be insects, small reptiles and amphibians, plants, and fungi.

Finally, Bigfoot believers tend to set up a false dichotomy that, with all this evidence, it must either be true or some grand conspiracy to commit a hoax. However, we humans have a history of cognitive biases producing odd results.

It doesn’t have to be a global conspiracy to engage in a hoax if our neurobiology is inherently biased toward developing the concept of “wild men of the wilderness” in the same way that it’s inherently biased toward inventing gods to explain nature.

When people see something they’re not certain about, they tend to fill in the details based on whatever cultural zeitgeist is floating around in their heads. One of the books I read (probably Carl Sagan’s The Demon-Haunted World) went into detail on how the post-World War 2 UFO craze caused a rapid shift in interpretations of existing events like sleep paralysis.

Arguments For

So, with all that strongly leaning away from Bigfoot being real, what did the documentary say that cast it back into doubt?

First, it pointed out two things that I had never seen mentioned in any other documentary, and which hadn’t occurred to me before:

  1. It’s rare to find animals remains in the wild, because they tend to retreat to die, and, when it’s not dead livestock in the middle of a wide-open pasture (with no cover and possibly a guard donkey or llama), scavengers are very efficient.
  2. Animals like Bigfoot and the Yeti are said to live in areas which are still rugged and sparsely-populated to this day.

As the anthropologist they interviewed said, there are at least 100 bears for every Bigfoot and he’s never found a bear that died a natural death, just lying there in the wild.

That ties in with an important detail I already did know: Some animals are very shy. Heck, even primates we know about can present supporting evidence. Western scientists didn’t acquire specimens of the mountain gorilla and bonobo until 1902 and the 1920s, respectively.

Beyond that, from observations of other shy species we know, it’s perfectly plausible to conclude that there exist species so shy that they are likely to go extinct from competition for habitat before we ever encounter them and it’s plausible that Europeans, having a more industrialized and less in-tune-with nature lifestyle, produce a larger “radius of deterrence” compared to aboriginal peoples.

Likewise, both South America and Australia have humans and thick forests, yet the range of places where creatures like Bigfoot, the Yeti, the Almas, and so on appear seem to better match the ability of non-human species to spread, rather than our much greater ability to spread. That’s a point against the “myths emerging independently due to a bias in human neurobiology” hypothesis.

There were a couple of other points made in the documentary regarding pre-Columbian evidence (some native carvings that seem too consistently ape-like and a claim that Leif Erikson encountered a “forest spirit” that doesn’t fit the description of of a Native American), but I had to discard them since I couldn’t find any solid online corroboration and I don’t care enough to go poking through offline sources.

Third Option: Maybe Bigfoot once existed but not now

Everybody seems to be so caught up in taking sides that I’ve never seen anyone address this possibility. Maybe all the 20th century stuff is hoaxes and wishful thinking, but there really is something to the native myths.

Difficulty in testing it aside, it’s certainly a reasonable hypothesis. Tribal history doesn’t really care about precisely recording how long ago the most recent encounter was, and there is strong evidence of pre-industrialized humans driving animals to extinction even when we don’t outright hunt them. (Jared Diamond’s The Third Chimpanzee mentions that, for example.)

All in all, I think you can see why I adopted the more scientist-ish viewpoint of “What do I believe? I believe that the evidence is inconclusive.”

Posted in Geek Stuff | Leave a comment

Fanfiction – Being Harry Potter

Kirinin really has a knack for writing fanfiction that has a special spark to it. I first recognized that when I was looking for Ranma ½ fanfiction and read The Pit. This time, it’s a Harry Potter – Draco Malfoy body-swap fic with a difference:

Being Harry Potter by Kirinin

The basic concept is that Draco Malfoy wakes up as Harry Potter in year 6, with no idea what happened.

I don’t want to say much about the plot, but I will say that it follows Draco and he discovers the “who” and “why” in chapter 2… but it turns out to be a big and very novel plot twist.

This is not your ordinary “they have to lay low and slowly gain an understanding of each other as they wait for a solution” fic. (And I do really love the foreshadowed reason for the “why” to backfire on the “who”.)

Instead, it’s a tantalizing blend of “lay low because I’m too ‘set adrift’ to think of a better plan”, stuff I don’t want to spoil, and scenes that could be in a more ordinary body-swap fic, but most authors wouldn’t think of them.

Now, as a more specific example which isn’t a spoiler, I do really like the interaction between Snape and “Harry” during a detention. Draco’s preconceptions about Harry do evoke some interesting character-exploring behaviour from the both of them.

There’s also a delightfully clever “No, wait. Bad mental image.” moment in chapter 6, though I can’t quote it because it depends on details which would constitute a spoiler.

I let my note-taking slip somewhere around the middle of reading it, and I don’t have time to re-read it right now, so I can’t go into detail beyond that point (to the degree I usually would), but Draco’s character evolves very nicely throughout the story and there are some very elegant little touches (such as the role a certain jacket ends up playing).

I also found the manner in which Voldemort met his end to have a particular poetic cleverness to it.

I will mention that there is a moment when a male-male relationship comes up and the character being into guys had no basis in canon. While I have no problem with that sort of thing in and of itself, I do feel like the way it was introduced detracts from the narrative as a whole. That said, it’s quite minor so, if you just treat it as one flawed scene and keep reading, it should quickly get carried away as you get back into the flow of the story.

Before I put this review on the back burner, somewhere around the middle of the story, I estimated that I’d probably give it a 4.7 out of 5 rating because I wasn’t feeling that sense of certainly I normally associate with a 5 out of 5. While I’d probably need to re-read it in one sitting to be sure, that does seem about right.

Posted in Fanfiction | Leave a comment

Resources for Reverse-Engineering 16-bit Applications

See Also: My list of tips and resources for writing new DOS and Win16 apps

While offering some advice, I got a little caught up in the research I was doing, so here’s a list of the resources I found for picking apart 16-bit x86 applications.

Reference Materials


In the days of DOS and Windows 3.x, executable packers and code protectors like PKLite and PackWin were a common means of saving precious disk space, as well as deterring casual inspection of an executable by novices with tools like DEBUG.COM.

They still see use to this day, but exponential decline in the cost of storage space, combined with the risk of false positives from virus scanners, has reduced the demand for this sort of software. (Aside from JavaScript minifiers, for which an effect similar to a decompiler can be achieved using a code beautifier.)

When it comes to reverse-engineering, packed/protected executables must be unpacked/unprotected before utilities which operate on the on-disk form of the program will return useful results.

A now-unmaintained but open-source utility which will handle the vast majority of packed executables for DOS and Windows 3.x.
Universal Extractor for Windows
A tool which combines code and algorithms from a great many unpacking tools not covered by UNP into one convenient package… including various tools for unpacking installers without running them.
UPX for all major platforms
The most popular executable packer today and also capable of unpacking its own creations, assuming they haven’t been modified to obscure their nature.
archive of exetools.com
the Wayback Machine’s October 2002 archive of this site contains a huge list of more esoteric unpacking tools.


Decompilers attempt to retrieve something higher-level than assembly language from a program.

This may be the normal state of things if the language doesn’t compile to machine code or it may be a convenience accomplished by looking for patterns of machine instructions that are known to come from specific higher level constructs like for and function calls.

(For languages which compile to machine code, you get something half-way between assembly language and the original source code, because the compiler threw out various higher-level details which would be needed to perfectly reconstruct the original source.)

Reko for Windows
According to the old SourceForge page (which has screenshots), this open-source decompiler’s list of noteworthy supported formats includes MS-DOS, Win32 and AmigaOS.
The release notes on GitHub mention adding support for NE-format EXEs (Windows 3.x and OS/2), SEGA Genesis, SEGA Dreamcast, NeoGeo, partial support for Atari-TOS, and improving support for MacOS Classic (though I couldn’t find the initial mention of adding it).
DoDi’s Visual Basic 3/4 Decompiler for Windows
The site refers to this freeware tool as a Visual Basic 4 decompiler, but the tool is originally for decompiling Visual Basic 3 and, if the newer version is limited to Visual Basic 4, the site also provides links for earlier versions specifically intended for Visual Basic 3.
DisC – Decompiler for Turbo C 2.0/2.01 (source)
A decompiler specifically for programs built using Borland Turbo C 2.0 and 2.01 (which is offered as a free download by Embarcadero).


Disassemblers are what you need if you want to inspect the code in its entirety when it’s not running

They’re generally more reliable than decompilers, and preserve the correspondences to the machine-code form necessary to design patches.

Compared to debuggers, they tend to have more advanced searching features (to make up for the lack of ability to find things by running the code until you hit a breakpoint), but can’t deal with especially dynamic code where the behaviour is most easily understood by running it.

All Formats

semblance (source)
An open-source, non-interactive disassembler for MZ (DOS), NZ (Win16), and PE (Win32) executables.
Doesn’t explicitly mention which platforms are supported, but it works on Linux and I suspect it’ll build on any POSIX-compliant platform (eg. MacOS, Cygwin, etc.).
TatraDAS (GUI for Win32, CLI for Linux)
Open-source disassembler with support for MZ, NE, PE, and .com executables. Unlike semblance, doesn’t seem to dump non-code segments.
CLI version is claimed to be OS independent, but I haven’t verified that claim.

DOS MZ Format

IDA Pro Freeware v5.0  for Windows
It’s hard to beat the IDA Pro disassembler and the 5.0 release, which is free for non-commercial use, can disassemble DOS MZ and 32-bit Windows PE binaries.
The ScummVM developers have requested and received permission to host a copy to ensure it will always be available to people wanting to add new game engine re-implementations to ScummVM. If my link breaks, check the HOWTO-Reverse Engineering page on the ScummVM wiki.
In my testing, this ran just fine on non-Windows platforms under Wine. (See also PlayOnLinux and PlayOnMac as easy ways to manage Wine versions and application prefixes.)

Windows 3.x NE Format

URSoft W32Dasm (A.K.A. Win32Dasm) for Windows 9x
Don’t let the name fool you. This tool disassembles both 16-bit and 32-bit Windows EXEs and, from what I remember in my high school years, it was by far the most comfortable option around.
Sadly, it appears to have been the result of a one-man operation with a website run on his ISP’s hosting, and it vanished from the web some time between February 2002 and May 2003, with its most recent version listing a 1998 copyright date in the About box.
When I was poking around, I found it not at all difficult to find the most recent registered version, but I couldn’t find a citation to back up the claims made by some sites that the last release is freeware that never received a patch to remove the warning about not sharing it, rather than so-called abandonware. (It could go either way. There is precedent for official freeware re-releases with warnings left intact and the author didn’t bother to update the copyright date in the image in question from 1995 to 1998.) As such, I will not provide a link to it.
From what I remember, it can be finicky about which Wine versions it will work under. (It won’t crash… it’ll just fail to load its desired monospace font and display useless placeholder glyphs.)
Windows CodeBack [2] [3] for DOS 3.x+
The Wine developer wiki recommends this disassembler for Win16 executables.
If my links ever go dead, it won’t be difficult to find as wcb105a.zip in various archives of old shareware and freeware. Like IDA Pro Freeware, it’s free for non-commercial use, and the features withheld for paying users are ones you’re not likely to need.
However, my quick tests seem to indicate that it’s a purely non-interactive, command-line disassembler.
In my testing, this ran perfectly well in DOSBox 0.74.
An open-source, non-interactive disassembler for NZ (Win16) executables with supplementary features for Major BBS modules.


Debuggers are what you need for inspecting code while it’s running. They take some getting used to, but their versatility is unmatched.

More advanced debuggers even support “reverse debugging” allowing you to get the program to a point of interest, like the manifestation of a bug, then rewind its execution to explore how you wound up there.

DeGlucker for DOS
A protected-mode debugger which claims to be more powerful than Turbo Debugger and Soft ICE. Version 0.5 alpha, released in May of 2000,  is closed-source, but freeware.
Also offered is version 0.4 from January 1999, released with TASM x86 assembly source code under informally stated “do what you want but leave the attribution” terms after the original author no longer had time to maintain it.
Insight for DOS
A GPL-licensed real-mode debugger that gets suggested frequently. I don’t know whether it has any advantages over DeGlucker aside from being open-source and being usable on older, more limited hardware.
Open Watcom C/C++ for DOS and Windows
Open Watcom includes a complete set of tools, including DOS and Windows debuggers (WD.EXE and WDW.EXE) and a resource editor capable of poking around inside EXE files built by other compilers (WRE.EXE).
While Open Watcom hasn’t made use of the support it inherited for Windows 3.1 builds of the installer, all installers will install the DOS and Windows tools (even the Linux installer includes them), so you can use the DOS installer under Windows 3.1 as long as you don’t mind manually setting up the Windows bits.
WineDbg for POSIX (Linux, MacOS, etc.)
Wine‘s built-in debugging support can be used in two different ways:
  • Using the built-in command-line interface
  • Using an open-source graphical frontend like gdbgui, DDD, or kdbg via the gdbserver protocol.
DOSBox Debugger
The DOSBox emulator contains a debugger which can be enabled in special builds.
Win32 builds are provided, as well as instructions for making debug-enabled builds for other platforms.


Radare2 is hard to classify because it does so much. Wikipedia summarizes it as “a complete framework for reverse-engineering and analyzing binaries”.
It supports DOS, Win32, Java, Gameboy, Gameboy Advance, Nintendo DS, Nintendo 3DS, Commodore VICE, WebAssembly, Android, and XBox binaries, among others.
According to the website, it can disassemble, assemble, debug, patch, be scripted in multiple languages, run on all major platforms, and more.
You’ll probably want to use the Cutter GUI with it.


If you’re looking to poke at Win9x-era stuff too (eg. Win32 PE binaries), some of the listed tools support them and here are some additional tools you could try which were omitted from the previous lists for supporting neither MZ nor NE:

PE (Win32/Win64)

  • x64dbg (Open-source debugger)
  • OllyDbg (Shareware debugger, free registration but not needed for full function.)
  • Snowman [2] (Open-source decompiler, can run as an x64dbg plugin)
  • Boomerang (Open-source decompiler)
  • IDA Pro Freeware 7.0 (Non-commercial-only like v5.0. Supports 64-bit PE but drops MZ. Also adds Linux and MacOS versions.)
  • Ghidra (Open-sourced by the NSA. I’m unclear what formats other than PE are supported, but I’ve seen it described as being like IDA Pro but more free or like Radare but more mature.)
  • Borg Disassembler (info claims it to be freeware, but source is offered too)
  • Frida (Dynamic instrumentation framework, describes itself as “Greasemonkey for native apps”)

Macromedia Director and Flash

  • Swifty Xena Pro for Windows (Extract SWFv4-6 and Macromedia Director v6-7 resource bundles from single-EXE projectors)
  • dirOpener (extract resources from Macromedia Director resource bundles (v8.5 and earlier). The original site is only on the Wayback Machine and they pulled the download links, but you can still use it as reference for the filenames to google. May need the SWA Decompression Xtra from the full Shockwave Player installer for some content.)
  • JPEXS [2] (a complete open-source reverse-engineering suite for Flash files)
Posted in Geek Stuff | Leave a comment

Fanfiction – In Love of Quidditch

Now for a bit of compare and contrast with a story that has some interesting characteristics.

In Love of Quidditch by Secondary Luminescence

This story shares a very similar premise to one of the best-written Harry Potter stories I’ve ever read… The Pureblood Pretense. In fact, judging by various plot elements, it’s actually a fanfic that does to The Pureblood Pretense what The Pureblood Pretense does to Tamora Pierce’s Song of the Lioness Quartet.

I don’t really want to call it a fix-fic, because I feel that doing so would be disrespectful to the amount of effort that went into changing things up. It’d be better to call it a different take on the same concept. An AU of an AU, per se. It’s not as original, even when you only consider the elements they don’t have in common, but it’s clear that Secondary Luminescence really tried their best. (The sequel has problems, but I’ll be covering it in a separate review.)

Now, given that Pureblood Pretense is richer and more engaging than some of the print novels I’ve read, while In Love of Quidditch is a good but otherwise ordinary fanfic, I’ll try not to compare them too much on how they achieved what they aimed for. Instead, I’ll focus mostly on what they intended to do and accomplishments that were attainable for both stories.

What makes In Love of Quidditch an interesting story to analyze is the way Secondary Luminescence used finesse to solve the large-scale weaknesses in The Pureblood Pretense’s plot, which murkybluematter just powered through on raw skill. (The latter being made even more impressive by the explanation I got from murkybluematter that those flaws exist because they weren’t taking it as seriously in the beginning when said flaws were established.)

Like The Pureblood Pretense, this story applies the broad strokes of Tamora Pierce’s Song of the Lioness Quartet to the Harry Potter setting: In an alternate wizarding world where a discriminatory law forbids her from pursuing her dream, a female Harry Potter trades places with a male relative in order to go to Hogwarts, disguised as a boy: In Pretense, studying potions under Professor Snape despite Hogwarts being for purebloods only. In this, playing quidditch despite women being forbidden from flying brooms.

In Pretense, Harry has a complete and loving family, which makes the deception more difficult to pull off while, in Quidditch, only James Potter and Remus Lupin are left, James cut off contact with Remus when the kids were 5, and Lily’s death has left him so distant and buried in his work as to be somewhat neglectful… making the deception much easier. (Though that does increase the risk of going in the other direction and making the story start to feel like hardship porn.)

Given that the most obvious form of discrimination was already taken by Pureblood Pretense, sports is a surprisingly well-considered alternative. There is a great deal of (under-taught) history surrounding sexist double-standards and outright prohibitions against women in the world of sports… not to mention that, for all it has appeared in artwork from time to time, riding a broom side-saddle as used to be expected of women on horses isn’t very practical.

That said, there is one major flaw which becomes obvious in hindsight later in the story. In Pureblood Pretense, Voldemort is still alive and reworked into the leader of the political party responsible for limiting Hogwarts to purebloods. It’s a clean and elegant way to justify a lot of other changes, and to shift many of the conflicts in the direction of political manoeuvring. In Love of Quidditch, on the other hand, makes a brief mention in chapter 1 that he hasn’t been sighted in six years and then reveals Quirrellmort, fairly unchanged and vulnerable to a similar “protection in Harry’s skin” despite Lily’s death being in childbirth, without any of the foreshadowing to justify that specific approach to things. This is sloppy compared to how well put together the rest of the story is and probably should have been my first hint at the flaws which crop up in the sequel.

Like both Pretense and Lioness, Quidditch only focuses on the experiential aspects of the gender-bending to the bare minimum necessary to hand-wave or justify bits of the plot. For example, mentioning her getting used to the feel of wearing boxers to set up how a prepubescent girl manages to get away with changing among the boys in the locker room.

(People who write gender-bending stories have a tendency to make the gender-bending itself the focus of the story and, while I do enjoy that too, it would be detrimental to the story in cases like this. Also, more generally, there’s a tendency for authors to lose perspective and have the story err too much on the side of gratifying what I refer to as their “non-sexual fiction kink”… a base interest that, while non-erotic, evokes similar emotional fixations. In other words, a strong antonym for “pet peeve”.)

As I mentioned in my review of it, the biggest weakness of The Pureblood Pretense is that “Rigel Black” is the type of literary hero who becomes a Mary Sue if written poorly. The success of the plot depends on her being the kind of prodigy that, while they exist (I was reading and understanding computer magazines at age 6), tend to feel uncomfortably convenient unless the author is so good that you’re having too much fun to question things. (thankfully, murkybluematter is that good.)

In Love of Quidditch does this differently, finding alternative plot points that are easier to justify. Here, the closest thing to being a hard-to-write prodigy that this Harry has is a love of flying stronger than in canon, a twin brother who was left terrified of it by a childhood accident, and a father who is so distant that, in the second book, it takes him a week to notice that she radically changed her hairstyle. (Even though it matters so much to him that, once he does notice, he grounds her until she agrees to use a hair regrowth potion… which she never does.)

Like canon Harry Potter, she’d do well in any house, but, unlike in The Pureblood Pretense, she not only refuses Slytherin because it would draw her father’s attention, she insists that the Sorting Hat use her memories of her brother to put her where it would have put him. Disgruntled, it shelves its second choice of Ravenclaw and puts her in Gryffindor. Also, like canon Harry Potter, she does accomplish things that hint at being magically powerful. (In canon, casting a patronus capable of driving off over a hundred dementors in his third year. In this story, casting a reparo powerful enough to fix the bathroom the troll trashed in her first year.) Both of these help to avoid the risk of her accomplishments feeling contrived when combined with her stated reasoning for studying like mad: To keep James Potter from having reason to come to Hogwarts and discover the deception.

Furthermore, unlike “Rigel Black”, she doesn’t keep her secret from everyone for years on end. Within the first few chapters, the Weasley Twins have figured it out (no doubt, thanks to the Marauder’s Map) and agreed to keep her secret in exchange for help in pulling off some of their pranks and getting material from the library. (After all, they did get banned from it for destroying some of the old newspapers that had her picture in them.)

By chapter 6, Madam Pomfrey has also found out, but her oaths prevent her from telling anyone and, during the encounter with Quirrellmort, he reveals the deception to Cedric Diggory, who is with Harry in this version of events, promises to keep her secret after it’s all over and done, and becomes more of a main character going forward.

This is a perfect example of the difference in approach. The Pureblood Pretense does what could be contrived, but you get so into the story that you don’t notice. This has the hero sometimes getting outplayed, but in ways where it makes perfect sense for it to further the plot, rather than ending it.

Now, there is a spot where the fic really convinced me of its intention to be an homage. In chapter 7, Harry’s efforts to be too perfect a student for James Potter to have reason to come to Hogwarts have caught Professor McGonagall’s attention. As a result, she gives Harry a book titled “Transfiguration Lessons for the Newfound Prodigy” and “his” thoughts on the matter are She knew she was good, but she wouldn’t have ever called herself a prodigy. That and the more subtle detail that, like Rigel, Harry’s effort to avoid drawing attention has backfired in some sense, are both elegant nods to murkybluematter’s work.

In a way, you could say that it’s like a fix-fic, but on a much more abstract level. It takes the concept of The Pureblood Pretense, and then does a good job of finding much less demanding ways to go about all the bits that make “Rigel Black” walk the line between a rough literary hero and an exceedingly polished Mary Sue.

Pretense definitely has a richer and more engaging cast of characters, but Quidditch isn’t exactly slacking and spends a fair bit of time exploring minor characters too. I especially enjoy how the Weasley twins wind up taking about as big a role in this as Ron did in canon and how Cedric Diggory starts to take on a significant role near the end. (The choice of the twins is quite the elegant one when you think about it. Not only are they Harry’s teammates, but it makes a surprising amount of sense for Ron to be rebuffed from befriending Harry by the amount of interest the Twins show in “him”.)

Some things are still a bit of a stretch, such as Harry learning sufficient legilimency so quickly after finding someone who can teach her, but it’s not immersion-breaking… especially in the context of Harry being a candidate for Ravenclaw and a voracious student. (And, also, it’s a side effect of borrowing a plot point from Pureblood Pretense which, in my opinion, cut off or postponed more productive elements of the narrative.)

On that note, now for the problems.

First, the most minor couple: I do think Hagrid’s accent is written a bit too thickly. I have to slow down and concentrate to make sense of what he’s saying on occasion and you don’t really want to pull the reader out of their immersion in the story like that. Also, the story uses “sorcerer’s stone” rather than “philosopher’s stone” which is a pet peeve of mine. Regardless of what American publishers say, the philosopher’s stone was the name of the real-life goal of real-life alchemy that Rowling was referencing.

Next, something that does work, but, as I mentioned, feels like it’s stifling stuff that the author had the potential to develop: While the mechanism and motive are different, it borrows the “sickness that traps people in their minds” idea from Pureblood Pretense (which it does make sufficiently fresh and interesting). The problem is how the side effect of doing so squashes the developing teacher-favoured student relationship between Harry and Professor McGonagall. Yes, it does make it easier to avoid Harry’s skill level feeling contrived, but I think what was lost was worth more than what was gained… especially when it could have instead been handled by reducing the progress Harry was making without the help a bit, so the results of the added help would be easier to justify to the readers.

This is the root of the biggest problem, as well as the core issue that causes so much trouble in the sequel: Secondary Luminescence seems to have trouble making the broad strokes of the plot truly original. Instead, borrowing bits from canon or bits from the Pureblood Pretense and reworking the details enough to keep them interesting.

In this book, the biggest problem manifests in the climax. For all the mystery built around the crisis of the year, the climax winds up being Quirrellmort, when there’s practically no mention of Voldemort or Quirrell in the story up to that point. That could still work… but not when you just assume details like Harry’s blood protection which depended on canon events which didn’t happen in this version of the story.

All in all, while it does stick closer to the source material than would be best for it, it mostly manages to form its own identity, leaving a sense that it’s adapting ideas from The Pureblood Pretense in the same way that The Pureblood Pretense adapts ideas from the Song of the Lioness Quartet.

It may not have murkybluematter’s “5 out of 5 is an understatement” writing skill, but, it’s still got an uncommon amount of novelty worked into the events and character relationships. Even with the “comes out of left field” aspect of some of the details from the climax, Secondary Luminescence has done a good job of switching up the “flavour text” enough to make familiar events interesting again.

(Including one change which is noteworthy enough that I want to mention it. When you’re writing an Alternate Universe fic like this one, you’re allowed some leeway to bend the “single divergence point” rule of good fanfiction as long as the changes feel sufficiently unimportant. In this case, it’s used to have Harry and friends assume that Fluffy is male but then have Hagrid correct them on that. It’s a nice little way to poke at the human tendency to make assumptions about gender based on preconceptions and unrelated characteristics as long as you don’t take such opportunities often enough for it to become gratuitous.)

I’d give it a 4.3 out of 5… and it would have been a 4.5 out of 5 with a fix to the climax. (And it definitely helps that, if you explain it to someone else, there’s no need for any “Trust me. The author makes it work.” justifications.)

Finally, to address the elephant in the room. Yes, the title does annoy the crap out of me. My inner pedant can’t help but scream that “In Love of Quidditch” is wrong and it should be “For Love of Quidditch”, dammit!.

Posted in Fanfiction | Leave a comment

How to Copy-Paste YouTube Comments With Formatting in Firefox

TL;DR: Copy more. Extending your selection outside the comment body will prevent the bug from triggering.

For the last little while, I’d been suffering from an annoying bug where, if I tried to copy and paste YouTube comments, I’d lose all the line breaks and have one big ugly wall of text.

I finally decided to go to Bugzilla and report it, only to discover that someone else had reported it two years ago and there appeared to be no progress, so I decided to see if I could figure out what was going on… and not only did I narrow it down to something actionable, I realized why I had only started to suffer from it much more recently.

YouTube serves up its comments in a custom HTML element named <yt-formatted-string> and, in the DOM inspector, it looked like the element contained exactly the problem text I was getting when I tried to copy and paste.

At first, this had me worried, as visions of custom rendering and esoteric bugs danced through my head but, as I continued to poke around, I noticed two things:

First, it actually did have the line breaks… but as raw, plaintext newlines (\n in views showed all characters). That prompted a suspicion, which revealed the second thing…

They’re using white-space: pre-wrap to ask the browser to render that “plaintext” as it’s meant to be.

It was then I had a bit of a “No. It couldn’t be that simple.” moment.

Sure enough, when I popped over to jsBin and added these active ingredients, I was able to reproduce the bug:

  div { white-space: pre-wrap; }




Firefox has a bug with copying and pasting any text that’s set to white-space: pre-wrap;!

…but then why didn’t I trigger it before? I actually discovered that completely by accident, when I got a bit sloppy and impatient. If you begin your text selection outside the pre-wrapped element, then it copies properly!

For YouTube, I used to copy the entire comment, including the header, in one go, and then edit out all the cruft that got picked up in between the username and the content, which protected me from the bug.

In hindsight, that does make sense. This isn’t the only circumstance I’ve run into where Firefox may not give the clipboard what you expect if you begin and end your selection in the right place. (I’ve also seen it happen when copy-pasting fragments of <p> tags into my fanfiction quotes bin, which is from a normal website into a contenteditable element in a form, also being rendered by Firefox.)

EDIT: And I’ve now tracked down and reported that other bug too.

Posted in Web Wandering & Opinion | 2 Comments

Working around serde_json bug #464 (serializing pre-epoch mtimes)

You may not know this, but Rust’s serde serialization/deserialization library can panic if you happen to feed it a file modification time prior to the UNIX epoch.

I discovered this when it killed an application I was writing and, in a lapse of reason, reported it on serde_json rather than the Serde core, where the problem code resides. I then went on to file a bug on Rust’s standard library about the API in question being too easy to misuse like this, with the docs not drawing sufficient attention to that hazard.

However, while I’m waiting for them to resolve the problem, I still need to actually get work done, so I cooked up a custom timestamp type that I can #[derive(Serialize, Deserialize)] on instead, because I felt it was simpler, cleaner, more concise code than overriding the default serialization for something like the Rust SystemTime type which serializes to a non-primitive JSON type.

Here’s the code I came up with, in case anyone else needs it:

Posted in Geek Stuff | Leave a comment

Why you should ALWAYS practice defensive programming

Take a look at this Rust snippet for a moment and tell me whether you can find the reason it’s not safe to run.

After all, Rust is a language that makes comfortable safety its claim to fame, so it should be pretty obvious, right?

if direntry.path_is_symlink() {
    return Entry::Symlink {
        path: direntry.path().to_owned(),
        target: read_link(direntry.path()).unwrap()

Surprisingly enough, there are actually two ways this can fail, and both can be insulated against by this change… assuming it’s acceptable for the Symlink variant to store an Option<PathBuf> rather than a PathBuf:

if direntry.path_is_symlink() {
    let target = read_link(direntry.path()).ok();
    if target.is_none() {
        println!("path_is_symlink() but read_link() failed: {}",

    return Entry::Symlink {
        path: direntry.path().to_owned(),
        target: target

(In this case, the error resolution strategy is “surface the error in an e-mail from cron via println! so I can fix it when I wake up and send a ‘best effort’ entry to the index that this code populates”.)

The hint lies within that .unwrap() or, more correctly, in the nature of what’s being unwrapped.

Problem 1: Rust’s ownership model can only protect you from misusing things it can control

The first code snippet is susceptible to a race condition because there’s nothing Rust can do to stop another program on the system from deleting the symlink in between the entry.path_is_symlink() and the read_link(entry.path()).

No matter how much validation you do, paths are inherently weak references to resources you don’t have an exclusive lock on. Either handle potential errors each time you dereference them (ie. every time you feed them to a system call), or, if possible in your situation, open a file handle, which the kernel can impose stronger guarantees on.

In my case, opening a handle to share between the two calls was not possible, because the initial query is done by the ignore crate, but I then have to call read_link myself. (I’ll probably file a feature request for this, since it seems like an odd oversight in ignore‘s metadata support.)

(This is what makes proper use of the access(2) function in the C standard library such a niche thing. File permissions can change between when you check and when you try to take advantage of them, so it’s only really proper to use it as a way to bail out early when, otherwise, you’re at risk of performing some heavy or time-consuming task, only to fail for lack of permissions when it comes time to make use of the results.)

I’ve actually written a more thorough blog post on this approach to looking at types of data.

Problem 2: Programmers are fallible

The entry object comes from a WalkDir iterator from the ignore crate, and, for some reason, in my project, the root of the tree always returns true for entry.path_is_symlink() even though it’s not.

Likewise, my very first test with an EXE parser named goblin panicked, because I threw an EXE at it that the author hadn’t anticipated being possible. (I’ve reported the bug and it has since been fixed.)

No matter how good a programming language is, you and the people who write your dependencies are still using the good old Human Brain, which has been stuck at version 1.0, bugs and all, for at least 25,000 years.

As of this writing, I haven’t yet determined whether the symlink bug is in my code or ignore, but it does present a perfect example of why I wrap each unit of processing in except Exception as err: (Python) or std::panic::catch_unwind(|| { ... }) (Rust) in my projects, even if I feel confident that I’ve handled or prevented each failure case with more specific code.

In short, you want a thorough belt-and-suspenders approach to safety:

  1. Expected Errors: Study up on ways that calls can fail, so you can avoid unnecessarily throwing away progress or giving vague diagnostic information with a coarse-grained recovery strategy. (I actually tripped over just this problem with serde_json in the same project, where it was difficult to diagnose and report a bug because the panic message didn’t contain enough information.)Despite that, this still is one of Rust’s biggest strengths. For example, I never realized that getcwd could fail in over a decade of using it in Python.
  2. Unexpected Errors: Identify your transactional unit of work (eg. a step in a processing pipeline that writes its intermediate products to disk, a single thumbnail in a thumbnailer, a single file in a downloader, etc.) and wrap it up in the most general error handler you can write.

As your knowledge grows, the scope of expected errors will grow (filesystem race conditions being a good example of something people don’t usually learn early on), but you’ll never anticipate everything. (Nor should you. There’s a curve of diminishing returns at play, and you need to balance robustness against programmer time.)

Suppose you want to leave your computer to thumbnail thousands of images overnight. You don’t want to spend weeks perfecting code for a single night of use, but would you rather wake up to find thousands of thumbnails and a handful of error messages in the log, or a few thumbnails and a panic message dating to not long after you fell asleep?

UPDATE: It turns out that the bug is in ignore and I’ve reported it.

Posted in Geek Stuff | Leave a comment