This site was changed over to markdown-based authoring almost two years ago, but the back end always was a bit sluggish. Naturally I cache the converted data, so this wasn't a big issue until now.
Yesterday I reworked everything with a new css base (purecss.io, quite nice, and a few silly glyphicons from fontawesome.io, just because I can). Now the site should almost work properly for mobile kit, and it's still all pure CSS and no Eczemascript whatsoever.
While experimenting and hacking that stuff up I saw that some pages really really took time to prime. As it turns out, good old standard Text::Markdown is horribly slow. A number of my source articles took 5+ seconds to convert, each, and these are mostly very simple files. Can't have that.
So, today I completely reworked the back end with redis as an optional cache across processes /and/ a better markdown renderer.
markdown is not exactly strictly "standardized", and there's only discount as a practical alternative (for me), but that's primarily a C library and a command line tool. There's a perl wrapper for the library, Text::Markdown::Discount, but that thing is utter garbage (no access to the options, internal gotchas in the code etc.).
And discount is weird; it's got all those 'useful extensions' snort
to
the markdown syntax, most of which suck and many of which are on
by default. yay!
So, in the end I resorted to fork
+exec
ing a discount process for every conversion, but that still takes only 4 milliseconds on average...not 5+ seconds
as before.
Anyway, long story short, now it works properly. Still, I have to say it: ASS. A_NonStandardStandards_S, too - but then most of the Standard Standards are no much better.
(And should you be unfamiliar with the phrase "down, not across" - that's the ASR motto, being the effective way to slit your wrists.)