Along with all the fun, creative stuff
it enables you to do,
programming sometimes requires you to carry out
boring and repetitive editing operations.
If those operations are uniformly applicable,
it's straightforward to automate them
using regular expressions
and a tool like sed,
or :%s/foo/bar/g in vim-speak.
But sometimes a regex can't express
the pattern you want to match against
and on those occasions,
vim macros can come to the rescue.
with functional programming idioms
to control concurrency
without recourse to 3rd-party libraries.
This post contrasts two such patterns
that enable you to process data
either concurrently or serially,
with back-off and retry logic
in the event of errors occurring.
For Christmas I bought myself a Garmin Forerunner 645,
after becoming irritated to the point of distraction
by my Polar Flow M430.
Now that I've used the Garmin
for a few weeks,
this is my compare-and-contrast review
of the two devices.
Recently I sat my WSET level 2
and did pretty well.
I scored 90% in the exam,
which is a pass with distinction.
Based on that experience,
these are my tips
for anyone else planning
to take the course.
About a month ago,
we had an outage
caused by a slow-running query in MySQL.
This particular slow query
wasn't spotted when it was deployed
because it depended on data
inserted by client browsers
and the related preference in Firefox
was not enabled at that point.
A few weeks after it shipped,
the client pref was flipped on
and as the table grew,
it slowed down MySQL increasingly
until the whole of Firefox Accounts
And of course,
because Sod's Law
is one of the fundamental forces of nature,
this happened late on a Friday night.
There were some complicating factors
that slowed down diagnosis,
but it's also fair to say
we could have caught it at source
with an EXPLAIN of the offending query
during code review.
Because of that,
I decided to try and automate
EXPLAIN checks for our MySQL queries.
Refactoring boilerplate code is always easy
in dynamically-typed languages,
but sometimes takes a bit more effort
when constrained by strong typing.
This is something I was puzzling over recently,
when the penny dropped for me
about how Rust's macros
can be used to bridge the gap.
Let's say you have a huge amount of JSON data
and you want to parse values from it in Node.js.
Perhaps it's stored in a file on disk
or, more trickily,
it's on a remote machine
and you don't want to download the entire thing
just to get some data from it.
And even if it is on the local file system,
the thing is so huge that reading it in to memory
and calling JSON.parse
will crash the process
with an out-of-memory exception.
Today I implemented a new method
for my async JSON-parsing lib, BFJ,
which has exactly this type of scenario in mind.
"Bad performance hurts user engagement"
is a sentiment that feels intuitively true
but can be hard to sell in product conversations.
The best way to persuade a non-believer
is to point them at hard evidence
and for that you need
to do a few things
with your performance data.