Premature Optimization (LLO Archive)
Created 2025-11-30, last modified 2025-11-30
Part of my archive of Layover Linux Official posts on Tumblr.
2025-09-05
Doing so much work in C these days (at least in the hobby part of my life, not professionally) has been an interesting and useful pressure on my perspective about premature optimization.
When I was a youngster, there's a lot of stuff I took as received wisdom that I didn't yet have the experience to question. "Premature optimization is the root of all evil? I'll take your word for it!" I was writing slow code in slow languages, but it was usually fast enough for my purposes.
As I got older, I started to encounter more and more situations where I needed a particular piece of code to be fast, but there was an architectural mistake preventing improvement. And this is where I started to get really disillusioned at the idea of delaying any kind of consideration of performance. I also got really good at asking "in a perfect world, what's the solution that requires the least amount of machine effort?" immediately followed by "well what's in the way of doing that?" This was the era of napkin math, of specialized DB tables and indexes, getting practice looking at problems outside the box of "object in code corresponds to object in real life."
This is basically where I've lived for awhile. Still using slow-ish languages, but armed with the knowledge that you can get more horsepower out of architectural decisions and concepts than from language choice... and that you should think about performance in your big picture before you've painted yourself into a corner. And it's felt like growth, I'm still convinced it is.
But now that I'm using C, I'm finally understanding where the advice about premature optimization was probably coming from, particularly given Knuth himself was using C when he said it. I'm used to languages where you would quickly run out of meaningful optimizations, as long as you got the design right. But in C, you build the world tailored to your needs, and with that freedom comes more potential (and perceived responsibility) to optimize. The skill ceiling is so high it's overwhelming. And design still matters, arguably even moreso, because C is very detail-oriented, and a change to the design means many tiny intricate changes across many files.
I've had a couple times now, working on Prone, where I've had to back out a micro-optimization (like single-alloc strings) because it made the code too hard to work with for what it is: a prototype, that will only ever graduate to production use "someday." I've actually found it useful to consciously say "screw optimal, how do I do this as simply and uniformly as possible." It's a new and foreign feeling to me.
What I've found useful is a willingness to build new alternatives next to old systems, and eventually cut over and delete the original. I've found that trying to update an existing structure or approach is usually maladaptive - I want a new type, with a new name, that I migrate to intentionally, and clean up the names afterwards. And I've found that the only thing I should optimize for right now is to write a clean prototype, because the problem I'm solving isn't yet strictly defined enough to fully geek out about the fastest way to do it. I still have time to make design decisions later that facilitate speed. But I'm exploring too much new territory to care about anything other than MY velocity right now, and that's okay.

