Much Ado About Consting (LLO Archive)
Created 2025-11-30, last modified 2025-11-30
Part of my archive of Layover Linux Official posts on Tumblr.
2025-06-21
I used to be pretty scared to do anything significant in C. It was useful to be able to read snippets of other people's code and follow what it meant, but C has a reputation in many circles: that the absolute power it grants you is a hazard to your health.
Now that I'm writing significant amounts of C, and for a project with a non-trivial scope (writing my own programming language with genuinely novel features), I strongly disagree with this narrative about C, and where that narrative leads language designers and lay programmers. C is hazardous, but none of the hazard comes from the power.
If you're not a C programmer, I invite you to watch a little something, if you have 15 minutes. It's Dan Sacks arguing for a specific style in C++ variable declarations. That probably sounds boring, but actually, but the point of the style is to avoid hazards. And it turns out, one of the most fundamental parts of C and C++,
int foo = 4;
// [~~~~~]
// ^ this part
is easy to get wrong.
Okay, sure, maybe not in that example. You're all very smart, gold star stickers for everybody. But how hard do you have to think about this? What does this mean?
const char *const *x[2048];
I mean, that's scary, right? Variable declaration is something you'll end up having to do a billion times over, it's like breathing. The idea that such a core piece of the language might betray you, well, it's giving implicit casts in JavaScript levels of wat. How are we supposed to get anything done when there's so many footguns in unavoidable, basic parts of the language? A lot of C programmers avoid const entirely because, even though immutability could make their programs safer, C has made immutability difficult to use correctly.
It gets worse, actually. You probably thought that int example was pretty easy. How many bits is int again? How about char? Did you know that both are platform-dependent, and that without using stdint.h types you'll get different widths on different architectures? Hopefully you'll never have to deal with a platform where chars are more than 8 bits, but the C standard accommodates them.
I think it's reasonable to see this stuff as a C outsider - heck, even an insider! - and find yourself sweating nervously through your clothes. It only gets worse if you remember that C is the foundation that almost everything else on your computer is built from. Unless you're doing some exciting and unusual things, the machine you're reading this post on has a kernel written in C.
Cryogenics as a Competitive Sport
Before we go any further down the "quarantine C" radicalization pipeline, let's take a second to talk about immutability in other languages. Generally speaking, from a programmer perspective, when you want immutability at all, you want deep immutability. This is probably closest to most people's intuition, and provides the most simple and powerful protection. You would expect, in Python for example, this:
x = [1,2,3]
y = { "x": x, "other_stuff": 12}
freeze(y)
y["other_stuff"] = 11 # Raises exception. You can't change frozen stuff!
y["new key"] = 10 # And `y` _is_ frozen. So you can't modify it.
y["x"].push(4) # Not even the stuff inside it.
In fact, there is an open PEP in conversation to add this hypothetical freeze function... but it hasn't been accepted, and a previous attempt in 2005 failed. Even though deep immutability is the main kind of immutability you'd want for catching/preventing errors, it's actually hard to do in Python. Often your best option is to use frozen dataclasses, and nest those - which requires avoiding the ubiquitous list and dict types, or anything else mutable. That's... yeah, pretty fragile.
Lots of languages don't support deep freeze, but they support a shallow freeze, like JavaScript. Or take a look at Ruby:
# Example borrowed from https://www.rubyguides.com/2016/01/ruby-mutability/
animals = %w( cat dog tiger )
animals.freeze
animals << 'monkey'
# RuntimeError: can't modify frozen Array
# Freezing `animals` doesn't freeze the stuff inside `animals`. Surprise!
animals[1][0] = 't'
# => ["cat", "tog", "tiger"]
This is arguably counterintuitive, and may lead to more logic errors than it solves. And logic errors are scary - even the most memory-safe language is still trying to do what you ask. If you want to make a music program, but you type the source code for a raytracer, even Rust will not save you from asking for the wrong thing - it can only catch inconsistencies in your story, like a bourbon-soaked pulp detective. That "second pair of eyes" is useful, but it can't be a substitute for you as the author knowing what you mean, and whether your source code matches your intentions.
When you look from this lens, it becomes clear that C is actually making a very common mistake of language (and/or standard library) design. There's theoretically use cases to have nuanced "this bit is const, but this stuff it contains is not" kinds of scoping, sure, but in general practice, that's a mother of an antipattern. C could have been designed, hypothetically, for const to work in a deep immutability way, and it actually might have been easier for C compilers to implement, but it would not make C any more memory-safe. You can already cast away const, and that's a merit of the language. What gives, man?
Footguns and Calibers
C has certain non-negotiable strengths, which have cemented its position in modern tech stacks. It's low-level, it's libertine, and it decouples data structures from functions. It's hated for the same list of reasons.
Now it's true that C is hazardous. And it's true that in C, lots of logic errors are also memory errors, because the programmer is in charge of memory (and capable of making mistakes about it). In fact, I don't even believe in the "get good" rhetoric that a lot of C veterans have. C is just plain harder than it oughtta be.
But when I ask myself what I'd prefer to do low-level programming in, the answer is always "a better C." I don't want to make it high-level, or paternalistic, or object-oriented. Those aren't the things that are broken about C, and so much modern language design - by misdiagnosing the flaws in C - are trying to fix the parts that aren't broken. We're talking about a language that lets you venture into dangerous territory, because sometimes you need to - a language that shouldn't stop you, but should equip you to face danger with confidence and clarity.
What sucks about C is all the ways it undermines your ability to develop a trustworthy mental model of how your code works. Variable declarations are hard to read. Volatile doesn't do what you'd expect. Signed integer overflow is undefined behavior. In fact, a ton of things are undefined behavior, and compilers are allowed to interpret UB in source code however they like. The linker model leads to hellish nightmares. Namespaces don't exist. The built-in types are platform-dependent in ways that are fair to describe as evil. What I need to emphasize is that a low-level, libertine, procedural language doesn't automatically need to make these mistakes. A C that doesn't suck, but still has the core strengths, is actually quite easy to imagine.
The real problem is pairing an assault on human reasoning (which is what these mistakes would add up to in any language, even a high-level one) with the stakes of low-level programming. It's a language full of footguns, where those guns are lent more consequence by working with large-caliber rounds. The answer is not smaller ammunition. The answer is a language without all these damn footguns in it.
I think the closest thing to what I'm looking for at this level of the tech stack, right now, is probably Zig. I'd like to play with it someday and have a better-informed opinion. Having tried Rust, I can tell you, that isn't the answer - it's extremely paternal in ways that hurt the readability (and writeability) of source code. There are a few things it does quite well - uniquely so - but I think its niche is narrower than people expect, and the current cultural obsession with memory safety (rewrite everything in Rust!!!) is coming from a place of well-intentioned ignorance. Most of my effort to make my own programming language (Prone is not as low-level as C or Zig, but still cares about performance) comes from a goal of not having to write things in Rust anymore. I'm done suffering, it's time to heal.
I have a similar complaint about Functional Programming. Side effects are not bad, they're usually the point of programming. What we care about is something closer to the "Principal of Least Privilege" - the idea that code is easier to think about when its permissions tightly match its purpose. For example, my code handling untrusted user input should have different permissions from my code to edit
/etc/shadow. Functions being "pure" vs "impure" is actually a uselessly low resolution for the kind of protections I want. I don't necessarily get the strictness I want by using a language built around a strictness I find asinine. MINIX 3, despite being written in C, is what the kids call "bae" and "so real for this." But I will credit Haskell for getting immutable data right, at least :D
Aren't you just being a contrarian?
Au contraire, mon ami! The point is not to be controversial (though that might be an unmanaged side effect). The point is that we're supposed to learn from history, we're supposed to be analytical about what's been done, so we can make better choices as we make new things. I think we, as an industry, are... weirdly bad at this? We're perfectly capable of seeing all the CVEs and zero-day vulnerabilities in C code and saying "there's gotta be some important lesson there." But when it comes to deciding what the evidence ought to mean, and which direction it ought to move us as developers, I think we're really susceptible to accepting easy - but incorrect - answers.
I don't think this comes from bad faith. But I do think that within any given chamber, slogans echo louder than critical thinking. This warrants some skepticism and care from us, I think. I think that thought-terminating cliches are a known footgun that we'll never fully be rid of, and we'll need manual vigilance against them until the end of time.

