The Compiler Isn't Your Babysitter (And Neither Is ChatGPT)
File Options Window Help

We've built ourselves a comfortable little dystopia where the tools think for us, and we've convinced ourselves that's progress. AI writes your code! Rust's borrow checker catches your memory bugs! You Ship it! Except nobody's fucking thinking anymore. We've traded understanding for convenience, and the result is exactly what you'd expect... software that technically compiles but fundamentally sucks.

Want an example? How about Windows 11?
Windows 11 has introduced a lot of change under the hood, and none of them made the OS any better...
"Windows 11! Now with Rust, Typescript, and 30% Vibe Coded!"
Yeah, microsoft has been replacing C,C++, and C# code with Rust and Typescript, written by LLM, and the result is a handful of new bugs, and worse performance.

The Borrow Checker as a Crutch

Look, Rust's borrow checker is brilliant engineering. It catches entire categories of bugs that would've haunted C codebases for decades. But... it's not magic, and treating it like magic is how you end up with codebases that compile but make senior engineers weep. I've watched junior devs treat the borrow checker like some kind of divine oracle. They don't understand why their code is wrong, they just keep tweaking shit until the compiler stops screaming. Does it work? Who knows! It compiles, doesn't it?

The borrow checker doesn't understand your business logic. It doesn't know if you're building the right thing. It just knows whether your pointers are behaving. And if you can't explain why it's complaining without consulting Stack Overflow, you're not writing Rust, you're playing compiler roulette.

The AI Slop Epidemic

Then there's the AI assistants. Copilot, Cursor, Opencode, whatever flavor of autocomplete-on-steroids you're using this week. They're impressive, sure. They're also turning an entire generation of developers into vibe coders who can't code or debug without the internet... Don't believe me? Just look at twitter next time Cloudflare or AWS is down and see the panic "Cursor is down, I can't work".

Their workflow goes like this:
Type a comment describing what you want. Watch the AI hallucinate some code. Ship it. No need to actually read it, understand the control flow, or god forbid write a test. The robot wrote it, so it must be correct, right? And if it's not, just blame the robot, it's not your fault.

I've seen production code with AI-generated functions that nobody on the team actually understood. They just trusted the output, merged it, and moved on. When it inevitably broke in production, debugging turned into an archaeological expedition through someone else's alien logic.

AI-generated code is like that friend who's confidently incorrect about everything. It'll give you buffer overflows, race conditions, SQL injection vulnerabilities, and hardcoded API keys with the same cheerful confidence it uses to write a Hello World program. And because you didn't write it yourself, you don't have that intuitive sense of where the bodies are buried... So when it crashes, you're left staring at a pile of gibberish wondering why your perfectly safe code turned into a ticking time bomb.

The Death of Manual Verification

Here's what kills me... somewhere along the way, we decided that automated tools meant we could skip thinking entirely. The borrow checker catches memory bugs, so why worry about memory layout? Why worry about other bugs, memory bugs are everything right? Right? The AI wrote tests, so why read them? The fuzzer will find the edge cases, so why think about them upfront?

The borrow checker won't catch that you print clients credit card numbers to the console... but hacker will. AI assistants don't understand your system architecture or the weird hardware quirks you're working around. Fuzzers only find what they're programmed to look for, they won't explore every possible path in your code, and they definitely won't catch that race condition that only shows up on the first Friday of the month.

You know what catches those bugs? A developer who actually understands what the fuck their code is doing.

The Knowledge Gap They Don't Want to Talk About

There's a divide forming, and it's getting wider. On one side, you've got developers who learned on C, who've debugged segfaults, have mastered GDB, Valgrind, and other debugging tools. They've read assembly. They've traced memory leaks through valgrind output. They understand that tools are helpers, not replacements for thinking.

On the other side, you've got the Rust cult and the vibe coders. The Rust people treat the borrow checker like an infallible god, and the rust doc as the holy book. The vibe coders assume AI output is the answer to all their problems, and they don't even bother to read it. Neither group seems particularly interested in understanding the low-level details, because the tools handle that, right?

When your production system shits the bed in the middle of the night, the borrow checker isn't going to debug it for you. ChatGPT isn't going to explain why your "perfectly safe" code is deadlocking or why your rust code took down the whole internet with unwrap() after loading a bad config file. (Yeah, the famous Cloudflare Rust unwrap() incident) You need to actually understand memory models, concurrency primitives, and how the compiler transformed your high-level abstractions into machine code.

But how can you debug assembly if you've never been taught to read it? How can you reason about memory layout if you've only ever worked with garbage-collected languages like Java or Python? Or a babysited language like Rust? How can you question the tools if you don't understand what they're doing under the hood? If a mechanic can only paint cars, and has no idea how an engine works, can you really call him a mechanic?

CS Education Is An Absolute Failure

Universities are churning out TypeScript developers who couldn't explain what RAM is if their diploma depended on it. I'm not exaggerating, I've interviewed these people. (I'm not saying all of them, but a lot of them) They can build a React app, sure, but ask them how a variable is stored in memory and you get a awkward silence.

This is insane. If you're writing code that runs on computers, you should understand, at minimum, how computers work. Not at the transistor level, necessarily (that would be a big plus), but you should be able to read and understand assembly. You should understand stack vs heap. You should know what a cache miss costs. You should be able to look at LLVM IR and not immediately have an aneurysm.

These aren't esoteric skills for embedded developers and compiler writers. They are not old useless knowledge that nobody needs anymore. They're foundational knowledge that every developer needs when shit goes wrong. And shit always goes wrong.

We need to teach to write assembly, even if it's something simple like PIC16 with its tiny instruction set. And teach how to read x86 and ARM assembly, to show students how their high-level code maps to machine instructions. We need to drill into their heads that tools are assistants, not substitutes for understanding. And that the most you understand about the tool, the beter you are at using it.

The Real Problem, and why there is no solution

Here's the thing: I don't hate Rust. I don't hate AI assistants. These tools are genuinely useful when used correctly. The problem isn't the tools themselves, it's how they're being sold and adopted. They market them as silver bullets. "Use Rust and your code will be safe!" "Use AI and you'll code 10x faster!" (well I guess some vibe coder have gone from 0.01x to 0.1x engineers, but that's not the point) What we don't mention is that these tools only work correctly when wielded by people who already understand the fundamentals. They augment expertise, they don't create it.

The enshittification of modern software isn't because our tools are bad. It's because we've built a culture where developers outsource thinking to compilers and chatbots, where "it compiles" is considered sufficient, where understanding how things actually work is treated as optional knowledge for 'greybeards'. And where sadly the knowledge gap between the new 'devs' that staeted with those tools and real engineers is growing wider by the day.

The solution isn't to go back to writing everything in C and assembly, mostly because new dev don't have the capabilties to do so. It's to use modern tools while maintaining the discipline and understanding that makes good developers good in the first place. Let the borrow checker catch your memory bugs, but understand why they were bugs. Use AI to speed up boilerplate, but read and verify every line it generates. Employ fuzzers and sanitizers, but don't let them replace manual testing and code review. And most importantly, don't let the Borrow Checker or the AI replace your brain.

Tools should augment human intelligence, not replace it. The moment we forget that is the moment we've already lost. And from what I see on twitter and github... it's probably too late for that.