The Omnissiah
Picture a future where we've forgotten how our own machines work. The systems still run (sometimes), but no one alive understands them anymore. So we stop engineering and start praying, muttering incantations at servers, taping blessings to motherboards, swinging incense around data centres in the hope that the spirits inside will keep the lights on. It sounds absurd, but it's also a logical endpoint of a world already filling with AI black boxes nobody can explain. Warhammer 40k's tech-priests are increasingly looking like a forecast.
Having spent over a decade building software, I can say with some confidence that this current wave of AI development is the most tectonic shift I've witnessed in the role of the software engineer. The lines between product and engineering are blurring fast. While that has obvious upsides for businesses, it carries a quieter risk for anyone entering the pipeline today. It pulls them away from the finer details of the craft, the hard-won lessons that traditionally shape a senior engineer.
Software engineering has always been built on more than syntax. It rests on principles like keeping things simple, naming things well and understanding that every line of code is a liability as much as an asset. Experienced engineers apply these almost without thinking. As we drift further from the implementation, these principles start arriving via textbook rather than lived experience, they become a form of trivia instead of instinct.
That raises an uncomfortable question. How do we truly understand the systems we're building if so much of the building is delegated? We may become fluent in what the thing is meant to do, but if what we're building and how it's built have quietly decoupled, can we really claim responsibility for the final artefact?
We've already seen the early tremors. High-profile bugs, outages, embarrassing failures, not always directly caused by AI, but accelerated by a culture that moves faster and comprehends less.
The standard rebuttal is that we just need to catch these issues before they reach production with comprehensive test suites and guardrails around AI output. Those things help. They are also sticking plasters on a deeper wound. It's a bit like bowling with the bumpers up, except we've handed every bowler a Super Bowl 3000 cannon that fires the ball down the lane at 300 miles an hour. The bumpers are technically still there - they are also about to become shrapnel.
The worry isn't just that bugs slip past those fragile defences. It's that when something does go wrong, we'll no longer have the understanding required to figure out why.
This has always been the role of the software engineer. If we can't take responsibility for the artefacts we create, what is our role going forward?
Companies are coalescing around the term "product engineer", a hybrid role that collapses business and engineering into one fast-moving position. Same job, new coat of paint. Software engineers have always needed a grounding in both code and business domain. In that regard, nothing has changed.
The longer-term issue is different. We currently benefit from the pre-existing knowledge of experienced engineers. What happens to the profession if we can no longer comprehend the systems we build? Some will argue we've simply moved further up the abstraction chain. We don't write assembly anymore, after all. I think that grossly oversimplifies the problem. Moving up a level of abstraction has historically meant using deterministic compilers and interpreters, tools built by people with a deep comprehension of the layer beneath them. Those layers of comprehension form the toolchain.
AI inserts itself at every level of that chain at once. In theory, I could use it to write Assembly as readily as JavaScript - that isn't obscuring my comprehension of the implementation, it is replacing it.
The most pressing thread here is the apprenticeship problem. Senior engineers exist because junior engineers spent years failing at things, sitting through code reviews, slowly absorbing instinct from people a few steps further along. Seniority wasn't conferred by time served, it was earned through small, often painful lessons. If that pipeline shortcuts straight to "describe what you want and accept the output", where do the next decade's seniors come from?
The optimistic answer is that AI can be a sort of sparring partner rather than a ghostwriter. Treat its output as a first draft you're obligated to understand line by line. Use it to accelerate the boring parts whilst still doing the thinking yourself. That's a reasonable model for an individual engineer with the discipline to work that way.
The cynic in me suspects this isn't really a choice individuals get to make. Businesses gravitate toward whatever reduces cost and increases margin, and those incentives rarely align with the long-term interests of a profession. If one engineer slows down to properly understand what they're shipping while another accepts the AI's output and moves three times as fast, the market has a fairly predictable opinion about which of them looks more valuable on a quarterly review. The careful engineer becomes a luxury. The careless one becomes the template.
This isn't unique to software. It plays out in any field where the appearance of productivity can be decoupled, however briefly, from the underlying competence required to sustain it. Software is just one of the earlier dominoes.
There's something else worth saying here, which is that solving problems is genuinely enjoyable. That is the nature of engineering, and for a lot of engineers, a collapse into pure product and business work simply won't scratch the same itch. We don't apply our skills and critical thinking only because we have to, though in some deeper sense perhaps we do. We do it because we want to. Wrestling with a problem until it yields is the part of the job that makes the job interesting in the first place.
This isn't unique to engineering either. For an artist, image generation isn't a replacement for the act of making something. The output might look superficially similar, but the thing being lost is the part the artist actually showed up for. Strip the thinking out of knowledge work and you don't just lose the expertise. You lose the reason a lot of people chose the work to begin with.
You might be reading this hoping I'll arrive at a neat conclusion. I don't think I have one. In the short term, there are still plenty of facets of working with AI where experience and comprehension remain invaluable. The engineers who already have the instincts can wield these tools far more effectively than those who don't.
The trouble is what happens after that. If the endpoint really is a world in which knowledge work amounts to prompting a system and accepting its output, the issue extends well beyond software engineering. Every profession that relies on the slow accumulation of judgement faces the same quiet erosion. The expertise doesn't vanish all at once. It simply stops being replenished.
The tech-priests of the 41st millennium didn't arrive in a single generation. They arrived because, slowly and unremarkably, the people who understood the machines were replaced by people who only knew how to operate them. Each generation knew slightly less than the one before, and each had slightly less reason to care, because the machines mostly still worked. Until one day the distinction between maintenance and ritual quietly dissolved, and nobody was left who could tell the difference.
We're a long way from incense and prayer scrolls in the server room. We're not, I think, as far as we'd like to believe.