Skip to main content

Design for Change

What did all the organizations I worked for have in common? A desire to change—and to change quickly!

Starting with startups—it’s obvious: we need to build that new product fast. MMP, MVP, ARR—ASAP!

But large organizations aren’t that different—the need for speed is still there. And when speed isn’t enough, it triggers the need for change. Every company operating in the digital space (and every large company does) wants to innovate quickly for customers: to stay competitive, meet customers’ expectations for frictionless experiences, and deliver quality products and differentiating capabilities before competitors win the market.

What does that mean for architects? Design for easy change. The ability to change easily creates speed.

Of course, this shouldn’t lead to over-engineering or designing for every unknown future requirement. The guiding principle should be the opposite: the simpler the system, the easier it is to change.

Architecture at Different Speeds

A typical situation in older, non-“digitally native” companies is that “systems of record” (SoR) are legacy and hard to change. On top of them, a new digital experience layer is created as a set of modern, customer-facing applications.

This sounds promising, but in practice, systems of record will still slow the organization down. It may be relatively easy to use mechanisms such as CDC to extract data from legacy systems into the experience layer. But if core business processes still live in the SoR, then writing data back will most likely require changes to legacy systems.

This isn’t impossible to handle, but the company needs to be aware that creating a new digital experience layer won’t solve all problems, and old systems will still require investment to keep up with change. Even if the organization is committed to decommissioning old systems and moving all processes into new solutions, it won’t happen overnight. Architectures operating at two different speeds will have to coexist for some time, with a high risk of degrading the user experience during the transition.

Speed in the AI Era

The Architect Elevator book was written before the AI revolution. But now, with AI dramatically accelerating code creation, are IT processes and systems ready to handle the unprecedented pace of change? How good is our CI/CD? Can we delegate implementation tasks to AI agents with confidence?

A quote from the book feels especially relevant now:

“Propose to a development team that they let you delete 20 arbitrary lines from their source code. Then, they’ll run their tests—if they pass, they’ll push the code straight into production. From their reaction, you’ll know immediately whether their source code has sufficient test coverage.”

An AI agent would be more than happy to delete those 20 lines. Of course, we can use AI to improve test coverage. But do we have the requirements specified? If not, AI to the rescue again: we can use it to reverse-engineer requirements from the codebase, review whether the derived requirements make sense, and then fix issues based on feedback gathered by LLM bots. Happy days may be coming, but there’s likely a bumpy road ahead.

There are two main scenarios. In the 1st one, we bet that AI will actually become that good to fix all the legacy issues. In that scenario though, the economy changes significantly, and most of the current digital companies could be out of business, replaced by AI giants. Decision makers might be better off investing in land to grow their own potatoes than in IT transformations if believe in that scenario.

Alternatively, if we bet that AI won’t become that good, then before using AI to accelerate new-feature development, we should first use it to build confidence around making changes with AI tools—or design systems from scratch with easy, confident change in mind.