June 19th, 2024

Software design gets worse before it gets better

The "Trough of Despair" in software design signifies a phase where design worsens before improving. Designers must manage expectations, make strategic decisions, and take incremental steps to navigate this phase successfully.

Read original articleLink Icon
Software design gets worse before it gets better

The article discusses the concept of the "Trough of Despair" in software design, emphasizing that improvement in design always involves a period where things get worse before they get better. The software designer's role is to envision a better design and navigate through this challenging phase by determining the size of steps taken towards improvement. The article highlights the importance of managing expectations during this phase and making strategic decisions to reach a better design efficiently. It also touches on the factors influencing the duration and extent of the "Trough of Despair" and how designers can influence the transformation process to achieve a successful outcome. The key takeaway is that software design involves careful planning, incremental steps, and managing the trade-offs between risk, progress, and organizational reactions to design changes.

Link Icon 29 comments
By @Rhapso - 4 months
To boil it down simple:

People are vaguely good and competent, they leave systems in a locally-optimal state.

In general only changes that are "one step" are considered, and they allways leave things worse when you are currently in a locally optimal state.

A multi-step solution will require a stop in a lower-energy state on the way to a better one.

Monotonic-only improvement is the path to getting trapped. Take chances, make mistakes, and get messy.

By @mattgreenrocks - 4 months
The multi-dimensional nature of this problem makes it extremely fascinating to me, even twenty years into my career.

There's a certain dopamine hit you get for voluntarily trudging into the trough of despair, pushing the Sisphyean boulder of better design uphill in the optimistic belief that you can do better, and then actually arriving at something slightly better.

Way more fun than conceptualizing programming as duct-taping libraries, frameworks, and best practices together.

By @bloaf - 4 months
So there is a pretty obvious analogy in chemistry: activation energy.

https://en.wikipedia.org/wiki/Activation_energy

The ELI5 version is that atoms are all trying to find a comfy place to be. Typically, they make some friends and hang out together, which makes them very comfy, and we call the group of friend-atoms a molecule. Sometimes there are groups of friendly atoms that would be even comfier if they swapped a few friends around, but losing friends and making new friends can be scary and seem like it won't be comfy, so it takes a bit of a push to convince the atoms to do it. That push is precisely activation energy, and the rearrangement won't happen without it (modulo quantum tunneling but this is the ELI5 version.)

In the software world, everyone is trying to make "good" software. Just like atoms in molecules, our ideas and systems form bonds with other ideas and systems where those bonds seem beneficial. But sometimes we realize there are better arrangements that weren't obvious at the outset, so we have to break apart the groupings that formed originally. That act of breakage and reforming takes energy, and is messy, and is exactly what this author is writing about.

By @wannacboatmovie - 4 months
What is "better software" though?

On one hand you have guys like the OpenBSD team that work on Mostly Boring Things and making serious inroads at improving software quality of Mostly Boring Components that power the hidden bits of the Internet that go relatively unnoticed.

On the other hand, you have "improvements" from Apple and everyone else that involve an ever-changing shell game of moving around UI widgets perpetuated by UI designers on hallucinogens.

Are these browsers like Chrome that are elaborate ad dispensing machines really improvements from the browsers of yore? IE 4 may have sucked by modern standards but it also didn't forward every URL I visit to Google.

I've been around since the beginnings of the WWW and it's reached the point where I am struggling to understand how to navigate these software "improvements". For the first time I have felt like my elderly parents using technology. I haven't gotten stupider; the software has become more difficult to use. It has now become some sort of abstract art rather than a tool for technologists.

By @breckenedge - 4 months
At first glance I didn’t like this article (due to a long history of poorly executed redesigns for design’s sake) so I gave it a few minutes and reread it, and now I like it.

Sometimes when reviewing people’s redesigns, I can’t see the beautiful thing that they’re envisioning, only the trough. And over the years I’ve noticed that a lot of redesigns never make it out of the trough. I like the idea of doing small things quickly, I think that’s good, but that’s also technical debt if the redesign never results in a benefit.

By @dkarl - 4 months
Metaphors get abused in this article in a confusing way, and I don't think it explains why the quality curve goes downward at first -- the initial drop in quality is compared to an initial capital investment? what? -- but I agree with the truth of it.

I think the article could be a lot shorter and easier to understand if it simply said that the current design is in a local maximum, and you have to work your way incrementally out of the local maximum to reach a different local maximum. I think programmers would get that metaphor a lot more easily than the "buying widgets for a new factory" metaphor.

I do like how the article puts the spotlight on designing the process of change: picking the route, picking the size of the steps, and picking the right spot to announce as the goal. That gives me a lot of food for thought about the changes my team is contemplating right now.

By @xyst - 4 months
good design and implementation requires skilled people. you don't get either with bottom of the barrel pay grades.

something I have noticed in this industry is that big companies think they can outsource their staffing issues and "save on labor". But in the end they pay more in management of outsourced assets, inevitable maintenance of poorly designed and implemented software, delays in delivery, and of course the churn and burn of hiring/firing contractors. Then they end up redoing everything with local talent with 1/8th the team in half the time.

It only took 3-4 years to realize this but this is what the "trough of despair" really looks like.

By @kazinator - 4 months
I don't understand that this article at all or what it means by worse. It clearly defines what better means: that the architecture is such that implementing a feature is no harder than it absolutely has to be. So if it has to get worse first that means initially we're making it harder to implement the desired features? Why are we doing that? Are we counting a half-done, under-construction state? It's harder to implement a feature now than before because we partially wrecked the old architecture, and then you architecture is not done yet? Or is it because we're accounting all this new re-architecting work towards the cost of the first new feature? The first toilet install is hard because we have to do the whole plumbing in the building, and redo the sewage pipe out to the street? The second toilet is easy?
By @exabrial - 4 months
That’s the lesson we learned: implement the most simplistic thing first with just a bit of basic principles like separation of concerns. Humans are terrible at predicting where a system will expand in the future. Therefore just stay out of your own way by not over building!
By @diffxx - 4 months
I mainly agree although I think that the trough of despair often comes after an initial bump. At first when designing the new system, you pluck the low hanging fruit of improvement for a small subset of the system. There is no dip yet -- things are just getting better. But when you start migrating the rest of the system, you inevitably do hit that dip and descend into the trough of despair before climbing back out.

The art is to design things in such a way so that a minimum amount of time is spent in the trough.

By @quantum_state - 4 months
Interesting discussion … it appears that the nonlinear nature of modifying a software by a dev team with incomplete tacit knowledge of the underlying design makes it inevitable that things would end up in a state of ruin: small changes become very costly and risky, etc.
By @dagss - 4 months
What so often happens is you make a plan like this, then business priorities change/things took longer than expected/people leave or join and then you wish you never started...
By @piinbinary - 4 months
The same applies to tech debt: https://jeremymikkola.com/posts/2022_01_29_tech_debt_gets_wo...

(Yes there's a typo in the url. It bugs me, too)

prior discussion: https://news.ycombinator.com/item?id=30128627

By @danybittel - 4 months
My experience is the exact opposite. To implement a new feature, I usually first refactor, make space for the new feature, improve existing design. (uphill). Then I implement the new feature as pristine and clear as possible (top). Then I face the reality, integration tests fail, I add edge cases I forgot, (downhill). And I end up at the bottom, ready to push that abomination to git and forget about it.
By @deterministic - 4 months
Not my experience at all. My original design decisions rarely change fundamentally. And whatever small changes I decide to implement, I implement step-by-step with each step being a refactored improvement.

It probably helps that I have 30+ years of experience and always pick architectures I have used before on successful projects.

By @monksy - 4 months
Firstly, Hey Kent, how have you been?

Secondly: I think this may be reflective of someone that hasn't sat down and realized the environment that they're in. Creating a poor architecture or approach for the first go is usually a sign of dysfunction or inexperience.

Inexperience: It's more that the individual hasn't sat down, realized that the initial approaches are in appropriate and should be designing first before pushing forward. Experience should be fleshing out a lot of these details before coding anything and get the protocols and conflicts resolved months before they happen. (This is where I see a Staff+ being responsible and assisting in the development of the project)

Dysfunctional environment: Our culture in software engineering has forgone formal design before creating a solution. Typically, most development is dictated by "creating a microservice" first and then trying to design as you go along. This is so aggressive in a coding first approach that many even forgo testing. Why does this exist? Partly the incentives by business/management to deliver fast and distrust in the survivibility in the product.

---

That being said: Am I promoting a "perfect design" (as I've been accused of doing) first? No, iteration will happen. Requirements will change, but if you're applying strong tooling and good coding practices.. rearranging your arch shouldn't be as big of an issue as it currently is.

By @junto - 4 months
I’d be interested in the tail end of the graph. I assume that the longer the software is in operation, the more complex and worse it gets. From an anecdotal perspective, that’s my experience anyway having worked on some legacy projects in my time.
By @niyyou - 4 months
Genuine question, is it a property of software design only? Think about construction, for any change in the architecture, one has to demolish stuff and make a mess. I'd argue, that's a general property of change.
By @uoaei - 4 months
Absolutely. There's a gap between code which expresses/communicates its intentions and code which achieves the same goals while being much more streamlined and suited to the constructs of the language.
By @advael - 4 months
Needed this today. I think sometimes engineers go crazy and go try to greenfield something, anything, because building stuff requires it being in a nonfunctional state for a sec and this is hard enough on its own but there being (understandable, but often very counterproductive) friction around that that comes from what you're working on being something someone is relying on can make it a really daunting and frustrating process due to the inevitability of the trough
By @richrichie - 4 months
By @layer8 - 4 months
I wouldn’t refer to the intermediate states as a software design. The “trough” is the transition from an old design to a new design. But it’s not a software design itself, and therefore not a worse design. It’s just the fact that if you can’t go from the old to the new design in one go (a typical situation), then you’ll have an intermediate phase where the code base is in a more inconsistent and/or complex state, and therefore objectively worse if it were to stay in that interim state. But it’s not the software design that gets worse before it gets better, unless you’re doing some design exploration (hopefully not in production).
By @begueradj - 4 months
I have been wondering who could write such a wise article title before I found out it's Kent Beck himself ...
By @kachapopopow - 4 months
I always think of projects as something you constantly iterate on, rewriting in my experience has always been a mistake.
By @gavinhoward - 4 months
This matches my software development process:

https://gavinhoward.com/2022/10/technical-debt-costs-more-th...

Not everyone can do this, however.

By @sylware - 4 months
Why does that make me think about x11->wayland core?
By @sharas- - 4 months
Kent said some things, drew some kumbaja diagrams...
By @behnamoh - 4 months
This also applies to iOS:

    iOS 6 = good
    iOS 7 = bad
    iOS 8 = better

And now:

    iOS 16 = good
    iOS 17 = bad
    iOS 18 = even worse (and ugly)
    iOS 19 = hopefully good