August 18th, 2024

Algorithms We Develop Software By

The article explores software development methodologies that improve coding efficiency, emphasizing daily feature work, code rewriting, the "gun to the head" heuristic, and effective navigation of problem spaces.

Read original articleLink Icon
AppreciationFrustrationInsight
Algorithms We Develop Software By

The article discusses various software development methodologies and heuristics that can enhance coding efficiency and quality. One method highlighted is starting work on a feature at the beginning of the day and, if unfinished, deleting the work and starting anew the next day, while retaining unit tests. This approach, reminiscent of Extreme Programming, encourages clean solutions through iterative refinement. Another heuristic is to "write everything twice," which suggests that rewriting code can lead to higher quality outcomes and better retention of coding patterns. The author also mentions a technique called the "gun to the head" heuristic, which challenges engineers to devise quicker solutions by breaking their initial time estimates, often leading to more efficient plans. The article emphasizes the importance of pathfinding in problem-solving, likening software engineering to various search algorithms, and suggests that improving as an engineer involves becoming adept at navigating problem spaces effectively.

- Starting work on features daily and deleting unfinished work can lead to cleaner solutions.

- Rewriting code can improve quality and retention of coding patterns.

- The "gun to the head" heuristic can help engineers find quicker solutions by challenging their initial estimates.

- Effective software engineering involves navigating problem spaces and finding optimal solutions.

- Iterative methods and heuristics can significantly enhance coding efficiency and quality.

AI: What people are saying
The comments reflect a variety of perspectives on software development methodologies discussed in the article.
  • Many commenters support the idea of rewriting code, with some suggesting writing it multiple times to achieve the best results.
  • The "gun to the head" heuristic is debated, with some viewing it as a useful thought exercise while others criticize its application in project management.
  • There is a consensus on the importance of understanding abstractions and avoiding analysis paralysis in software design.
  • Several commenters emphasize the need for balance between speed and quality in coding practices.
  • Some highlight the significance of experience in recognizing when to refactor or rewrite code effectively.
Link Icon 20 comments
By @nkozyra - 3 months
Write everything (generally, new features) twice has turned out to be really good strategy for me, but it doesn't sit well with bizdev or project managers and tends to be perceived as unnecessary slowness.

But if you plow through a feature and get it "working," you'll do much of that work cleaning up the logic and refactoring through your first pass. What rewriting allows you to do is crystalize the logic flow you developed the first time and start cherry-picking in a more linear fashion to meet the blueprint. It also tends to reduce the urge (/ need) for larger scale refactorings later on.

By @from-nibly - 3 months
> "gun to your head, you have to finish in 24 hours, what do you do?"

PSA: if you are a project manager / owner or some other similar position you do not get to ask this. This is a personal educational excercise not a way to get stuff done faster.

By @pkoird - 3 months
A good code, in my opinion, is written by appropriate selection of suitably contained abstractions. The problem with this, and the article does try to talk about it, is that for you to select appropriate abstractions, you need to know the "entire" thing. Which is to say, you need to have a knowledge of something that isn't there yet.

In other engineering disciplines like say civil or architecture, this problem is solved by using a good blueprinting paradigm like CAD layouts, but I find a distinct lack of this in software[1]. Ergo this advice which is a rephrasing of "know first and build later". But it is also equally easy to lose oneself in what's called an analysis paralysis i.e. get stuck in finding the best design instead of implementing a modest one. In the end, this is what experience brings to table I suppose, balance.

[1]closest I can think of are various design diagrams like the class diagrams etc.

By @simpaticoder - 3 months
Very interesting suggestions, all worth trying. Having a very capable coworker can help here, because they can show you what can be done in a short amount of time. Specifically I've noticed that some devs get "winded" by a change and want to take a break before moving on; others simply continue. This ability can be improved with practice, both within and across contexts. Doing things quickly is valuable for many intrinsic reasons that are often overlooked because we descry the poor extrinsic reasons. As with car repair, the odds that you forget how to reassemble the car scales with the time the repair takes. Similarly, if you can execute a feature in a day (especially a complex one that requires changes to many parts of a repo, and/or more than one repo) this is much less risky than taking many days or weeks. (To get there requires that you have firm command of your toolset in the same way a mechanic understands his tools, or a musician understands her instrument. It also requires that externalities be systematically smooth - I'm thinking particularly of a reliable, locally repeatable, fast CI/CD process.)

(The calculus here is a little different when you are doing something truly novel, as long periods of downtime are required for your brain to understand how the solution and the boundary conditions affect each other. But for creating variations of a known solution to known boundary conditions, speed is essential.)

By @a1o - 3 months
> Write everything twice

There's an enhancement in a software I use/maintain that I wrote once and lost (the PC I wrote kaput and I was writing offline so I also didn't backup). It was an entire weekend of coding that I got very in the zone and happily coded.

After I lost that piece of code I never could get the will to write that code again. Whenever I try to start that specific enhancement I get distracted and can't focus because I also can't remember the approach I took to get that working and get lazy to figure it out again how that was done. It's been two years now.

By @jesse__ - 3 months
This is one of the best "programming advice" posts I've ever read, right up there with the grug brained developer.
By @vanjajaja1 - 3 months
> If, after a few days, you can't actually implement the feature, think of what groundwork, infrastructure, or refactoring would need to be done to enable it. Use this method to implement that, then come back to the feature

really good, this is key. building a 'vocabulary' of tools and sticking to it will keep your velocity high. many big techs lose momentum because they dont

By @Etheryte - 3 months
I really like the footnote that indirectly says that sometimes you just need to spin up a background thread to figure something out. Resonates heavily with my experience, to the point where I feel like a lot of the value my experience brings is identifying this class of problems faster. You stumble onto it, recognize it's the think about it passively type and move on to other things in the meanwhile. It would be easy to bang your head on it and get nowhere, sometimes you just need to let it sit for a bit.
By @halfcat - 3 months
Dan Abramov talks about WET (write everything twice) [1] as generally a good approach, primarily because you often don’t know the right abstraction up front, and a wrong abstraction is way worse than a copy/paste.

He has some good visuals that illustrate how incorrectly dependent and impossible to unwind wrong abstractions can become.

[1] https://youtu.be/17KCHwOwgms

By @rmnclmnt - 3 months
> Write everything twice

I’d say « Write everything three times » because it usually take 3 versions to get it right: first is under-engineered, second is over-engineered and third is hopefully just-right-engineering

By @hintymad - 3 months
I remember seeing somewhere a popular list of top 10 algorithms used in systems, and it's kinda depressing to realize that the most recent algorithm on the list, Skip List, was invented roughly 30 years ago, and every single one of them was taught in an introductory data structure course. That is, we most likely do not need to study the internals of algorithms nor need to implement them in production. For such a long time in history, smart and selfless engineers already encapsulated the algorithms into well abstracted and highly optimized libraries and frameworks.

Of course, there are exceptions. ClickHouse implemented dozens of variations of HashTable just to squeeze out as much performance as possible. The algorithms used in ClickHouse came from many recent papers that are heavy and deep on math, which few people could even understand. That said, that's just exception instead of norm.

Don't get me wrong. Having a stable list of algorithms is arguably a hallmark of modern civilization and everyone benefits from it. It's just that I started studying CS in the early 2000s, and at that time we still studied Knuth because knowing algorithms in-depth was still a core advantage to ordinary programmers like me.

By @layer8 - 3 months
By @justinl33 - 3 months
> start over each day This reminds me of "spaced repetition" in learning theory. Drilling the same problem from scratch is a great way to get better at iterating through your rolodex of mental models, but so many people prioritize breadth because they think it is the only way to generalize to new problems.
By @ww520 - 3 months
I usually won't rewrite the whole thing twice, but would rewrite parts of it multiple times. For the very least, the second time around I would format things and add comments to make things easier to be understood. Code should be written for comprehension.
By @DrScientist - 3 months
I find the follow approach quite useful.

1. First write down a bunch of idea of how I might tackle the problem - includes lists of stuff that I might need to find out.

2. Look at ways I break the task down to 'complete-able in a session'.

3. Implement, in a way the code is always 'working' at the end of session.

4. Always do a brain dump into a comment/readme at the end of the session - to make it easy to get going again.

By @aleph_minus_one - 3 months
> Another heuristic I've used is to ask someone to come up with a solution to a problem. Maybe they say it'll take 4 weeks to implement. Then I say "gun to your head, you have to finish in 24 hours, what do you do?"

Pretend to be capable of doing this, and in the short moment where the other person is not attentive, get the gun and kill him/her. This satisfies the stated criteria:

> The purpose here is to break their frame and their anchoring bias. If you've just said something will take a month, doing it in a day must require a radically different solution.

> The purpose of the thought experiment isn't to generate the real solution.

:-)

---

Lesson learned from this: if you can't solve the problem that the manager asks you for, a solution is to kill the manager (of course you should plan this murder carefully so that you don't become a suspect).

:-) :-) :-)

By @gregors - 3 months
"you have 24 hrs" and "write everything twice" ......they go hand in hand don't they? You're definitely going to rewrite it if you slap code out there.
By @steve918 - 3 months
I like the "gun to the head" heuristic but I would probably rephrase it to be something like "If you only had 24hrs to solve this or the world would come to an end".
By @mgaunard - 3 months
Most software has a finite lifetime of a few years. You rewrite everything eventually.

What you should be worried about is the code that hasn't been rewritten in ten years.

By @mempko - 3 months
"Write everything twice" is a great heuristic. Extreme programming and unit tests is a dumb and wasteful technique. You end up cornering yourself.