February 12th, 2025

Do You Use a Debugger?

Sandor Dargo's blog post reflects on his journey with debugging in C++, highlighting the transition from using debug statements to employing debuggers due to increasing code complexity in new job environments.

Read original articleLink Icon
Do You Use a Debugger?

The blog post by Sandor Dargo discusses the use of debuggers in C++ programming, reflecting on the author's personal journey with debugging practices. Initially, Dargo did not use a debugger due to a lack of knowledge and relied on debug statements to trace code execution. Over time, as he became more proficient in writing clean code and implementing test-driven development, he found that debug statements were often sufficient. However, after changing jobs and encountering a more complex codebase, he realized that relying solely on tests and debug statements was inadequate. The complexity of the new system necessitated the use of a debugger to effectively understand and resolve bugs. Dargo emphasizes the importance of adapting to the demands of different codebases and acknowledges that while some developers may not use debuggers, the intricacies of modern software often require their use for efficient debugging.

- The author initially avoided using a debugger, relying on debug statements instead.

- As coding skills improved, the author found debug statements sufficient for simpler codebases.

- A transition to a more complex codebase highlighted the limitations of not using a debugger.

- The post underscores the necessity of adapting debugging techniques to the complexity of the code being worked on.

- The author reflects on the evolving nature of software development and debugging practices.

Link Icon 5 comments
By @PaulHoule - 2 months
Plenty. The difference is I'm using commercial tools (PyCharm, WebStorm, and IntelliJ) that have visual debugging that works. Now gdb from the command line is great for what it is, but a lot of people who use open source toolchains find that "it just doesn't work" when it comes to debugging.

Myself, I think tests and debugging go together like peanut butter and jelly. That is, you use tests to set up a situation you want to debug, then you run the debugger. When you're all done you have a test that proves the bug was fixed.

It might be less of a problem in the C++ world but I really like being able to use the debugger to investigate a situation without touching the source code at all, since if you start hacking on the source code to add printfs and add other debugging hacks you often wind up checking in things accidentally that you shouldn't have, particularly when you're also making real changes to fix the bug.

An arXiv paper [1] investigates those accidental commits and reveals that people do a lot of printf debugging of async callbacks since those can be hard to debug; they breakpoint just find in most environments but you never really know if they are going to run or not and I think a lot times they get called N times for harmless things before you get the one you want to breakpoint. (Maybe it's just me but I've never had good luck with conditional breakpoints)

[1] https://arxiv.org/abs/2501.09892

By @sevensor - 2 months
Very language dependent. Long build times mean you may need a debugger. Debugging prints make you think harder in advance and can actually force you to structure your code with greater clarity. The fact that you have to understand what they mean across two different contexts, writing and running, slows things down but also improves your mental picture of the codebase. Debugging prints don’t get enough credit.
By @marssaxman - 2 months
I used debuggers all the time when I used to work on interactive GUI applications. I am sure I could still remember how, if I had a need, but in the meantime I've spent years working on hardware drivers, embedded firmware, distributed systems, ML pipelines, and other such environments where debuggers would be unhelpful or impossible. Mostly I debug by looking at logs, and by thinking over their implications; you get used to it.

Debugging is not really about the software, after all; it's a process of discovering the limits of your own understanding and identifying the flaws in your mental models. You make hypotheses and test them, and it doesn't matter so much what instruments you use to perform the tests, so long as you know how to think about the process systematically.

By @eternityforest - 2 months
Ability to use the debugger is one of my top priorities, I will often redesign an entire plan of it breaks the debugger (Either because it uses DSLs, or it's too strongly multi language, or it's not runnable locally on the desktop at all).

Same for version control, linting, and formatting, I pretty much ignore technologies that don't integrate with all the modern dev tools most of us expect to have.