Common knowledge is that in programming, you spend most of the time maintaining code and fixing problems, instead of adding new features. Often, via debugging, you find what the issue is, but don't know how to fix it. Other times you encounter a Heisenbug or a Loch Ness Monster. But usually you're trying to find something in a large codebase, and you have no idea where to look.
In general, these are the steps to do (your mileage may vary), when you need to fix a code-related issue:
- Gather information about the issue.
- Think about how the issue might reproduce and try reproducing it.
- Make a hypothesis - what can cause the bug to appear?.
- Go through your code and try finding where it's broken, possibly changing your hypothesis.
- Loop 3 and 4 until you find that piece of broken code.
- Figure a solution. That can be a hard problem on its own.
Common knowledge is not that common
I find that a surprising amount of programmers I've worked with, don't know how to use a systematic approach for steps 3-5.
They don't have a proper debugging workflow. Hell, some of them don't have a workflow, period.
People don't know how to trace their code, or use breakpoints and watches. Instead they're relying on random prints with
Console.WriteLine statements, or some language equivalent.
For entry-level developers this is normal and, frankly, expected - they don't have the experience. The more bugs you encounter and fix, the better you get at it. Sadly, I've also seen inability to debug code in developers with multiple years of professional development behind their back.
Herein lies the problem, developers hinder themselves by not having enough knowledge on how to properly combat issues. This costs time, money, strains nerves and in the long run can add up to health problems and breakdowns. It's plainly frustrating to search 3+ consecutive days for a bug. All this negatively affects programmers.
How do we fix this? My suggestion is to introduce debugging courses and make them mandatory in Computer Science, Software Engineering and other programming-related majors. Everybody needs to know how to solve problems.
How would classes help, if debugging is learned with practice?
I was introduced to the art of debugging in a 6th grade general computing afterschool course. It turned little whiz-kids into programming gods using Turbo Pascal during the course of several years. Whenever I had troubles, my teachers would show me how to trace my code and find my errors - what does Step In, Step Over do, how to use Watches and so on.
This gave me an advantage in my later school and university years, where my classmates would struggle, without knowing debugging concepts. After someone explained and showed tracing in action, my classmates would do better in their homeworks. Teachers in both school and uni tried to incorporate debugging into their projects and homeworks, but it wasn't really focused. It was more in the way of "If you don't know why your program doesn't work, just debug it - like this".
Yes, debugging is learned with practice. You may have spent years and years programming, and still find new tricks, like remote debugging Phonegap apps using Safari.
Debugging classes help with exposure - the more programmers know the basics, the higher quality code our industry can produce. We need to teach developers to handle problems more effeciently, as this also translates into personal life as well. Oh, something wrong happened? Now I have a systematic approach to finding the cause and fixing it.
Furthermore, such a course can also be called "Solving Problems 101", presenting abstract patterns and concepts and can be taught even in high school. Everybody benefits from having patterns for dealing with issues. All science fields, especially Engineering and Mathematics are particularly susceptible to pattern-based problem resolution.
So what's next?
Go ahead and learn more about debugging. Find new tips and tricks, patterns, workflow shortcuts. In the upcoming posts I'll delve into how to use certain IDE tools, how to trace a program and how to use patterns to locate faulty code.
Meanwhile, I've outlined some rough ideas about what a debugging course should contain.
Check out the list, learn up on any unfamiliar topics and work on sharpening your skills. We all benefit from that.
Sample debugging course contents
- How is code actually executed - line by line, callbacks, etc, but this is usually covered in some way in other programming courses.
- Call stack
- Types of breakpoints
- Exception breakpoints
- Adding breakpoints to library functions
- Breakpoints in low-level debugging, e.g. assembly
- Types of breakpoints
- How to use stepping
- Step in
- Step over
- Step out
- Handling data
- REPL / Expression execution
- Debugging multithreaded/multiprocess apps
- Post-mortem debugging
- to files
- to remote services
- Remote logging
- Crash reports
- Crash dumps
- Bug reports
- Reading and submitting
- Best practices and patterns
- Locating issues in unfamiliar code
- Top-down/recursive approach to finding bugs in modular systems
- Finding information about unfamiliar topics
- When to actually use random print statements
- And most importantly, lots of practice and real-world examples in different IDEs and languages.
TLDR: We need to expose every programmer, as early as possible, to basic and advanced debugging concepts, so we can do better as industry and as individuals.
Update: To clarify, I'm not berrating or dismissing console logging for debugging. It has its purposes - in embedded programming, or when your IDE cannot properly map source code to stack frames, or when breakpoints will actually hinder your progress, as well as other places. Using it in situations where better methods are available, or randomly, is what I consider bad.