When I left college, I thought the key to being a great programmer was to make perfect (in both function and form) code the first time. Any problems that appeared were because I wasn't good enough, or because I wasn't careful enough, or couldn't think far enough ahead, or spend enough time planning. If only I could work out all the problems on a whiteboard, I would be a great programmer.
I started my programming career working on a debugger. After a couple years of that, I was convinced that the most important skill for a successful programmer was debugging. I accepted that I couldn't write perfect code the first time, and neither could anyone else. Spending all my time on the whiteboard never produced working code, but it sure took a long time. Instead, I'd just write something that seemed plausible, and then debug. There was no way to write great code like I wished, but I could get really good at fixing broken code.
That worked. I was able to create shipping software, while only taking 2x or 3x as long as planned. I was able to fix almost all of the important bugs before shipping, without introducing too many other bugs along the way. (I wince to write that!).
Then I read Fowler's Refactoring. It completely changed my thinking. It taught me that, while I couldn't write simple, clear, clean, maintainable code the first time, I could get there. The reason I couldn't get my design right on the whiteboard the first time was because you can't tell if it's right until you see it in code. Only then can you decide what's wrong and how to fix it. The key seemed to be to get in to code as soon as possible, while making it as cheap & safe as possible to fix the design of your code once it's written.
That lead me to Extreme Programming, where I learned that by writing well-factored code, where the classes are small, simple, and easy to understand, I could also have simple, easy-to-write, and easy-to-pass unit tests. When you code like that, a unit test failing points you to the root of the problem very quickly. When I do it right, tracking down problems is so easy that I rarely touch the debugger.
Now, I'm no angel of code. I rarely pair program, and I don't always write my unit tests first. I blame my parents. Oh, and the fact that my employer and my tools aren't as supportive as I could wish. And sure enough, I pay for it. Then I whip the debugger out, and apply those old debugging skills, while wishing I had just written better unit tests.
There's another place where debugging is critical: in production. When you're looking at a problem in the wild, being able to diagnose it quickly in a debugger is very powerful. I can learn to write programs that are good at self-diagnosis where I can predict weakness, but for the unpredictable problems, I think I'll always need a debugger.