No! Debuggers make your code worse!
Let me support this rash statement by telling you a little tale of first-hand experience in which I accidentally learnt something profound.
I took on a contract as a Delphi coder, and the first task assigned was to write a template engine conceptually similar to a reporting engine - using Java, a language with which I was unfamiliar.
Bizarrely, the employer was quite happy to pay me contract rates to spend months becoming proficient with a new language, but wouldn't pay for books or debuggers. I was told to download the compiler and learn using online resources (Java Trails were pretty good).
The golden rule of arts and sciences is that whoever has the gold makes the rules, and I proceeded as instructed. I got my editor macros rigged up so I could launch a compile with a single keystroke, and I used regexes to parse the compiler output and put my cursor on the reported location of compile errors, and so I had a little IDE with everything but a debugger.
To trace my code I used the good old fashioned technique of inserting writes to the console that logged position in the code and the state of any variables I cared to inspect. It was crude, it was time-consuming, it had to be pulled out once the code worked and it sometimes had confusing side-effects (eg forcing initialisation earlier than it might otherwise have occurred resulting in code that only works while the trace is present).
Under these conditions my class methods got shorter and more and more sharply defined, until typically they did exactly one very well defined operation. They also tended to be specifically designed for easy testing, with simple and completely deterministic output so I could test them independently.
The long and the short of it is that when debugging is more painful than designing, the path of least resistance is better design.
What turned this from an observation to a certainty was the success of the project. Suddenly there was budget and I had a "proper" IDE with an integrated debugger. Over the course of the next two weeks I noticed a reversion to prior habits, with "sketch" code made to work by iterative refinement in the debugger.
Having noticed this I recreated some earlier work using a debugger in place of thoughtful design. Interestingly, taking away the debugger slowed development only slightly, and the finished code was vastly better quality particularly from a maintenance perspective.
Don't get me wrong: there is a place for debuggers. Personally, I think that place is in the hands of the team leader, to be brought out in times of dire need to figure out a mystery, and then taken away again before people lose their discipline.
People won't want to ask for it because that would be an admission of weakness in front of their peers, and the act of explaining the need and the surrounding context may well induce peer insights that solve the problem - or even better designs free from the problem.