views:

939

answers:

19

There's a lot of homework questions here on SO.

I would guess that 90%+ can be solved by stepping through the code in a debugger, and observing program/variable state.

I was never taught to use a debugger. I simply printed and read the GDB manual and stepped through their examples. When I used Visual Studio for the first time, I remembered thinking, Wow! how much simpler can this be, click to set a breakpoint, mouse over a variable for the value, press a key to step, the immediate window, debug.print, etc...

At any rate, are students "taught" to use a debugger? If not, why not? (Perhaps a better question is, why can't they learn to use a debugger themselves... maybe they need to be told that there is such a tool that can help them...)

How long does it take to learn to use a debugger?

A: 

In high school we were taught to debug by writing stuff out to the console.

In college, we were taught a mix of that plus using a debugger.

The tools have only gotten easier to use, so I am really not sure why it is not taught.

Bryan Batchelder
+20  A: 

I don't think the problem is teaching. Using a modern graphical debugger is not rocket science (at least not for most user-mode programs running on a single computer). The problem is with the attitudes of some people. In order to use a debugger effectively, you should:

  • Admit it's your fault and select isn't broken.
  • Have the perseverance to spend a couple nights debugging, without forgetting the previous point.
  • There's no specific algorithm to follow. You should guess educatedly and reason effectively from what you see.

Not many non-programmers have these attitudes. At college, I have seen many friends who give up after a relatively short period of time and bring me some code and tell me the computer is doing something wrong. I usually tell them I trust their computer more than them (and this hurts some feelings, but that's the way it is).

Mehrdad Afshari
Point number 1 is definitely the hardest for some of the "developers" I've met. Meh.
Erik Forbes
Yes; I've made good money off of bets like "I bet $5 that GCC isn't broken". After losing the bet to me, most people don't repeat that mistake.
Chris Arguin
@Chris good thing you were betting for GCC, not javac in the 1.2/1.3 days, otherwise you might not have made off so well.
Suppressingfire
I can't understand how anyone cannot think it is their fault. They have written the code, the compiler works for everyone else, how else could the code not perform correctly?
Callum Rogers
Actually, there is an algorithm ... it's called the Scientific Method. You observe, spot something that does not jibe with expectations, form a hypothesis of what might cause it, test your hypothesis, comparce with expectations, lather, rinse repeat.
Peter Rowell
http://thinkreason.net/wp-content/uploads/2009/07/then-a-miracle-occurs-cartoon.png
Mehrdad Afshari
@Peter, there's no algorithm for forming a hypothesis though.
Matthew Crumley
@Matthew: Well, hypothetically there is. :-) You're right, of course, but this is where a little bit of mystical programmerness comes into it. Programmers seem to be good at spotting patterns, even fairly subtle ones. Given an apparent pattern you can speculate about what might cause it, which is mostly what a hypothesis is. Besides, when all else fails there's always Wolf Fence in Alaska. :-) See http://portal.acm.org/citation.cfm?id=358690.358695 for one description of it.
Peter Rowell
+4  A: 

This is a good question to ask to the faculty at your school.

At my university, they gave a very brief example of debugging, then pointed us to the "help" files and the books.

Perhaps they don't teach it because there is sooo much stuff to cover and so little time for the lecturers. The professors aren't going to hold everybody's hand.

Thomas Matthews
That's too bad. If they aren't teaching debugging then they aren't teaching programming.
Jon B
@Jon B: How much programming should a college be teaching? In the Computer Science program I was in, we generally figured it was up to the student to learn to program (although we did have some introductory courses).
David Thornley
@Jon B: That's because they're not teaching programming. They're teaching Computing Science.
mbarnett
+3  A: 

I'll put in a cautionary note on the other side. I learned to program with Visual Basic and Visual C ( mid 80s ), and the debuggers were built-in and easy to use. Too easy, in fact... I generally didn't think about how to solve a problem, I just ran it in the debugger and adjusted the behavior. Oh, that variable is one too high... I must have to subtract one here!

It wasn't until I switched to Linux, with the not-quite-as-easy gcc/gdb combo, that I began to appreciate design and thinking about your code first.

I'll admit, I probably go too far the other way now. I use a debugger to analyze stack traces and that's about it. There should be a middle ground between analyzing the problem and stepping through it in a debugger. Certainly people should be shown all the tools available too them.

Chris Arguin
I don't make a lot of friends when I say it (what with working in a microsoft shop), but I think tools like VS make you a worse developer. Falling back on crutches too often leave certain kinds of thinking unexercised, until they pretty much atrophy.
Matt Briggs
@Matt I agree and disagree with the statement. I wouldn't want to say to someone "You'll let your legs atrophy if you drive instead of walk" when they need to go cross country.
Jim Leonardo
A: 
Zannjaminderson
+2  A: 

I was taught to use a debugger in college. Not much, late (it should be almost the second thing to teach), but they taught me.

Anyway, it's important to teach to DEBUG, not only to "use a debugger". There are situations where you can't debug with gdb (e.g. try to debug a program running 10 concurrent threads) and you need a different approach, like the old-fashioned printf. I can certainly agree with you that usually one learn and make use of debug techniques much later that the first time you could use them.

Khelben
+3  A: 

In my high school and university, most of the people in the classes didn't really care about programming at all.

Tyler Smith
+4  A: 

If by students you mean Computer Science students, I think the answer is fairly obvious. The subject matter for courses is generally theory, with the programming language / framework / library there as an aid. The professor can't go very far in depth on a particular tool, since it would take away from time he is teaching networking or systems or whatever. Maybe if there were a course called "Real World Programming" or something like that, they'd cover debuggers, but in general I don't see too much wrong with expecting students to read the language / tool documentation in order to accomplish the coursework.

danben
+2  A: 

From a practicality standpoint, most likely (due to policy or technical restrictions) you cannot use a debugger on a production application. Not using the debugger as too much of a crutch promotes adding the proper amount of logging to your application.

Chris Ballance
Is there a reason you can't generate a dump when something goes wrong? If policy prevents you from doing so, then that same policy will prevent you from doing a useful amount of logging.
Anon.
Aren't logging and debugging two separate things? Logging is there to facilitate and enhance the debugging?
Another reason for not using debuggers on production apps is that they're generally compiled with optimization, and that breaks the correspondences that a source-level debugger relies on.
David Thornley
@roygbiv logging exceptions and transactions properly can minimize the need for debugging. closely related in use.
Chris Ballance
Debugging dumps (and mini dumps) is something that happens VERY often on large production systems. In fact the rate of minidumps produced by a given system can be something that indicates the priority of an issue for many systems I've interacted with big and small. :-)
Dave Quick
I disagree that "you cannot use a debugger on a production application"... a lot of production problems can be reproduced on a dev. box and you can use the debugger there.
JoelFan
@JoelFan Do you always have an exact full copy of your production data and server configuration on your local dev box? Production problems cannot always be reproduced locally.
Chris Ballance
+5  A: 

Debuggers were introduced in my second year Intro to C course, if I recall correctly. Of course the problem most students were struggling with at that point was getting their work to compile, which a debugger will not help with. And once their ten line command line program compiles and then crashes, well, they already have some printfs right there. Fighting to master GDB is overkill.

In my experience, it's fairly rare to actually deal with a code base large enough to make more than a cursory familiarization with a debugger worth the time investment in most Comp. Sci curriculums. The programs are small and the problems you face are more along the lines of figuring out the time-space complexity of your algorithm.

Debuggers become much more valuable on real world projects, where you have a lot of code written by different people at different times to trace through to figure out what keeps frotzing foo before the call to bar().

mbarnett
sometimes code written by you at different times is "hard" enough to use a debugger (It's like when your staring at the screen and wondering who the crap wrote this!? and then look at the header and see it was all you)
Earlz
@earlz I've played that game a few times.
Richo
A: 

There are more than a few questions here, to my mind. Here's a few that are asked and a few that I'd infer:

I was taught in BASIC and Pascal initially, usually with an interpreter that made it easier to run the program till something blew up. We didn't have breakpoints or many of the fancy things there are now for tracing through code, though this would have been from 1983-1994 using a Commodore 64, Watcom BASIC, and Pascal on a Mac.

Even in my later university years, we didn't have a debugger. If our code didn't work, we had print statements or do manual tracing, in terms of time this would have been 1995-1997.

One cavaet with a debugger is that for something like Visual Studio, do you have any idea how long it could take to go through every feature it has for debugging? That could take years in some cases I think. This is without getting into all the build options and other things that it can do that one might use eventually. Another point is that for all the good things that a debugger gives, there is something to be said for how complex things can get,e.g. using a breakpoint in VS there is the call stack, local variables, watch windows, memory, disassembly and other things that one could want to examine while execution is halted.

The basics of using a debugger could be learned in a week or so, I think. However, to get to the point of mastering what a debugger does, how much goes on when code is executing as well as where it is executing as there are multiple places where things can run these days like GPUs to go along with the CPU, would take a lot longer and I'd question how many people have that kind of drive, even in school.

JB King
+1  A: 

A found that there is a lot of negative attitude towards debuggers amongst academia and seasoned systems programmers. I have come up against one quite talented programmer who claimed that "Debuggers don't work, i just use log files." Fair enough, for multi-threaded server apps you must have logging, but there's no denying a debugger is useful for 99% of the code that is not multi-threaded.

In answering your question, yes debuggers should be covered in programming syllabus, but as one of the tools of debugging a program. Tracing and logging are important as well.

Igor Zevaka
+1  A: 
Dave Quick
A: 

You're question is kind of similar to, "Why aren't students taught software testing"? I'm sure the do in some places, but typically Universities/Colleges stick to teaching the 'interesting' theoretical computer science stuff, and tend not to teach the practical tools. Like how if you're taking English in school they teach you how to write, not how to use MS Word (yea I'm sure there are some Word courses, but you get my point).

Jeremy Raymond
A: 

I wasn't taught to use a debugger in my undergraduate degree, because you cannot use a debugger on a deck of punch cards. Even print statements are a luxury if you have a 15 minute turnaround on "jobs", etcetera.

I'm not saying that people should not be taught to use debuggers. Just that it is also important to learn to debug without this aid, because:

  1. it will help you understand your code better if you don't have to rely on a debugger, and
  2. there are situations where a sophisticated debugger won't be available.

On the latter point, I can also remember debugging a boot prom on an embedded device using a (rather expensive) logic analyzer to capture what was happening on the address / data lines.

Stephen C
+1  A: 

Not entirely related, but people need to use debuggers not just for debugging but to understand working code.

JoelFan
@JoelFan - If you have to use a debugger to understand working code, that code needs to be rewritten.
Stephen C
@Stephen C - not necessarily... suppose you are new to a project and it's a huge code base... start it up in the debugger and get a feel for the flow between the different parts of the code... perfectly acceptable!
JoelFan
+2  A: 

Because there is not text book on debugging, period.

In fact, it is very hard to create a teaching situation where get the incentive to use a debuigger. Typical assignments are too simple to really require a debugger. Greg Wilson raised that topic at last year's SUITE workshop, and consensus was it is very hard to get students to use a debugger. You can tell them about debugging, but creating situation where they will actually feel the pain of resorting to the debugger is hard.

Maybe a lecture on game cracking could motive students to step through the game with a debugger? At least, that was why I told myself how to use a debugger as a 12-year old :)

Adrian
A: 

The same reason students aren't taught version control, or unit testing, or shell scripting, or text editing, or documentation writing, or even (beyond intro courses) programming languages. The class is about computer science, usually a single concept or family of concepts, not programming. You're expected to learn what you need.

This isn't unique to computer science. My chemistry classes (I also have a chemistry degree) didn't teach me how to use any chemistry lab equipment, either. You learned that by hanging around in the lab and watching other students and asking the grizzled old profs who hung out there.

Ken
+1  A: 

Wagering a hypothesis based on my experience as a TA and aspiring CS prof:

It would actually confuse the kids who have little to no programming experience more than it would help (bare with me here...)

Now first of all: I completely agree that teaching how to use a debugger would be a great thing, but I think the barrier to doing so stems from the greater, systematic problem that software engineering and computer science are not separate majors. Most CS programs will require 2-4 classes where learning to code is the focus. After these, coding ability is required but not the topic of the class.

To my main point: It's very hard to teach something using the guise of "you don't get this now but do it because it'll be useful later." You can try, but I don't think it really works. This as an extension of the idea that people only really learn from doing. Going through the motions but not understanding why is not the same as doing.

I think kids learning to code for the first time aren't going to understand why using a debugger is more effective than inserting print lines. Think about a small to medium sized script you code: would you use the debugger barring odd behavior or some bug you couldn't work out quickly? I wouldn't, seems like it would just slow me down. But, when it comes to maintaining the huge project I work on every day, the debugger is invaluable beyond a doubt. By the time students get to the portion of the curriculum that requires big projects, they're not in a class that focuses on general coding anymore.

And all of this brings me to my awesome idea I think every CS prof should do when teaching how to code: instead of exclusively asking for projects from the kids, every now and then give them a big piece of complex code and ask them to fix the bugs. This would teach them how to use a debugger

john