views:

188

answers:

6

I am writing a cross-platform C++ program for Windows and Unix. On the Window side, the code will compile and execute no problem. On the Unix side, it will compile however when I try to run it, I get a segmentation fault. My initial hunch is that there is a problem with pointers.

What are good methodologies to find and fix segmentation fault errors?

P.S to moderators, if you feel this question is too subjective, I give permission to change it to a CW.

+8  A: 

Compile application with -g option, then you have in binary file debug symbols. Then run gdb load your application file by the console, execute run command to start your application and do something to get Segmantion Fault, then after you get this type in gdb console bt show to get stacktrace in Segmantion Fault time.

Svisstack
+1  A: 

On Unix you can use valgrind to find issues. It's free and powerful. If you'd rather do it yourself you can overload the new and delete operators to set up a configuration where you have 1 byte with 0xDEADBEEF before and after each new object. Then track what happens at each iteration. This can fail to catch everything (you aren't guaranteed to even touch those bytes) but it has worked for me in the past on a Windows platform.

wheaties
well this would be 4 bytes rather than 1... but the principle is fine.
Jonas Wagner
May I link to my [non-intrusive heap debugger](http://stackoverflow.com/questions/2835416)? :-)
FredOverflow
Go for it. We're all about helping others here so anything that can help should be added.
wheaties
+6  A: 

Sometimes the crash itself isn't the real cause of the problem-- perhaps the memory got smashed at an earlier point but it took a while for the corruption to show itself. Check out valgrind, which has lots of checks for pointer problems (including array bounds checking). It'll tell you where the problem starts, not just the line where the crash occurs.

paleozogt
+4  A: 

Before the problem arises, try to avoid it as much as possible:

  • Compile and run your code as often as you can. It will be easier to locate the faulty part.
  • Try to encapsulate low-level / error prone routines so that you rarely have to work directly with memory (pay attention to the modelization of your program)
  • Maintain a test-suite. Having an overview of what is currently working, what is no more working etc, will help you to figure out where the problem is (Boost test is a possible solution, I don't use it myself but the documentation can help to understand what kind of information must be displayed).

Use appropriate tools for debugging. On Unix:

  • GDB can tell you where you program crash and will let you see in what context.
  • Valgrind will help you to detect many memory-related errors.
  • With GCC you can also use mudflap. It can detect some errors that Valgrind doesn't and the performance loss can be lighter.

Finally I would recommend the usual things. The more your program is readable, maintainable, clear and neat, the easiest it will be to debug.

Ugo
+1  A: 

I don't know of any methodology to use to fix things like this. I don't think it would be possible to come up with one either for the very issue at hand is that your program's behavior is undefined (I don't know of any case when SEGFAULT hasn't been caused by some sort of UB).

There are all kinds of "methodologies" to avoid the issue before it arises. One important one is RAII.

Besides that, you just have to throw your best psychic energies at it.

Noah Roberts
A: 

Yes, there is a problem with pointers. Very likely you're using one that's not initialized properly, but it's also possible that you're messing up your memory management with double frees or some such.

To avoid uninitialized pointers as local variables, try declaring them as late as possible, preferably (and this isn't always possible) when they can be initialized with a meaningful value. Convince yourself that they will have a value before they're being used, by examining the code. If you have difficulty with that, initialize them to a null pointer constant (usually written as NULL or 0) and check them.

To avoid uninitialized pointers as member values, make sure they're initialized properly in the constructor, and handled properly in copy constructors and assignment operators. Don't rely on an init function for memory management, although you can for other initialization.

If your class doesn't need copy constructors or assignment operators, you can declare them as private member functions and never define them. That will cause a compiler error if they're explicitly or implicitly used.

Use smart pointers when applicable. The big advantage here is that, if you stick to them and use them consistently, you can completely avoid writing delete and nothing will be double-deleted.

Use C++ strings and container classes whenever possible, instead of C-style strings and arrays. Consider using .at(i) rather than [i], because that will force bounds checking. See if your compiler or library can be set to check bounds on [i], at least in debug mode. Segmentation faults can be caused by buffer overruns that write garbage over perfectly good pointers.

Doing those things will considerably reduce the likelihood of segmentation faults and other memory problems. They will doubtless fail to fix everything, and that's why you should use valgrind now and then when you don't have problems, and valgrind and gdb when you do.

David Thornley