I was writing a function in c++ the other day and it occured to me the compiler could do a lot more to help me guard against mistakes. The essentials of my code were like this -
void method(SomeType* p)
{
assert(p != 0);
p->something();
}
And it was called like this
SomeType p = NULL;
if (SomeCondition)
{
p = some_real_value;
}
method(p);
Clearly it's possible for p to be null at run time and therefore the assertion on the method to fail in a debug build. My mistake.
However it seems possible that that the compiler could have caught this at compile time and issued a warning saying that it has detected it has found a possibility that the assertion could be violated.
Ok this is a simple case and it would be fairly simple for the compiler to spot that the pointer could be NULL at that point based on some flow analysis of the program and tracking of possible ranges of variables at each point.
I know that it would likely be too difficult to determine if many asserts would be violated but if even a small number times the compiler was able to tell me that I've written code where it's possible that an assertion is violated it would help make my programs that much safer.
I'm thinking that it would help with things like off by one errors in array indexing too for example inside a loop :-
assert(index >= 0 && index < array_size);
I'm thinking that in many cases the compiler could prove at compile that the index variable could possibly be outside of those bounds and issue a warning at compile time.
I realise that this is likely to be far too much work for a complier to do normally but perhaps there are some tools that can perform this kind of analysis? I've not been able to find anything with google but I was wondering if anything of this kind exists? Or is it just too hard to do well enough to be useful perhaps?