Although, as Jeffrey Kemp says, high quality programmers produce high quality code, you can use your build process as a way to help ensure that your programmers (of whatever quality they are) do not have lapses of concentration.
One primary method is to ensure that your compilations are done with appropriately stringent compiler warning levels. What that means depends on the language, but for C and GCC (GNU Compiler Collection), you might use:
-Werror -Wall -Wmissing-prototypes -Wstrict-prototypes
This says 'any warning will be treated as an error' (so the build will fail until the warning is fixed), warn about a lot of common problems, and in particular, insist that non-static functions have a prototype in scope before being used or defined, and static functions have a prototype or definition in scope before being used, and insist on strict prototypes - do not allow the declaration notation 'extern int function();' which means (in C, but not C++) a function that returns an int with indeterminate - not empty - argument list; it also means no K&R (non-prototype) function definitions. If you're really up to speed, consider adding -Wextra
.
Other languages have other issues. In Java, you probably want to compile with the -deprecated
option (spelled correctly if I've misspelled it), and so on.
Then ensure that coding issues are fixed - and fixed properly, not by scattering casts around willy-nilly - before the build passes.
One advantage of ensuring that the fixes for these problems are done by the people who change things so that the previously clean-compiling code no longer compiles cleanly is that those people are too busy fixing the code to write more code that doesn't compile properly. But code review by competent, stringent and patient pedagogues is also important; the guilty parties have to be taught how to deal with the mess they make.