Hi,
I wonder why other languages do not support this feature. What I can understand that C / C++ code is platform dependent so to make it work (compile and execute) across various platform, is achieved by using preprocessor directives. And there are many other uses of this apart from this. Like you can put all your debug printf's inside #if DEBUG ... #endif
. So while making the release build these lines of code do not get compiled in the binary.
But in other languages, achieving this thing (later part) is difficult (or may be impossible, I'm not sure). All code will get compiled in the binary increasing its size. So my question is "why do Java, or other modern compiled languages
no support this kind of feature?" which allows you to include or exclude some piece of code from the binary in a much handy way.
views:
366answers:
11The major languages that don't have a preprocessor usually have a different, often cleaner, way to achieve the same effects.
Having a text-preprocessor like cpp
is a mixed blessing. Since cpp
doesn't actually know C, all it does is transform text into other text. This causes many maintenance problems. Take C++ for example, where many uses of the preprocessor have been explicitly deprecated in favor of better features like:
- For constants,
const
instead of#define
- For small functions,
inline
instead of#define
macros
The C++ FAQ calls macros evil and gives multiple reasons to avoid using them.
Other languages do support this feature, by using a generic preprocessor such as m4.
Do we really want every language to have its own text-substitution-before-execution implementation?
Because modern compilers are smart enough to remove dead code in most any case, making manually feeding the compiler this way no longer necessary. I.e. instead of :
#include <iostream>
#define DEBUG
int main()
{
#ifdef DEBUG
std::cout << "Debugging...";
#else
std::cout << "Not debugging.";
#endif
}
you can do:
#include <iostream>
const bool debugging = true;
int main()
{
if (debugging)
{
std::cout << "Debugging...";
}
else
{
std::cout << "Not debugging.";
}
}
and you'll probably get the same, or at least similar, code output.
Edit/Note: In C and C++, I'd absolutely never do this -- I'd use the preprocessor, if nothing else that it makes it instantly clear to the reader of my code that a chunk of it isn't supposed to be complied under certain conditions. I am saying, however, that this is why many languages eschew the preprocessor.
Preprocessor directives are archaic and using them is often bad practice.
Just use if (debug) {...}
. Binary size is a nil point unless you're writing embedded code or something. 90% of the time, the compiler will remove "dead" or "unreacheable" code, regardless.
Note that macros/preprocessing/conditionals/etc are usually considered a compiler/interpreter feature, as opposed to a language feature, because they are usually completely independent of the formal language definition, and might vary from compiler to compiler implementation for the same language.
A situation in many languages where conditional compilation directives can be better than if-then-else runtime code is when compile-time statements (such as variable declarations) need to be conditional. For example
$if debug array x $endif ... $if debug dump x $endif
only declares/allocates/compiles x when needing x, whereas
array x boolean debug ... if debug then dump x
probably has to declare x regardless of whether debug is true.
The C pre-processor can be run on any text file, it need not be C.
Of course, if run on another language, it might tokenize in weird ways, but for simple block structures like #ifdef DEBUG, you can put that in any language, run the C pre-processor on it, then run your language specific compiler on it, and it will work.
Other languages also have better dynamic binding. For example, we have some code that we cannot ship to some customers for export reasons. Our "C" libraries use #ifdef
statements and elaborate Makefile tricks (which is pretty much the same).
The Java code uses plugins (ala Eclipse), so that we just don't ship that code.
You can do the same thing in C through the use of shared libraries... but the preprocessor is a lot simpler.
Because decreasing the size of the binary:
- Can be done in other ways (compare the average size of a C++ executable to a C# executable, for example).
- Is not that important, when it you weigh it against being able to write programs that actually work.
Many modern languages actually have syntactic metaprogramming capabilities that go way beyond CPP. Pretty much all modern Lisps (Arc, Clojure, Common Lisp, Scheme, newLISP, Qi, PLOT, MISC, ...) for example have extremely powerful (Turing-complete, actually) macro systems, so why should they limit themselves to the crappy CPP style macros which aren't even real macros, just text snippets?
Other languages with powerful syntactic metaprogramming include Io, Ioke, Perl 6, OMeta, Converge.
A better question to ask is why did C resort to using a pre-processor to implement these sorts of meta-programming tasks? It isn't a feature as much as it is a compromise to the technology of the time.
The pre-processor directives in C were developed at a time when machine resources (CPU speed, RAM) were scarce (and expensive). The pre-processor provided a way to implement these features on slow machines with limited memory. For example, the first machine I ever owned had 56KB of RAM and a 2Mhz CPU. It still had a full K&R C compiler available, which pushed the system's resources to the limit, but was workable.
More modern languages take advantage of today's more powerful machines to provide better ways of handling the sorts of meta-programming tasks that the pre-processor used to deal with.