views:

1191

answers:

12

I believe, that the usage of preprocessor directives like #if UsingNetwork is bad OO practice - other coworkers do not. I think, when using an IoC container (e.g. Spring), components can be easily configured if programmed accordingly. In this context either a propery IsUsingNetwork can be set by the IoC container or, if the "using network" implementation behaves differently, another implementation of that interface should be implemented and injected (e.g.: IService, ServiceImplementation, NetworkingServiceImplementation).

Can somebody please provide citations of OO-Gurus or references in books which basically reads "Preprocessor usage is bad OO practice if you try to configure behaviour which should be configured via an IoC container"?

I need this citations to convince coworkers to refactor...

Edit: I do know and agree that using preprocessor directives to change targetplatform specific code during compilation is fine and that is what preprocessor directives are made for. However, I think that runtime-configuration should be used rather then compiletime-configuration to get good designed and testable classes and components. In other words: Using #defines and #if's beyond what they are meant for will lead to difficult to test code and badly designed classes. Has anybody read something along these lines and can give me so I can refer to?

+7  A: 

"The preprocessor is the incarnation of evil, and the cause of all pain on earth" -Robert (OO Guru)

Robert Gould
I accidentally voted up, then recognized that Robert Gould did write poetry but nothing wit ObjectOrientation... Now I don't have enough reputation to vote down *sigh*
tobsen
Sorry you mistook my joke. Anyways my name is Robert Gould and your free to quote me :)
Robert Gould
Tobsen, if you click on the up or down arrow a second time, it will rescind whatever vote you had cast. And Mr. Gould, I would expect a poet to know that your != you're. Tsk!
Phantom Watson
Thanks Phantom Watson. I dont think Robert Gould is related to the poerty writer Rober Gould who died 1709. Nevertheless, maybe Robert Gould could write a limerick containing "don't use" "preprocessor" and "IoC" ;-)
tobsen
dang, beat me to it! but i was going to pretend to quote Jon Skeet
Steven A. Lowe
+12  A: 

IMHO, you talk about C and C++, not about OO practice in general. And C is not Object-oriented. In both languages the preprocessor is actually useful. Just use it correctly.

I think this answer belongs to C++ FAQ: [29.8] Are you saying that the preprocessor is evil?.

Yes, that's exactly what I'm saying: the preprocessor is evil.

Every #define macro effectively creates a new keyword in every source file and every scope until that symbol is #undefd. The preprocessor lets you create a #define symbol that is always replaced independent of the {...} scope where that symbol appears.

Sometimes we need the preprocessor, such as the #ifndef/#define wrapper within each header file, but it should be avoided when you can. "Evil" doesn't mean "never use." You will use evil things sometimes, particularly when they are "the lesser of two evils." But they're still evil :-)

I hope this source is authoritative enough :-)

jetxee
Sorry, but I was talking about defines in C#...
tobsen
Are there #defines in C#? I didn't know that! Java does not have them (intentionally).
jetxee
Yes, there are - and today I found out that people are using them for things a Inversion of control container should be responsible for.Here are the msdn pages for #define http://snurl.com/alr65 and #if http://snurl.com/alr7x
tobsen
Defines in C# are the same as defines in C++ as far as evilness goes.
Brian
+1  A: 

The support of preprocessing in C# is highly minimal.... verging on useless. Is that Evil?

Is the Preprocessor anything to do with OO? Surely it's for build configuration.

For instance I have a lite version and a pro-version of my app. I might want to exclude some code on the lite withour having to resort to building very similar versions of the code.

I might not want to ship a lite version which is the pro version with different runtime flags.

Tony

Tony Lambert
IMO one should be able to configure behaviour instead of branding it into the software by using preprocessing. E.g: you built your software using #define No_Logging. Deliver your software,suddenly you need to log. What now? Rebuild instead of configuration? That's evil indeed, isn't it?
tobsen
What if you sell a lite version and a pro version.... surely you don't want to ship the proversion to people who by the lite one?
Tony Lambert
"I might not want to ship a lite version which is the pro version with different runtime flags." Sure. You just deliver the lite.dll and a configuration file which tells your IoC to take the instantiate the objects from that assemblies. If the user goes pro just give him pro.dll and a changed config
tobsen
+1  A: 

In c# / VB.NET I would not say its evil.

For example, when debugging windows services, I put the following in Main so that when in Debug mode, I can run the service as an application.

    <MTAThread()> _
    <System.Diagnostics.DebuggerNonUserCode()> _
    Shared Sub Main()


#If DEBUG Then

        'Starts this up as an application.'

        Dim _service As New DispatchService()
        _service.ServiceStartupMethod(Nothing)
        System.Threading.Thread.Sleep(System.Threading.Timeout.Infinite)

#Else

        'Runs as a service. '

        Dim ServicesToRun() As System.ServiceProcess.ServiceBase
        ServicesToRun = New System.ServiceProcess.ServiceBase() {New DispatchService}
        System.ServiceProcess.ServiceBase.Run(ServicesToRun)

#End If

    End Sub

This is configuring the behavior of the application, and is certainly not evil. At the very least, its not as evil as trying to debug a service startup routine.

Please correct me if I read your OP wrong, but it seems that you are complaining about others using a preprocessor when a simple boolean would suffice. If thats the case, dont damn the preprocessors, damn those using them in such fashion.

EDIT: Re: first comment. I dont get how that example ties in here. The problem is that the preprocessor is being mis-used, not that it is evil.

I'll give you another example. We have an application that does version checking between client and server on startup. In development, we often have different versions and dont want to do a version check. Is this evil?

I guess what I am trying to say is that the preprocessor is not evil, even when changing program behavior. The problem is that someone is misusing it. What is wrong with saying that? Why are you trying to dismiss a language feature?

StingyJack
Well, in this somewhat special case involving Services it might be okay. For a more general (= not involving MS baseclasses ;-) ) example please read the comment I left to Lamberts post...
tobsen
Actually I'd really want to say "Preprocessor usage is bad OO practice if you try to configure behaviour which should be configured via a IoC container". I can say that. But they won't believe me until I either show them the correspoing fxcop rule or I can cite someone...
tobsen
To me this falls under the category "Magic code."When it is compiled in Release mode it suddenly behaves differently... For you this is no problem, but if someone else is doing something with the code, he would not expect this behavior.
Johan
OOHHH I made "magic" =)
StingyJack
+8  A: 

Preprocessor directives in C# have very clearly defined and practical uses cases. The ones you're specifically talking about, called conditional directives, are used to control which parts of the code are compiled and which aren't.

There is a very important difference between not compiling parts of code and controlling how your object graph is wired via IoC. Let's look at a real-world example: XNA. When you're developing XNA games that you plan to deploy on both Windows and XBox 360, your solution will typically have at least two platforms that you can switch between, in your IDE. There will be several differences between them, but one of those differences will be that the XBox 360 platform will define a conditional symbol XBOX360 which you can use in your source code with a following idiom:

#if (XBOX360)
// some XBOX360-specific code here
#else
// some Windows-specific code here
#endif

You could, of course, factor out these differences using a Strategy design pattern and control via IoC which one gets instantiated, but the conditional compilation offers at least three major advantages:

  1. You don't ship code you don't need.
  2. You can see the differences between platform-specific code for both platforms in the rightful context of that code.
  3. There's no indirection overhead. The appropriate code is compiled, the other isn't and that's it.
Vojislav Stojkovic
And I totally agree with you: When it comes to things which are platform dependent conditional directives may be a good way to handle that. But with the code I read it was clear that behaviour was configured via preprocessor directives instead of configuration.
tobsen
Now I need material I can cite when talking to the developers and explain why it would be better to refactor it. They will ask me "who says "Preprocessor usage is bad OO practice, if you try to configure behaviour" and I want to reply "besides me, OO-Guru xy writes about it here[insertUrlPlease]".
tobsen
Ooops, sorry. I didn't mean to preach. Your question sounded to me like a personal doubt or request for clarification; I didn't realize you were hunting for material you need to convince someone else that they were doing a bad thing.
Vojislav Stojkovic
+2  A: 

IMO it is important to differentiate between #if and #define. Both can be useful and both can be overused. My experience is that #define is more likely to be overused than #if.

I spent 10+ years doing C and C++ programming. In the projects I worked on (commercially available software for DOS / Unix / Macintosh / Windows) we used #if and #define primarily to deal with code portability issues.

I spent enough time working with C++ / MFC to learn to detest #define when it is overused - which I believe to be the case in MFC circa 1996.

I then spent 7+ years working on Java projects. I cannot say that I missed the preprocessor (although I most certainly did miss things like enumerated types and templates / generics which Java did not have at the time).

I've been working in C# since 2003. We have made heavy use of #if and [Conditional("DEBUG")] for our debug builds - but #if is just a more convenient, and slightly more efficient way of doing the same things we did in Java.

Moving forward, we have started to prepare our core engine for Silverlight. While everything we are doing could be done without #if, it is less work with #if which means we can spend more time adding features that our customers are asking for. For example, we have a value class which encapsulates a system color for storage in our core engine. Below are the first few lines of code. Because of the similarity between System.Drawing.Color and System.Windows.Media.Color, the #define at the top gets us a lot of functionality in normal .NET and in Silverlight without duplicating code:

using System;
using System.Collections.Generic;
using System.Text;
using System.Diagnostics;
#if SILVERLIGHT
using SystemColor = System.Windows.Media.Color;
#else
using SystemColor = System.Drawing.Color;
#endif

namespace SpreadsheetGear.Drawing
{
    /// <summary>
    /// Represents a Color in the SpreadsheetGear API and provides implicit conversion operators to and from System.Drawing.Color and / or System.Windows.Media.Color.
    /// </summary>
    public struct Color
    {
        public override string ToString()
        {
            //return string.Format("Color({0}, {1}, {2})", R, G, B);
            return _color.ToString();
        }

        public override bool Equals(object obj)
        {
            return (obj is Color && (this == (Color)obj))
                || (obj is SystemColor && (_color == (SystemColor)obj));
        }
        ...

The bottom line for me is that there are many language features which can be overused, but this is not a good enough reason to leave these features out or to make strict rules prohibiting their use. I must say that moving to C# after programming in Java for so long helps me to appreciate this because Microsoft (Anders Hejlsberg) has been more willing to provide features which might not appeal to a college professor, but which make me more productive in my job and ultimately enable me to build a better widget in the limited time anybody with a ship date has.

Joe Erickson
I am glad that you enjoy working with c# - so do I. And as with the answer Vojislav Stojkovic posted, I do agree that #if and #defines are totally okay when used to distinguish between functionalites between platforms. In my case there is only one target platform. I'll edit the question again.
tobsen
+2  A: 

Using a #if instead of an IoC or some other mechanism for controlling different functionality based on configuration is probably a violation of the Single Responsibility Principle, which is the key for 'modern' OO designs. Here is an extensive series of articles about OO design principles.

Since the parts in the different sections of the #if by definition concern themselves with different aspects of the system, you are now coupling the implementation details of at least two different components into the dependency chain of your code that uses the #if.

By refactoring those concerns out, you have created a class that, assuming it is finished and tested, will no longer need to be cracked open unless the common code is broken.

In your original case, you'll need to remember the existence of the #if and take it into account any time any of the three components change with respect to possible side-effects of a breaking change.

Steve Mitcham
Like Daniel Daranas answer, this helps me in convincing my colleagues - however I really need some reference to an important blog or, even better, book.
tobsen
I updated my comments with a link to an extensive article that leads to other resources that should give you all the ammo you need.
Steve Mitcham
Thank you very much indeed!
tobsen
+3  A: 

One problem with the preprocessor #ifdef's is that they effectively duplicate the number of compiled versions that, in theory, you should test thorougly so that you can say that your delivered code is correct.

  #ifdef DEBUG
  //...
  #else
  //...

Ok, now I can produce the "Debug" version and the "Release" version. This is ok for me, I always do it, because I have assertions and debug traces which are only executed in the debug version.

If someone comes and writes (real life example)

  #ifdef MANUALLY_MANAGED_MEMORY
  ...

And they write a pet optimization which they propagate to four or five different classes, then suddenly you have FOUR possible ways to compile your code.

If only you have another #ifdef-dependant code then you'll have EIGHT possible versions to generate, and what's more disturbing, FOUR of them will be possible release versions.

Of course runtime if()'s, like loops and whatever, create branches that you have to test - but I find it much more difficult to guarantee that every compile time variation of the configuration remains correct.

This is the reason why I think, as a policy, all #ifdef's except the one for Debug/Release version should be temporary, i.e. you're doing an experiment in development code and you'll decide, soon, if it stays one way or the other.

Daniel Daranas
That provides additional points to bring forward my arguments. Thanks.
tobsen
+1  A: 

Preprocessor code injection is to the compiler what triggers are to the database. And it's pretty easy to find such assertions about triggers.

I mainly think of #define being used to inline a short expression because it saves the overhead of a function call. In other words, it's premature optimization.

le dorfier
Great comparison. I just googled and - maybe because I haven't had too much contact with triggers yet - found this webiste: http://jyte.com/cl/database-triggers-are-evil the explanation fits, even though people tend to disagree ;) Have you got better sources?
tobsen
+2  A: 

Bjarne Stroustrap provides his answer (in general, not specific to IoC) here

So, what's wrong with using macros?

(excerpt)

Macros do not obey the C++ scope and type rules. This is often the cause of subtle and not-so-subtle problems. Consequently, C++ provides alternatives that fit better with the rest of C++, such as inline functions, templates, and namespaces.

James Curran
+8  A: 

Henry Spencer wrote a paper called #ifdef Considered Harmful.

Also, Bjarne Stroustrup himself, in the chapter 18 of his book The Design and Evolution of C++, frowns on the use of preprocessor and wishes to eliminate it completely. However, Stroustrup also recognizes the necessity for #ifdef directive and the conditional compilation and goes on to illustrate that there is no good alternative for it in C++.

Finally, Pete Goodliffe, in chapter 13 of his book Code Craft: The Practice of Writing Excellent Code, gives an example how, even when used for its original purpose, #ifdef can make a mess out of your code.

Hope this helps. However, if your co-workers won't listen to reasonable arguments in the first place, I doubt book quotes will help convince them ;)

Vojislav Stojkovic
Thanks for comming back and bringing the books with you ;-)
tobsen
A: 

One quick point to tell your coworkers is this: the preprocessor breaks operator precedence in mathematical statements if symbols are used in such statements.

tkotitan