views:

254

answers:

9

I've been working on a foundational c++ library for some time now, and there are a variety of ideas I've had that could really simplify the code writing and managing process. One of these is the concept of introducing some macros to help simplify statements that appear very often, but are a bit more complicated than should be necessary.

For example, I've come up with this basic macro to simplify the most common type of for loop:

#define loop(v,n) for(unsigned long v=0; v<n; ++v)

This would enable you to replace those clunky for loops you see so much of:

for (int i = 0; i < max_things; i++)

With something much easier to write, and even slightly more efficient:

loop (i, max_things)

Is it a good idea to use conventions like this? Are there any problems you might run into with different types of compilers? Would it just be too confusing for someone unfamiliar with the macro(s)?

+2  A: 

I would say it depends upon whether you expect anyone else to ever have to make sense of your code. If it's only ever going to be you in there, then I don't see a problem with the macros.

If anyone else is ever going to have to look at this code, then the macros are going to cause problems. The other person won't know what they are or what they do (no matter how readable and obvious they seem to you) and will have to go hunting for them when they first run across them. The result will be to make your code unreadable to anyone but yourself - anyone using it will essentially have to learn a new language and program at the same time.

And since the chances of it just being you dealing with the code are pretty much nil if you hope the code to be a library that will be for more than just your personal use - then I'd go with don't.

Daniel Bingham
The same can be said for function calls. Other people will have no idea what they do (not matter how readable and obvious), and will have to go hunting for them every time (on the assumption that they have no memory whatsoever). So never write sub-routines or classes: always do everything in `main`. Macros have their flaws, but this isn't one of them. If they're used throughout a code-base, then maintainers of that code base just learn what they do, same as they learn what commonly-used functions and types do.
Steve Jessop
@Steve Jessop There's a key difference between a macro and a function in this case. A function is a language feature that is a subroutine. It doesn't chance the language's basic syntax, rather it encapsulates a section of code that needs to be generalized and run alot. Macros used like this completely change the basic language syntax. Developers expect to hunt for functions - they do not expect or want to hunt for macros and have to learn them. In doing so they basically have to learn a new language - rather than simply learn a program.
Daniel Bingham
True, but irrelevant since that doesn't make it impossible for people to remember what they do. In particular if the developers are paid, then it would be astonishingly unprofessional of them to refuse to learn what commonly-used macros in the code base do.
Steve Jessop
John Dibling
@Steve Jessop Sure it would be unprofessional of them to refuse to learn them, but that doesn't mean they have to like it. And the act of learning them takes that much more time. Also, might I ask why you're picking on me rather than the 14 upvoted first posted who said the exact same thing I did in less words and with a humorous anecdote?
Daniel Bingham
Ironic that for example Ruby is designed specifically to make it easier for programmers to design their own private languages which nobody else understands. Rails being a prime example. The reason people keep wanting to "change the C or C++ language" is that those languages don't do what they want...
Steve Jessop
@John Dibling I would agree with you about that, and never use macros in my own code. However, they are in the language and I don't see why it should be a problem if he wants to experiment with them to speed his own development. Provided he's the only one who's ever going to have to deal with it. Since I find that unlikely though, I still recommend against it. Funny how elaborating a bit more makes the difference between a whole ton of cheers and getting picked on and down votes. I said basically the same thing - I just added some nuance.
Daniel Bingham
@Steve Jessop In the instance a language doesn't do what you want, I would say you need to go looking for another language rather than trying to redesign an existing one.
Daniel Bingham
The first person said it's a bad idea, which is an opinion. You said other programmers will have to go hunting for them "every time", which is a testable statement of fact which I believe to be false based on past experience (admittedly not so much of C++ - it's possible that C++ programmers are worse at it than the people I know, but surely their ability is above 0).
Steve Jessop
@Steve Jessop Semantics. I edited the phrasing - better now?
Daniel Bingham
@Daniel: "go looking for another language" in an ideal world, yes, but in practice C++'s slightly crummy facilities for looping aren't grounds for switching languages - the reasons for using C++ on a given project are usually technical. In practice, I'd attempt something with Boost.Foreach and a custom type of Boost.Range before writing my own macro.
Steve Jessop
If you're trying to persuade me to take off the downvote, I can't: it isn't mine. I still don't agree that programmers should be treated as though they are as dumb as you say.
Steve Jessop
@Steve Jessop Fair enough, so you feel that there are times when Macro use is justified. Might I suggest you take your argument to the comments of the upvoted post? I fear your very good points will be lost down here deep in the comments of my down voted answer. (Sorry, that sounds vaguely sarcastic - its not meant to be. I mean it, your points are good.)
Daniel Bingham
@Steve Jessop I know the down vote isn't yours and I'm not trying to persuade you to take it off. I just can't resist a good debate. I do feel that this discussion is somewhat subjective. I also feel that having it down here on my answer doesn't help anybody, because one that says macros === bad is getting up voted like crazy with little disagreement or contest. You've nearly convinced me that Macros aren't the evil my knee jerk reaction would have them be. But there are others not getting the message ;)
Daniel Bingham
Thing is, I could say "sometimes it's a good idea", and that poster would say, "aha, but I put in special weasel-words 'generally a bad idea' to make sure I didn't need to have that argument". I guess he got upvoted because he didn't say anything controversial: "avoid macros unless *astonishingly* useful or necessary" is fairly uncontroversial. Boost.Foreach and C++0x range for are probably about as good as you'll get in C++, and I don't hate `for` as much as the questioner does...
Steve Jessop
There you go, I've attempted to start an argument in the top answer. Let's see if dkackman is as up for it as you are :-)
Steve Jessop
@Steve Jessop True - I'm not much of the politician. Funny how much difference adding a "generally" can make. I'll have to keep that in mind for the future. Anyway, thanks for the debate and good luck with Dkackman ;)
Daniel Bingham
It's ok; I can tell where the real discussion is going on. I was *expecting* the vast majority to shoot my suggestion down with little thought in the matter. I was mostly interested to see if anyone was actually in favor of this. I'm adding a comment to my original post, if y'all want to take a look at it. ;)
DoctorT
Regarding the wording: to be honest, although this question has specific details, it's essentially a duplicate of the general pattern "should I use a macro for ...". Several people will say "Aaargh! Macros! Run for cover!", with good reason. I suspect it's largely chance or speed who gets to the top and gets all the rep for saying it :-)
Steve Jessop
Yeah, and I'm fully aware that *some* macros would undoubtedly cause large numbers of problems, and make code much harder to debug. However, seeing how simple my `loop` macro is, I seriously doubt it would cause any debugging problems, unless people are misusing it. If anything, it might actually simplify debugging... because typing all those characters in a usual for loop... people can make mistakes. I know when I'm looking at for loops I usually just gloss over them, and probably won't notice that the guy typed `x > 5;` instead of `x < 5` unless I look really closely.
DoctorT
The d/v was mine. The message that was conveyed to me when I read this post was "macros are OK, just be careful." You say "it depends upon whether you expect anyone else to ever have to make sense of your code." But I think it depends on more than that. It depends on whether or not you (as a macro-user) are interested in becoming a better programmer. Programmers who write macros to ease their programming without carefully considering all the problems macros introduce are destined to remain mediocre programmers. The world has enough of those. Your post seemed to give licence to them.
John Dibling
I'll remove the d/v not because I now agree with the post, but because the debate within the comments is more useful than any post int his thread.
John Dibling
+24  A: 

IMHO this is generally a bad idea. You are essentially changing well known and understood syntax to something of your own invention. Before long you may find that you have re-invented the language. :)

dkackman
+1 Absolutely don't do this. Macros make understanding and debugging code much harder for future developers. Try to debug macro heavy code in Visual Studio and you'll learn a new definition of pain.
Morinar
<sarcasm>But I think most developers like relearning something they are already familiar with. They really don't have much to do or learn already.</sarcasm>
cplotts
+1 "Simplifying" macros are a very bad idea. You end up with a secret language that nobody else knows. I pity the poor maintenance programmer who has to learn DoctorT++ just to track down your bugs.
John Dibling
-1 because this answer is too inflated and doesn't actually provide a good reason not to do what the OP is doing. As someone else commented in another, similar answer, this same argument can easily be reduced to the absurd: don't write functions. The whole purpose of functions and macros is reuse of commonly used constructs and ease of maintenance by putting these common constructs in one area; simplifying code in other words. The problems with macros are that they often have side effects that are hard to catch. That's why they're hard to debug.
Noah Roberts
Further, DSELs (Domain Specific Embedded Languages) have proven to be very useful in many cases. See Spirit, the TMP book, etc...
Noah Roberts
Bah, after a long discussion I have been mostly convinced that Macros are not as evil as they are often made out to be (and in my experience have been). But now it's too late to change my vote. See excellent points made by Noah Roberts and Steve Jessop.
Daniel Bingham
"Generally a bad idea", or "a bad idea in this specific instance"? There's a fair amount of information in the question on which to make a call.
Steve Jessop
I'm tempted to upvote just for the link to "better_c.h". Better C is out there ... spread the word!
Dan Moulding
Yeah, that link made me lol pretty hard. :)
DoctorT
@Dan and @DoctorT - the single best #define in the history of mankind: #define do_nothing
dkackman
@Noah: You're right. Reducing "don't reinvent the language" to "don't write functions" IS absurd. Macros are evaluated before the code is even compiled, defeating the typesystem and scoping rules. Debugging macros is difficult or impossible even with modern debuggers because the code doesn't actually exist. Subtle syntax errors caused by malformed macros often generate compiler errors that give no hint as to the actual problem. Variable and function names can be silently replaced, yielding hopfuly grossly incorrect behavior. The list goes on. None of these problems apply to functions.
John Dibling
+1  A: 

Getting rid of the for loops is generally a good idea -- but replacing them with macros is not. I'd take a long, hard look at the standard library algorithms instead.

Jerry Coffin
A: 

It's a question of where you're getting your value. Is typing those 15 extra characters in your loops really what's slowing your development down? Probably not. If you've got multiple lines of confusing, unavoidable boilerplate popping up all over the place, then you can and should look for ways to avoid repeating yourself, such as creating useful functions, cleaning up your class hierarchies, or using templates.

But the same optimization rules apply to writing code as to running it: optimizing small things with little effect is not really a good use of time or energy.

Ipsquiggle
-1: NO. Stuff that sucker in a FUNCTION.
John Dibling
Good point. I generalized my advice a little and removed mention of macros.
Ipsquiggle
+2  A: 

No, not a good idea.


  int max = 23;
  loop(i, ++max)...

It is, however, a good idea to refactor commonly used code into reusable components and then reuse instead of copy. You should do this through writing functions similar to the standard algorithms like std::find(). For instance:


template < typename Function >
void loop(size_t count, Function f)
{
  for (size_t i = 0; i < count, ++i) f();
}

This is a much safer approach:


int max = 23;
loop(++max, boost::bind(....));
Noah Roberts
But you're sacrificing processor time to do that, and it's not really any easier to type in the first place. You'd be better off just using the usual `for` syntax. And yes, using the macro in a way it was not intended would cause problems. That's a good point. This is something that is so commonly used though, that I think once people get the hang of it, there would be no problem. And obviously this can't be used to replicate more complicated uses of `for`. `for` would still be used in those instances.
DoctorT
premature optimization is the root of all evil. What data do you have to make the statement that there's a sacrifice of processor time? How much time? Is the time that you are saving in using a less than safe construct worth the hassles such use inevitably causes?As to the fact that for would still be necessary...of course it would be. You didn't say you wanted something to replace ALL uses of for, just common ones. Perhaps my particular implementation does not address your particular needs. This means you need to come up with one that does.
Noah Roberts
Well, you're calling a function, which passes a pointer, calls another function through that pointer which then runs the loop. In the macro I outlined, it's just `for` loop. So yeah, there's a very small amount of overhead, though it's probably not enough to make a difference. Don't get me wrong, your suggestion was very interesting, while still being a valid way to approach the problem. It's the only answer on this page I've voted up. Though, I'd be curious to understand the meaning behind your statement "premature optimization is the root of all evil".
DoctorT
Am I? Have you looked at the assembler code generated by the compiler to verify that any of your assumptions are indeed correct? At what optimization levels are your assumptions true and at what levels are they false?The point is that you should really measure before you assume that some construct is going to cause "overhead".
Noah Roberts
It's all templates, and therefore there is no *technical* reason why it shouldn't be possible to inline it all. - However, current C++ provides little easy means to create the Function. You can easily end up writing lots of (possibly obfuscated) code for that. (That `boost::bind(...)` may be easily starting off something very alarming.)
UncleBens
Yes, writing `3.times { |i| puts i }` in Ruby is great, but just isn't achievable in all languages. To write the equivalent in C++03 with the above `loop` template would be more code than just the `for` loop, and IMO harder to read/comprehend than the for loop. Even in C++0x, lambda syntax isn't exactly elegant.
Steve Jessop
@UncleBens - yep, trying to make sure you never, ever make a function call and insisting that there never be any pointer reference so the less that brilliant compilers can optimize it all is going to result in some very obfuscated code much of the time. That's why I always insist someone actually go measure the performance hit of a construct before they claim it has "overhead". More often than not, even if there is "overhead" (1ns out of 100), you'll find that you don't actually care.
Noah Roberts
+2  A: 

I think you've provided one strong argument against this macro with your example usage. You changed the loop iterator type from int to unsigned long. That has nothing to do with how much typing you want to do, so why change it?

That cumbersome for loop specifies the start value, end value, type and name of the iterator. Even if we assume the final part will always be ++name, and we're happy to stick to that, you have two choices - remove some of the flexibility or type it all out every time. You've opted to remove flexibility, but you also seem to be using that flexibility in your code base.

Steve Jessop
The reason I replaced `int` with `unsigned long` in the macro, is because there is no need for signed integers if you are starting at zero and incrementing. The only reason `int` is being used in the `for` example is because that is the most common syntax programmers usually use. The only functional difference between the two is that `unsigned long` is slightly *better* for the task, but is too long for people to actually write out when writing for loops.
DoctorT
The change might well provoke compiler warnings, if `max_things` has signed type, so I would not say that it's innocuous.
Steve Jessop
That's a good point, Steve. It might be better to make it an int in that case. I do plan on using `unsigned long` in other places though, particularly for storing the maximum values of arrays and such (which would be the primary candidates for the second parameter of `loop`). Casting the value to `unsigned long` within the macro might be another possibility.
DoctorT
A: 

Steve Jessop makes a good point. Macros have their uses. If I may expound upon his statements, I would go so far as to say that the argument for or against macros comes down to "It depends". If you make your macros without careful thought, you risk making future maintaners' lives harder. On the other hand, using the wxWidgets library requires using library provided macros to connect your code with the gui library. In this case, the macros lower the barrier of entry for using the library, as magic whose innards are irrelevant to understanding how to work with the library are hidden away from the user. In this case, the user is saved from having to understand things they really don't need to know about, and can be argued that this is a "Good" use of macros. Also, wxWidgets clearly documents how these macros are supposed to be used. So make sure that what you hide isn't something that is going to need to be understood by someone else coming in.

Or, if its just for your use, knock yourself out.

Joshua
Yes, I will probably be using some kind of mechanism like that. If nothing else, all the syntax-changing elements I introduce into the code will be carefully documented and explained in an easily accessible location that everyone involved in the project will know about.
DoctorT
Just to be clear, in the case of wxWidgets the macros are technically used to change the syntax of c++, but they are meant to hide some scary invocation methods from the user that are necessary in many places (but the details of which would only serve to confuse users). I just wanted to make sure that that point was made clearly. Good luck!
Joshua
Yeah, I'm definitely going to fully research something like this before I consider using it. I don't want it to cause unexpected problems in the code formatting, and I similarly want to avoid becoming dependent on a component for a particular interface.
DoctorT
What is your library going to do? Do you have a project page up somewhere?
Joshua
I'm mostly writing it to learn; I'm not sure how much of it will actually be used in future projects. There's not enough written besides to actually post it anywhere as of yet. I keep going back and re-writing parts of it for better performance and/or usability. Damn my perfectionism!
DoctorT
A: 

Apart from the maintenance/comprehension problems mentionned by others, you'll also have a hard time breaking and single-stepping through macro code.

One area where I think macros might be acceptable would be for populating large data structures with constants/litterals (when it can save an excessive amount of typing). You normally would not single-step through such code.

Emile Cormier
A: 

In Unix, I find that by the time I want to create an alias for a command I use all the time, the command is on my fingers, and I'd have a harder time remembering the syntax of my alias than the original command.

The same applies here -- by the time you use an idiom so much that you want to create a macro for it, the idiom will be on you fingers and cause you more pain than just typing out the code.

JohnMcG
That's still looking at the short term, though. If you spend a small amount of effort now to re-learn to use said idiom, it may very well save you a lot of time in the long run. Not that that's necessarily the case in this situation, but I don't think reluctance to try something new just because you're used to something different is a good reason not to try it.
DoctorT
@DoctorT If your concern is saving typing time, then use a macro in your EDITOR which inserts the for loop. It's much more important for code to be easy to read than it for it to be easy to type.
Stephen C. Steel
It's not *only* about saving typing time. Sure, if you don't know what `loop` does, you're not going to be able to read it properly. But on the other hand, if you *do* know what `loop` does, it's easier to read just as much as it's easier to type.
DoctorT
Not for people who know the C++ language, it isn't.It is a much safer bet that a reader knows what a C++ for loop does than what your loop macro does.BTW -- I think there are better reasons not to do this, but this is one I hadn't yet.
JohnMcG