tags:

views:

297

answers:

1

for example, it seems impossible to implement delegation techniques without creating warnings. This really makes no sense, because:

  if (self.myDelegate != nil) {
   BOOL callDelegate = [self.myDelegate respondsToSelector:@selector(fooDidHappen:WithBar:)];
   if (callDelegate) {
    [self.myDelegate fooDidHappen:foo withBar:bar];
   }
  }

Like you can see, I ask if the delegate responds to that selector. But in the if-block which is entered only when the delegate responds to it, I get a stupid warning that the delegate does not respond. Of course it does not, because this code only takes full advantage at runtime. However it's very bad practise to continue working with compiler warnings in xcode, so I would like to tell the compiler to just ignore that.

In PHP for example you can write @anFunction(foo); and the @ will make sure that this function does not give you any warning at all. So is there an compiler directive or command that can be typed around that part to get rid of the warning?

+2  A: 

In another question people told you exactly how to fix that warning, you declare a formal protocol for the delegate and add that protocol onto the delegate's property declaration.

Having said that, if you want to suppress the warning you can use the GCC diagnostics pragma:

//Turn the warning off
#pragma GCC diagnostic ignored "-Wundeclared-selector"
if (self.myDelegate != nil) {
  BOOL callDelegate = [self.myDelegate respondsToSelector:@selector(fooDidHappen:WithBar:)];
  if (callDelegate) {
    [self.myDelegate fooDidHappen:foo withBar:bar];
  }
}

//Turn the warning back on
#pragma GCC diagnostic warning "-Wundeclared-selector"

Note this requires GCC 4.2.1+ or CLANG (and clang has what I feel is an improved version, but I may be biased since I wrote the patch for it).

Louis Gerbarg
That actually won't always work. The compiler needs to know the type of the arguments to be able to correctly compile the method invocation itself.
bbum
Well, he should just fix the issue, which is why I linked back to the other post. I assume the compiler just infers "v@:@@" by looking at the types of the args. It could be wrong which cause a runtime error, but you can always suppress warning, it just isn't a good idea ;-)
Louis Gerbarg