Returning nil
is an error here since you're returning an integer primitive, not an object. (You're getting a cast warning because nil
is actually a #define
that evaluates to ((void *)0)
, which is a null pointer, not an integer zero.) The best option for Objective-C code that interfaces with Cocoa is probably to use NSNotFound
, a #define
for NSIntegerMax
which is used throughout Cocoa to signify that a given value does not exist in the receiver, etc. (Another option is to use -1
, which is more common in C code. What works best depends on what the calling code expects and can handle.)
Although NSNotFound
is a signed value, it's big enough that you're highly unlikely to run into a range issue. (NSIntegerMax
is approximately half of NSUIntegerMax
, and very few people get remotely close to 2,147,483,647 objects — let alone twice that many — in 32-bit land. In 64-bit, forget about it; you'll run out of physical RAM in your machine long before you run out of integers for indexes.)
Speaking of which, the Cocoa convention is to use NSUInteger
(rather than NSInteger
) for indexes. An unsigned integer cannot be negative, which offers some sanity protection on index values; among other things, it becomes easier to sort out accidental integer overflow/underflow. If this is a custom data source method (as it seems to be) I'd strongly suggest switching to using unsigned integers. (It may help to remember/realize that NSInteger
and NSUInteger
occupy the same number of bytes, they just interpret the bits differently, so you won't "waste" any space by switching types.)