Im using core-plot for my graphing component of my iPhone app, and I have been using NSDecimal
object a lot.
One of the lines of their code that I have seen is like this:
-(void)plotPoint:(NSDecimal *)plotPoint forPlotAreaViewPoint:(CGPoint)point
{
NSDecimal x;
//do some calculations on x
plotPoint[CPCoordinateX] = x;
}
Where, CPCoordinateX is deinfed as below:
typedef enum _CPCoordinate {
CPCoordinateX = 0, ///< X axis
CPCoordinateY = 1, ///< Y axis
CPCoordinateZ = 2 ///< Z axis
} CPCoordinate;
The line:
plotPoint[CPCoordinateX] = x;
is what I dont understand, how can a NSDecimal be assigned to like this?
In my code, Im trying to call this method, like so:
NSDecimal dec = CPDecimalFromInteger(0);
[plotSpace plotPoint:&dec forPlotAreaViewPoint:point];
NSDecimalNumber *newx = [[NSDecimalNumber alloc] initWithDecimal:dec];
NSDecimal x = dec[CPCoordinateX];
//NSLog(@"converted at: %@", newx);
but Im getting compile errors:
error: subscripted value is neither array nor pointer
Can someone please explain this to me?