I'm having trouble with
+ (NSDecimalNumber *)decimalNumberWithString:(NSString *)numericString locale:(NSDictionary *)locale
Because I want to provide very high precision values programmatically to have no floating-point errors initially, apple gives me the only option to rely on a wonky locale.
So the documentation says pretty encrypted:
Parameters: ... locale A dictionary that defines the locale (specifically the NSDecimalSeparator) to use to interpret the number in numericString.
Discussion The locale parameter determines whether the NSDecimalSeparator is a period (as is used, for example, in the United States) or a comma (as is used, for example, in France).
well, after searching for NSDecimalSeparator in the docs, nothing found. Searching on the net, found that this thing is "deprecated". So currently I do something dangerous like this:
NSLocale *usLoc = [[NSLocale alloc] initWithLocaleIdentifier:@"en_US"];
NSDecimalNumber *num = [NSDecimalNumber decimalNumberWithString:str locale:usLoc];
So I wonder: If they really need this wonky locale for something crucial like this (I add strings programmatically, no user input), couldn't I somehow create an own locale? That parameter wants an NSDictionary, so the idea:
Could I create an NSMutableDictionary out of that locale Dictionary that comes for -initWithLocaleIdentifier:@"en_US", and then just edit this %?&§! NSDecimalSeparator field?
And another thing that raises my headaches: Why does the parameter ask for an NSDictionary, where I have to pass an NSLocale object? Or is my code wrong? (not tested, since my app is currently totally screwed up ;-) )