Hi everybody, I noticed a weird behaviour of iPhone OS when using decimal values. The simulator parse them from strings in a correct way but when I test the app on my iPhone it lose the decimal part.
In particular, I store values in a dictionary that I retrieve in this way:
Code:
NSString *thickStr = [dictionary valueForKey:@"thickness"];
NSNumber *thickNum = [[[self class] numberFormatter] numberFromString:thickStr];
[self setSpessore:thickNum];
where the "numberFormatter" class is defined as below:
Code:
+ (NSNumberFormatter *)numberFormatter
{
static NSNumberFormatter *_formatter;
if (_formatter == nil)
{
_formatter = [[NSNumberFormatter alloc] init];
[_formatter setNumberStyle:NSNumberFormatterDecimalStyle];
[_formatter setFormatterBehavior:NSNumberFormatterBehavior10_4];
[_formatter setGeneratesDecimalNumbers:TRUE];
}
return _formatter;
}
But it doesn't work! The App on iPhone keeps on convert the string to a simple integer, forgetting the decimal part, while the app on iPhone simulator works fine!