views:

9868

answers:

3

Possible Duplicates:
How to convert a single char into an int
Character to integer in C

Can any body tell me how to convert a char to int?

char c[]={'1',':','3'};

int i=int(c[0]);

printf("%d",i);

When I try this it gives 49.

A: 

The standard function atoi() will likely do what you want.

Frans Bouma
atoi takes a char* not a char.
Mehrdad Afshari
Isn't that just semantics? A char* gives a char as well, just pass the address. But ok, I didn't look at the specifics, my C is starting to get kind of rusty.
Frans Bouma
Paul Tomblin
Nope, to be correct, it expects a NULL-terminated char* not just an address.
Mehrdad Afshari
@Frans,Noldorin: I would consider it very dangerous to "just" pass a pointer to a char array which isn't NULL terminated to a function expecting a NULL terminated string. In the example, remove the colon from the char array and the code will read uninitialized memory.
Tobi
+4  A: 
int result = charValue - '0';

so your code would be:

printf("%d", i - '0');
Mehrdad Afshari
Why did you answer and vote to close? Maybe you should mark this is CW.
Zifre
I answered before someone mentioned it's a dupe.
Mehrdad Afshari
+13  A: 

In the old days, when we could assume that most computers used ASCII, we would just do

int i = c[0] - '0';

But in these days of Unicode, it's not a good idea. It was never a good idea if your code had to run on a non-ASCII computer.

Edit: Although it looks hackish, evidently it is guaranteed by the standard to work. Thanks @Earwicker.

Paul Tomblin
Not working .I expect answer as 1 but it gives a 6 digit nummber garbage i think
Cute
Sorry, I changed it to c[0] since that's what the example code uses.
Paul Tomblin
How could unicode affect the representation of a digit in a char variable?
David Sykes
@David, you're probably right. I was just thinking that there might be things in the Unicode character set that look like digits but aren't. But really, the problem is other character sets, since the standard does not guarantee that 0-9 are contiguous integer values.
Paul Tomblin
@Paul, actually the C standard does guarantee that. Paragraph 2.2.1 "In both the source and execution basic character sets, the value of each character after 0 in the above list of decimal digits shall be one greater than the value of the previous." So your answer is perfectly valid in a conforming C implementation.
Daniel Earwicker