tags:

views:

4500

answers:

7
+4  A: 
int i = 32;
char c = (char)i;
John JJ Curtis
this doesn't work it shows weired characters
Yassir
Shows? So is this converting or displaying? This is the perfect answer for your current question, so you need to edit your question telling us what is is you're really looking for.
GMan
@GMan Good point!
Secko
I guess the question was edited later on, so the accepted answer looks to be correct now.
John JJ Curtis
+3  A: 

Try looking up sprintf. Also, for numbers less than 8 bits resolution or 256, you can cast it explicitly from int to char.

stanigator
+1  A: 

Why doesn't this work for you? (assuming C is the language in question)?

char c;

int x;

...

c=(char)x;

Ira Baxter
A: 

There's the good old:

int n = 123;

char c[20];
sprintf(c, "%d", n);

It's nasty because how long should the array c be? But it's extremely common in C code in the wild.

Daniel Earwicker
"how long should the array c be?" (sizeof(int)*CHAR_BIT-1)/3 + 3 is long enough. But I do wonder whether there should just have been a constant for it somewhere
Steve Jessop
+4  A: 

Are you looking for the char equivalent of a single digit number? For example, converting 5 to '5'? If so then you can do the following, assuming of course the char is 9 or less.

char dig = (char)(((int)'0')+i);
JaredPar
the character codes for '0' through '9' are guaranteed to be sequential and properly ordered by the ISO C standard.
Daniel Earwicker
@Earwicker, thanks (updated answer).
JaredPar
Remember, in C, the only difference between char and int is that one is 8-bit and one is 32-bit. Thus, it can simply be written: `char dig = '0'+i;`
Zarel
That's an assumption. Integers don't have to be 32-bit.
GMan
What this has to do with bit widths? I think if both char and int has the same width, then casting to `int` won't guarantee success either. In fact, it's the case when both char and int have the *same* width, when `'0'+i` is more portable (because it will automatically convert the `'0'` to `unsigned int` then, instead of risking an overflow when casting to `int`). But srsly, i've never seen this to be the case, and i expect it never to be the case (for this, `char` and `int` has to have the same width, and the code for '0' has to overflow int (char would be unsigned) - weird!).
Johannes Schaub - litb
@litb Although it may never be the case, the casts increase the probability of failure from zero to something greater than zero. That should always be avoided -- especially when it involves so much extra typing.
Dingo
A: 

If you dont want to use any of the other answers, you can try:

int intNum = 1;
char chString[2];

//convert to string
itoa(intNum, chString, 10); //stdlib.h
Secko
Are 2 characters really enough?
GMan
@GMan In this case yes. Anyway it's just an example.
Secko
+2  A: 

In response to your edit, just use printf and specify an integer:

int i = 10;
printf("%d", i);

The reason for this is 0 represents the number 0. The ASCII character "0" is not at zero, but at 48.

This is why Jared's Answer will work: char dig = (char)(((int)'0')+i);

It takes "0" (which is 48), and adds your number to it. 0 + 48 is 48, so 0 becomes "0". If you are converting a 1, it will be 48 + 1, or 49, which corresponds to "1" on the ASCII chart.

This only works for numbers 0 through 9, as "10" is not an ASCII character, but rather "1" followed by a "0".

GMan