views:

117

answers:

4

I am looking to get a random letter using something like

char ch = 'A' + randomNumber ;  // randomNumber is int from 0 to 25

But that gives "loss of precision" compilation error (same if randomNumber is only a byte). I guess with Unicode the above is a gross oversimplification. But how should I do it ?

This works but seems a bit clumsy:

char ch = "ABCDEFGHIJKLMNOPQRSTUVWXYZ".charAt(randomNumber);
+1  A: 

The problem is arising from trying to assign an int into a char.

Since an int is 32-bits and char is 16-bits, assigning an int can potentially lead to a loss of precision, hence the error message is displayed at compile time.

coobird
+1  A: 

If you know that you're going to be in the appropriate range, just cast:

char ch = (char) ('A' + randomNumber);
Jon Skeet
Thanks, this seems to be the cleanest way to get what I want so it is obvious at a glance what it is doing.
Griff
+5  A: 

char ch = (char) (new Random().nextInt('Z' - 'A' + 1) + 'A')

You may replace 'A' and 'Z' by any character you want to achieve a wider range.

Christian Strempfer
A: 

How about this? Ugly casting but no compilation errors. Should generate a random capital letter:

int rand = (int) (Math.random() * 100);
int i = 65 + (rand % 26);
char c = (char) i;
System.out.println(c);
Michael Easter