What sort of suggestions are you looking for exactly? efficiency? correctness? You do mention unit testing ... I think there could definitely be an improvement there.
I actually helped develop an online game and their shuffling mechanism. I don't really suspect performance is much of an issue, as most algorithms you find are by and large the same. I would suggest the following however,
a. create a random interface
public interface IRandom
{
byte NextRandomByte ();
}
Anything that now consumes this interface can now be mocked\unit tested in a controlled manner or environment. You do not really want to be unit testing truly random algorithms - you won't be able to verify your data!
As for why return a byte, a byte is likely the smallest unit of randomness you could want. Not only that, but if given a means of generating a single random byte, generating a sequence of them and concatenating them together is an easy way of generating an even wider range of random data.
Of course, you will have to be wary of introduing bias to your data ...
b. Ensure quality of data by reducing bias over arbitrary intervals. Assuming underlying data is uniformly random, any interval that is NOT a factor of 256 will introduce bias. Consider this,
// 250 is not a factor of 256!
byte a = random.NextRandomByte () % 250; // values 0-5 are biased!
In the preceeding snippet, values 0-5 have a 2/255 probability to come up, while values 6-249 have a 1/255 probability to come up. That is a significant bias over time. One approach is to check the number coming from the generator, and discard it if it exceeds an acceptable range
// continually generate random data until it is satisfactory
for (byte r = random.NextRandomByte (); r > 250; r = random.NextRandomByte ())
{
}
byte a = r % 250; // r is guaranteed to be on [0, 250], no longer bias
"Acceptable range" may be determined by finding the greatest multiple of your interval that can be represented by your value type. A more generalized form
byte modulo; // specified as parameter
byte biasThreshold = (byte.MaxValue / modulo) * modulo;
for (; unbiasedValue >= biasThreshold; )
{
// generate value
unbiasedValue = random.NextRandomByte ();
}
And if you want values greater than byte, simply concatenate the values together,
int modulo; // specified as parameter
int biasThreshold = (int.MaxValue / modulo) * modulo;
for (; unbiasedValue >= biasThreshold; )
{
// generate value
byte a = random.NextRandomByte ();
byte b = random.NextRandomByte ();
...
int unbiasedValue = a << 24 + b << 16 + c << 8 + d;
}
c. Consume! Place your algorithms or helpers in stateless extension or static classes, like
// forgive my syntax, recalling from memory
public static class IRandomExtensions
{
public int GetUnbiasedInteger (this IRandom random, int modulo) { }
public int GetUnbiasedUnsignedInteger (this IRandom random, uint modulo) { }
public int GetUnbiasedLong (this IRandom random, long modulo) { }
public int GetUnbiasedUnsignedLong (this IRandom random, ulong modulo) { }
...
}
public static class IEnumerableExtensions
{
public IEnumerable<T> Shuffle<T>(this IEnumerable<T> items, IRandom random)
{
// shuffle away!
...
}
}
Deciding whether or not to implement these as methods on your interface or as external methods [as i've done] is up to you - but keep in mind, making them member methods forces implementors to repeat or duplicate code. Personally, I like extensions. They are very clean. And sexy.
int randomNumber = random.UnbiasedInteger (i - 1);
List<int> shuffledNumbers = numbers.Shuffle (random);
Clearly all of the preceeding is optional, but facilitates unit testing and improves overall quality of your random data.
Random and "fair" dice is a very interesting topic in general. If you are at all interested, I strongly recommend you Google it sometime and perform some research. :)