views:

123

answers:

3

I want to write a function which randomizes the order of a sequence of alphabetic characters. For example, the sequence:

A B C D E F G . . .

...might be changed to:

Z L T A P ...

...which, if passed to the same function again could result in:

H R E I C ....

Any suggestions?

A: 

This sounds like homework, but either way:

http://stanford.edu/~blp/writings/clc/shuffle.html

David
I forget to say this is not homework ,I am just practising
gcc
I've always wondered about doing this only n times, where n is the size of the array, just doesn't seem randommy enough.
tomdemuyt
Ya, this particular algorithm might not be the best for small arrays. But then, humans have a demonstrably different definition of "random" than the universe does, so YMMV :) (I'm trying to find an article I recently read on the subject, but thus far to no avail.)
David
your text is not best way because algorithm depends on random() function
gcc
@gcc: You've accepted the answer with the Fisher-Yates algorithm. In what way does it not depend on a random() function?
Secure
A: 

You mean randomise the alphabet? I wrote something similar in PHP a few days ago. The logic was the following:

  1. Let S1 be a string containing the alphabet characters "ABC...XYZ".
  2. Let S2 be an empty string.
  3. While strlen(S1) > 0, choose a random character C from S1. Append C to S2 and remove C from S1.
  4. Return S2.

The result is a randomly shuffled set of characters, created with minimal CPU load (if the string has 26 characters, the inner loop only needs 26 iterations).

mingos
your algorithm may fail because ,in some moment, function will send the same alphabe to you,how will you understand that old one not changed
gcc
Sorry for replying so late. The chance of it outputting an unchanged string are 1:403,291,461,126,605,635,584,000,000. I don't even know how such a large number is called. You can safely assume that even if your computer continually generated new random permutations of this string at a rate of 1 billion per second, you would never hit a duplicate in the CPU's entire life span. At this rate, assuming the permutations never repeat, the computer would still need to run for 12,779,535,234 years (if my algebra isn't off) non stop to deplete the possibilities.
mingos
+3  A: 

Have a look at the Fisher-Yates shuffle algorithm, and in particular the modern version of it.

Mark Byers