If the URLs don't change very often, you can come up with a somewhat-complicated job that you could run periodically (nightly?) which would assign integers to each record based on the different sites present.
What you can do is write a routine that parses the domain out from a URL (you should be able to find a snippet that does this nearly anywhere).
Then, you create a temporary table that contains each unique domain, plus a number.
Then, for every record in your URLs table, you look up the domain in your temp table, assign that record the number stored there, and add a large number to that temp table's number.
Then for the rest of the day, sort by the number.
Here's an example with the five records you used in your question:
URLs:
Temp table:
example.com 1
stackoverflow.com 2
perl.org 3
Then for each URL, you look up the value in the temp table, and add 3 to it (because it's got 3 distinct records):
Iteration 1:
URLs:
http://www.example.com/some/file 1
http://www.example.com/some/other/file NULL
http://stackoverflow.com/questions/ask NULL
http://stackoverflow.com/tags NULL
http://use.perl.org/ NULL
Temp table:
example.com 4
stackoverflow.com 2
perl.org 3
Iteration 2:
URLs:
http://www.example.com/some/file 1
http://www.example.com/some/other/file 4
http://stackoverflow.com/questions/ask NULL
http://stackoverflow.com/tags NULL
http://use.perl.org/ NULL
Temp table:
example.com 7
stackoverflow.com 2
perl.org 3
et cetera until you get to
http://www.example.com/some/file 1
http://www.example.com/some/other/file 4
http://stackoverflow.com/questions/ask 2
http://stackoverflow.com/tags 5
http://use.perl.org/ 3
For a lot of records, it's going to be slow. And it will be difficult to work with many inserts/deletions, but the result will be a flawless round-robin ordering.