One section of our website presents a paged randomized content. The first time a new user visits the site, she is assigned a new random seed which passed in URL's and for persistence stored also in cookies. The problem is that the seed in URL's confuses Googlebot (and other indexing services); it complains that there are too many URL's pointing to the same content. It would be possible for us not to pass the seed in URL's, but still even if we use only cookies, it seems to me that there at some point we would have to make a decision whether the visitor is an indexing spider or a human in order to present the content in a non-randomized fashion.
My main question is: How bad would be in this case to detect the most common indexing spiders, and serve them the content in a non-randomized fashion? I know that the number one rule of search optimization is to not to optimize and, if anything, optimize for users and make sure that the content is the same for everybody. But in this case, we would not be actually changing the content or hiding anything.
Has anybody been facing the same problem? What are the best practices to deal with this issue?