I want to protect my website from site copiers. I have looked at Ajax Toolkit NoBot, but unfortunately it is not meeting my requirements.
Below are my requirements.
- only 0.5% pages will have post-backs, rest of pages looks like static pages. so detection should happen at initial request, not at post-back.
- same time i want allow search engine crawler. what is the best way to detect search-bots? user agent is not right way?
And also is it possible to obfuscate page content by padding extract words(my site url, etc) in the middle of content and those words will not displayed my website. But the these padded words should not be removed easily by using either JQuery(client-side)/HTMLDocument(server-side) coding.
Any abstract idea also welcome.
If your answer is no, please do not answer. Suggest me if any possible ways are there.