views:

63

answers:

2

I have a large link database, that I would want to protect against others who would want to copy them. Is there anything I can do other than force people to enter a CAPTCHA before each link?

+1  A: 

you can output the links using ROT13, and then use javascript to put them back to normal. this way, the scrapers must support javascript in order to steal your links, which should cut down on the number of eligible scrapers

bonus points: replace ROT13 with something harder, and obfuscate your 'decode' javascript.

mkoryak
what would make it really hard to crack is when you alternate between several encode/decode algorithms. So sometimes, the URL would be ROT13, other times just base64 or something else still. Unpredictable encodings are hard to recognize, while you always know which method you are using.
Teun D
+1  A: 

The javascript suggestion could work, but you would render your page inaccessible to those using assistive technologies like screen readers as well as anyone without javascript.

Another possible option would be to generate a cryptographic nonce. This technique is currently used to protect against CSRF attacks, but could also be used to ensure that the scraper would have to request a page from your site before accessing a link. This approach may not be appropriate if you support hotlinking, but if you just want to make sure that someone went to your site first, it could work.

Another somewhat ghetto option would be use referrers. These can be easily faked, but it might prevent some of the dumber scrapers. This also requires that you know where your users came from before they hit your site.

Can you let us know if you are hotlinking or if the user comes to your site before going to the protected link? We might be able to provide better advice that way.

Jarret R
Users will always come from my site before going a link.
Yegor