tags:

views:

314

answers:

7

My Application can perform 5 business functions. I now have a requirement to build this into the licensing model for the application.

My idea is to ship a "keyfile" with the application. The file should contain some encrypted data about which functions are enabled in the app and which are not. I want it semi hack proof too, so that not just any idiot can figure out the logic and "crack" it.

The decrypted version of this file should contain for example:

   BUSINESS FUNCTION 1 = ENABLED
   BUSINESS FUNCTION 2 = DISABLED.... etc

Please can you give me some ideas on how to do this? Thank you in advance.

A: 

ROT-13!

Edit:

ROT-13 is a simple substitution cipher in which each letter is substituted by the letter 13 letters before it in the alphabet. (NOTE: alternatively, you can use the ascii-value 13 less than the given char to support more than [ A-Z0-9]).

For more info see wikipedia.

tster
ROT-* is neither secure nor is it something which could help the poster because you didn't provide additonal information of what ROT is or how to implement that in c#.
tobsen
For the record, using ROT-13 would stop probably 99% of users from being able to give themselves free access to your features and would be the fastest to implement, saving you the most money. Unless this is big-buck software, ROT-13 on this kind of thing would be sufficient.
tster
Yes please, tell us what Rot-13 is please?
JL
Ok, while its true ROT-13 is not the most secure mechanism on earth, the answer does not deserve to be voted down, its still a valid option for requirements that don't require hefty security. Still better afterall that plain text.... and still bound to confuse a few packet sniffers.
JL
+1  A: 

You could achieve this fairly easily using Rijndael, however the problem is the fact that the code will contain your Key in your current design. This basically means someone will disassemble your code to find the key and boom, goodbye protection. You could slow this process by also obfuscating your code, but again, if they want to get it, they will get it.

However, this aside, to answer your question, this code should work for you:

http://www.dotnetspark.com/kb/810-encryptdecrypt-text-files-using-rijndael.aspx

Kyle Rozendo
You should salt the key with some other data so that the actual encryption key used by the app is different for each machine or user. Otherwise, if someone pays for all the features they can just give their key file to everyone and now everyone has the features.
tster
A: 

Use any 'Cryptography' method to implement this. Just check out the namespace 'System.Security.Cryptography'

The above namespace providing many encryption and decryption functions to protect secret data.

You have another method to implement this using registry. You can store data in windows registry. Better to encrypt data before store into registry.

Anoop
JL can't really ship the registry with his application.
MusiGenesis
+4  A: 

While it could definitely be done using Rijndael, you could also try an asymmetric approach to the problem. Require the application to decrypt the small settings file on start up using a public key and only send them new configuration files encrypted using the private key.

Depending on the size of your configuration file, this will cause a performance hit on startup compared to the Rijndael algorithm, but even if the client decompiles the program and gets your public key its not going to matter in regards to the config file since they won't have the private key to make a new one.

Of course none of this considers the especially rogue client who decompiles your program and removes all the checking whatsoever ... but chances are this client won't pay for your product no matter what you do thus putting you in a position of diminishing returns and a whole new question altogether.

Streklin
+2  A: 

Probably the easiest secure solution is to actually use online activation of the product. The client would install your product, enter his key (or other purchase identification -- if you purchase online this could all be integrated, if you are selling a box, the key is more convenient).

You then use this identification to determine what features are available and send back an encrypted "keyfile" (as you term it), but also a custom key (it can be randomly generated, both the key and key file would be stored on your server -- associated with that identification).

You then need to make sure the key file doesn't work on other computers, you can do this by having the computer send back it's machine ID and use that as added salt.

Adam Luter
+2  A: 

I've been pondering using custom built assemblies for the purpose of application licensing. The key file approach is inherently flawed. Effectively, it's a bunch of flags saying "Feature X is enabled, Feature Y is not". Even if we encrypt it, the application will have all the functionality built in - along with the method to decrypt the file. Any determined hacker is unlikely to find it terribly hard to break this protection (though it may be enough to keep the honest people honest, which is really what we want).

Let's assume this approach of encrypted "Yay/Nay" feature flags is not enough. What would be better is to actually not ship the restricted functionality at all. Using dynamic assembly loading, we can easily put just one or two core functions from each restricted feature into another assembly and pull them in when needed. These extra "enablement" assemblies become the keyfiles. For maximum security, you can sign them with your private key, and not load them unless they're well signed.

Moreover, for each customer, your build and licensing process could include some hard to find customer specific data, that effectively ties each enablement assembly to that customer. If they choose to distribute them, you can track them back easily.

The advantage of this approach over simple Yay/Nay key files is that the application itself does not include the functionality of the restricted modes. It cannot be hacked without at least a strong idea of what these extra assemblies do - if the hacker removes their loading (as they would remove the keyfile), the code just can't function.

Disadvantages of this approach include patch release, which is somewhat mitigated by having the code in the keyfile assemblies be simple and compact (yet critical). Custom construction of an assembly for each customer may be tricky, depending on your distribution scenario.

Adam Wright
+1  A: 

I find Perforce-style protection scheme easiest to implement and use, while at the same time being quite hack-proof. The technique uses a plain text file with a validation signature attached at the last line. For example:

----(file begin)
key1: value1
key2: value2
expires: 2010-09-25
...
keyN: valueN
checksum: (base64-encoded blob)
---- (file end)

You would choose an assymetric (public/private key) encryption algorithm + hashing algorithm of your choice. Generate your reference public/private key pair. Include the public key in your program. Then write a small utility program that will take an unsigned settings file and sign it - compute the digital signature for the contents of the file (read settings file, compute hash, encrypt this hash using private key) and attach it (e.g. base64-encoded) as "checksum" in the last line.

Now when your program loads the settings file, you would read the embedded public key and validate the digital signature (read file contents, strip the last line, compute hash; compare this value against checksum from last line base64 decoded and run through the assymetric decryption using embedded public key). If the validation succeeds, you know the settings file has not been tampered with.

I find the advantages to be that the settings are in plain text (so for example the customer can see when the license expires or what features they paid for), however changing even a single character in the file with result in the digital signature check failing. Also, keep in mind that you are now not shipping any private knowledge with your program. Yes, the hackers can reverse-engineer your program, but they will only find the public key. To be able to sign an altered settings file, they will have to find the private key. Good luck doing that unless you're a three-letter agency... :-).

Milan Gardian