I guess this is a math question and not a programming question, but what is a good way to create a formula that has diminishing returns?
Here are some example points on how I want the curve to look like.
f(1) = 1
f(1.5)= .98
f(2) = .95
f(2.5) = .9
f(3) = .8
f(4) = .7
f(5) = .6
f(10) = .5
f(20) = .25
Notice that as the input gets higher, the percentage decreases rapidly. Is there any way to model a function that has a very smooth and accurate curve that says this?
Another way to say it is by using a real example. You know in Diablo II they have Magic Find? There are diminishing returns for magic find. If you get 100%, the real magic find is still 100%. But the more get, your actual magic find goes down. So much that say if you had 1200, your real magic find is probably 450%. So they have a function like:
actualMagicFind(magicFind) = // some way to reduced magic find