Alright, this is probably gonna be a pretty simple question to answer. I haven't had a math class dealing with logarithms in a few years, so I apologize. So I have a USB Controller that I'm using to control the mouse on the screen with the left joystick. Now how this works right now is the controller returns a double between 0.00 and 1.00 depending on how far the push the joystick in the direction (0.00 center, 1.00 pushed all the way over). I'm using this to adjust the speed of the mouse movement by multiplying the returned value by a given speed (returned double * speed). This gives me a linear speed. But for the purpose of accuracy of the mouse and clicking things on screen, I'd like it to be more logarithmic, so as it's really slow when barely pushing, and then the speed increases logarithmically as you move the joystick farther. That way you can get good speed for moving across the screen, while also having good sensitivity when moving it slowly. So I just need help with the formula, as I'm sure it's pretty simple. Also, I'm working in Java. Right now my formula is:
double value (value given by controller)
int speed = 25;
value += value * speed;
I then use this to move the mouse. Thanks, Brayden