This is a bit messy, but it's simple and it works well:
First decide what intervals should be allowed between ticks. Because of the craziness of the time system (base 10? base 60? how long is a month?) there is no particulatly good algorithm to generate this list - just pick natural intervals that people are familiar with and hardcode it into your program:
...etc...
every 0.1 second
every 1 second
every 5 seconds
every 15 seconds
every 1 minute
every 5 minutes
every 15 minutes
every 1 hours
every 2 hours
every 4 hours
every 8 hours
every day, midnight
every 7 days, midnight
every month start
every quarter start
every year start
every 10 years
...etc...
Then given a specific axis width, and a specific time interval to display, simply iterate over your list calculating how many ticks it would produce and how close the ticks would be in pixels would be if you used that scale. This calculation be done using simple division. Choose the scale with the largest number of ticks with the condition that the ticks don't get too close. This naive algorithm should give perfectly adequate performance, but you could use a binary search rather than iterating over the whole list if you wish to optimize it (probably not worth the effort though).
It's a bit annoying, but I don't know of a better way unless you can find some library that will do this for you. I don't know of any library that offers this function, but there must be tons of open source projects that do something similar that you could grab code from if you don't want to write this yourself.