I am writing a test tool which places a large amount of load on a network service. I would like this tool to start with little load and gradually increase over time. I am sure that there is some triganometry which can do this sort of calculation in one line of code but I am not a math guru (yet). Is there some sort of library (or simple algorithm) which can help with this calculation?
The code would ideally take a few arguments:
- algorithm to use (determine how quickly the value increases
- starting value
- ending value (maximum)
- time (amount of time between starting and ending value)
- step (granularity in milliseconds)
So every [step] an event would be raised indicating what the value is at that point in time.
This is an ideal implementation though, so I am open to suggestion.
Any input would be greatly appreciated, thank you :)
EDIT:
Let me be more clear ... the amount which the value increases is not linear, it is a curve.