I need to find the total timings for the execution of a program over different inputs. The program reads some data and writes it into another file. The values of the data value and the size of the data are different every time.
I want to find how long it will take in general for all size of data.
Is the algorithm for finding this based on the total timings of the program for a single execution?
For Example, if I know
for single execution
a.program - execution time 1.2sec
- its create file 100 kb file
Can I find out how long it will take for n executions, on a different data size?