I have a text file:
DATE 20090105
1 2.25 1.5
3 3.6 0.099
4 3.6 0.150
6 3.6 0.099
8 3.65 0.0499
DATE 20090105
DATE 20090106
1 2.4 1.40
2 3.0 0.5
5 3.3 0.19
7 2.75 0.5
10 2.75 0.25
DATE 20090106
DATE 20090107
2 3.0 0.5
2 3.3 0.19
9 2.75 0.5
DATE 20100107
On each day I have:
Time Rating Variance
I want to work out the average variance at a specific time on the biggest time scale.
The file is massive and this is just a small edited sample. This means I don't know the latest time and the earliest time (it's around 2600) and the latest time may be around 50000.
So for example on all the days I only have 1 value at time t=1, hence that is the average variance at that time.
At time t=2, on the first day, the variance at time t=2 takes value 1.5 as it last until t=3, on the second day it takes value=0.5 and on the third day it takes value ((0.5+0.18)/2). So the avg variance over all the days at time t=2 is the sum of all the variances at that time, divided by the number of different variances at that time.
For the last time in the day, the time scale it takes is t=1.
I'm just wondering as to how I would even go about this.
As a complete beginner I'm finding this quite complicated. I am a Uni Student, but university is finished and I am trying to learn Java to help out with my Dads business over the summer. So any help with regards to solutions is greatly appreciated.