This can be done in linear time, by using two pointers (Left and Right) traversing the sorted list of events. Actually, the traversing is done on a sorted list of call events, containing both call start events and call end events.
The left pointer starts from the leftmost start event. The right pointer moves to the latest event not later than (left ptr time + 1hr). We also maintain a sum variable that calculates the total of all call durations in the interval.
At each iteration we move the left pointer to the next event (updating the sum accordingly), and then we progress with the right pointer (updating the sum, again) until its end position (little less than 1 hr later).
The maximum value of sum is obtained at the busy hour window.
I didn't give details on how to update the sum variable when moving the pointers, but I believe it should not be complicated to do it in constant time.
The advantage of this algorithm is that it does not have the granularity problem, and that it is linear in the number of events.
--EDIT--
Actually this is an O(N*Log N) algorithm, since the input table does not provide the sorting of events that I assumed. We need to generate the sorted lists of events first