views:

77

answers:

2

Hello All,

I recently built a query in SQL that I can use to look at our payment log and find out an average number of payments made per hour over a period of time. I'm sure there are third party reporting applications that are much more suited for doing the type of calculation I'm trying to do, but just for fun, I'm kind of curious to see what other methods you all might use (in T-SQL) to get the data I'm looking for. I know it's kind of a dumb question, so I won't be offended if you down vote me. But, for anyone else out there that is bored like me, feel free to post your other solutions.

The table is set up with a [CreateDate] column which is the DATETIME value of when the payment posts to the account. I use this to determine which hour they made the payment in.

CREATE TABLE #TempTimes
(
    [Time] DATETIME,
)

DECLARE @numDays INT
SET @numDays = 10
DECLARE @time DATETIME
SET @time = DATEADD(dd, DATEDIFF(dd, 0, GETDATE() - @numDays), 0)

WHILE @time < GETDATE()
BEGIN
    INSERT #TempTimes
    VALUES (@time)

    SET @time = DATEADD(hour, 1, @time)
END
GO
/*
I have to join in a table with all of the hours for the time span I'm querying against, 
because otherwise, the AVG function won't calculate in the hours where no payments were made. 
*/

SELECT DATEPART(hour, [Time]) [Hour], AVG(CAST([Count] AS DECIMAL)) [Average]
FROM 
(
    SELECT [Time], CASE WHEN [Count] IS NULL THEN 0 ELSE [Count] END [Count]
    FROM #TempTimes tt
    LEFT JOIN 
    (
     SELECT DATEADD(hour, DATEDIFF(hour, 0, [CreateDate]), 0) [hour], COUNT(*) [Count]
     FROM [dbo].[PaymentLog]
     GROUP BY DATEADD(hour, DATEDIFF(hour, 0, [CreateDate]), 0)
    ) t1 ON t1.[hour] = tt.[time]
) t2
GROUP BY DATEPART(hour, tt.[Time])
GO

DROP TABLE #TempTimes
GO

CJ

+1  A: 

IIUC you are filtering from the main table using a join with the temp table.

Why not drop the join and just use a where clause? If you want to force all the hours to show up, do the join after the the rest of the work or use a union of some kind.

BCS
Good point. I could UNION the table and then do a SUM([Count]) and GROUP BY on all the rows later to get the data together. That would probably be less work for the proc.
regex
+1  A: 

I think this is a reasonably nifty method. It avoids the use of an Hours table (or temp table like you're using). The downside is that it only shows values for hours where a payment was made. Let the front end put in all the 0s. :)

CREATE TABLE dbo.Payments
(
    payment_id INT IDENTITY NOT NULL,
    create_date DATETIME  NOT NULL,
    CONSTRAINT PK_Payments PRIMARY KEY CLUSTERED (payment_id)
)
GO

INSERT INTO dbo.Payments (create_date)
SELECT '2009-02-25 12:00:00.000' UNION
SELECT '2009-02-25 12:45:00.000' UNION
SELECT '2009-02-25 12:30:00.000' UNION
SELECT '2009-02-25 13:10:00.000' UNION
SELECT '2009-02-25 13:22:00.000' UNION
SELECT '2009-02-25 14:09:00.000' UNION
SELECT '2009-02-25 14:40:00.000'
GO

SELECT
    CAST(CONVERT(VARCHAR(13), T1.create_date, 121) + ':00:00.000' AS DATETIME),
    COUNT(T3.payment_id) + 1
FROM
    dbo.Payments T1
LEFT OUTER JOIN dbo.Payments T2 ON
    T2.create_date >= CAST(CONVERT(VARCHAR(13), T1.create_date, 121) + ':00:00.000' AS DATETIME) AND
    T2.create_date <  T1.create_date
LEFT OUTER JOIN dbo.Payments T3 ON
    T3.create_date >= T1.create_date AND
    T3.create_date <  DATEADD(hh, 1, CAST(CONVERT(VARCHAR(13), T1.create_date, 121) + ':00:00.000' AS DATETIME)) AND
    T3.payment_id <> T1.payment_id
WHERE
    T2.payment_id IS NULL
GROUP BY
    CAST(CONVERT(VARCHAR(13), T1.create_date, 121) + ':00:00.000' AS DATETIME)
GO
Tom H.
Using string manipulation as a proxy for DateTime manipulation is Extremely inefficient...
Dems
You could certainly use other methods to truncate off the minutes/seconds/milliseconds. This was just the easiest to code and understand imo. The basic idea is still the same though.
Tom H.
Very clever indeed. However your COUNT(*) + 1 will not work if there is only one payment made for that hour. Then it shows up as 2 :( I tried to find a way to make it work, but the query is just too impossible. Thanks for giving me something to do for the rest of the day though :)
regex
I just corrected it to handle that situation.
Tom H.