views:

601

answers:

3

I have been asked to perform a performance test using SQL Server 2008. As part of this, I am comparing the speed of IDENTITY columns as PKs using INTs and BIGINTs. I have a simple routine to create 100,000 rows for each type and time the insert speed. The script looks like this:

SET NOCOUNT ON

CREATE TABLE TestData
(
    PK  INT IDENTITY PRIMARY KEY,
    Dummy INT
)

DECLARE @Rows INT
DECLARE @Start  DATETIME

SET @Rows = 100000
SET @Start = GETDATE()

WHILE @Rows > 0
BEGIN
    INSERT INTO TestData (Dummy) VALUES (@Rows)
    SET @Rows = @Rows - 1
END

SELECT @Start, GETDATE(), DATEDIFF(MS, @Start, GETDATE())

DROP TABLE TestData

For testing BIGINT identities, I use a very slightly modified version:

SET NOCOUNT ON

CREATE TABLE TestData
(
    PK  BIGINT IDENTITY PRIMARY KEY,
    Dummy INT
)

DECLARE @Rows INT
DECLARE @Start  DATETIME

SET @Rows = 100000
SET @Start = GETDATE()

WHILE @Rows > 0
BEGIN
    INSERT INTO TestData (Dummy) VALUES (@Rows)
    SET @Rows = @Rows - 1
END

SELECT @Start, GETDATE(), DATEDIFF(MS, @Start, GETDATE())

DROP TABLE TestData

To my surprise, the BIGINT version runs appreciably faster than the INT version. The INT version on my test kit takes about 30 seconds and the BIGINT about 25 seconds. Granted the test kit has a 64-bit processor. However, it is running 32-bit Windows and 32-bit SQL Server 2008.

Can anyone else recreate, deny, confirm or contest the results or point out if I have missed something?

A: 

Just a guess: did you ever try to test the BIGINT first and the INT afterwards? Database servers like to keep things in memory to optimize similar operations...

Stefan Steinegger
Nope, tried it both ways around, restarting the services in-between. Have you tried it and found opposing results?
BlackWasp
No I didn't try until now, it was really just a guess. I can try it at work.
Stefan Steinegger
I tried it on SqlServer 2005, it runs in exactly the same time.
Stefan Steinegger
Weird - must be my server.
BlackWasp
A: 

I tried that on mine SQL2008. INT takes 14sec. BIGINT takes 18sec.

A: 

To take it a step further, do the same thing with a VARCHAR, such as this:

SET NOCOUNT ON

CREATE TABLE TestData
(
    PK          VARCHAR(8) PRIMARY KEY,
    Dummy       INT
)

DECLARE @Rows   INT
DECLARE @Start  DATETIME

SET @Rows = 100000
SET @Start = GETDATE()

WHILE @Rows > 0
BEGIN
    INSERT INTO TestData (PK, Dummy) VALUES (CONVERT(VARCHAR(8), @Rows), @Rows)
    SET @Rows = @Rows - 1
END

SELECT @Start, GETDATE(), DATEDIFF(MS, @Start, GETDATE())

DROP TABLE TestData

I would expect this to be much slower, since the script is determining the "identity" column, and there are string conversions. Also, I made the VARCHAR(8) to match the number of bytes with a bigint. Yet, in my tests, this runs faster that the INT test from above.

What I take from this is that inserting records into an empty table will be pretty fast no matter what you throw at it. The implications of performance down the road, i.e. other indexes on the table, inserting rows when the table already has a lot of data, etc. is probably a much more important consideration.

khutch