Hi there,
I've finally gotten my insert batch to work and now I've been fiddling with the size of the batch, but I can't see any difference in performance between a value of 50 and a value of 10000. This seems very odd to me, but I don't know what's happening behind the scene, so it might be normal behavior.
I'm inserting 160k rows into a table and the mean time for my tested values is 115 +/- 2 secs. Without batching it takes 210 secs, so I'm quite satisfied with the improvement. The target table is:
CREATE TABLE [dbo].[p_DataIdeas](
[wave] [int] NOT NULL,
[idnumber] [int] NOT NULL,
[ideaID] [int] NOT NULL,
[haveSeen] [bit] NOT NULL CONSTRAINT [DF_p_DataIdeas_haveSeen] DEFAULT ((0)),
CONSTRAINT [PK_p_DataIdeas] PRIMARY KEY CLUSTERED
(
[wave] ASC,
[idnumber] ASC,
[ideaID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON
) ON [PRIMARY]
) ON [PRIMARY]
I read What to look for when setting UpdateBatchSize and the answer was to simply test a couple of different values. I can understand that, but shouldn't it be possible to calculate or at least guesstimate a good value if you know the table design, the SQL question and the data that's about to be inserted?
Are there any best practices out there that someone can recommend?