I am looping through all my databases and aggregating the results into an aggregates database.
In my loop I call
master.dbo.xp_cmdshell osql C:\whatever.SQL
As the loop progresses, the cmdshell takes longer and longer to execute. If I stop the loop and run a single aggregate for one database it executes quickly.
Is there anything I can add to my external SQL script to make it run faster? Maybe something to commit and free the records before the next loop? Or should I add some kind of a pause after every loop?
I want to use an external SQL file because it contains many update statements and it's more manageable for me.
Here's how I loop:
Update dbFoo.dbo.tblBar set Processed = 0 Go WHILE EXISTS ( SELECT ID FROM dbFoo.dbo.tblBar WHERE Processed = 0) BEGIN SELECT @aRow = MIN(tblBar.ID) FROM dbFoo.dbo.tblBar SELECT @aFoo1 = Foo1 FROM dbFoo.dbo.tblBar WHERE ID = @aRow SELECT @aFoo2 = Foo2 FROM dbFoo.dbo.tblBar WHERE ID = @aRow SELECT @aFoo3 = Foo3 FROM dbFoo.dbo.tblWhatever WHERE Foo = @aFoo EXEC RunPreAgg @Foo1 = @aFoo1, @Foo2 = @aFoo2, @Foo3 = @aFoo3, @RetVal = @aRetVal OUTPUT SELECT returning = @aRetVal UPDATE dbFoo.dbo.tblBar SET Processed = 1 WHERE ID = @aRow END
Then the RunPreAgg stored procedure basically does this:
if db_id('db' + @Foo1 + '_' + @Foo2) is not null
BEGIN
--This bat file creates the SQL File
select @sql = 'master.dbo.xp_cmdshell '''+@path+'wwwRunPreAgg.bat ' + @Foo1 + ' ' + @Foo2 + ' ' + @Foo3 + ''''
exec( @sql )
--execute
select @sql = 'master.dbo.xp_cmdshell ''osql -E -o '+@path+'output\tmp'+@Foo1+'_'+@Foo2+'.txt -i '+@path+'tmp' + @Foo1 + '.SQL'''
exec( @sql )
--This erases the SQL File
select @sql = 'master.dbo.xp_cmdshell '''+@path+'wwwCleanup.bat ' + @Foo1 + ' ' + @Foo2 + ''''
exec( @sql )
Set @retval = 'Done!'
END
ELSE
BEGIN
Set @retval = 'Err: No DataBase'
END
The variable names are changed to protect the innocent. The code works fine, I just need to optimize.