views:

450

answers:

14

Just speaking to a colleague of mine. He was walking with a hop in his step, on the way to the coffee machine.

I asked him "what's with the 'swarmy' walk?", he said, "I just reduced a two hour long query down to 40 seconds! It feels so good".

He altered a stored procedure, that was using cursors and introduced a temp table, that was refactored from the original dataset - I will email him soon, to get more info on actual implemetation.

But ultimately, he was buzzing.

Question is, what SQL, that sticks in your mind and has made you buzz, whilst optimising slow performing queries?

+1  A: 

Sorry, I don't tend to get a buzz from that sort of thing but most situations have been pretty basic, monitoring performance of queries and adding indexes to speed them up.

Now increasing the speed of "real" code that I've written by changing data structures and algorithms within the class, that's where I get my buzz (and reputation a the go-to man for performance fixes at work).

paxdiablo
"real" code as opposed to sql? That's mean.
Learning
related to the buzzing, I agree, when you have been doing it for a long time, the feeling becomes less.
Ferdeen
@Learning, wasn't meaning to disparage SQL, it's just that it's mostly composed of one-liner queries where optimization is fairly straight-forward. Large complicated code-bases ("real") give me more of a sense of achievement.
paxdiablo
@paxdiablo ...ahhh for the days of one-liner queries in SQL... :-) I think it very much depends where you are and what you do, but SQL can get as complicated as 'real' code very quickly. Too quickly!
Colin Asquith
+1  A: 

hey on the iphone which uses sqlite, i straight away reduced by database processing time from 40 seconds to 2 seconds with the use of exclusive write transactions... i was super happy doing this

as this was my first experience of sql on an embedded device - quite different from the usual server related stuff (indexes, normalizations, etc etc)

--- as far as servers go, indexes are real blessing. also if you take a bit of pain and get rid of as many nulls as you can in your table, you would be surprised with the performance gains - not many developers focus on nulls, they usually go with indexes and other documented stuff

few other lesser exploited ways - using xml to process multiple batch inserts / updates / deletes at 1 go instead of doing 1 insert at a time - in sql 2005 this can be super cool

Raj
+6  A: 

I have to say when I learned how to create and use covered indexes. Now, THAT was a performance booster.

Otávio Décio
+1 when used properly, these can offer amazing improvements in performance
Michael Haren
I did'nt know about covered indexes - I've just read an article about them and can't wait to try the technique! (I am the 'Colleague with the hop in my step' in Ferds original question ;-) )
Hugh Mullally
Yeah, they are great, just be careful with large tables as they can become quite enormous and do more harm than good if used improperly. Like all things performance, you need to test a representative workload before and after.
Michael Haren
+2  A: 

It's always nice to take a poorly written, cursor-laden query and eliminate cursors, cut the code by half, and improve performance many-fold.

Some of the best improvements are in clarity (and often result in nice performance boosts, too).

Michael Haren
Just thought of another question - when are cursors useful ? everyone seems to re-factor them into something more optimum.
Ferdeen
found this http://stackoverflow.com/questions/287445/why-do-people-hate-sql-cursors-so-much
Ferdeen
A: 

I answered this on a previous question ("Biggest performance improvement you’ve had with the smallest change?"), however, it's such a simple improvement, yet one that is and can be so often overlooked, that it bears repeating:

Indexes!

CraigTP
+3  A: 

Using SQL's BULKIMPORT to reduce several hours of inherited INSERT code to less than a minute.

David
A: 

It's all about indexes. And avoiding stupid things that make them useless.

unbeknown
A: 

Well we had a similar thing where we had a slow query on a Open Freeway site. The answer wasn't so much optimising the query, but to optimise the server that it was on. We increased the cache limit and cache size so that the server would not run the query so often.

This has massively increased the speed of the system and ultimately made the client happy! :)

Not quite the calibre of the original posts optimisation skills, but it definitely made us buzz!

Tisch
A: 

Splitting one ridiculously long stored procedure, which did a great deal of "if it's after 5 pm, return this bit of sql" and which took in excess of 20 seconds to run, into a set of stored procedures that were called by one controlling sp, and got the times down to subsecond responses.

Jonathan
A: 

Changing order of conditions inside WHERE clause so it filters the most discriminating condition first (while at the same time indexes from non-discriminating columns like gender are removed).

Dan
The query optimizer should do this for you.. what engine was this on?
Brimstedt
What DBMS are you using? You shouldn't need to manually re-order WHERE conditions if your optimiser is doing its job properly.
LukeH
MS SQL 2000. Anyway, don't see how engine can optimize this. If you want to shor-circuit evaluation of some expresion containing AND, you should write the code that does so. Even when not shor-circuiting, how server can know what field is more discriminatory if the fields are of the same type?
Dan
A: 

One Word, Dynamic Queries

If you serching with large numbers of parameters you can discount them from the SQL string. This has sped up my queries dramatically and with reletive ease.

Create PROCEDURE dbo.qryDynamic
( 

@txtParameter1 nvarchar(255),
@txtParameter2 nvarchar(255),

AS
SELECT     qry_DataFromAView.*
FROM         qry_DataFromAView
BEGIN

    DECLARE @SQL nvarchar(2500)
    DECLARE @txtJoin nvarchar(50)

    Set @txtJoin = ' Where '

    SET @SQL = 'SELECT     qry_DataFromAView.*
       FROM         qry_DataFromAView'

    IF @txtParameter1 is not null
    Begin
     SET @SQL=@SQL + @txtJoin + ' Field1 LIKE N''%'' + @dynParameter1 + N''%'') '
     Set @txtJoin = ' And '
    end


    IF @txtParameter2 is not null
    Begin
     SET @SQL=@SQL + @txtJoin + ' Field2 LIKE N''%'' + @dynParameter2 + N''%'') '
     Set @txtJoin = ' And '
    end

    SET @SQL=@SQL + ' ORDER BY Field2'


    Exec sp_executesql @SQL, N'@dynParameter1 nvarchar(255), @dynParameter2 nvarchar(255)',  @dynParameter1 = @txtParameter1 ,@dynParameter2 = @txtParameter2

END
GO
Paul
A: 

Back in the day, I worked on a CICS/DB2 system, written in COBOL. A lot of our queries were doing full table scans (and slow) even though we had all the proper indexes and WHERE clauses.

It turned out (and I may have this backwards, it's been 15 years) that the problem was that we were using PIC S9(n) COMP in WORKING STORAGE for the query parameters, but DB2 wanted PIC S9(n) COMP-3. By using the wrong data type, DB2 had to do a full table scan in order to convert the values in the database to the value we were passing in. We changed our variable definitions and the queries were able to use the indexes now, which dramatically improved our performance.

Patrick Cuff
A: 

I had a warm glow after being able to use a Cross Tab query to scrap oodles (technical term) of processing and lookups...

Usually it's simple things like adding indexes or only getting the data you need, but when you find a problem that fits an answer you've seen before... good times!

Colin Asquith
A: 

(Half way of topic)

I rewrote a 3000 line stored procedure into LINQ2SQL/C#. The stored procedure juggled lots of data between a bunch of unindexed temp tables. The LINQ2SQL version read the data into a couple of Dictionaries and ILookups and then I joined the data manually with plain old C# code.

The stored procedure took about 20 seconds and the LINQ2SQL/C# version took 0.2 seconds.

Albin Sunnanbo