views:

45

answers:

2

I know that all of the domain aggregate functions are slow, but I am looking for the least of the evils.

Which will be the faster of the following two options?

  1. Create a query that will group by the unique value and sum the total. Have dlookup to get my sum from the query.
  2. Do a dsum on the main table with the criteria being the same as my GROUP BY columns in option 1?

Edit 1
My question is a generic question that I have wondered in many situations but below is a specific situation under consideration right now.

On my time entry detail subform I need to show how much materials have been used on this job. The control source for the textbox looks like this:

=DLookUp("[SumOfPrice]","tm_materialsUsedByPart","[Part]=" & [cmbPart])

tm_materialsUsedByPart is currently a query that is summing up all the materials used.

Would it be faster to use a dsum that have esentially the same criteria or is it faster to execute the query then grab the single row that matches my criteria?

+1  A: 

There are two other options: correlated subquery or derived table. A correlated subquery would entail declaring a column in your query to itself be a query:

Select ...
 , ( Select Sum(Foo)
     From Bar
     Where Bar.FK = Gamma.PK ) As Total
From Gamma

A derived table, in Access, would involve creating a saved query that does all the totals and then joining that query into your main query. Although it is possible to build the totals query on the fly in the main query, my experience is that Access is happier with a saved query and a standard join. If you show us more about what you are trying to accomplish, you might get some more specific answers.

EDIT

IMO, the fastest solution would be to include the total, via a saved query, in the source for the form. I.e., not use either DLookup or DSum but instead include the total as part of the form's RecordSource by joining to a query that calculates the total.

Second, to actually know which function will perform best, you would need to do some performance tests against your data. My guess is that the performance will be comparable between the two. Even with DLookup against a stored query, Access should be smart enough to inject the filtering criteria and in effect get a comparable execution plan to using DSum.

Lastly, if you are going to use a domain aggregate function and the only purpose of the query used in DLookup is to serve totals for this form, then IMO, it makes more sense to use DSum since it makes your intent for the use of the domain aggregate function clearer to the reader.

Thomas
I added edit 1. Hopefully this will make it more clear what I am asking.
Icode4food
I can't add the field to the recordsource of the form because the form data needs to be editable. Having a GROUP BY query in the records source will not allow the data to be editable.
Icode4food
@jeffreymb - You do not need a group by in the record source. If you use a saved query, only the saved query will have the group by. The main query will not. If you use a correlated subquery in the record source, only it will have a group by and not the main query.
Thomas
+1  A: 
=DLookUp("[SumOfPrice]","tm_materialsUsedByPart","[Part]=" & [cmbPart])

Without seeing the SQL for tm_materialsUsedByPart, we can only guess at what it's doing. Maybe it reads an entire table, or set of JOINed tables, and uses a GROUP BY to aggregate Sum(Price) as SumOfPrice.

It should be faster to DSum against the underlying table, particularly if the [Part] field is indexed.

=DSum("[Price]", "tblRowSource", "[Part]=" & Me.cmbPart)

That way you ask the database engine to only read the rows which contain the [Part] value you want. Your materialsUsedByPart query requires reading all the rows, then you extract the single group value you want. Don't do that. Ask the database engine to read the fewest rows possible to get you the information you need.

Edit: I was wrong that DSum against the base table would be faster than DLookup against an aggregate query. As @Thomas suggested, the query plan is the same for both cases in my simple test.

Here is my query, qryMinutesPerClient:

SELECT Time_Sub.CLIENT_ID, Sum(Time_Sub.MINUTES) AS SumOfMINUTES
FROM Time_Sub
GROUP BY Time_Sub.CLIENT_ID;

Then

DLookup("SumOfMINUTES","qryMinutesPerClient","CLIENT_ID = 11111")

results in this query plan:

- Inputs to Query -
Table 'Time_Sub'
- End inputs to Query -

01) Restrict rows of table Time_Sub
      using rushmore
      for expression "Time_Sub.CLIENT_ID=11111"
02) Group result of '01)'

And with

DSum("MINUTES","Time_Sub","CLIENT_ID = 11111")

the plan is the same:

- Inputs to Query -
Table 'Time_Sub'
- End inputs to Query -

01) Restrict rows of table Time_Sub
      using rushmore
      for expression "CLIENT_ID=11111"
02) Group result of '01)'

The most important performance issue involves "using rushmore", which is possible because my CLIENT_ID field is indexed. Without that index, Rushmore can't be used in the query plan, and both DLookup and DSum approaches are noticeably slower than with Rushmore.

HansUp
So MS Access isn't intelligent enough to use the index even the way that I am currently doing it? [Part] is indexed. Also, you had a good point about what the `tm_materialsUsedByPart` query does, it is doing a single JOIN.
Icode4food
I like @Thomas' suggestion to include the totals in your form's record source, and that's what I would do as long as the record source can be read-only. If you need to edit record source values in your time entry detail subform, use DSum().
HansUp
Access doesn't do anything. But the Jet/ACE database engine does. You can Google SHOWPLAN if you want to find out how it optimizes queries, but that won't help with figuring out how the domain aggregate functions work. In general, Jet/ACE (and the domain aggregate functions) will use all indexes, with a few exceptions (such as NOT used with subqueries, where sometimes it uses the index on only one side of the comparison).
David-W-Fenton