views:

2062

answers:

13

Have you ever seen any of there error messages?

-- SQL Server 2000

Could not allocate ancillary table for view or function resolution.
The maximum number of tables in a query (256) was exceeded.

-- SQL Server 2005

Too many table names in the query. The maximum allowable is 256.

If yes, what have you done?

Given up? Convinced the customer to simplify their demands? Denormalized the database?


@(everyone wanting me to post the query):

  1. I'm not sure if I can paste 70 kilobytes of code in the answer editing window.
  2. Even if I can this this won't help since this 70 kilobytes of code will reference 20 or 30 views that I would also have to post since otherwise the code will be meaningless.

I don't want to sound like I am boasting here but the problem is not in the queries. The queries are optimal (or at least almost optimal). I have spent countless hours optimizing them, looking for every single column and every single table that can be removed. Imagine a report that has 200 or 300 columns that has to be filled with a single SELECT statement (because that's how it was designed a few years ago when it was still a small report).

+1  A: 

I have never come across this kind of situation, and to be honest the idea of referencing > 256 tables in a query fils me with a mortal dread.

Your first question should probably by "Why so many?", closely followed by "what bits of information do I NOT need?" I'd be worried that the amount of data being returned from such a query would begin to impact performance of the application quite severely, too.

ZombieSheep
Imagine a customer who wants to see a list (in a grid) of items in their stores along with really a lot of related pieces of information (e.g. about first order of every item, about last order, about first delivery, about last delivery, about customers who buy it, about costs of delivery, ...). Believe me, it is possible, I've seen it. And I have had to cope with it. :)Yes, the impact on performance can be severe in such situations.
Marek Grzenkowicz
A: 

Are you using SQL Server 2000 SP3?

Stu
SQL Server 2000 SP4 and SQL Server 2005 SP2.
Marek Grzenkowicz
A: 

I'd like to see that query, but I imagine it's some problem with some sort of iterator, and while I can't think of any situations where its possible, I bet it's from a bad while/case/cursor or a ton of poorly implemented views.

Shawn Simon
No iterator of any kind. Just a single SELECT statement.
Marek Grzenkowicz
A: 

Could you possibly create some views?

Zerofiz
Views won't help. Tables used in views count towards the limit, too.
Marek Grzenkowicz
A: 

@chopeen: Man, I work for a large videogames retailer's European headquarters in Dublin. We have hundreds of thousands of SKUs and I don't think we even have 256 tables in our database! Well, we do... but we sure as hell could never need to reference them all at once. What software is used to manage all of this?

Rob Burke
A single table is enough to run into this kind of problem - try something like this: SELECT * FROM dbo.FOOBAR F001 CROSS JOIN dbo.FOOBAR F002 CROSS JOIN dbo.FOOBAR F003 ... CROSS JOIN dbo.FOOBAR F300
Marek Grzenkowicz
Software that's used to manage all of this: Query Analyzer + Visual Source Safe.
Marek Grzenkowicz
+1  A: 

@chopeen You could change the way you're calculating these statistics, and instead keep a separate table of all per-product stats.. when an order is placed, loop through the products and update the appropriate records in the stats table. This would shift a lot of the calculation load to the checkout page rather than running everything in one huge query when running a report. Of course there are some stats that aren't going to work as well this way, e.g. tracking customers' next purchases after purchasing a particular product.

pix0r
+5  A: 

For SQL Server 2005, I'd recommend using table variables and partially building the data as you go.

To do this, create a table variable that represents your final result set you want to send to the user.

Then find your primary table (say the orders table in your example above) and pull that data, plus a bit of supplementary data that is only say one join away (customer name, product name). You can do a SELECT INTO to put this straight into your table variable.

From there, iterate through the table and for each row, do a bunch of small SELECT queries that retrieves all the supplemental data you need for your result set. Insert these into each column as you go.

Once complete, you can then do a simple SELECT * from your table variable and return this result set to the user.

I don't have any hard numbers for this, but there have been three distinct instances that I have worked on to date where doing these smaller queries has actually worked faster than doing one massive select query with a bunch of joins.

Dillie-O
A: 

Post the query :D

Also I feel like one of the possible problems could be having a ton (read 200+) of name/value tables which could condensed into a single lookup table.

Shawn Simon
A: 

I agree with modesty, posting the query [sanitized] could help in us suggesting improvements to your specific case rather than a general" fix the DB".

Rob Burke
A: 

alt text

Kevin
+1  A: 

This would happen all the time when writing Reporting Services Reports for Dynamics CRM installations running on SQL Server 2000. CRM has a nicely normalised data schema which results in a lot of joins. There's actually a hotfix around that will up the limit from 256 to a whopping 260: http://support.microsoft.com/kb/818406 (we always thought this a great joke on the part of the SQL Server team).

The solution, as Dillie-O aludes to, is to identify appropriate "sub-joins" (preferably ones that are used multiple times) and factor them out into temp-table variables that you then use in your main joins. It's a major PIA and often kills performance. I'm sorry for you.

@Kevin, love that tee -- says it all :-).

friism
A: 

I had this same problem... my development box runs SQL Server 2008 (the view worked fine) but on production (with SQL Server 2005) the view didn't. I ended up creating views to avoid this limitation, using the new views as part of the query in the view that threw the error.

Kind of silly considering the logical execution is the same...

LeRoy DeNooyer
It's weird that it helped - as far as I know, tables used in views count towards the limit. Are you using indexed views?
Marek Grzenkowicz
A: 

Please check your table fields.........

om Prakash
What do you mean? What should I check?
Marek Grzenkowicz