We have a query to remove some rows from the table based on an id field (primary key). It is a pretty straightforward query:
delete all from OUR_TABLE where ID in (123, 345, ...)
The problem is no.of ids can be huge (Eg. 70k), so the query takes a long time. Is there any way to optimize this?
(We are using sybase - if that matters).
...
Let's says I have a table which has many fields linked to values from other "value tables". Naturally, I declare foreign key constraints on each and evary of them to enforce integrity.
What if I finally get the number of such fields in the range of 20-30? Will it somehow "slow" table operations or not really?
ADDED: Value tables are ex...
Hello, I will do microblogging web service (for school, so don't blast me for lack of new idea) and I worry that DB could be often be overloaded (user could following other users or even tag so I suppouse that SELECT will be heavy - check 20 latest messages which contains all observing tags and user).
My idea is create another table, an...
IQueryable<WebEvent> mySearch =
eventDC.GetBooks()
.Where(p => p.Price.Any(d => d.EventDatetime.Month == fromDate.Month
&& d.EventDatetime.Year == fromDate.Year))
.WithGroup(groupId)
.OrderBy(p => p.Price.Where(r => r.Datetime >= fromDate)
.Or...
To make a long story short, one part of the application I'm working on needs to store a somewhat large volume of data in a database, for another part of the application to pick up later on. Normally this would be < 2000 rows, but can occasionally exceed 300,000 rows. The data needs to be temporarily stored and can be deleted afterwards.
...
How often do you conduct regular maintenance such as stress test your application and/or tune your database indexes for your applications?
E.G., Do you tune (defrag, reorganise or rebuild) your database indexes once a week, every six months or only after large volumes of data have been input and do you stress test your application afte...
Hi there,
This is the first time I'm approaching an extremely high-volume situation. This is an ad server based on MySQL. However, the query that is used incorporates a lot of JOINs and is generally just slow. (This is Rails ActiveRecord, btw)
sel = Ads.find(:all, :select => '*', :joins => "JOIN campaigns ON ads.campaign_id = campai...
Hi,
I have a query that I'm running on two equivalent databases, but hosted on separate MS SQL 2005 servers. I want to measure the time of a query on both servers, and thus tried the following:
SET STATISTICS TIME ON
GO
SELECT TOP 10000 *
FROM table
GO
SET STATISTICS TIME OFF;
GO
And got the following result:
SQL Server parse and...
So, it seems to me like a query on a table with 10k records and a query on a table with 10mil records are almost equally fast if they are both fetching roughly the same number of records and making good use of simple indexes(auto increment, record id type indexed field).
My question is, will this extend to a table with close to 4 billio...