query-optimization

Optimizing MySql query to avoid using "Using filesort"

I need your help to optimize the query to avoid using "Using filesort".The job of the query is to select all the articles that belongs to specific tag. The query is: select title from tag, article where tag = 'Riyad' AND tag.article_id = article.id order by tag.article_id The tables structure are the followi...

SQL query performance optimization (TimesTen)

Hi community, I need some help with TimesTen DB query optimization. I made some measures with Java profiler and found the code section that takes most of the time (this code section executes the SQL query). What is strange that this query becomes expensive only for some specific input data. Here’s the example. We have two tables that ...

Optimizing ROW_NUMBER() in SQL Server

We have a number of machines which record data into a database at sporadic intervals. For each record, I'd like to obtain the time period between this recording and the previous recording. I can do this using ROW_NUMBER as follows: WITH TempTable AS ( SELECT *, ROW_NUMBER() OVER (PARTITION BY Machine_ID ORDER BY Date_Time) AS Orde...

Which fieldtype is best for storing PRICE values?

Hi there I am wondering whats the best "price field" in SQL Server for a shoplike structure? Looking at this overview: http://www.teratrax.com/sql_guide/data_types/sql_server_data_types.html We have data types called money, smallmoney, then we have decimal/numeric and lastly float and real Name, memory/disk-usage and value ranges: M...

Database query optimization

Ok my Giant friends once again I seek a little space in your shoulders :P Here is the issue, I have a python script that is fixing some database issues but it is taking way too long, the main update statement is this: cursor.execute("UPDATE jiveuser SET username = '%s' WHERE userid = %d" % (newName,userId)) That is getting called ab...

Composite primary keys in N-M relation or not?

Lets say we have 3 tables (actually I have 2 at the moment, but this example might illustrate the thought better): [Person] ID: int, primary key Name: nvarchar(xx) [Group] ID: int, primary key Name: nvarchar(xx) [Role] ID: int, primary key Name: nvarchar(xx) [PersonGroupRole] Person_ID: int, PRIMARY COMPOSITE OR NOT? Group...

Rewriting query in order to remove FIND_IN_SET?

My mysql query looks like this: SELECT pages.*, showcase.*, project.* FROM pages INNER JOIN showcase ON showcase.pid = pages.uid AND showcase.deleted != 1 INNER JOIN project ON FIND_IN_SET(project.uid, showcase.projects) WHERE pages.deleted != 1 AND pages.pid = 14 AND pages.dokTy...

Efficiently select top row for each category in the set

I need to select a top row for each category from a known set (somewhat similar to this question). The problem is, how to make this query efficient on the large number of rows. For example, let's create a table that stores temperature recording in several places. CREATE TABLE #t ( placeId int, ts datetime, temp int, PRI...

Query Optimization using WHERE IN

I'm wondering if someone can explain how the IN calculates? Well, ultimately I'm trying to find out why this query is slow and how to optimize it. I waited over 3 minutes and when I cancelled the query it had only returned 1000 lines which doesn't seem like it should take that long. SELECT t2.* FROM report_tables.roc_test_results as ...

MySQL select specific cols slower than select *

My MySQL is not strong, so please forgive any rookie mistakes. Short version: SELECT locId,count,avg FROM destAgg_geo is significantly slower than SELECT * from destAgg_geo prtt.destAgg is a table keyed on dst_ip (PRIMARY) mysql> describe prtt.destAgg; +---------+------------------+------+-----+---------+-------+ | Field | Type ...

CREATE VIEW for MYSQL for last 30 days

I know i am writing query's wrong and when we get a lot of traffic, our database gets hit HARD and the page slows to a grind... I think I need to write queries based on CREATE VIEW from the last 30 days from the CURDATE ?? But not sure where to begin or if this will be MORE efficient query for the database? Anyways, here is a sample qu...

Web page database query optimization

I am putting together a web page which is quite 'expensive' in terms of database hits. I don't want to start optimizing at this stage - though with me trying to hit a deadline, I may end up not optimizing at all. Currently the page requires 18 (that's right eighteen) hits to the db. I am already using joins, and some of the queries are ...

Multiple keys/indeces/constraints when joining three tables

I'm getting more and more confused as I try to distinguish from the ambiguities of these terms. I have a query that is taking longer than necessary simply because I cannot get the key on on table to work for the other joins. I have only one column that is "Unique" in t1, there are others which are 73.8% unique and I cannot figure out h...

Does anybody have any suggestions on which of these two approaches is better for large delete?

Approach #1: DECLARE @count int SET @count = 2000 DECLARE @rowcount int SET @rowcount = @count WHILE @rowcount = @count BEGIN DELETE TOP (@count) FROM ProductOrderInfo WHERE ProductId = @product_id AND bCopied = 1 AND FileNameCRC = @localNameCrc SELECT @rowcount = @@ROWCOUNT WAITFOR DELAY '000:00:00.400' Approach #2: DECLARE @c...

mysql inserts & updates optimized

This is an optimization question, mostly. I have many forms on my sites that do simple Inserts and Updates. (Nothing complicated) But, several of the form's input fields are not necessary and may be left empty. (again, nothing complicated) However, my SQL query will have all columns in the Statement. My question, is it best to optimi...

Date arithmetic using integer values

Problem String concatenation is slowing down a query: date(extract(YEAR FROM m.taken)||'-1-1') d1, date(extract(YEAR FROM m.taken)||'-1-31') d2 This is realized in code as part of a string, which follows (where the p_ variables are integers, provided as input by end users): date(extract(YEAR FROM m.taken)||''-'||p_month1||'-'||p_day...

SQL server virtual memory usage and performance

Hello, I have a very large DB used mostly for analytics. The performance overall is very sluggish. I just noticed that when running the query below, the amount of virtual memory used greatly exceeds the amount of physical memory available. Currently, physical memory is 10GB (10238k bytes) whereas the virtual memory returns significantly...

2 Select or 1 Join query ?

I have 2 tables: book ( id, title, age ) ----> 100 milions of rows author ( id, book_id, name, born ) ----> 10 millions of rows Now, supposing I have a generic id of a book. I need to print this page: Title: mybook authors: Tom, Graham, Luis, Clarke, George So... what is the best way to do this ? 1) Simple join like this: Select...

Data Access from single table in sql server 2005 is too slow

Following is the script of table. Accessing data from this table is too slow. SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Emails]( [id] [int] IDENTITY(1,1) NOT NULL, [datecreated] [datetime] NULL CONSTRAINT [DF_Emails_datecreated] DEFAULT (getdate()), [UID] [nvarchar](250) COLLATE Latin1_Ge...

sql UPDATE, a calculation is used multiple times, can it just be calculated once?

Using: UPDATE `play` SET `counter1` = `counter1` + LEAST(`maxchange`, FLOOR(`x` / `y`) ), `counter2` = `counter2` - LEAST(`maxchange`, FLOOR(`x` / `y`) ), `x` = MOD(`x`, `y`) WHERE `x` > `y` AND `maxchange` > 0 As you can see, LEAST(maxchange, FLOOR(x / y) ) is used multiple times, but it should always have th...