I have a list of TV shows stored in 1 table. Another table stores show genres (action, romance, comedy).
Most shows usually have more than 1 genre, so having a single tv_genre column and putting the genre ID in there isn't an option.
I could create a look up table which would store tv show id + genre id, and I could insert 1 row for e...
UPDATE: I found a solution. See my Answer below.
My Question
How can I optimize this query to minimize my downtime? I need to update over 50 schemas with the number of tickets ranging from 100,000 to 2 million. Is it advisable to attempt to set all fields in tickets_extra at the same time? I feel that there is a solution here that I'm ...
I have this, I built it programatically:
(( cat1:bobo AND ( ( cat2:jojo ) OR ( cat2:coco ) ) ))
For the sake of debugging, I am looking for a good method that would basically reduce it to least amount of parens needed:
cat1:bobo AND ( cat2:jojo OR cat2:coco )
I'm on C#, but if you have a good technique you've seen, I will port it...
I'm working on selecting locations (city, state) out of a database. The problem is that the query is running a tad slow and I'm not sure how to speed it up. For example:
SELECT CONCAT_WS(', ', city, state) as location, AVG(latitude), AVG(longitude)
FROM places
WHERE city='New York' AND state='NY'
GROUP BY location
There's going to ...
I learned a trick a while back from a DBA friend to speed up certain SQL queries. I remember him mentioning that it had something to do with how SQL Server compiles the query, and that the query path is forced to use the indexed value.
Here is my original query (takes 20 seconds):
select Part.Id as PartId, Location.Id as LocationId
F...
Hi,
this is a follow up from http://stackoverflow.com/questions/1242223/mysql-find-rows-matching-all-rows-from-joined-table
Thanks to this site the query runs perfectly.
But now i had to extend the query for a search for artist and track. This has lead me to the following query:
SELECT DISTINCT`t`.`id`
FROM `trackwords` AS `tw`
IN...
Assuming the following:
/*
drop index ix_vouchers_nacsz on dbo.vouchers;
drop index ix_vouchers_nacsz2 on dbo.vouchers;
create index ix_vouchers_nacsz on dbo.Vouchers(
FirstName, LastName,
Address, Address2, City,
State, Zip, Email
);
create index ix_vouchers_nacsz2 on dbo.Vouchers(
Email, FirstName, LastName,
Address, Address2,...
I have a table called users with roughly 250,000 records in it. I have another table called staging with around 75,000 records in it. Staging only has one column, msisdn. I want to check to see how many rows in staging are not present in users.
I have the following query, which I have tested on a small data subset, and it seems to work ...
using PHP and MySQL
i have grabbed an array of facebook user ids from facebook.
Now i want to find the corresponding username in my application for this array.
Clearly in my application the user table contains unique username and unique fb_uid values.
my rudimentary understanding oof programming led me to 2 ways:
1) use a loop and r...
Test case schema and data follow as:
create table tmp
(
vals varchar(8),
mask varchar(8)
);
insert into tmp values ('12345678',' ');
insert into tmp values ('12_45678',' _ ');
insert into tmp values ('12345678',' _ ');
insert into tmp values ('92345678',' ');
insert into tmp values ('92345678',' _ ')...
Hey, I'm doing a website that requires me to use about 5 mysql "SELECT * FROM" queries on each site and I'd like to know if it can slow the download speed in any way.
...
Assuming that there's a table called tree_node, that has the primary key called id and has a nullable column called parent_id and the table embeds a tree structure (or a forest), how can one compute the depth of a node in the tree/forest in an efficient way in SQL?
...
Hi,
Please have a look at the below SQL code.
DECLARE @RET TABLE(OID BIGINT NOT NULL,rowid bigint identity);
DECLARE @ResultTbl TABLE(OID BIGINT,sOID BIGINT,partkey bigint);
DECLARE @PATOID as VARCHAR(4000)
SET @PATIENTOID= '95,96,192,253,110,201,201,83,87,88,208,208,208,208'
INSERT INTO @RET SELECT OID FROM dbo.FGETBIGINTLIST(@PATOI...
I have the following SQL code:
select val.PersonNo,
val.event_time,
clg.number_dialed
from vicidial_agent_log val
join
call_log clg on date_add('1970-01-01 02:00:00', interval clg.uniqueid second) = val.event_time
order by val.event_time desc
limit 100;
which executes and returns rows in les...
Consider the Following object model (->> indicates collection):
Customer->Orders
Orders->>OrderLineItems->Product{Price}
The app is focused on processing orders, so most of the time tables showing all the orders that match certain criteria are used in the UI. 99% of the time i am only interested in displaying the Sum of LineTotals, no...
On a very heavy traffic LAMP server I'm using a memory table to keep track of several data items as counters. It is implemented like this:
$query = "INSERT INTO daily_info_mem SET di_num=1 ,di_type=9, di_date = current_date(), di_sid= $sid_int ,di_name='user_counter' ON DUPLICATE KEY UPDATE di_num=di_num+1";
The index sets unique di_t...
I've happened to develop a module in Drupal and due to some seeming View limitations had to use custom SQL. This ran me into some problems with node revisions and I came to conclusion
that in Drupal it's best to use its native methods for working with any data. Otherwise, data integrity problems may arise.
And even with desire to optim...
Hi everyone,
As an exercise (read:interview question) in index optimisation, I need a query which is slow on the standard AdventureWorks database in SQL2005. All the queries I've tried take about 1 second and I would prefer to have a query which takes multiple seconds so that it can be optimised effectively.
Can anyone here create such...
Hi,
I'm using Firebird and created a table, called EVENTS. The columns are:
id (INT) | name (VARCHAR) | category (INT) | website (VARCHAR) | lat (DOUBLE) | lon (DOUBLE)
Now a user wants to search for events in a certain radius around him, but entered only two or three letters of his home city. So we've got - lets say - 200 possible c...
An ex-coworker of mine wrote the following UPDATE as part of a data import script and it takes nearly 15 minutes to complete on a table of 92k rows.
UPDATE table
SET name = (
SELECT TOP 1 old_name FROM (
SELECT
SUM(r) rev,
number,
name,
intermediate_number,
interm...