Hi, This query pops up in my slow query logs:
SELECT
COUNT(*) AS ordersCount,
SUM(ItemsPrice + COALESCE(extrasPrice, 0.0)) AS totalValue,
SUM(ItemsPrice) AS totalValue,
SUM(std_delivery_charge) AS totalStdDeliveryCharge,
SUM(extra_delivery_charge) AS totalExtraDeliveryCharge,
this_.type ...
Does having separate file for each table improve InnoDB performance in MySQL. Are there any such performance tuning tips for MySQL
...
We currently have a WinForms application that we want to slowly migrate to a web application.
One screen is a time sheet entry system that uses DataWindow and is very slow and buggy.
Anyway, the time sheet screen has five sections that are saved in real time. A finished time sheet needs 2-5 of these sections.
Currently, the system run...
I have a denormalized table product with about 6 million rows (~ 2GB) mainly for lookups. Fields include price, color, unitprice, weight, ...
I have BTREE indexes on color etc. Queriy conditions are dynamically generated from the Web, such as
select count(*) from product where color=1 and price > 5 and price <100 and weight > 30 ... et...
MySQL Workbench reports a value called "Key Efficiency" in association with server health. What does this mean and what are its implications?
From MySQL.com, "Key Efficiency" is:
...an indication of the number of key_read_requests that resulted in actual key_reads.
Ok, so what does that mean. What does it tell me about how I'm s...
I have a 10M-row table product with fields like color (int), price (float), weight (float), unitprice (int), etc ... Now users from Web dynamically generate queries to lookup data from this table with random conditions (color is a must have here) and order-by such as
select * from product where color=1 and price >5 and price <220 and .....
Hi
For my iPhone app I'm creating some rotating gears with the help of some subclassed UIViews.
I have created subclasses that rotate themselves triggered by a timer.
In one place I have one of these subclasses within another one (so rotation within rotation, think moon rotation around earth and it's own axle). It all rotates fine an...
Hi all,
I have a table with several columns and a unique RAW column. I created an unique index on the RAW column.
My query selects all columns from the table (6 million rows).
when i see the cost of the query its too high (51K). and its still using INDEX FULL scan. The query do not have any filter conditions, its a plain select * from...
I'm trying to write a function to rotate an image matrix using the loop-tiling technique. However, I'm running into some issues with getting it to work properly.
EDIT:
Here's my updated code that works, but only when n is a multiple of the block size. How would I go about handling varying matrix sizes? Right now, I'm just using square ...
Hi,
I have a table which have more than 380 million records....
I have a stored procedure which
1. Delete some records in that
2. Insert something into it.
The total procedure takes around 30 minutes to execute. Out of this DELETE takes 28 minutes.
Delete is a simple statement -> Delete a where condition_1 AND condition_2 AND c...
I have a site - http://www.tubeloop.com/ -- that creates YouTube playlists in a loop. The codebase utilizes 3 jQuery plugins i've created - a youtube api wrapper, a youtube player wrapper and a radial menu. The site is a mashup and its extremely heavy on the client-side, as all requests made to YouTube, Facebook and Meebo are all made th...
If I have a table with huge amount of data...and If I do incremental delete instead "one time delte"..what's the benefit ?
Onetime delete
DELETE table_1
WHERE BID = @BID
AND CN = @CN
AND PD = @PD;
Incremental Delete
While (1=1)
Begin
DELETE TOP (100000) FROM table_1
WHERE BID = @BID
AND CN = @CN ...
I was looking at a very slow SQL query (originating from a Java app using Hibernate deployed in JBoss 5.1). This particular query returned about 10K records but still took 40s or more.
I ended up sniffing the traffic with the database (wireshark has a dissector for TNS) and found something unexpected. When data was coming from the serve...
I am using large number of global temporary tables for generating huge reports against an Oracle 10g database. Each report consists of 4 to 5 global temporary tables(GTT) per say. But as far as I understand the concept of GTT's, the data is created on the fly per each session for different set of parameters.
For example, in my scenario,...
I have simple metohds to export DataTable to xls using string. Number of columns is 5 - 30, and number or rows might be from 1 to 1000. Sometimes there is problem with performance, and I please for advice what can I change in my code. I'm using .net 4.0
public string FormatCell(string columnName, object value)
{
StringB...
I am having a performance sensitive .net application.
It is capable of doing parallel threading upto 32 threads. We have need of improving it even further. Increasing the thread may not really help since the number of failure attempts to process increases as we increase the number of threads.
My question here is,
What are the options...
Hi,
I've a function to resize a bitmap which is called several time (real time processing of image) and the following code is quite slow, so my app performance is "bad". Does anyone know an other way of resizing image with CF which is faster ?
Image bmp = new Bitmap(size.Width, size.Height);
using (var g = Graph...
Hi,
I'm working on my CouchDB with Python, cause I love python. The problem is: On my machine (it's a Seagate Dockstar) it's quite slow. How can I increase the speed?
a) I've tried to use psyco. It's not available for that plattform.
b) I've tried to put the imports outside my function-definitons. This doesn't work since the couchpy th...
I was attempting to evaluate various Rails server solutions. First on my list was an nginx + passenger system. I spun up an EC2 instance with 8 gigs of RAM and 2 processors, installed nginx and passenger, and added this to the nginx.conf file:
passenger_max_pool_size 30;
passenger_pool_idle_time 0;
rails_framework_spawner_idle_time 0;...
Hello everyone,
I am writing a generic hash map in C++ which uses chaining to deal with collisions.
Say if I have a hash map with 11 buckets, and I insert 8 items. The hash function will distribute it as follows:
bucket[0] = empty
bucket[1] = 2 elements
bucket[2] = empty
bucket[3] = 1 element
bucket[4] = 1 element
bucket[5] = 3 eleme...