MySQL questions are some of my favorites on StackOverflow.
Unfortunately, things like this:
SELECT foo, bar, baz, quux, frozzle, lambchops FROM something JOIN somethingelse ON 1=1 JOIN (SELECT * FROM areyouserious) v ON 0=5 WHERE lambchops = 'good';
make my eyes bleed.
Also, attempts at describing your schema often go like this:
...
Hello,
I have a performance issue in a bottleneck section in my code. Basically it's a simple nested loop.
Profiling the issue reveals that the program spends a lot of time just incrementing both of the loop counters (++) and testing for termination (i/j < 8).
Watching the assembly output I see that both counters don't get registers an...
I am using the following MySQL query in a PHP script on a database that contains over 370,000,000 (yes, three hundred and seventy million) rows. I know that it is extremely resource intensive and it takes ages to run this one query. Does anyone know how I can either optimise the query or get the information in another way that's quicker?...
My goal is to maximise performance. The basics of the scenario are:
I read some data from SQL Server 2005 into a DataTable (1000 records x 10 columns)
I do some processing in .NET of the data, all records have at least 1 field changed in the DataTable, but potentially all 10 fields could be changed
I also add some new records in to the...
Is there any optimization library in C#?
I have to optimize a complicated equation in excel, for this equation there are a few coefficients. And I have to optimize them according to a fitness function that I define. So I wonder whether there is such a library that does what I need?
...
I want to calculate the page load time; This means from second 0 (a little jquery snippet was loaded) to second x, when the whole page is loaded.
i wonder if any one had an experience with it, also ideas how to implement it correctly will be apperciated.
please i don't need an extension, i already have firebug, i need a js solution
th...
Currently have an 'item' table, and a 'pair' table. The pair table simply contains two columns, which contain the primary key from the item table.
A common query is to find a number of items that are featured in the least number of pairs.
SELECT id,COUNT(*) AS count FROM item i LEFT JOIN pair p ON (i.id = p.id1 OR i.id = p.id2) GROU...
This is related to the queries I'm running from this question, namely:
SELECT CONCAT_WS(', ', city, state) AS location, AVG(latitude), AVG(longitude)
FROM places
WHERE state='NY'
AND city='New York'
GROUP BY
state, city
I've been looking at phpMyAdmin and they have one value red-flagged, Handler_read_rnd_next. ...
I've read this about structure padding in C:
http://bytes.com/topic/c/answers/543879-what-structure-padding
and wrote this code after the article, what should print out size of 'struct pad' like 16 byte and the size of 'struct pad2' should be 12. -as I think.
I compiled this code with gcc, with different levels of optimization, even the ...
I have have a class that I wrote, and it seems bigger than it should be. It doesn't extend anything, and has very little going on - or so I thought - but each one is taking up just under 100k100 bytes ( thanks back2dos ). I guess that I don't have a very good understanding of what really affects how much memory an object takes up in AS3...
Sometimes I run a Postgres query it takes 30 seconds. Then, I immediately run the same query and it takes 2 seconds. It appears that Postgres has some sort of caching. Can I somehow see what that cache is holding? Can I force all caches to be cleared for tuning purposes?
Note: I'm basically looking for a postgres version of the fol...
What is the more efficient approach for using hashmaps?
A) Use multiple smaller hashmaps, or
B) store all objects in one giant hashmap?
(Assume that the hashing algorithm for the keys is fairly efficient, resulting in few collisions)
CLARIFICATION: Option B implies segregation by primary key -- i.e. no additional lookup is necess...
I'm trying to tune the performance of my application. And I'm curious what methods are taking the longest to process, and thus should be looked over for any optimization opportunities.
Are there any existing free tools that will help me visualize the call stack and how long each method is taking to complete? I'm thinking something that ...
I'm trying to optimize 'in the small' on a project of mine.
There's a series of array accesses that are individually tiny, but profiling has revealed that these array accesses are where the vast majority of my program is spending its time. So, time to make things faster, since the program takes about an hour to run.
I've moved the fol...
I have a 2 gb mysql table with 500k rows and I run the following query on a system with no load.
select * from mytable
where name in ('n1', 'n2', 'n3', 'n4', ... bunch more... )
order by salary
It takes a filesort and between 50 and 70 seconds to complete.
When removing the order by salary and doing the sorting in the application,...
In a Django view I'm doing something like this..
lists = Stuff.objects.exclude(model2=None)
for alist in lists:
if alist.model2.model3.id == target_id:
addSomeStuff
The slowness comes from going from model (database row) to model in the if statement.
This actually takes nearly a second to run whe...
I have a few regular expressions which are run against very long strings. However, the only part of the string which concerns the RE is near the beginning. Most of the REs are similar to:
\\s+?(\\w+?).*
The REs capture a few groups near the start, and don't care what the rest of the string is. For performance reasons, is there a way t...
Hi all,
I have and normal select query that take nearly 3 seconds to execute (Select * from users). There are only 310 records in the user table.
The configuration of the my production server is
SQl Server Express Editon
Server Configuration : Pentium 4 HT, 3 ghz , 2 GB ram
Column Nam Type NULL COMMENTS
Column Nam Type NULL...
Does anyone know of an optimized way of detecting a 37 bit sequence in a chunk of binary data that is optimal. Sure I can do a brute force compare using windowing (just compare starting with index 0+next 36 bits, increment and loop until i find it) but is there a better way? Maybe some hashing search that returns a probability that the s...
I have automatic generated code (around 18,000 lines, basically a wrap of data) and other about 2,000 lines code in a C++ project. The project turned on the link-time-optimization operation. /O2 and fast-code optimization. To compile the code, VC++ 2008 express takes incredibly long time (around 1.5 hours). After all, it has only 18,000 ...