I want to create a large HashMap but the put() performance is not good enough. Any ideas?
Other data structure suggestions are welcome but I need the lookup feature of a Java Map:
map.get(key)
In my case I want to create a map with 26 million entries. Using the standard Java HashMap the put rate becomes unbearably slow after 2-3 milli...
The goal of the assignment that I'm currently working on for my Data Structures class is to create a of Quantum Tic Tac Toe with an AI that plays to win.
Currently, I'm having a bit of trouble finding the most efficient way to represent states.
Overview of current Structure:
AbstractGame
Has and manages AbstractPlayers (game.nextP...
I believe that out there we have algorithms implementations (e.g. a c++ implementation of a particular sorting algorithm) which might not be as efficient as they could be.
I would like to write a research paper discussing how such an implementation might be improved. This could be in any programming language, however C, C++, Python, Jav...
Is there a way in MySQL to COUNT(*) from a table where if the number is greater than x, it will stop counting there? Basically, I only want to know if the number of records returned from a query is more or less than a particular number. If it's more than that number, I don't really care how many rows there are, if it's less, tell me the ...
I have 4 blocks of jQuery that look like this:
$('#aSplashBtn1').click(function(){
$('#divSliderContent div').hide();
$('#divSplash1').fadeIn('slow');
return false;
});
$('#aSplashBtn2').click(function(){
$('#divSliderContent div').hide();
$('#divSplash2').fadeIn('slow');
return false;
});
$('#aSplashBtn3').click(function(){...
I'm looking for a way to optimise the following:
SELECT
(SELECT SUM(amount) FROM Txn_Log WHERE gid=@gid AND txnType IN (3, 20)) AS pendingAmount,
(SELECT COUNT(1) FROM Txn_Log WHERE gid = @gid AND txnType = 11) AS pendingReturn,
(SELECT COUNT(1) FROM Txn_Log WHERE gid = @gid AND txnType = 5) AS pendingBlock
where @gid is ...
Hi,
I am writing an application for embedded linux where 5% of processor time is going in reading a file and 95% on processing it. Can I get some performance improvement if I read file in one thread and keeps on processing in another thread?
I am reading from mmc card which has DMA support. Filesize is of 20mb and it is devided in chun...
Hi All,
I am running following query in a stored proceudre and it is taking 30 milliseconds to execute. Can anyone help me out to optimize this query:
Table Definition is:
Create Table Customer
(
CustID int not null auto_increment,
CustProdID int,
TimeStamp DateTime,
primary key(CustID)
);
Update Customer
Set TimeStamp =...
I have an insert that uses a condition checking for a NOT IN. There are about 230k rows in the NOT IN subquery.
INSERT INTO Validate.ItemError (ItemId, ErrorId, DateCreated)
(
SELECT ItemId, 10, GetUTCDate()
FROM Validate.Item
INNER JOIN Refresh.Company
ON Validate.Item.IMCompanyId = Refresh.Company.IMCompanyId
...
The concern with the legacy of the SQL statements is a constant in my head. Especially when SCRUM is used, where the code has no owner, that is, all must be able to repair and maintain each piece. Optimizing SQL procedures usually means converting it into a set-based commands and using special operators. I need tips to keep the code work...
Hi,
I am consuming a high rate data stream and doing the following steps to store data in a MySQL database. For each new arriving item.
(1) Parse incoming item.
(2) Execute several "INSERT ... ON DUPLICATE KEY UPDATE"
I have used INSERT ... ON DUPLICATE KEY UPDATE to eliminate one additional round-trip to the database.
While trying...
I'm working on some matlab code which is processing large (but not huge) datasets: 10,000 784 element vectors (not sparse), and calculating information about that which is stored in a 10,000x10 sparse matrix. In order to get the code working I did some of the trickier parts iteratively, doing loops over the 10k items to process them, an...
How many GCC optimization levels are there?
I tried gcc -O1, gcc -O2, gcc -O3, and gcc -O4
If I use a really large number, it won't work.
However, I have tried
gcc -O100
and it compiled.
How many optimization levels are there?
...
I know that -O1 automatically turns on certain flags. These flags can be turned on manually though. If I don't specify -O1, it should still be possible to get -O1 optimization by specifying all the flags that -O1 turns on.
I tried
-fthread-jumps -fcprop-registers -fguess-branch-probability
but it still does not do -O1 optimization....
I want to compile with optimization -O1, but there is a certain flag that it turns on that I do not want to use. How do I turn it off?
...
There is a lot of discussion about the lack of macros in some languages, and the inefficiencies that can arise. Perhaps the most common example is the guard before a log statement.
To what extent can current & future optimisation be relied upon to do the right thing and obviate the need for a macro. In this example and in general?
// ...
Hi,
I have a many to many table in MySQL - it can look something like:
A | B
1 | 1
1 | 2
1 | 3
3 | 4
3 | 5
4 | 1
4 | 2
4 | 5
etc.
You'll notice that the numbers are not always contiguous, (A=1,3,4..) - I am looking for the fastest way to turning this into a matrix table, ignoring columns and rows with no data (e.g. A=2). What I have...
I have a query that pulls 5 records from a table of ~10,000. The order clause isn't covered by an index, but the where clause is.
The query scans about 7,700 rows to pull these 5 results, and that seems like a bit much. I understand, though, that the complexity of the ordering criteria complicates matters. How, if at all, can i reduce ...
Hello everybody!
I am not an expert of MySQL, and I have search a lot about the following problem without finding the solution.
So, I'm using a MySQL table with this structure:
CREATE TABLE photos (
file varchar(30) NOT NULL default "",
place tinytext NOT NULL default "",
description tinytext NOT NULL default "",
type char(1) ...
Hello,
First of all I am an autodidact so I don't have great know how about optimization and stuff. I created a social networking website.
It contains 29 tables right now. I want to extend its functionality by adding things like yellow pages, events etc to make it more like a portal.
Now the question is should I simply add the tables ...