I need to store large amounts of data on-disk in approximately 1k blocks. I will be accessing these objects in a way that is hard to predict, but where patterns probably exist.
Is there an algorithm or heuristic I can use that will rearrange the objects on disk based on my access patterns to try to maximize sequential access, and thus ...
Hi,
I have a 1GB file containing pairs of string and long.
What's the best way of reading it into a Dictionary, and how much memory would you say it requires?
File has 62 million rows.
I've managed to read it using 5.5GB of ram.
Say 22 bytes overhead per Dictionary entry, that's 1.5GB.
long is 8 bytes, that's 500MB.
Average string len...
I have a table like so:
keyA keyB data
keyA and keyB together are unique, are the primary key of my table and make up a clustered index.
There are 5 possible values of keyB but an unlimited number of possible values of keyA,. keyB generally increments.
For example, the following data can be ordered in 2 ways depending on which key c...
Good day,
we just moved from asp.net 1.1 to asp.net 2.0. We are using ajax update panels.
In an Apress book (Pro asp.net 2008) , I've read that when you use the updatepanel, you don't reduce the acount of bandwidth sent, because the entire page is still sent.
That in mind, I've also read on many websites that it is better to use mult...
I'm testing a web application for browser memory leaks using Quick Test Professional (QTP) 9.5 and Internet Explorer 6. PerfMon works for monitoring the memory usage over time, but its data has to be synchronized to the testing results to find out which steps trigger the browser memory leak. Since QTP's scripting language is VBScript,...
Does combining an Enterprise Messaging solution with Web Services result in a real performance gain over simple HTTP requests over sockets?
(if implementation details will help, interested in JMS with a SOAP webservice)
...
In windows there is perfmon to monitor various performances aspects (called counters) of the system.
Is there a perfmon-like for Linux?
especially, in interested in...
CPU usage (total/per process/in kernel)
Memory usage (total/per process/in kernel)
...Is it possible to store this information in files for future analysis?
...
what is the best tool (open or commercial) currently available, that lets me send customized requests to a web server and get back a response to check the performance?
i will be sending it a load of more than 20K per second, but i need to get numbers for each call made. also, the numbers might be in some microseconds or nanoseconds. Ho...
what is a baseline and what is a benchmark? what is the best definition for these and how do you baseline a set of numbers and benchmark another set?
...
my web server has a lot of dependencies for sending back data, when it gets a request. i am testing one of these dependency applications within the web server. the application is decoupled from the main web server, and only queries are going to it in the form of api's exposed.
my question is, if i wish to check these api's in a multithr...
Which design do you think runs faster on PostgreSQL?
A. Making a 15 column table of varchars and the like, but putting all TEXT columns in a separate table with an fkey link back to this table. And let's imagine you want to search for the record with ID of "4" but then pull all the rows back, including the stuff from the TEXT columns in...
I am working on an Adobe Flex app, which needs to parse a relativley large XML file. ATM it is only 35MB, but in an ideal world would get much larger in the future.
**Edit: I have no control over the XML file
I am essentially dropping it's contents right into an SQLITE database, so I could use the SimpleXML class to turn it into an obj...
I'm looking for a benchmark (and results on other PCs) which would give me an idea of the development performance gain I could get by upgrading my PC, also the benchmark could be used to justify the upgrade to my boss.
I use Visual Studio 2008 for my development, so I'd like to get an idea of by what factor the build times would be impr...
I'm looking for a way to easily load test and benchmark some of our SQL (using ADO.NET, nothing fancy using LINQ or PLINQ) that has to be performant when running under high parallel load.
I've thought of using the new parallel extensions CTP and specifically Parallel.For / Parallel.ForEach to simply run the SQL over 10k iterations or so...
In a comment I read
Just as a side note, it's sometimes faster to drop the indices of your table and recreate them after the bulk insert operation.
Is this true? Under which circumstances?
...
I'm using mySQL to set up a database of stock options. There are about 330,000 rows (each row is 1 option). I'm new to SQL so I'm trying to decide on the field types for things like option symbol (varies from 4 to 5 characters), stock symbol (varies from 1 to 5 characters), company name (varies from 5 to 60 characters).
I want to optimi...
Hi, I have a single large table which I would like to optimize.
I'm using MS-SQL 2005 server. I'll try to describe how it is used and if anyone has any suggestions I would appreciate it very much.
The table is about 400GB, has 100 million rows and 1 million rows are inserted each day.
The table has 8 columns, 1 data col and 7 columns us...
As an example these are some of the things I always do when starting a new machine:
Install 'Visor' - gives you an always available HUD style terminal window via F1.
Install 'Clix' - run a million system customization command line instructions.
Install 'Default App' - self explanatory.
Set 'Terminal.app' to open and be hidden automa...
I have a community site which has around 10,000 listings at the moment. I am adopting a new url strategy something like
example.com/products/category/some-product-name
As part of strategy, I am implementing a site map. Google already has a good index of my site, but the URLs will change. I use a php framework which accesses the DB for...
I have a star schema type data base, with fact tables that have many foreign keys to dimension tables. The number of records in each dimension table is small - often less than 256 bytes, but always less than 64k. The fact tables typically have hundreds of thousands of records, so I want maximize join speed.
I'd like to use tinyints and...