huge

Database size is huge

I have the following problem. We have a database that stores binaries in the database. We know the size of the database can be big so we removed all the binaries from the database and used the task "shrink" on it. This way we hoped that the database would be much smaller. These are the results: before removal size was: 20 gigabyte after...

PHP/MySQL - SELECT from various tables, with a list of indexes

Hi folks... im with a hugh question... Here is the scenario. Im developing a timeclock system, i have these tables: -punch (id_punch,date) -in1 (id_in1,time,id_punch,...) -in2 (id_in2,time,id_punch,...) . -in6 (id_in6,time,id_punch,...) -out1 (id_out1,time,id_punch,...) -out2 (id_out2,time,id_punch,...) . -out6 (id_out6,time,id_punch...

Challenges in remotely running big RIA application

I have a big rich-internet-application file (qooxdoo,js,html). The users use their browser to point to the web server and run it. The problem is that it takes a long time for the users to load the application every time they visit the site. Is there a way to somehow "bundle" and save the application locally and have the user refer to i...

breadth-first-search on huge graph with little ram

I currently have a graph that has about 10 million nodes and 35 million edges. For now the complete graph is loaded into memory at program start. This takes a couple of minutes (it is Java after all) and needs about half a gigabyte of RAM. For now it runs on a machine with a dual core processor and 4 gigabytes of RAM. When the graph is ...

Processing XML file with Huge data

Hi,be m I am working on an application which has below requiements - 1. Download a ZIP file from a server. 2. Uncompress the ZIP file, get the content (which is in XML format) from this file into a String. 3. Pass this content into another method for parsing and further processing. Now, my concerns here is the XML file may be of Huge si...

How to read a database record with a DataReader and add it to a DataTable

Hello I have some data in a Oracle database table(around 4 million records) which i want to transform and store in a MSSQL database using ADO.NET. So far i used (for much smaller tables) a DataAdapter to read the data out of the Oracle DataBase and add the DataTable to a DataSet for further processing. When i tried this with my huge t...

Querying huge words listed in a file against 3 million MySQL records

What is the best way to query each word present in a huge word-list against 3 million MySQL records? Number of words in word-list file is approximately 10,000. Iterating over each word works but consumes huge amount of time. Is there any better way to optimize ? ...

How to create a huge Informix database?

Can anyone provide me with a script for creating a huge database (for example, 2 GB of data) in IBM Informix Dynamic Server (IDS) version 11.50.FC4 on a Linux RHEL 64-bit machine? ...

informix huge db creation

how to create huge database in informix ids 11.50 ...

Python: Huge file reading by using linecache Vs normal file access open()

Hi, I am in a situation where multiple threads reading the same huge file with mutliple file pointers to same file. The file will have atleast 1 million lines. Eachline's length varies from 500 characters to 1500 characters. There won't "write" operations on the file. Each thread will start reading the same file from different lines. Whi...

what is meant by normalization in huge pointers

Hi, I have a lot of confusion on understanding the difference between a "far" pointer and "huge" pointer, searched for it all over in google for a solution, couldnot find one. Can any one explain me the difference between the two. Also, what is the exact normalization concept related to huge pointers. Please donot give me the following...

Read a line containing (large) N reals to an array in Fortran

I've read() down past a header of an input file, and read the value of L on the way. Now I come to a line of L^2 consecutive reals, which I need to input to the elements of an allocatable array A(L,L). Trying character *100 :: buffer read (1,10) buffer 10 format(a(L*10)) results in Error: Syntax error in ...

Improving speed when loading a list from a web service.

This is a continuation on this question: The problem is simple. I need to call methods from a REST web service which controls several tables. One table is a snapshot table which contains records with huge XML files. Each XML file is basically a backup from another database. This backup XML is then sent to customers who use the data as re...

Problem running heavy application

I developed an "heavy" application (700 Mb !). With an "apk installer" application, i can install it on the Nexus One SD card (Froyo "installLocation" option). My application is heavy because of the videos it contains (located in /raw directory). The problem I have, is that it crashes when launched, with this error : 08-18 11:22:16.179:...

Drawing large graph with graphviz

Hello guys, I have here generated a large .dot file of my facebook friends' graph with fb-map. It has 287 nodes and almost 2000 edges. I'm using dot and neato to generate a .png image, using the overlap="orthoyx" paramethers, but it doesn't give a nice effect. Too many overlapped edges. Do you know any set of options to manage such hug...

Concept for archiving huge database

I need a concept how I can archive all data in D-Pool which is older than one year. At the moment we have more than 3 million records in the D-Pool. Because of this huge data foundation searches and filters over the database takes quite too long, because most searches are done over the whole D-Pool data, but in most cases I am only inter...

Java XML Parser for huge files

Hi, I need a xml parser to parse a file that is approximately 1.8 gb. So the parser should not load all the file to memory. Any suggestions? ...

Download server with ASP.NET how to accomplish long running synchronous requests ?

My ASP.NET application is a download application (no pages) which reads huge binary files (1-2 GB -> over 1 hours download time with resume support) from local network and stream them to web clients (each request -> one large binary response, so there's no text/html response at all). I use a HTTP Handler (.ashx) instead of a (.aspx) page...