Our publishing workflow includes Windows and Linux machines (there are some Macs too, but not in the critical-path workflow). Many texts include both English and Khmer and are marked-up in XML.
XML Copy Editor is the best cross-platform open-source XML editor I've discovered. It utilizes the Scintilla editing component, which is general...
I have large (hundreds of MB or more) files that I need to read blocks from using C++ on Windows. Currently the relevant functions are:
errorType LargeFile::read( void* data_out, __int64 start_position, __int64 size_bytes ) const
{
if( !m_open ) {
// return error
}
else {
seekPosition( start_position );
DWORD bytes_read;
BOOL...
Hi all,
first of all: I'm not a programmer, never was, although had learn a lot during my professional carreer as a support consultant.
Now my task is to process - and create some statistics about a constantly written and rapidly growing XML like log file. It's not valid XML, because it does not have a proper <root> element, e.g. the l...
Hi,
What would be the best approach to parse a delimited file when the columns are unknown before parsing the file?
The file format is Rightmove v3 (.blm), the structure looks like this:
#HEADER#
Version : 3
EOF : '^'
EOR : '~'
#DEFINITION#
AGENT_REF^ADDRESS_1^POSTCODE1^MEDIA_IMAGE_00~ // can be any number of columns
#DATA#
agent1^the...
How can I get the file size of a file in C when the file size is greater than 4gb?
ftell returns a 4 byte signed long, limiting it to two bytes. stat has a variable of type off_t which is also 4 bytes (not sure of sign), so at most it can tell me the size of a 4gb file.
What if the file is larger than 4 gb?
...
My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's b...
Hi,
I am trying to upload large file on Google Docs.
And i came across Resumable Upload.
But how can i implement it in Google API Ver 2 ?
Thanx
...
Hi,
I am trying to send mail with large size attachment upto (1MB,2MB).
But sending mail fails.(Sending to Google Apps) as:
MailItemEntry[] entries = new MailItemEntry[1];
String EmlPath = "C:\\testemail.eml";
String msg = File.ReadAllText(EmlPath);
entries[0] = new MailItemEntry();
entries[0].Rf...
Is there a way to read a large text file (~60MB) into memory at once (like a compiler flag to increase program memory limit) ? Currently, ofstream's open function throws a segmentation fault while trying to read this file.
ifstream fis;
fis.open("my_large_file.txt"); // Segfaults here
The file just consists of rows of the form
number...
Hi all, I'm trying to transfer large file(1Gb+) using UDP(in packets) through air application. I'm transfering byteArray by taking chunks of packets from FileStream. But its giving
'Error #1000: The system is out of memory'
at sender side after certain number of packets sent and by this time the downloaded file size at server side is 25...
Hi, I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and then saving th...
Is there a GUI program that can read large MySQL dumps (+200MB), or really any large text file?
Most modern editors it seems can't handle large files because it seems they like to load the whole file into memory.
I want to open it on Ubuntu (Linux), but I would also like to read it on Windows.
...
I'm trying to read a 17MB excel file (2003) with PHPExcel1.7.3c, but it crushes already while loading the file, after exceeding the 120 seconds limit I have.
Is there another library that can do it more efficiently? I have no need in styling, I only need it to support UTF8.
Thanks for your help
...
Hi,
I am having a database with tables having billions of rows in a single table for a month and I am having data for the past 5 years. I tried to optimize the data in all possible ways, but the latency is not decreasing. I know there are some solutions like using horizantal shrading and vertical shrading. But I am not sure about any op...
Hello,
We need to write software that would continuously (i.e. new data is sent as it becomes available) send very large files (several Tb) to several destinations simultaneously. Some destinations have a dedicated fiber connection to the source, while some do not.
Several questions arise:
We plan to use TCP sockets for this task. Wh...
We've got a file-based program we want to convert to use a document database, specifically MongoDB. Problem is, MongoDB is limited to 2GB on 32-bit machines (according to http://www.mongodb.org/display/DOCS/FAQ#FAQ-Whatarethe32bitlimitations%3F), and a lot of our users will have over 2GB of data. Is there a way to have MongoDB use more t...
I have a criteria page in my asp.net application. When user clicks report button, firstly in a new page results are binded to a datagrid, then this page is exported to excel file with changing content type method.
That normally works, but when large amount of data comes, system.outofmemoryexception is thrown.
Does anyone know a way t...
Hi,
I wrote a PHP script to dynamically pack files selected by the client into zip file and force a download. It works well except that when the number of files is huge (like over 50000), it takes a very long time for the download dialog box to appear on the client side.
I thought about improving this using cache (these files are not c...
Note that I can't first store the file locally -- it's too big.
This (obnoxious) page (scroll all the way to the bottom) seems to give an answer but I'm having trouble disentangling the part that's specific to tape drives:
http://webcache.googleusercontent.com/search?q=cache:lhmh960w2KQJ:www.experts-exchange.com/OS/Unix/SCO_Unix/Q_2424...
I have a number of rather large binary files (fixed length records, the layout of which is described in another –textual– file). Data files can get as big as 6 GB. Layout files (cobol copybooks) are small in size, usually less than 5 KB.
All data files are concentrated in a GNU/Linux server (although they were generated in a mainframe)....