large-files

How to Transfer Large File from MS Word Add-In (VBA) to Web Server?

Overview I have a Microsoft Word Add-In, written in VBA (Visual Basic for Applications), that compresses a document and all of it's related contents (embedded media) into a zip archive. After creating the zip archive it then turns the file into a byte array and posts it to an ASMX web service. This mostly works. Issues The main issue ...

Computing MD5SUM of large files in C#

I am using following code to compute MD5SUM of a file - byte[] b = System.IO.File.ReadAllBytes(file); string sum = BitConverter.ToString(new MD5CryptoServiceProvider().ComputeHash(b)); This works fine normally, but if I encounter a large file (~1GB) - e.g. an iso image or a DVD VOB file - I get an Out of Memory exception. Though, I ...

Cloud HUGE data storage options?

Hi, Does anyone have a good suggestion on how to do video recording? We have a camera that can record and then stream live video to a server. So this means we can have 1000's of cameras sending data 24X7 for recording. We will store data for over 7 / 14 / 30 days depending on the package. Per day if a camera is sending data to the se...

Rejecting large files in git

We have recently started using git and had a nasty problem when someone committed a large (~1.5GB file), that then caused git to crash on various 32bit OSes. This seems to be a known bug (git mmaps files into memory, which doesn't work if it can't get enough contingous space), which isn't going to get fixed any time soon. The easy (for ...

Importing wikipedia database dumb - kills navicat - anyone got any ideas?

Ok guys I've downloaded the wikipedia xml dump and its a whopping 12 GB of data :\ for one table and I wanted to import it into mysql databse on my localhost - however its a humongous file 12GB and obviously navicats taking its sweet time in importing it or its more likely its hanged :(. Is there a way to include this dump or atleast pa...

Very large uploads with PHP

Hi guys, I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however. Browser: HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all Flash uploader puts entire file into memory befor...

XML streaming with XProc.

Hi all, I'm playing with xproc, the XML pipeline language and http://xmlcalabash.com/. I'd like to find an example for streaming large xml documents. for example, given the following huge xml document: <Books> <Book> <title>Book-1</title> </Book> <Book> <title>Book-2</title> </Book> <Book> <title>Book-3</title> </Book> <...

Parsing Huge XML Files in PHP

I'm trying to parse the dmoz content/structures xml files into mysql, but all existing scripts to do this are very old and don't work well. How can I go about opening a large (+1GB) xml file in php for parsing? ...

Partially load large text file with different encodings

I am writing a Java text component, and is trying to partially load some large text file in the middle (for speed reasons). My question is if the text is in some multi-bytes encoding format, like UTF8, Big5, GBK, etc. How can I align the bytes so that I can correctly decode the text? ...

.Net Transforming large XML docs with XSL

Question: What is the best way to transform a large XML document (>200MB) using XSL in .Net? Background: I have a application that feeds me large data files, I cannot change the format. In the past I have been able to translate smaller data files with no issues. Originally I was working with the XML as strings and was running out of ...

Seeking and reading large files in a Linux C++ application

I am running into integer overflow using the standard ftell and fseek options inside of G++, but I guess I was mistaken because it seems that ftell64 and fseek64 are not available. I have been searching and many websites seem to reference using lseek with the off64_t datatype, but I have not found any examples referencing something equal...

Version control system for huge files?

I am looking for a quick but not-so-dirty way to do snapshots of a bunch of files totaling about 80 gigs. The issue here is that many of the files are around 1 GB large. What is the best free version control system for this type of thing? I know ZFS is an option, but I'd rather try something else first. ...

how do I download a large file (via HTTP) in .NET

I need to download a LARGE file (2GB) over HTTP in a C# console app. Problem is, after about 1.2GB, the app runs out of memory. Here's the code I'm using: WebClient request = new WebClient(); request.Credentials = new NetworkCredential(username, password); byte[] fileData = request.DownloadData(baseURL + fName); As you can see... I'm...

Suggestions for uploading very large (> 1GB) files

I know that such type of questions exist in SF but they are very specific, I need a generic suggestion. I need a feature for uploading user files which could be of size more that 1 GB. This feature will be an add-on to the existing file-upload feature present in the application which caters to smaller files. Now, here are some of the opt...

programming files of size larger than 2 GB using C#.Net

How to write large content to disk dynamically using c sharp. any advice or reference is appreciated. Iam trying to create a file(custom format and extension)and writing to it. The User will upload a file and its contents are converted to byte stream and is written to the file(filename.hd).The indexing of the uploaded files is done in a...

Control to view a file with a large amount of text

Is there a TextBox-like WinForms control that can show a large amount of text (hundreds of megabytes) in read-only mode? Of course it should work without loading the whole file into memory at once. I'm trying to implement this myself, using a standard TextBox, processing scroll and keyboard events and reading the amount of text necessar...

Best way to process large XML in PHP

I have to parse large XML files in php, one of them is 6.5 MB and they could be even bigger. The SimpleXML extension as I've read, loads the entire file into an object, which may not be very efficient. In your experience, what would be the best way? ...

What is the fastest way to create a checksum for large files in C#

Hi, I have to sync large files across some machines. The files can be up to 6GB in size. The sync will be done manually every few weeks. I cant take the filename into consideration because they can change anytime. My plan is to create checksums on the destination PC and on the source PC and than copy all files with a checksum, which ar...

Read large file into sqlite table in objective-C on iPhone

I have a 2 MB file, not too large, that I'd like to put into an sqlite database so that I can search it. There are about 30K entries that are in CSV format, with six fields per line. My understanding is that sqlite on the iPhone can handle a database of this size. I have taken a few approaches but they have all been slow > 30 s. I've...

search & replace on 3000 row, 25 column spreadsheet

I'm attempting to clean up data in this (old) spreadsheet and need to remove things like single and double quotes, HTML tags and so on. Trouble is, it's a 3000 row file with 25 columns and every spreadsheet app I've tried (NeoOffice, MS Excel, Apple Numbers) chokes on it. Hard. Any ideas on how else I can clean this thing up for import ...