large-files

How Can I Get Around this EOutOfMemory Exception When Encoding a Very Large File?

I am using Delphi 2009 with Unicode strings. I'm trying to Encode a very large file to convert it to Unicode: var Buffer: TBytes; Value: string; Value := Encoding.GetString(Buffer); This works fine for a Buffer of 40 MB that gets doubled in size and returns Value as an 80 MB Unicode string. When I try this with a 300 MB Buffer...

using php to download files, not working on large files?

Hi all, I'm using php to download files, rather than the file itself opening in a new window. It seems to work ok for smaller files, but does not work for large files (I need this to work on very large files). Here's the code I have to download the file: function downloadFile($file) { if (file_exists($file)) { /...

Serving Large Protected Files in PHP/Apache

I need to serve up large files (> 2gb) from an Apache web server. The files are protected downloads, so I need some kind of way to authorize the user. The CMS I'm using uses cookies checked against a MySQL database to verify the user. On the server, I have no control over max_execution_time, and limited control over memory_limit. My tec...

Reading from large, continuously growing file with BufferedReader

The task I have is to (somewhat efficiently) read line-by-line through a very large, continuously growing file. Here's basically what I'm doing now: BufferedReader rd = //initialize BufferedReader String line; while(true){ while((line=rd.readLine())==null){ try{ Thread.sleep(1000); }catch(InterruptedExcep...

Getting a 2GB file inside PHP?

I am needing to download a very large file via PHP, the last time I did it manually via http it was 2.2gb in size and took a few hours to download. I would like to automate the download somehow. Previously I have used file_put_contents($filename, file_get_contents($url)); Will this be ok for such a large file? I will want to untar th...

File downloads incomplete over slow connection

I have a 15MB file on a website (Apache webserver) that downloads fine on reasonable speed connections, but is almost always incomplete on slower connections (28KBytes/sec, for example). The size of the incomplete file is random, from 2 to 13 MB. I have verified the behavior in both Safari and Firefox, on a connection with negligible l...

Upload large files using Ruby

I'm wondering what is the best pattern to allow large files to be uploaded to a server using Ruby. I've found Rails and Large, Large file Uploads: Looking at the alternative but it doesn't give any concrete solutions. I don't want to use Rails since I'm working on a simple upload server that'll run in standalone mode. I'm guessing tha...

Problems with HUGE XML files

I have 16 large xml files. When I say Large, I am talking in gigabytes. One of these files is over 8 GB. Several of them are over 1 gb. These are given to me from an external provider. I am trying to import the XML into a database so that I can shred it into tables. Currently, I stream 10,000 records at a time out of the file into mem...

How do you save Large Objects in Access2007?

I am importing data (Pdf, Xls, Doc, Txt) into an Access 2007 Database....then I convert it to Base64 (called strData below). If the original was about 150K or less then I can save the converted file (about 250K characters) into a memo field. However, if the original is larger than that then the code gives me "run-time error '2498': An ex...

Estimating the word count of a file without reading the full file

I have a program to process very large files. Now I need to show a progress bar to show the progress of the processing. The program works on a word level, read one line at a time, splitting it into words and processing the words one by one. So while the programs runs, it knows the count of the words processed. If somehow it knows the wor...

OutofMemoryException - Loading Extremely Large Images

I'm trying to load an extremely large image (14473x25684), but I'm hitting into a memory limitation. Here's a simple program to demonstrate the problem: static void Main(string[] args) { string largeimage = @"C:\Temp\test_image.jpg"; // 14473x25684 Image i = Bitmap.FromFile(largeimage); // OutofMemoryException was unhandled } ...

Write very large array to file in PHP

I've got a client with a Magento shop. They are creating a txt file to upload to googlebase, which contains all of their products, but due to the quantity of products (20k), the script bombs out once it's taken up about 1gb. It's being run via cron. Is there a way to either zip or segment the array, or write it to the file as it's creat...

Very large images in web browser

We would like to display very large (50mb plus) images in Internet Explorer. We would like to avoid compression as compression algorithms are not what CSI would have us believe that they are and the resulting files are too lossy. As a result, we have come up with two options: Silverlight Deep Zoom or a Flash based solution (such as Zoo...

c handle large file

I need to parse a file that could be many gbs in size. I would like to do this in C. Can anyone suggest any methods to accomplish this? The file that I need to open and parse is a hard drive dump that I get from my mac's hard drive. However, I plan on running my program inside of 64-bit Ubuntu 10.04. Also given the large file size, the ...

View large text files online in web browser (logs, linux)

Does anybody know any linux (php preferred) simple app to view large files online in browser? Like phpMyAdmin(MySQL) but for files. My php application writes some debug, error and informational events to logs. I know, I can log into database, but for me personally appending file seems as more reliable (because it simple) solution for th...

Does fread fail for large files?

I have to analyze a 16 GB file. I am reading through the file sequentially using fread() and fseek(). Is it feasible? Will fread() work for such a large file? ...

Read a long string into memory

Hi, I am having a very large string, and when I read it in Java, I am getting out of memory error. Actually, I need to read all this string into memory and then split into individual strings and sort them based on value. What is the best way do this? Thanks ...

C# serialize large array to disk

I have a very large graph stored in a single dimensional array (about 1.1 GB) which I am able to store in memory on my machine which is running Windows XP with 2GB of ram and 2GB of virtual memory. I am able to generate the entire data set in memory, however when I try to serialize it to disk using the BinaryFormatter, the file size get...

How to show feedback while streaming large files with WCF

I'm sending large files over WCF and I'm using transferMode="Streamed" in order to get this working, and it is working fine. The thing is sometimes those files are just too big, and I want to give the client some sort of feedback about the progress. Does anybody have a godd solution/idea on how to acomplish this? EDIT: I don't comman...

Better way to store large files in a MySQL database?

I have a PHP script that you can upload very large files with (up to 500MB), and the file's content is stored in a MySQL database. Currently I do something like this: mysql_query("INSERT INTO table VALUES('')"); $uploadedfile = fopen($_FILES['file']['tmp_name'], 'rb'); while (!feof($uploadedfile)) { $line = mysql_escape_string(fget...