I mean 100+ MB big; such text files can push the envelope of editors.

I need to look through a large XML file, but cannot if the editor is buggy.

Any suggestions?

+20  A: 

I've found that UltraEdit32 does pretty well loading large text files (including XML).

Gil Milow
i just opened a 150mb SQL dump in UEStudio, and after a short loading pause, it worked fine. Scrolling was a bit jerky, but not bad for a file with 2.4 million lines.
I regularly use UltraEdit for very large files, binary and text, and it always works well.
+4  A: 

My normal standby is Notepad++, but in this case I have to post specifically to recommend against it. It can handle reasonably large files okay in most cases, but it really struggles with large Xml data.

Something else worth noting: many so-called text editors will treat Xml as more than just text. They'll do validation, folding, and try to create a DOM, resulting in a memory image much larger than the file itself.

Notepad++ is doing something like this, but other editors may do it as well.

Joel Coehoorn
+3  A: 

BBEdit on the Mac will handle them just fine.

Otherwise VIM (or vim -R if you don't need to edit it) will handle it just fine as well.

+1 for BBEdit. It's my editor of choice for everything except Objective-C (Xcode wins for Obj-C, because that's what it was *built* for).
Dave DeLong
BBEdit has file size limit of 384 MB. In addition, it loads the whole file in memory, just like most other editors, which means opening files is slow. See
+91  A: 

I'm assuming that you're on Windows, so I'll recommend gVim. Where Notepad++ will choke on very large files, VIM has chowed through those puppies with little problem.

010Editor on Windows will open GIANT (think 5GB) files in binary mode and allow you to edit and search the text.

Other suggestions are SlickEdit and Emacs, Large Text File Viewer.

Text editors with 2GB limit: Notepad++, Jujuedit, TextPad

VIM, or Emacs... pick your poison, both will handle any file you throw at them. I personally prefer Emacs, but both will beat notepad without so much as a hiccup.
Mike Stone
Emacs has a maximum buffer size, dependent on the underlying architecture (32 or 64 bits). I think that on 32 bit systems you get "maximum buffer size exceeded" error on files larger than 128 MB.
Rafał Dowgird
+1 for Vim. It's my editor of choice.
Steve Rowe
I just tried Notepad++ with a 561MB log file and it said it was too big
I regularly open ~600mb files with gVIM ...
I've been asked in the past to edit a couple of plain text files in the multi-GB range, which our users tried to edit with MS Word... well, most of you will know what happened.Just opened it in vim and searched and replaced with the user sitting next to me in a matter of seconds (after that huge file was finally read in of course).
@Rafal Interesting! Looks like on 64bit it is ~1024 petabytes. The reason has to do with the fact that emacs has to track buffer positions (such as the point)
I'll second gVim for huge files. I just edited a 950 MB text file with no problems (but it took a while to open and save). When I tried that same file in Notepad2, Windows became concerned with the size of my pagefile, and started resizing it.
Christian Davén
But be careful, vim will only work as long as the files in question have enough line breaks. I once had to edit a ca. 150 MB file without any line breaks, and had to resort to gedit because vim couldnt handle it.
If you are going to use (g)vim then to improve performance you may want to turn off some features such as syntax highlighting, swapfile and undo. See, and
Dave Kirby
Notepad++ is a great,free product I use a lot - but it doesn't handle very large files that well.
Dan Diplo
I wonder if 5GB text files exist .. :-O .. if you don't mind, may I know .. (in practical world) where we are forced to use/edit these bulky text files.. (alternate way would have been to break the file and make a few of it.. usually larger files, of any file-type, make system cry to give performance)
infant programmer
@Rafal: emacs buffer size can be boosted with emacs 23. I don't recall offhand how to do it.
Paul Nathan
I tried all, gVim sucked in that it didnt even tell you it was loading a file - took forever to load the file (only 200k, 5 million lines).SlickEdit opened the entire file in about 3 seconds. Getting the trial license was a PIA tho.Thank you for listing these.
Emacs definitely has a buffer size problem on 32-bit.
Joseph Garvin
I want an editor that mmap()s the file and reads only the parts I am looking at... even gvim seems to load the whole thing into memory first, and even resizing the window freeezes it while it thinks...
Joe Koberg

On windows I've used Notepad++. I don't know if Ive edited that large, but certainly many Megs.

I'm not sure of the limit, but Visual Studio 2005 should handle it, and it will allow you to view it as a table (assuming the XML is regular).

Dan Hewett
notepad++ is good(at least not bad) at files less than 100 MB.up to that level, notepad++ does not handle well.
Yin Zhu
+9  A: 

Here's another vote to NOT use Notepad++. We are working with huge XML files at my work and Notepad++ will choke on them every-time.

Surprisingly Wordpad performs better on these types of files than Notepad++. I've also had success with UltraEdit although I'm downloading gVIM now to see how it performs.

If you are just looking to validate a large file I've asked that question here and gotten some good responses (XMLStartlet is a nice command line app)

Dan Cramer
Notepad++ struggles on even smallish files if you try and insert text at the start - somebody didn't do a good data model design
Martin Beckett
How big is huge? I opened a 300mb log file in notepad++ earlier and didn't have any problems viewing/scrolling/searching though it did take maybe 10 seconds or so to open.
I just tried opening a 95mb sql dump in notepad++ and the whole thing locked up.
Ryan Elkins
+3  A: 

I've opened 20+ meg log files in Emacs without it breaking a sweat, I can't imagine it would falter at 100+ meg files. There are builds of it for windows too.


I just on a whim tested a very simple generated XML file in Emacs... 140mb file, and it handled it beautifully. Syntax coloring and everything worked fine, a slight delay when opening the file, but no more than a few seconds. Same with going to the end of the file... otherwise, absolutely no problems.

Mike Stone
Log files are one thing: Xml is another. You have to create DOM that goes with the file, and I've seen lots of editors choke on even modest xml.
Joel Coehoorn
We are talking about Emacs here, not notepad or some wimpy average windows app ;-) Emacs is a very solidly built editor that has been around since the days when micro performance mattered. It was a beast then, but now it is pretty lightweight, and very efficient comparatively.
Mike Stone
emacs23 (even in fundamental mode) will choke on just a 2MB XML file that is all one line. After reformatting to multiple lines with GNU textutils, emacs is still painfully sluggish with xml-mode syntax highlighting.
+2  A: 

I've opened and browsed 100MB text files with SlickEdit.

As much as I hate to say anything nice about SlickEdit, editing files much larger than available memory is something it is very good at.
Mark Bessey
+8  A: 

Okay, I've tried it with Visual Studio, Emacs and gVim (64 bit).

Emacs chokes, VS opens it but is too sluggish, and gVim kicks ass. I just tried an intentionally generated 500 meg file on gVim, and it opens that too fine without much trouble :)

vim is good. vim is best
This may help:
Jeremy Stein
LargeFile.vim (as pointed by Jeremy Stein) is useful. I was about to add a comment about it. It basically disables the swapfile whenever the file is too big.
Denilson Sá
Vi and its LargeFile plugin FTW!
Pascal Thivent
+4  A: 

My vote is for EditPad. There are lite and pro versions with not much difference between them. I regularly open files of >>100Mb. Plus it lets you select columns of text!

+1  A: 

gVIM all the way - I opened a 1 gig text file on windows, there was an initial delay of about 15 secs, but after that it was as smooth as anything. gVIM on Unix was smoother & quicker than on windows.

+4  A: 

I try to open a 3GB log file in gVim... I stopped the process as it took too long. While in the process of opening the file the *.swp file was growing... I guess it would grow till about the same size of the file itself at the end... I didn't want this. Solution:

:set noswapfile might help speeding things up.

I got this from a nice article from Peter Chen

+4  A: 

If you're running Windows, TheGun (6144 bytes of MASM goodness) is awesome for this sort of thing - I've opened corrupt mbox files many hundreds of megabytes without a hitch:

Another one you may want to consider is Programmer's File Editor (PFE) which is "capable of opening enormous files (limited only by the total amount of virtual memory available)":

For it's size, TheGun is excellent.
Umber Ferrule
Editors should be able to read files larger than the virtual memory at a minimum. It's just a bit of fseek() and fread()
Charlie Somerville
+10  A: 

I've been using EmEditor and it handles huge text files with no problem. (hundreds of MB and up)

EmEditor is the most efficient editor for large files that I've seen. It was specifically designed for this. Most importantly, it doesn't load the entire file into memory like most editors.
+1  A: 

I have used Textpad for opening a 500+ meg xml file. It was too good. Opened without any glitch.

+1  A: 

Nano works just fine.


I tried the following programs: gVim, Notepad++, SQL Work Bench, and 'The Gun'.

Out of all of them, 'The Gun' seems to work the best.

+1  A: 

EditPlus works fine for multi-hundred-megabyte files. Been using it for more years than I care to remember.

Trevor Harrison
+2  A: 


OK, now *that's* minimalist :-)
+2  A: 

Disk-based file editing: (Windows only)

+20  A: 

Why are you using editors to just look at a (large) file?

Under *nix or cygwin, just use less ("less is more", only better, since you can back up). Searching and navigating under less is very similar to vim, but there is no swap file and little RAM used.

+1, I recently had some really huge xml files (+1 gigabyte) that I needed to look at. I'm on windows and both vim, emacs, notepad++ and several other editors completely choked on the file to the point where my system almost became unusable when trying to open the file. After a while I realized how unnecessary it was to actually attempt to open the file in an -editor- when I just needed to -view- it. Using cygwin (and some clever grep/less/sed-magic) I easily found the part I was interested in and could read it without any hassle.

XMLMax will do it AND if the xml file is not well-formed, it will locate the error, show it to you, let you fix it, save it and reload it for you. It should load a 100Mb file in ten seconds. You can Google to find it.

Bill Conniff

You can edit huge files with PilotEdit easily. I tried to edit a file of 7GB.


vim and vless (alias vless='/usr/share/vim/vim72/macros/')

Kartik Mistry
+7  A: 

One thing is how big files your editor can edit theoretically. But another thing is, is it fast enough to realistically edit that file.

Most editors use the easy way and simply load the whole file in memory. This means that you can not edit files larger than the largest free memory block. This gets even worse if the editor converts ASCII file into Unicode, which doubles the size. With editors like this, just opening the file may take several minutes. But even if you can load the file, any editing operations may be so slow that you really can not do anything.

For editing huge files, VEDIT is the best choice. It is marketed as the fastest editor on Earth, and it is probably true. In addition, VEDIT uses very little of memory and does not create huge tmp files no matter how big files you are editing. The standard (32-bit) version of VEDIT can edit files up to 2 GB (but you can edit larger files by using the built-in splitter function). VEDIT Pro64 can directly handle files of any size.

UltraEdit is OK, too, but it is not as fast as VEDIT and you may need to change configuration and sacrifice backup and undo for editing large files.

I just opened (copy of) my Outlook .pst file (297 MB) on VEDIT. Opening the file took approximately 0.1 seconds! Searching for a string that was found near the end of file took 8.0 sec in normal mode and 1.1 sec in read-only mode. Inserting and deleting characters were instantenous, as was undo. Saving the file took 11 seconds.

Opening the same file to Ultraedit took 9.8 sec in normal mode and about 1.0 sec if tmp files were disabled. Searching took 11.5 sec (using read-only mode did not have effect on this). Inserting and deleting characters were instantenous, but undo a single character insert took 26 seconds. Saving the file took 16 seconds.

I tried to open the file in Notepad, but it crashed (probably because the file is binary). Opening a 92 MB text file took 3 minutes.

Attempt to load the file on Eclipse default editor caused error "out of java heap space". The same happened even with the 92 MB file.

For more information about many text editors, see:
Wikipedia: Comparison of text editors

+1 excellent numeric comparisons
To clarify, albeit a bit pedantically: Editors using UTF-16 (a specific encoding) will use twice as much memory for files that are not encoded in UTF-16. A Unicode editor using UTF-8 internally opening a UTF-8 file will not use twice as much memory. Nor will a UTF-16 editor opening a UTF-16 file. "Unicode" != "twice as much RAM"
+1  A: 

I've opened Wikipedia dumps (you may guess the size). So far best option here is gVim. But as it is XML file, you can sneak peek into it (check well-formedness, count entries) with things like Apache Xalan.

Kuroki Kaze

I've always been fond of JWrite.

Link : JWrite from MWA Software

Jason Slocomb

JujuEdit opens up to 2GB in an instant, scrolls quickly, edits quickly and is free.

Very Large File Support - edit files up to 2GB in size, and browse them instantly with special "Open From Disk" mode.

Very Big Undo Buffer - virtually unlimited undo buffer (preserved after save so that saved changes can still be undone while the file is still open)


Programmer's File Editor is also quite amazing. It's from 1999 but it opens a 700MB file in about 2 seconds and saves it in about 3 seconds. Scrolling and editing is super fast too.


Ordinarily, I use GNU Emacs (v23) for text editing or lightweight coding. But I have to admit GNU Emacs sucks when it comes to large, huge, gigantic files: you have to configure and compile it yourself to edit file in the magnitute of megabytes, but even if you do, it's not really recommended. For large files, in text mode GNU Emacs is sluggish at best and sometimes hang; in hex-mode, it almost always hang (and I resort to gvim + xxd for hexediting not-so-large files, and dedicated hex editors for really-large files). I guess this is not some particularity with my computer but some design issues (feature? bug?) with GNU Emacs.


HiEditor, the best I've found for this job.


Use EmEditor to handle my very large files (more than 10GB and sometimes few hundred million rows) and it open with a breeze.. no problem so far.


TextPad is the cheapest and best option to open files nearing GBs in size.

Abdullah Akbar
+1  A: 

XML Copy Editor is a fast, open-source XML editor designed to handle huge files.

Windows and Linux binaries (as well as source) are available and I heard that new versions will also include Mac releases.

Great XML editor — I've used it for over three years.


Old question I know but I stumbled on this thread. The best program I have found for opening (viewing not editing) large files - such as database dump files and traces in excess of 10 GB plus - way in excess of available system or virtual memory - is "Large Text File Viewer".

It loads the file in chunks so you are able to scan through the file with out loading the whole file in memory.

It is freeware, written by Tie-Qi (TQ) Chen.

Latest version 5.2 released 5 June 2009, so looks like it is still a live project.

+3  A: 

I can attest to Cream.

Loaded, edited and saved an 850mb file for me in less than a minute. Since it's based on gVim, all the other testimonials for gVim would apply.

Failures: Notepad++, Large Text File Viewer, Notepad2

+1  A: 

I find a regular web browser has no trouble opening large files. I just opened a 220,047kb file on old IE6 without a problem.


If you just want to view a large file rather than edit it, there are a couple of freeware programs that read files a chunk at a time rather than trying to load the entire file in to memory. I use these when I need to read through large ( > 5 GB) files.

Large Text File Viewer by swiftgear

Big File Viewer by Team Walrus.

You'll have to find the link yourself for that last one because the I can only post a maximum of one hyperlink being a newbie.


I usually use TextPad for anything over 1GB.

Sometimes I have to go hunting through debug logs that are 3-4GB in size, and TextPad is the only thing i have found that will open them (and quickly as well, while still being able to perform searches and normal editing)

I think most decent text editors will open files of a couple of hundred MB. Hell, windows notepad will, if you give it a few minutes to chew away at it, and are not expecting to use the pc for anything else in the meantime)

But for anything over a few hundred MB, I suggest TextPad.

(Please note, my normal editor of choice for anything else is Notepad++, but it struggle with anything over 300MB or so)


I use EmEditor for large files. It opens and handles them faster than any other text editor I've used, and I work with some BIG files (upwards of 5GB).


If you just want to View the file, V File Viewer is an excellent tool, I opened an 8.5GB file with it earlier today. I have to say, I've been doing some searching today for a text editor which will handle what I call "Large" files and so far all the applications I've tried have choked. It makes me laugh to see people describing 100MB files as "Large". I deal with alot of data, clients will send me a flat text file with their entire say an entire transaction database covering a year or more of transactions and that sort of thing. I regularly deal with file sizes of over 2Gb, and right now am looking at a text file which as I said is 8.5 Gb, but which is mostly crap I'm not interested in. Anyone know of anything that will handle that? As to why would I want to? Well, we frequently get data provided to us with high/low ascii characters embedded which cause our data processing program (SAS) to fail to read the data in (such as end of file characters or embedded line feed characters etc). In order to fix this, I usually open the data in a text editor and strip the special chars out.