I mean 100+ MB big; such text files can push the envelope of editors.
I need to look through a large XML file, but cannot if the editor is buggy.
Any suggestions?
I mean 100+ MB big; such text files can push the envelope of editors.
I need to look through a large XML file, but cannot if the editor is buggy.
Any suggestions?
I've found that UltraEdit32 does pretty well loading large text files (including XML).
My normal standby is Notepad++, but in this case I have to post specifically to recommend against it. It can handle reasonably large files okay in most cases, but it really struggles with large Xml data.
Something else worth noting: many so-called text editors will treat Xml as more than just text. They'll do validation, folding, and try to create a DOM, resulting in a memory image much larger than the file itself.
Notepad++ is doing something like this, but other editors may do it as well.
BBEdit on the Mac will handle them just fine.
Otherwise VIM (or vim -R if you don't need to edit it) will handle it just fine as well.
I'm assuming that you're on Windows, so I'll recommend gVim. Where Notepad++ will choke on very large files, VIM has chowed through those puppies with little problem.
010Editor on Windows will open GIANT (think 5GB) files in binary mode and allow you to edit and search the text.
Other suggestions are SlickEdit and Emacs, Large Text File Viewer.
On windows I've used Notepad++. I don't know if Ive edited that large, but certainly many Megs. http://notepad-plus.sourceforge.net/uk/site.htm
I'm not sure of the limit, but Visual Studio 2005 should handle it, and it will allow you to view it as a table (assuming the XML is regular).
Here's another vote to NOT use Notepad++. We are working with huge XML files at my work and Notepad++ will choke on them every-time.
Surprisingly Wordpad performs better on these types of files than Notepad++. I've also had success with UltraEdit although I'm downloading gVIM now to see how it performs.
If you are just looking to validate a large file I've asked that question here and gotten some good responses (XMLStartlet is a nice command line app)
http://stackoverflow.com/questions/40663/validating-a-huge-xml-file
I've opened 20+ meg log files in Emacs without it breaking a sweat, I can't imagine it would falter at 100+ meg files. There are builds of it for windows too.
UPDATE:
I just on a whim tested a very simple generated XML file in Emacs... 140mb file, and it handled it beautifully. Syntax coloring and everything worked fine, a slight delay when opening the file, but no more than a few seconds. Same with going to the end of the file... otherwise, absolutely no problems.
Okay, I've tried it with Visual Studio, Emacs and gVim (64 bit).
Emacs chokes, VS opens it but is too sluggish, and gVim kicks ass. I just tried an intentionally generated 500 meg file on gVim, and it opens that too fine without much trouble :)
I try to open a 3GB log file in gVim... I stopped the process as it took too long. While in the process of opening the file the *.swp file was growing... I guess it would grow till about the same size of the file itself at the end... I didn't want this. Solution:
:set noswapfile might help speeding things up.
I got this from a nice article from Peter Chen
If you're running Windows, TheGun (6144 bytes of MASM goodness) is awesome for this sort of thing - I've opened corrupt mbox files many hundreds of megabytes without a hitch:
http://www.movsd.com/thegun.htm
Another one you may want to consider is Programmer's File Editor (PFE) which is "capable of opening enormous files (limited only by the total amount of virtual memory available)":
I've been using EmEditor and it handles huge text files with no problem. (hundreds of MB and up)
I tried the following programs: gVim, Notepad++, SQL Work Bench, and 'The Gun'.
Out of all of them, 'The Gun' seems to work the best.
EditPlus works fine for multi-hundred-megabyte files. Been using it for more years than I care to remember.
Why are you using editors to just look at a (large) file?
Under *nix or cygwin, just use less ("less is more", only better, since you can back up). Searching and navigating under less is very similar to vim, but there is no swap file and little RAM used.
XMLMax will do it AND if the xml file is not well-formed, it will locate the error, show it to you, let you fix it, save it and reload it for you. It should load a 100Mb file in ten seconds. You can Google to find it.
You can edit huge files with PilotEdit easily. I tried to edit a file of 7GB.
vim and vless (alias vless='/usr/share/vim/vim72/macros/less.sh')
One thing is how big files your editor can edit theoretically. But another thing is, is it fast enough to realistically edit that file.
Most editors use the easy way and simply load the whole file in memory. This means that you can not edit files larger than the largest free memory block. This gets even worse if the editor converts ASCII file into Unicode, which doubles the size. With editors like this, just opening the file may take several minutes. But even if you can load the file, any editing operations may be so slow that you really can not do anything.
For editing huge files, VEDIT is the best choice. It is marketed as the fastest editor on Earth, and it is probably true. In addition, VEDIT uses very little of memory and does not create huge tmp files no matter how big files you are editing. The standard (32-bit) version of VEDIT can edit files up to 2 GB (but you can edit larger files by using the built-in splitter function). VEDIT Pro64 can directly handle files of any size.
UltraEdit is OK, too, but it is not as fast as VEDIT and you may need to change configuration and sacrifice backup and undo for editing large files.
I just opened (copy of) my Outlook .pst file (297 MB) on VEDIT. Opening the file took approximately 0.1 seconds! Searching for a string that was found near the end of file took 8.0 sec in normal mode and 1.1 sec in read-only mode. Inserting and deleting characters were instantenous, as was undo. Saving the file took 11 seconds.
Opening the same file to Ultraedit took 9.8 sec in normal mode and about 1.0 sec if tmp files were disabled. Searching took 11.5 sec (using read-only mode did not have effect on this). Inserting and deleting characters were instantenous, but undo a single character insert took 26 seconds. Saving the file took 16 seconds.
I tried to open the file in Notepad, but it crashed (probably because the file is binary). Opening a 92 MB text file took 3 minutes.
Attempt to load the file on Eclipse default editor caused error "out of java heap space". The same happened even with the 92 MB file.
For more information about many text editors, see:
Wikipedia: Comparison of text editors
I've opened Wikipedia dumps (you may guess the size). So far best option here is gVim. But as it is XML file, you can sneak peek into it (check well-formedness, count entries) with things like Apache Xalan.
I've always been fond of JWrite.
Link : JWrite from MWA Software
JujuEdit opens up to 2GB in an instant, scrolls quickly, edits quickly and is free.
http://www.jujusoft.com/software/edit/index.html
Very Large File Support - edit files up to 2GB in size, and browse them instantly with special "Open From Disk" mode.
Very Big Undo Buffer - virtually unlimited undo buffer (preserved after save so that saved changes can still be undone while the file is still open)
Programmer's File Editor is also quite amazing. It's from 1999 but it opens a 700MB file in about 2 seconds and saves it in about 3 seconds. Scrolling and editing is super fast too.
Ordinarily, I use GNU Emacs (v23) for text editing or lightweight coding. But I have to admit GNU Emacs sucks when it comes to large, huge, gigantic files: you have to configure and compile it yourself to edit file in the magnitute of megabytes, but even if you do, it's not really recommended. For large files, in text mode GNU Emacs is sluggish at best and sometimes hang; in hex-mode, it almost always hang (and I resort to gvim + xxd for hexediting not-so-large files, and dedicated hex editors for really-large files). I guess this is not some particularity with my computer but some design issues (feature? bug?) with GNU Emacs.
Use EmEditor to handle my very large files (more than 10GB and sometimes few hundred million rows) and it open with a breeze.. no problem so far.
TextPad is the cheapest and best option to open files nearing GBs in size.
XML Copy Editor is a fast, open-source XML editor designed to handle huge files.
Windows and Linux binaries (as well as source) are available and I heard that new versions will also include Mac releases.
Great XML editor — I've used it for over three years.
Old question I know but I stumbled on this thread. The best program I have found for opening (viewing not editing) large files - such as database dump files and traces in excess of 10 GB plus - way in excess of available system or virtual memory - is "Large Text File Viewer".
It loads the file in chunks so you are able to scan through the file with out loading the whole file in memory.
It is freeware, written by Tie-Qi (TQ) Chen.
Latest version 5.2 released 5 June 2009, so looks like it is still a live project.
I can attest to Cream. http://cream.sourceforge.net/
Loaded, edited and saved an 850mb file for me in less than a minute. Since it's based on gVim, all the other testimonials for gVim would apply.
Failures: Notepad++, Large Text File Viewer, Notepad2
I find a regular web browser has no trouble opening large files. I just opened a 220,047kb file on old IE6 without a problem.
If you just want to view a large file rather than edit it, there are a couple of freeware programs that read files a chunk at a time rather than trying to load the entire file in to memory. I use these when I need to read through large ( > 5 GB) files.
Large Text File Viewer by swiftgear http://www.swiftgear.com/ltfviewer/features.html
Big File Viewer by Team Walrus.
You'll have to find the link yourself for that last one because the I can only post a maximum of one hyperlink being a newbie.
I usually use TextPad for anything over 1GB.
Sometimes I have to go hunting through debug logs that are 3-4GB in size, and TextPad is the only thing i have found that will open them (and quickly as well, while still being able to perform searches and normal editing)
I think most decent text editors will open files of a couple of hundred MB. Hell, windows notepad will, if you give it a few minutes to chew away at it, and are not expecting to use the pc for anything else in the meantime)
But for anything over a few hundred MB, I suggest TextPad.
(Please note, my normal editor of choice for anything else is Notepad++, but it struggle with anything over 300MB or so)
If you just want to View the file, V File Viewer is an excellent tool, I opened an 8.5GB file with it earlier today. I have to say, I've been doing some searching today for a text editor which will handle what I call "Large" files and so far all the applications I've tried have choked. It makes me laugh to see people describing 100MB files as "Large". I deal with alot of data, clients will send me a flat text file with their entire say an entire transaction database covering a year or more of transactions and that sort of thing. I regularly deal with file sizes of over 2Gb, and right now am looking at a text file which as I said is 8.5 Gb, but which is mostly crap I'm not interested in. Anyone know of anything that will handle that? As to why would I want to? Well, we frequently get data provided to us with high/low ascii characters embedded which cause our data processing program (SAS) to fail to read the data in (such as end of file characters or embedded line feed characters etc). In order to fix this, I usually open the data in a text editor and strip the special chars out.