I am looking for a text editor that will be able to load a 4+ Gigabyte file into it. Textpad doesn't work. I own a copy of it and have been to its support site, it just doesn't do it. Maybe I need new hardware, but that's a different question. The editor needs to be free OR, if its going to cost me, then no more than $30. For Windows.
For windows, unix, or Mac? On the Mac or *nix you can use command line or GUI versions of emacs or vim.
For the Mac: TextWrangler to handle big files well. I'm not versed enough on the Windows landscape to help out there.
Emacs can handle huge file sizes and you can use it on Windows or *nix.
Jeff Atwood has a post on this here: http://www.codinghorror.com/blog/archives/000229.html
He eventually went with Edit Pad Pro, because "Based on my prior usage history, I felt that EditPad Pro was the best fit: it's quite fast on large text files, has best-of-breed regex support, and it doesn't pretend to be an IDE."
It's really tough to handle a 4G file as such. I used to handle larger text files, but I never used to load them in to my editor. I mostly used UltraEdit in my previous company, now I use Notepad++, but I would get just those parts which i needed to edit. (Most of the cases, the files never needed an edit).
Why do u want to load such a big file in to an editor? When I handled files of these size, I used GNU Core Utils. The most common operations i performed on those files were head ( to get the top 250k lines etc ), tail, split, sort, shuf, uniq etc. It's really powerful.
There's a lot of things you can do with GNU Core Utils. I would definitely recommend those, instead of a new editor.
What OS and CPU are you using? If you are using a 32-bit OS, then a process on your system physically cannot address more than 4GB of memory. Since most text editors try to load the entire file into memory, I doubt you'll find one that will do what you want. It would have to be a very fancy text editor, that can do out-of-core processing, i. e. load a chunk of the file at a time.
You may be able to load such a huge file with if you use a 64-bit text editor on a computer with a 64-bit CPU and a 64-bit operating system. And you have to make sure that you have enough space in your swap partition or your swap file.
Why do you want to load a 4+ GB file into memory? Even if you find a text editor that can do that, does your machine have 4 GB of memory? And unless it has a lot more than 4 GB in physical memory, your machine will slow down a lot and go swap file crazy.
So why do you want a 4+ GB file? If you want to transform it, or do a search and replace, you may be better off writing a small quick program to do it.
When I'm faced with an enormous log file, I don't try to look at the whole thing, I use Free File Splitter
Textpad also works well at opening files that size. I have done it many times when having to deal with extremely large log files in the 3-5gb range. Also, using grep to pull out the worthwhile lines and then look at those works great.
The question would need more details.
Do you want just to look at a file (eg. a log file) or to edit it?
Do you have more memory than the size of the file you want to load or less?
For example, TheGun, a very small text editor written in assembly language, claims to "not have an effective file size limit and the maximum size that can be loaded into it is determined by available memory and loading speed of the file. [...] It has been speed optimised for both file load and save."
To abstract the memory limit, I suppose one can use mapped memory. But then, if you need to edit the file, some clever method should be used, like storing in memory the local changes, and applying them chunk by chunk when saving. Might be ineffective in some cases (big search/replace for example).
I've had to look at monster(runaway) log files (20+ GB). I used hexedit(http://www.download.com/3001-2352_4-10823211.html) FREE version which can work with any size files. It is also open source. Cool ?
Just for the record, and to complete this list:
You may want to check "Large Text Viewer", which does... pretty much what it says it does ;) (on Windows)
- No installation is needed.
- The executable is only 568KB!
- search features (regexp supported)
Instead of loading a gigantic log file in an editor, I'm using Unix command line tools like grep
, tail
, gawk
, etc. to filter the interesting parts into a much smaller file and then, I open that.
On Windows, try Cygwin.
I am amazed at the number of people who challenge you on wanting to open a large text file. It's only because they have become accustomed to using tools that won't do the job properly. Think about it. If you said you wanted to open a file of 1 mb only a few people would challenge you on it. But since you want to open a file of 4 Gb, and they can't then they think that you should try to work around to their way of doing it.
Yes, you can do a lot with the unix/linux tools, but for what you are looking to do, an editor or viewer seems perfect. The fact that there are not many of those tools that can do that is no reason that you shouldn't want to do it that way. I am amazed by how much of that I see on help boards on the internet. So many people can't really provide a useful answer to a problem, so they challenge the need to do that in the first place.
Open a file to see what's in it using an editor. Now that's a concept that anyone ought to be able to get behind.
f you just want to view a large file rather than edit it, there are a couple of freeware programs that read files a chunk at a time rather than trying to load the entire file in to memory. I use these when I need to read through large ( > 5 GB) files.
Large Text File Viewer by swiftgear http://www.swiftgear.com/ltfviewer/features.html
Big File Viewer by Team Walrus.
You'll have to find the link yourself for that last one because the I can only post a maximum of one hyperlink being a newbie.