views:

444

answers:

3

I have several large files, each of which I want to chunk/split it in to predefined number of parts.

Is there an efficient way to do it in Unix (e.g. via awk/sed/perl)?

Also each file can have varied number of lines.

File1.txt 20,300,055 lines
File2.txt 10,033,221 lines
etc...
+7  A: 

If you just want to split each file into files of a fixed number of lines or bytes, you can use the split command.

Thomas Padron-McCarthy
+2  A: 

I found this. You may need to find the number of parts to be splited it too first.

NawaMan
+2  A: 

you can use csplit, which can split by context. Check the man/info page of csplit for more info.

ghostdog74