tags:

views:

2936

answers:

6

Hello everyone!

I need to concatenate 3 files using C#. A header file, content, and a footer file, but I want to do this as cool as it can be done.

cool = really small code or really fast (non-assembly code).

+4  A: 
File.ReadAllText(a) + File.ReadAllText(b) + File.ReadAllText(c)


Not sure what constitutes cool, but it seems cool these days to use LINQ for everything:

string.Join("", Array.ConvertAll(new[] { "file1","file2", "file3" }, File.ReadAllText));
Jimmy
disclaimer: will not work for large files. See Mehrdad Afshari's code for what to do then.
Jimmy
This looks like it will need a lot of memory to hold all three files in memory as a string, let alone the intermediate string object resulting by adding the first two and the final string created by adding on the last.
Brian Ensink
Jimmy you beat me to it with your own disclaimer! :)
Brian Ensink
1) Concating strings are not good practise. You should use stringbuilder.2) This is not good solution for binary files.
TcKs
@TcKs: 1) inline concatenation here is done by string.concat(a,b,c) in one operation, with lower overhead than stringbuilder 2) I assumed "header/content/footer" files were text.
Jimmy
+7  A: 
void CopyStream(Stream destination, Stream source) {
   int count;
   byte[] buffer = new byte[BUFFER_SIZE];
   while( (count = source.Read(buffer, 0, buffer.Length)) > 0)
       destination.Write(buffer, 0, count);
}


CopyStream(outputFileStream, fileStream1);
CopyStream(outputFileStream, fileStream2);
CopyStream(outputFileStream, fileStream3);
Mehrdad Afshari
i don't find this so smart. What is a good BUFFER_SIZE? nobody knows. compared to File.ReadAllText(a) + File.ReadAllText(b) + File.ReadAllText(c), this looks like premature optimization to me.
nes1983
Depends on the size of your files. You would not like to concatenate a few hundred megabytes using the ReadAllText method.
gimpf
I agree. But in such cases, I would really leave such a task to the experts that wrote a library or, as would cross my thoughs, the cat program. I would certainly not read it chunkwise with some chunk-size a fair dice-roll gave me.
nes1983
In fact, this is the traditional method to copy a stream. A good buffer size depends on the specific situation, but probably it has to be a multiple of block size at very least. Remember, file sizes are among the things that easily go beyond 32 bit integer max value ;)
Mehrdad Afshari
Yea, I've seen this method. but is it smart to pretend low-level in a language like C#? to get this right, you have to consider context switches, library io buffers, etc. Buffer-copying is low-level stuff and should be done on the low level, if you REALLY need it, which isn't the usual case.
nes1983
@Nico, Smartness and simplicity are relative.
Mehrdad Afshari
i got too much into this, hm? i'm sorry, i didn't mean to offend you.
nes1983
@Nico, I don't how you interpreted my statement. I meant, the amount of simplicity and smartness depends on the situation. In this special case (using files), the performance of loading the whole files can be extremely critical which cannot always be ignored just because you're coding in C#.
Mehrdad Afshari
Most cases, Jimmy's way of doing it is fine. If you have large files and need speed, reading from the output of "cat file1 file2 file3" is probably going to be faster than having the buffer on the C# side.
nes1983
It might be but it has some other problems. Creating the process, introducing platform dependence, reliance on a third party program and making stuff more complex. You have to make these trade-offs and the "right" solution depends on the specific scenario.
Mehrdad Afshari
I was thinking all the time why you think it's that much "low-level". I thought it might worth mentioning that FileStreams are internally buffered, so the array here is not doing any actual file buffering to reduce low level syscalls. It's just reducing method calls and it's not that low level.
Mehrdad Afshari
Yea, but this is kind of my point: using these buffers looks low-level and complicated, but actually it's above the library-buffer, which reads from the OS buffer, which reads from the disc buffer. if i want to use buffers, i do it low-level. if i want to attach files high-level, i do f1+f2+f3.
nes1983
+1  A: 

i don't know C# so well - can you call the shell from there? can you do something like

system("type file1 file2 file3 > out");

?

i believe that asymptotically, this ought to be very fast.

niko

nes1983
in windows "type file1 file2 file3 > out" works.
Jimmy
thanks, i'll tweak the answer.
nes1983
A: 

If you are in Win32 enviroment, the most eficient solution could be Win32 API function "WriteFile". There is example in VB 6, but rewriting into c# is not difficult.

TcKs
+2  A: 

Another way....how about letting the OS do it for you?:

ProcessStartInfo psi = new ProcessStartInfo("cmd.exe", 
     String.Format(@" /c copy {0} + {1} + {2} {3}", 
         file1, file2, file3, dest));
psi.UseShellExecute = false;
Process process = Process.Start(psi);
process.WaitForExit();

HTH
Kev

Kev
+1  A: 

You mean 3 text files?? Does the result need to be a file again?

How about something like:

string contents1 = File.ReadAllText(filename1);
string contents2 = File.ReadAllText(filename2);
string contents3 = File.ReadAllText(filename3);

File.WriteAllText(outputFileName, contents1 + contents2 + contents3);

Of course, with a StringBuilder and a bit of extra smarts, you could easily extend that to handle any number of input files :-)

Cheers, Marc

marc_s