With all due respect to Rob Kennedy, he's misrepresenting how C/C++ handle multidimensional arrays.
Consider int foo[10][10] and int foo[100]. In both cases foo is a pointer to a block of memory containing sizeof(int)*100 bytes.
There are subtle differences between int *pointer and int foo[10][10]. Especially with respect to sizeof() and certain compiler-specific error checking. But, when desired, we can treat them interchangeably.
That said, leachrode can't pass argument 2 of getline() as "[10][10]". He can pass "10*10*sizeof(foo)", where foo is myArray's type. He could also pass sizeof(myArray), if it's actually declared as an array.
The variable myArray does NOT necessarily need to be of type char. However, none of the bytes in the myArray data block can have the value '\n' if that getline() invocation is going to work.
Personally, I'd go with binary storage. E.g.: (Assuming inputFile is an iostream object.)
inputFile.write( (const char *)myArray, sizeof(myArray) );
inputFile.read( (char *)myArray, sizeof(myArray) );
*HOWEVER*, that does assume that you are reading and writing on binary-compatible systems. We are making assumptions about how your compiler stored that array in memory. Different compilers may do things differently. Padding and alignment issues can create cross-platform nightmares.
It would be better, from a portability & future maintenance perspective, to read/write each array element in/out individually! Codewise it's not much more complicated. Speed wise, iostreams are buffered and you're bound by disk io anyway...
Reference: http://www.cplusplus.com/reference/iostream/
Afterthought: - IF myArray in not an elementary type (int[], double[]), but instead represents an array of objects (with virtual members) or pointers, you'll need a means of serializing. Pointers won't point to valid memory when they are read back in.