In your shoes, I would consider the standard scientific data formats, which are much less space- and time-consuming than ASCII, but (while maybe not quite as bit-efficient as pure, machine-dependent binary formats) still offer standard documented and portable, fast libraries to ease the reading and writing of the data.
If you store data in pure binary form, the metadata is crucial to make any sense out of the data again (are these numbers single or double precision, or integers and of what length, what are the arrays' dimensions, etc, etc), and issues with archiving and retrieving paired data/metadata pairs can, and in practice do, occasionally make perfectly good datasets unusable -- a real pity and waste.
CDF, in particular, is "a self-describing data format for the storage and manipulation of scalar and multidimensional data in a platform- and discipline-independent fashion" with many libraries and utilities to go with it. As alternatives, you might also consider NetCDF and HDF -- I'm less familiar with those (and such tradeoffs as flexibility vs size vs speed issues) but, seeing how widely they're used by scientists in many fields, I suspect any of the three formats could give you very acceptable results.