tags:

views:

1531

answers:

4

I have a text file which has 4 columns, each column having 65536 data points. Every element in the row is separated by a comma. For example:

X,Y,Z,AU
4010.0,3210.0,-440.0,0.0
4010.0,3210.0,-420.0,0.0
etc.

So, I have 65536 rows, each row having 4 data values as shown above. I want to convert it into a matrix. I tried importing data from the text file to an excel file, because that way its easy to create a matrix, but I lost more than half the data.

+5  A: 

The easiest way to do it would be to use MATLAB's csvread function.

There is also this tool which reads CSV files.

You could do it yourself without too much difficulty either: Just loop over each line in the file and split it on commas and put it in your array.

David Johnstone
+4  A: 

Instead of messing with Excel, you should be able to read the text file directly into MATLAB (using the functions FOPEN, FGETL, FSCANF, and FCLOSE):

fid = fopen('file.dat','rt');  % Open the data file
headerChars = fgetl(fid);      % Read the first line of characters
data = fscanf(fid,'%f,%f,%f,%f',[4 inf]).';  % Read the data into a
                                             % 65536-by-4 matrix
fclose(fid);  % Close the data file
gnovice
+1  A: 

If all the entries in your file are numeric, you can simply use a = load('file.txt'). It should create a 65536x4 matrix a. It is even easier than csvread

Dima
A: 

Suggest you familiarize yourself with dlmread and textscan.

dlmread is like csvread but because it can handle any delimiter (tab, space, etc), I tend to use it rather than csvread.

textscan is the real workhorse: lots of options, + it works on open files and is a little more robust to handling "bad" input (e.g. nonnumeric data in the file). It can be used like fscanf in gnovice's suggestion, but I think it is faster (don't quote me on that though).

Jason S

related questions