tags:

views:

1362

answers:

4

Hi,

I want to create a CSV file using Perl and write some data to it. Is there any way to do that?

Thank You

+10  A: 

You could use Class:CSV.

use Class::CSV;

my $csv = Class::CSV->new(
  fields         => [qw/userid username/]
);

$csv->add_line([2063, 'testuser']);
$csv->add_line({
  userid   => 2064,
  username => 'testuser2'
});

$csv->print();

# 2063,testuser
# 2064,testuser2

Edit: For more libraries, you can search CPAN.

A: 

Check out this tutorial for writing to files http://www.comp.leeds.ac.uk/Perl/filehandling.html

Tarski
But writing a CSV file is more about proper handling of field separators and escaping characters than about file handling.
+15  A: 

We usually use Text::CSV_XS (which the above-mentioned Class::CSV is based on)

UPDATE: Commenters below also suggest using Text::CSV which will load up Text::CSV_XS or, if that's not avialable, fall back on Text::CSV_PP that doesn't have XS dependency and may be easier to install.

DVK
This is the best option. Text::CSV_XS is highly optimized, and full featured with a smooth API. We use it all over. The author is also in my eyes the most responsive on all of CPAN, he takes pride in the module and provides top notch support. Text::CSV_PP is also a great fallback if you can't compile for the _XS, great if you need it later. The right thing will just happen these days if you simply use Text::CSV
Evan Carroll
Agreed with that last sentence: Just use Text::CSV and it will load up Text::CSV_XS if it's available or fall back to Text::CSV_PP if the system doesn't have the XS version installed.
Dave Sherohman
+3  A: 

There is more than one way to do it.

  • DBD::CSV - Use SQL to read and write CSV files.
  • Text::CSV - Build and parse CSV files a line at a time. This is pretty much the - gold standard for CSV manipulation.
  • POE::Filter::CSV - Provides a CSV filter for your POE component IO.
  • Data::Tabular::Dumper::CSV - Dump a table directly into a CSV file (objects with the same interface can dump to XML or MS Excel files).

There are many others on CPAN, too.

Of course, you could ignore all this and just use a loop, a join and a print:

my @table = (
  [ 'a', 'b', 'c'],
  [  1,   2,   3 ],
  [  2,   4,   6 ],
);

for my $row ( @table ) {
    print join( ',', @$row ), "\n";
}

Just run your script and redirect output to a file, bang! you are done. Or you could get crazy and open a file handle, and print directly to a file.

But then you won't handle multi-line fields, or embedded commas, or any of the many other oddities that can pop up in this sort of data processing task. So be careful with this approach.

My recommendations:

  • If you are building a POE application, use POE::Filter.
  • If you are a SQL/DBI monster and dream in SQL, or may need to replace CSV output with a real database connection, use DBD::SQL.
  • If you have simple data and are cobbling together a cruddy little throw-away script to format some data for your spreadsheet to digest, use print and join--do this for no code with an expected life span of greater than 2 hours.
  • If you want to dump a blob of data into a CSV or XML use Data::Tabular::Dumper::CSV.
  • If you are writing something that needs to be stable, maintainable and fast, and you need maximum control for input and output, use Text::CSV. (Note that POE::Filter::CSV, Data::Tabular::Dumper::CSV, and DBD::CSV all use Text::CSV or Text::CSV_XS as for back-end processing).
daotoad