tags:

views:

65

answers:

3

We a need a csv viewer which can look at 10MM-15MM rows on a windows environment and each column can have some filtering capability (some regex or text searching) is fine.

+3  A: 

I strongly suggest using a database instead, and running queries (eg, with Access). With proper SQL queries you should be able to filter on the columns you need to see, without handling such huge files all at once. You may need to have someone write a script to input each row of the csv file (and future csv file changes) into the database.

Kimball Robinson
[CSVFix](http://code.google.com/p/csvfix/) can generate SQL statements from CSV files.
Brian Nixon
Not what I was hoping for, but probably the best answer anyways :)
Eric Petroelje
A: 

I don't want to be the end user of that app. Store the data in SQL. Surely you can define criteria to query on before generating a .csv file. Give the user an online interface with the column headers and filters to apply. Then generate a query based on the selected filters, providing the user only with the lines they need.

This will save many people time, headaches and eye sores.

We had this same issue and used a 'report builder' to build the criteria for the reports prior to actually generating the downloadable csv/Excel file.

The Mirage
A: 

As other guys suggested, I would also choose SQL database. It's already optimized to perform queries over large data sets. There're couple of embeded databases like SQLite or FirebirdSQL (embeded).

http://www.sqlite.org/

http://www.firebirdsql.org/manual/ufb-cs-embedded.html

You can easily import CSV into SQL database with just few lines of code and then build a SQL query instead of writing your own solution to filter large tabular data.

dwich