tags:

views:

391

answers:

6

Hi All,

I have a CSV file with 550000+ rows, I need to import this data into Access, but when I try it throws an error that the file is too large (1.7GB), can you recommend a way to get this file into Access?

Thanks,

Darryl

+1  A: 

Try linking instead of importing ("get external data" -> "link table" in 2003), that leaves the data in the CSV-file and reads from the file directly and in-place. It doesn't limit size (at least not anywhere near 1.7 GB). It may limit some of your read/update operations, but it will at least get you started.

daniel
+1  A: 

I'd either try the CSV ODBC connector, or otherwise import it first in a less limited database (MySQL, SQL Server) and import it from there.

It seems that some versions of access have a hard 2GB limit on MDB files so you might get into trouble with that anyway.

Good luck.

extraneon
Yep - went for MS SQL instead - thanks.
Gogster
A: 

I remember that Access has some size limitation around 2 Go. Going to free SQLExpress (limited to 4 Go) or free MySQL (no size limitation) could be easier.

Philippe Grondier
Jet 4 and ACE are limited to 2GBs. Jet 3.x (Access97) was 1GB.
David-W-Fenton
A: 

You can also use an ETL tool. Kettle is an open source one (http://kettle.pentaho.org/) and really quite easy to use. To import a file into a database requires a single transformation with 2 steps: CSV Text Input and Table Output.

Chris J
A: 

Another option would be to do away with the standard import functions and write your own. I have done this one time before when some specific logic needed to be applied to the data before import. The basic structure is……

Open then file Get the first line Loop through until the end of the line If we find a comma then move onto the next field Put record into database Get the next line repeat etc

I wrapped it up into a transaction that committed every 100 rows as I found that improved performance in my case but it would depend on your data if that helped.

However I would say that linking the data as others have said is the best solution, this is just an option if you absolutely have to have the data in access

Kevin Ross
+1  A: 

why do you using access for huge files ? use sqlexpress or firebird instead

drorhan
While I'm a big Jet/ACE/Access promoter, I have to say that any data set that exceeds 1GB makes me nervous. The table overhead for 1.7GBs of data is likely to put you right up against the 2GB hard file-size limit even after a compact.
David-W-Fenton