Hi,
Consider the following code in Python, using psycopg2 cursor
object (Some column names were changed or omitted for clarity):
filename='data.csv'
file_columns=('id', 'node_id', 'segment_id', 'elevated',
'approximation', 'the_geom', 'azimuth')
self._cur.copy_from(file=open(filename),
table=self.new_table_name, columns=file_columns)
- The database is located on a remote machine on a fast LAN.
- Using
\COPY
from bash works very fast, even for large (~1,000,000 lines) files.
This code is ultra-fast for 5,000 lines, but when data.csv
grows beyond 10,000 lines, the program freezes completely.
Any thoughts \ solutions?
Adam