I am trying to get a fast i.e. fast and not a lot of code, way to get csv data into postgres data base. I am reading into python using csvDictreader which works fine. Then I need to generate code somehow that takes the dicts and puts it into a table. I want to do this automaticaly as my tables often have hundreds of variables. (I don't want to read directly to Postgres because in many cases I must transform the data and python is good for that)
This is some of what I have got:
import psycopg2
import sys
import itertools
import sys, csv
import psycopg2.extras
import psycopg2.extensions
csvReader=csv.DictReader(open( '/home/matthew/Downloads/us_gis_data/statesp020.csv', "rb"), delimiter = ',')
#close.cursor()
x = 0
ConnectionString = "host='localhost' dbname='mydb' user='postgres' password='######"
try:
connection = psycopg2.extras.DictConnection(ConnectionString)
print "connecting"
except:
print "did not work"
# Create a test table with some data
dict_cur = connection.cursor()
#dict_cur.execute("CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);")
for i in range(1,50):
x = x+1
print x
dict_cur.execute("INSERT INTO test (num, data) VALUES(%s, %s)",(x, 3.6))#"abc'def"))
### how to I create the table and insert value using the dictreader?
dict_cur.execute("SELECT * FROM test")
for k in range(0,x+1):
rec = dict_cur.fetchone()
print rec['num'], rec['data']