views:

88

answers:

1

I have an application where we're having our clients upload a csv file to our server. We then process and put the data from the csv into our database. We're running into some issues with char-sets especially when we're dealing with JSON, in particular some non-converted UTF-8 characters are breaking IE on JSON responses.

Is there a way to convert the uploaded csv file to UTF-8 before we start processing it? Is there a way to determine the character encoding of an uploaded file? I've played with iconv a bit but we're not always sure what encoding the uploaded file will have. Thanks.

A: 

This solution might be not ideal, but should do the job.

First, the ingredients:

  • chardet (sudo gem install chardet)
  • fastercsv (sudo gem install fastercsv)

Now the actual code (not tested):

require 'rubygems'
require 'UniversalDetector'
require 'fastercsv'
require 'iconv'

file_to_import = File.open("path/to/your.csv")
# determine the encoding based on the first 100 characters
chardet = UniversalDetector::chardet(file_to_import.read[0..100])
if chardet['confidence'] > 0.7
  charset = chardet['encoding']
else 
  raise 'You better check this file manually.'
end
file_to_import.each_line do |l| 
  converted_line = Iconv.conv('utf-8', charset, l)
  row = FasterCSV.parse(converted_line)[0]
  # do the business here
end
Milan Novota
Is 100 characters enough?
r-dub
Just change it to whatever works for the files you are working with. You can analyze the whole file if it's reasonably small.
Milan Novota