I am using the following method for reading Csv file content:
/// <summary>
/// Reads data from a CSV file to a datatable
/// </summary>
/// <param name="filePath">Path to the CSV file</param>
/// <returns>Datatable filled with data read from the CSV file</returns>
public DataTable ReadCsv(string filePath)
{
if (string.IsNullOrEmpty(filePath))
{
log.Error("Invalid CSV file name.");
return null;
}
try
{
DataTable dt = new DataTable();
string folder = FileMngr.Instance.ExtractFileDir(filePath);
string fileName = FileMngr.Instance.ExtractFileName(filePath);
string connectionString =
string.Concat(@"Driver={Microsoft Text Driver (*.txt; *.csv)};Dbq=",
folder, ";");
using (OdbcConnection conn =
new System.Data.Odbc.OdbcConnection(connectionString))
{
string selectCommand = string.Concat("select * from [", fileName, "]");
using (OdbcDataAdapter da = new OdbcDataAdapter(selectCommand, conn))
{
da.Fill(dt);
}
}
return dt;
}
catch (Exception ex)
{
log.Error("Error loading CSV content", ex);
return null;
}
}
This method works if I have a UTF-8 encoded Csv file with a schema.ini that looks something like this:
[Example.csv]
Format=Delimited(,)
ColNameHeader=True
MaxScanRows=2
CharacterSet=ANSI
If I have German characters in a Csv file with Unicode encoding, the method cannot read the data correctly.
What modifications can I make to the above method to read Unicode Csv files? If there is no way to do it this way, what Csv-reading code can you suggest?