I'm currently coding a file reader for fixed-width tables in VB, and the compiled application seems to be sucking down memory like there's no tomorrow. I'm working with a series of ~50 megabyte files, but after running through several, the process starts taking up about 200+ megabytes of RAM, which is way more than it should.
I've done some poking around, and I think the issue is the call to NewRow(), but don't take my word for it.
Does anyone have some tips for optimizing this? If the problem's with the NewRow() call, is there a way of clearing this out?
Code follows below:
Function LoadFixedWidthFileToDataTable(ByVal filepath As String, ByRef Colnames() As String, ByRef colwidth() As Integer) As DataTable
Dim filetable As New DataTable
For Each name As String In Colnames
filetable.Columns.Add(name)
Next
Dim loadedfile As StreamReader
Try
loadedfile = New StreamReader(filepath)
Catch io As IOException
MsgBox(io.Message)
Return Nothing
Exit Function
End Try
Dim line As String = loadedfile.ReadLine
Dim filerow As DataRow = filetable.NewRow
Dim i As Integer = 0
While Not loadedfile.EndOfStream
line = loadedfile.ReadLine
filerow = filetable.NewRow
i = 0
For Each colsize As Integer In colwidth
Try
filerow(i) = line.Substring(0, colsize)
line = line.Remove(0, colsize)
Catch ex As ArgumentOutOfRangeException ''If the line doesn't match array params
Exit For
End Try
i = i + 1
Next
filetable.Rows.Add(filerow)
End While
loadedfile.Close()
Return filetable
End Function