views:

18

answers:

3

I need to read line by line from text file (log files from server) and they are big (about 150-200MB). I am using StreamReader and its great for "little" files like 12MB but not for so big. After sometime it is loaded and it shows in my DataGridView but its huge in memory. I am using bindingSource.Filter on this DataGridView (like textbox and when user write letter it is filtering one column a comparing strings, not showing rows without letters in textbox and so) and with big files its useless too. So I want to ask you what is best solution for me.

I was looking and find some solutions but I need help with decided whats best for me and with implementing (or if there is something else):

  1. Load data in background and showing them in realtime. I am not really sure how to do that and I don´t know what to do with filtering in this solution.
  2. Maybe upgrade somehow streamreader? Or write own method for reading lines from file with binary readers?
  3. I found something about Memory-Mapped in c# 4.0 but i can´t use 4.0. Could this help feature help?

Thanks for help

A: 

Why not use Pagination

May be below Algorithm help , i am not sure but a thought.

1- Get the File Size

2- Divide the file size by 10 to get the total pages

3- Read data in 10 Mb chunks (how to do it , well not sure yet)

4- When user moves from one page to another load another 10 mb

ya it will have shortcoming in case of filtering.

EDIT

1- You can create a web service also if you don't want to download whole file.

2- In the web service you can make use of LogParser component by Microsoft. You can program agaist Log parser Api.

3- You can also fire SQL like query using LogParser like Select statements

see this Download Link

saurabh
I like my filtering and if I use Pagination I think filtering will be much complicated but I think it´s best way. I just want to know if there is another way. Thanks both
Bibo
When I want to use filtering, I should again load file with comparing strings right?
Bibo
yup , that's why i mentioned that it has some short coming while filetering.
saurabh
A: 

Load data in background and showing them in realtime. I am not really sure how to do that and I don't know what to do with filtering in this solution.

This is no help. It will still consume much memory in the background thread.

Maybe upgrade somehow streamreader? Or write own method for reading lines from file with binary readers?

Still no help, once you read the whole file into memory it will, well, consume memory.

I think you get the point. Don't load the whole file into memory. Load only chunks of it. Use paging. You cannot show 200MB worth of data on a single screen anyways, so only load the portion you need to show on the screen. So basically you need to implement the following function:

public IEnumerable<string> ReadFile(int page, int linesPerPage, out totalLines)
{
    ...
}

The Skip and Take extension methods could be helpful here.

Darin Dimitrov
And what should i do with filtering? As I mention I should again load file with comparing strings right? When user change (add, delete) letter I should again start with loading file and controlling which row to show and which not?
Bibo
You could have an additional parameter to the ReadFile function which will allow you to filter lines.
Darin Dimitrov
Ok, thanks I will try it.
Bibo
A: 

Okay, so I am implementing Paging and I read 5k lines of text file than after clicking button next lines and so. I am using BaseStream.Position for saving a starting reading but I would like to use some other function which save number of lines and mainly I want use method for starting reading from exact line but I can´t find nothing for StreamReader. Is there something like that?

Bibo