tags:

views:

219

answers:

4

I have an app which I have created to copy users (10,000+) from one domain and create the user in another domain with custom properties.

The app seems to run fine until it hits a 1.7gb of memory, I know there is a limit of 2gb per process on 32bit machines but I am running this on a copy of Windows Server 2008 x64 with 24gb of RAM.

My app does not crash but completes before it should (around 3000 users), maybe memory limitation is not my problem here but it was the first thing that stood out when comparing my app to a simpler app which collect just counts the users as it loops through.

I have my project set to "Any CPU" and it shows in task manager without the *32 flag.

Can anyone help me to understand what is going on?

Thanks

+1  A: 

Does any single data structure in your program exceed 2 gigabytes? The .NET runtime can access more than 2gb, but no single object can be larger than 2 gb in size. For example, you can't allocate an array of bytes that's larger than 2 gb.

This can trip you up when using .NET collections. In particular the generic Dictionary and HashSet collection types.

Are you getting the out of memory exception when trying to add something to a collection?

Jim Mischel
Also, correct me please if I'm wrong, but .NET runtime not only can't allocate more than 2 GB per object, it also needs to fit an object into continuous block of memory so sometimes even 2 GB might not be possible but I'm not totally sure about this. Jon Skeet could shed some light on this :)
lubos hasko
You are correct. You'll need a contiguous block of memory, so it's unlikely you'll get close to 2 gb on a 32-bit system. There's also some allocation overhead. The largest byte[] you can allocate, for example, is 2,147,483,591 bytes. (Experimental result.)
Jim Mischel
A: 

The bulk of my application is looping through all 10000+ users using the DirectorySearcher then for each DirectoryEntry found I create an account using the following:

using(DirectoryEntry newUser = root.Children.Add("LDAPPATH"))
{
      //Assign properties to the newUser
      newUser.CommitChanges();
      //write the new username to a file using a streamwriter.
}

I then close the streamwriter and report display the number of users that have been created.

I have the pagesize of the DirectorySearcher set to 110000 to get all the results. My app just completes aas it should be only reports 3500 users created.

I created a simple test using the same code for the DirectorySearcher and instead of creating the accounts I just increment a counter, the counter reached the number I expected so I think must be something with the account creation or the logging to file.

A: 

What is your source for the records? Can you use a connected DataReader type model rather than a disconnected DataTable/DataSet type model to avoid keeping more than the current record in memory?

I doubt you just happen to have a server with 24GB of RAM lying around not doing anything else. It must have some job to do that requires the RAM be available for that purpose. Just because you have that much RAM on the machine doesn't necessarily mean it's available, or that it's a good idea to rely on it.

Joel Coehoorn
Very good point...
Mark Brittingham
not very good point, because question was about memory limits. He is trying to understand why his current implementation doesn't work.
lubos hasko
A: 

Whether or not you should be hitting a limit, it sounds to me like you're holding onto references to some objects that you don't really need.

If you don't know why you're consuming so much memory you might want to try profiling your memory usage to see if you really need all that memory. There are many options, but the Microsoft CLR Profiler is free and would do what you need.

Jon Norton