Home     |     .Net Programming    |     cSharp Home    |     Sql Server Home    |     Javascript / Client Side Development     |     Ajax Programming

Ruby on Rails Development     |     Perl Programming     |     C Programming Language     |     C++ Programming     |     IT Jobs

Python Programming Language     |     Laptop Suggestions?    |     TCL Scripting     |     Fortran Programming     |     Scheme Programming Language

Cervo Technologies
The Right Source to Outsource

MS Dynamics CRM 3.0

C# Programming

Confused about memory usage.

I have an application (windows service) that in the beginning reads data
rapidly into a collection of large object trees.  Each object tree in
the collection is about 100mb and typically there are 6-20 object trees
in the collection.  When the application starts and the data is being
read in, I watch the memory generations via the Process Explorer
(basically .NET CLR Memory perf counters).  What I see is a bit
confusing.  Most of the memory gets promoted from Gen0 to Gen2 to Gen2
(and to the Large Object Heap) in a fairly fast succession.  When all
the database information has been read in, the memory usage settles in
the following numbers.

Gen0 1,048,576
Gen1 67,044
Gen2 571,410,644
LOH  71,653,000

Then after about 10 minutes of that I attempt to dump the memory by
setting all the root object to null, followed by GC.Collect() and

foreach obj in collection
        obj = null;

        print ("Before " + GC.GetTotalMemory(false).ToString());
        print ("After " + GC.GetTotalMemory(true).ToString());


The root objects are collected, this I know, because the finalizers
write stuff to the log.  Now the memory looks like this:

Gen0 1,048,576
Gen1 2,940
Gen2 518,267,836
LOH  71,556,328

So basically, Gen2 decreased a bit, while none of the other Generations
changed much.

In addition, I check the memory usage via GC.GetTotalMemory method.
Before the first object is collected, it reports memory usage at
506,804,036.  With every GC.Collect()/GC.WaitForPendingFinalizers()
call, the value decreases by about 5-10mb.  After the final GC is
forced, the memory is reported at 435,156,904.  I am not sure where this
  number comes from, but it does not jive with what is being shown from

So I have a couple of questions here:

1.  Given that my root node gets collected, why does the memory usage
not decrease.

2.  Where does GC.GetTotalMemory get its numbers from and why do they
not match with what's in the heap.

3.  Where do I go from here as far as debugging the memory leak (other
than SOS)?


Never mind.  One call to MS Support and a lesson in WinDbg/SOS debugging
techniques located my bug.

Add to del.icio.us | Digg this | Stumble it | Powered by Megasolutions Inc