[Mono-dev] question: huge amounts of RAM
daniel at sones.de
Wed Sep 21 07:20:54 EDT 2011
Well actually we're about to introduce some changes into our import
functionality which will be as cautious as possible to not create
intermediate objects but only objects that stay. So mainly it will be a
growing operation regarding the number of object instances.
We already looked into the data structures we use and that those don't
impose a limit of Int32 to us when growing. Which of the two garbage
collectors would be best in your opinion? Would someone of the Mono-dev team
want to come and test with us?
From: mono-devel-list-bounces at lists.ximian.com
[mailto:mono-devel-list-bounces at lists.ximian.com] On Behalf Of Mark Probst
Sent: Mittwoch, 21. September 2011 11:52
To: Daniel Kirstenpfad
Cc: mono-devel-list at lists.ximian.com
Subject: Re: [Mono-dev] question: huge amounts of RAM
On Wed, Sep 21, 2011 at 11:42 AM, Daniel Kirstenpfad <daniel at sones.de>
> When speaking of huge I mean 1 to 2 terabytes of RAM.
> Since we do not own machines with such extreme amounts of memory we
> have been a limited time-frame assigned on such a machine to test our
> software on. So we want to make sure upfront that we actually will be
> able to use that amount of RAM
Do you care about the length of GC pause times? I mean the absolute length,
not the relative amount of time the GC consumes. (If your programs do
anything but batch processing then you probably do).
Mono-devel-list mailing list
Mono-devel-list at lists.ximian.com
More information about the Mono-devel-list