> Uh. Surely to extrapolate one needs to know how the growth
> is? Do you know? Is it exponential, as the graphs suggest?
> Such an extrapolation strikes me as 'shaky at best', we might
> be better off simply including the numbers for the shortened
> benchmark and explaining that it had to be reduced.
See comments below...
> >By the way: When enabling "GC before each measurement", those memory
> >peaks went away. This coincides much more with the
> measurements we had
> >done for our SC paper
> >(http://bodden.de/n/index.php?option=com_docman&task=doc_down
> load&gid=5
> >3
> >&Itemid=38).
> >
> >
> What do you mean by "went away"? Do you mean the drops
> disappeared (so you had a consistent upwards trend), or did
> memory usage stay reasonably constant?
The latter. It stayed around 1-2 MB with no upwards trend, at least for
the first few hundred iterations.
> Looking at the paper
> link you give, I was surprised to find your claim that J-LO
> shows practically no memory overhead on JHotDraw, since I
> seem to remember having to provide it with a large heap when
> I was experimenting with it. My recollections are hazy at
> best, though...
> What is the real situation?
If the implementation does what it is intended to do (i.e. there is no
bug), the memory overhead should be in the order of (size of formula) *
(number of bindings). That means that when you test with a large number
of shapes in JHotDraw, you may see a (hopefully linearly) high memory
consumption. The tests for the SC paper were done with 3 shapes only.
Hence the consumption stayed so low, I reckon. However, I have never
found a situation where I had to give it more heap space so far.
I will give it another run on cardinal right away and see how far it
gets by tomorrow morning. Then I should be able to give you more details
about the actual behaviour.
Eric
Received on Sun Mar 05 02:23:19 2006
This archive was generated by hypermail 2.1.8 : Tue Mar 06 2007 - 16:13:27 GMT