Re: Garbage collection and Memory [message #2300 is a reply to message #2105] |
Thu, 09 June 1994 17:33  |
geomagic
Messages: 22 Registered: November 1993
|
Junior Member |
|
|
In article <1994Jun9.220014.28022@noao.edu> eharold@corona.sunspot.noao.edu (Elliotte Harold) writes:
In article <thompson.770745164@serts.gsfc.nasa.gov>, thompson@serts.gsfc.nasa.gov (William Thompson) writes:
|> hevans@estwm0.wm.estec.esa.nl (Hugh Evans) writes:
|>
|> >I have discovered that after using Wave for an extended period that it slowly
|> >grabs more and more memory, even if new variables are not created, until
|> >finally it runs out of core memory. Whereas by saving the session and
|> >restarting it, the previous operation that crashed on a memory allocation
|> >problem will complete successfully.
|>
|>
|> It also strikes me that you could save the session, use .RNEW to clear out all
|> the memory, and restore it.
|>
> But will this allow you to start up in the middle of a program? i.e.
> can I Control-C a program; save,/all; save ,/routines; .RNEW; and then
> restore everything and .continue from where I left off?
>
> This is not an idle question. After a day carefully breaking up some
> matrix calculations into 1-2 MB pieces that wouldn't stretch the
> memory of my machine, a few hours into the run the sysadmin dropped by
> to warn me that my process was taking up 30 megabytes! This really
> makes me wonder if there's any way to deal with data sets that are
> larger than available memory. Would it help if I cleared temporary
> variables and arrays every pass through my main loops?
Buy more memory. Seriously, if you look at your staff time costs of
dinking around trying to contort your software to fit a big problem
into a small amount of memory, it's not cost effective to NOT buy
more memory. With 32 MB of memory for most workstations costing between
$1200-$1500, it's not worth wasting lots of time playing games with
exotic memory tricks.
It might help to allocate the biggest chuck of memory that any
intermediate calculate requires, instead of creating and deleting
variables. That might reduce memory fragmentation.
Dan O'Connell
geomagic@seismo.do.usbr.gov
Seismotectonics Group, U.S. Bureau of Reclamation
Denver Federal Center, P.O. Box 25007 D-3611, Denver, CO 80225
"We do custom earthquakes (for food)"
or
"Just more roadkill on the information superhighway"
/\
/ \
/ \ /\ /\
/\ / \ / \ / \ /\ /\/\ /\/\
___/ \ /\/\/\/ \ / \ /\ / \ / \/ \/ \ /\_______
\/ \ / \ / \/ \/ \/
\/ \/
--
Dan O'Connell
geomagic@seismo.do.usbr.gov
Seismotectonics Group, U.S. Bureau of Reclamation
|
|
|