comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Garbage collection and Memory
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
Re: Garbage collection and Memory [message #2291 is a reply to message #2105] Fri, 10 June 1994 07:17 Go to previous messageGo to previous message
eharold is currently offline  eharold
Messages: 12
Registered: July 1993
Junior Member
In article <GEOMAGIC.94Jun9173354@moe.seismo.do.usbr.gov>, geomagic@seismo.do.usbr.gov (Dan O-Connell) writes:
|> Buy more memory. Seriously, if you look at your staff time costs of
|> dinking around trying to contort your software to fit a big problem
|> into a small amount of memory, it's not cost effective to NOT buy
|> more memory. With 32 MB of memory for most workstations costing between
|> $1200-$1500, it's not worth wasting lots of time playing games with
|> exotic memory tricks.
|>

The problem is that in my field (astronomy)
it's always been and probably always will be VERY easy to produce data
sets that overwhelm available memory. The data we collect seems to grow
much faster than the memory capacity of our computers. The current
project I'm working on would really like 512 MB, about the maximum you
can shove in a Sparc. Buying that (which is not totally out of the
question) would cost around $20,000. This is the same order of magnitude
as my annual salary so it's not all that cost-ineffective for me to spend
a few days playing tricks with memory for data sets of this size.

It isn't too much trouble to fit the code into a 128 MB
machine. I actually have access to a Sparc with 128 MB but this
machine is shared among multiple astronomers, all of whom want to
run their own 128 MB (or larger) jobs. Thus as a grad student I'm
one of the first to get shoved off the machine when load is high.
Therefore it becomes very important to me to fit my code into as little
actual memory as possible.

Of course different analyses may apply if the professionals
working on your code are paid more than the average grad student,
or if the data sets are not astronomically large and don't quadruple
every time someone makes a better CCD. I'm not particularly familiar with
geological seismology. How fast does your data grow?

--

Elliotte Rusty Harold National Solar Observatory
eharold@sunspot.noao.edu Sunspot NM 88349
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: Mouse with Xmanager registers too fast
Next Topic: Re: image data extraction

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Fri Oct 10 20:22:19 PDT 2025

Total time taken to generate the page: 0.31915 seconds