Re: QUESTION: is there a command in idl which could clean up memory pieces? [message #62986] |
Tue, 21 October 2008 22:58  |
litongmu
Messages: 7 Registered: October 2008
|
Junior Member |
|
|
On Oct 21, 6:27 pm, RussellGrew <russell.g...@gmail.com> wrote:
> Do you really need all the CDF's open at once?
>
> Do you require all 5000 records from a given CDF or can you be more
> selective?
>
> Can you not open a CDF, get the data you need, process it, close the
> cdf, open the next one [using the same variables as before]... etc?
>
> Cheers.
That is how I did it. Sorry I did not express myself clearly. I deal
with the files one by one. And in every file, there are about 5000
records. However, when it came to certain amount, the code stopped.
Sorry for the confusion.
|
|
|
|
Re: QUESTION: is there a command in idl which could clean up memory pieces? [message #62989 is a reply to message #62987] |
Tue, 21 October 2008 15:52   |
litongmu
Messages: 7 Registered: October 2008
|
Junior Member |
|
|
On Oct 21, 12:16 pm, pgri...@gmail.com wrote:
> liton...@gmail.com wrote:
>> Hi all,
>
>> I am wondering if there is a command in IDL which could clean up all
>> the memory pieces to generate a whole big piece which could be used by
>> IDL.
>> Because, recently I am running a code to read and deal with cdf files.
>> And there are lots of files (more than 1000). The code cannot finish
>> them once, because of memory problem.
>
> While it is possible you are having a memory fragmentation issue,
> it seems more likely you are just running out of memory. Can you tell
> us how much memory you use at the time of the failure?
> Use: help,/mem
>
> Cheers,
> Paolo
>
>> And I know that in Matlab there is a command called 'pack', which
>> could clean up all the memory pieces. So I am wondering if there is a
>> similar counterpart in IDL. It took me really a long time to find in
>> the manuals. But no luck.
>> I appreciate your help and your try.
>
>> tongmu
>
>
It is different. If I run my code in my desktop, the memory usage is
about 300m, which, as you mentioned, is likely to be out of memory.
But if I use my laptop, the memory usage is about 30m or less.
However, I called a subroutine to open and read a cdf file every time,
and in one file, there are only like 5000 records. So I think there
should be enough memory to each file.
Another reason why I believe it is caused by memory fragmentation is
that in my code, I need to resize some arrays several times. It could
generate lots of memory pieces.
|
|
|
|
Re: QUESTION: is there a command in idl which could clean up memory pieces? [message #63063 is a reply to message #62986] |
Wed, 22 October 2008 20:32  |
Andrew Cool
Messages: 219 Registered: January 1996
|
Senior Member |
|
|
On Oct 22, 3:58 pm, liton...@gmail.com wrote:
> On Oct 21, 6:27 pm, RussellGrew <russell.g...@gmail.com> wrote:
>
>> Do you really need all the CDF's open at once?
>
>> Do you require all 5000 records from a given CDF or can you be more
>> selective?
>
>> Can you not open a CDF, get the data you need, process it, close the
>> cdf, open the next one [using the same variables as before]... etc?
>
>> Cheers.
>
> That is how I did it. Sorry I did not express myself clearly. I deal
> with the files one by one. And in every file, there are about 5000
> records. However, when it came to certain amount, the code stopped.
> Sorry for the confusion.
Sounds like a memory leak due to poor coding. Define "certain amount".
Does it fail after a certain number of files every time?
Define "code stopped" too - is there actually an error message - can
you tell us what it is?
You could try inserting Help,/MEMORY in various strategic palces in
your code to watch the memory use.
You could try http://www.dfanning.com/programs/undefine.pro to make
unnecessary variables "vanish" to free up memory.
Andrew
|
|
|