IDL - freeing up used memory? [message #84377] |
Wed, 29 May 2013 09:55  |
Andy Sayer
Messages: 127 Registered: February 2009
|
Senior Member |
|
|
Hi all,
I'm running into a memory issue which I am not certain whether is related to my IDL code or the machine the code is running on itself, and am wondering whether someone might be able to help me out. To start with, I'm using IDL 7.1.1 on CentOS, invoking via command line.
I've got a long piece of code which loops over a bunch of files, and for each file:
- reads the data in
- performs some operations
- stores the results in my 'output arrays'
and then moves onto the next file in the list. I define all my 'output' arrays up-front at the start of the code (before the loop). After dealing with a file, the associated interim variables (i.e. the data read directly in from the file, which consist of several hundreds of thousands of floating-point values) are set to an integer value of zero to save memory. I am also careful to close file units after they're opened.
If I do a help,/memory between dealing with files in my loop, the heap memory used is fairly constant (at ~130,000,000) and the maximum is typically 10-20% higher than that. This system has a lot of RAM and I don't think this process can be using anywhere near that limit. So, there is no apparent accumulation of junk in between reading files in the loop.
However, despite this, after some number of files I get this type of message:
% Unable to allocate memory: to make array.
Cannot allocate memory
At the line this error is issued at, the code is trying to create the structure to store the several hundreds of thousands of floating-point values to be read in from the file (i.e. at the 'read the data in' stage rather than the 'perform some operations on it' stage).
If I start my process from the offending file, it works fine (and if I start a few files earlier or later in the chain, the code instead falls over correspondingly earlier or later). This suggests that, despite the help,/memory output, there is some issue with the available memory decreasing as I go from file to file.
I've tried using the heap memory diagnosis/cleanup tools like heap_gc (e.g. http://www.idlcoyote.com/fileio_tips/memleak.html ) but am not using any pointers, and so that doesn't seem to do anything.
I've found posts (such as http://cow.physics.wisc.edu/~craigm/idl/archive/msg01590.htm l ) saying that these issues can be the result of avvailable memory being fragmented. That sounds more plausible to me. Various posts report issues with various IDL version/OS combinations. David has a related page on memory issues in IDL 8, using the Workbench. However, I'm not using the Workbench, or IDL 8, and so I don't think that is applicable to my situation.
So... does anyone know what could be going on here? Is my memory getting fragmented, and if so, is there any way to fix that? Is there some way besides help,/memory to see whether my memory usage is really stable between iterations? Any suggestions would be appreciated!
As an addendum, while I've been using IDL a while, I am not a computer scientist and have neither root access nor physical access to the machines this code is being executed on, both of which may limit my ability to poke into some aspects of the problem.
Thanks,
Andy
|
|
|