comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » memory leak with HDF5?
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
memory leak with HDF5? [message #44925] Thu, 21 July 2005 07:30 Go to previous message
peter.albert@gmx.de is currently offline  peter.albert@gmx.de
Messages: 108
Registered: July 2005
Senior Member
Hi everybody,

I am new to this group, and I am experiencing a strange memory leak
when reading HDF5 files with IDL 6.1 (on a IBM-AIX machine). If I am
running the following code fragment, with "files" being an array with
filenames of HDF5 files, which all contain a "Data/Data1" dataset:


for i = 0, n _files - 1 do begin
file_id = h5f_open(files[i])
nd = h5g_get_nmembers(file_id, "Data")
dataset_id = h5d_open(file_id, "Data/Data1")
dataset = h5d_read(dataset_id)
h5d_close, dataset_id
h5f_close, file_id
endfor

then the core image of the IDL process increases by appro. 400k in each
loop, which means that after a sufficent large number of files I get
the follwoing error

% Unable to allocate memory: to make array.
Not enough space


I have to admit that I do not exactly know what "core image of the IDL
process" actually means, but that's what the manpage of the Unix "ps"
command tells me ... :-) I did put the following line before the
"endfor" statement:

spawn, "ps axu | grep palbert | grep idl | grep -v grep"

which actually showed me, among other info, well, the size of the core
image. And it just constantly increased.

I also put a "help, /memory" there, of course, but this number kept
constant, so it is not IDL saving more and more variables or so.

Now, the funny thing is, if I exclude the

nd = h5g_get_nmembers(file_id, "Data")

command, then the core size increases much more slowly.
I have no idea what is going on here.

Moreover, if I open the same file again and again, nothing happens.

??? I am completely lost.

I would really like to run my code without crashing after a few hundred
files, so if anyone has an idea what is happening here, any comment
would be greatly appreciated.

Best regards,

Peter
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: More that 2800 Solutions Manuals (Part 3)
Next Topic: streaming into Excel?

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Fri Oct 10 10:27:14 PDT 2025

Total time taken to generate the page: 0.15949 seconds