Re: memory leak with HDF5? [message #44882] |
Mon, 25 July 2005 13:59  |
eddie haskell
Messages: 29 Registered: September 1998
|
Junior Member |
|
|
Hello Peter,
I tried the same experiment here but got slightly different results.
Running IDL 6.1.1 on both 32- and 64- bit AIX I did see an increase in
process size (as indicated by the spawned ps command) but only after
every 5-10 files instead of every file, and then it was not as much as
you are seeing. I tested it with 1000 different files but I did not
appear to be in danger of running out of memory anytime soon.
Whilst not an immediate solution, waiting a little bit for IDL 6.2
(which is shipping any minute now) might solve your problem. When I run
the test in IDL 6.2 I see a small increase in process size for the first
5-10 files then no increase after that.
Cheers,
eddie
peter.albert@gmx.de wrote:
> Hi everybody,
>
> I am new to this group, and I am experiencing a strange memory leak
> when reading HDF5 files with IDL 6.1 (on a IBM-AIX machine). If I am
> running the following code fragment, with "files" being an array with
> filenames of HDF5 files, which all contain a "Data/Data1" dataset:
>
>
> for i = 0, n _files - 1 do begin
> file_id = h5f_open(files[i])
> nd = h5g_get_nmembers(file_id, "Data")
> dataset_id = h5d_open(file_id, "Data/Data1")
> dataset = h5d_read(dataset_id)
> h5d_close, dataset_id
> h5f_close, file_id
> endfor
>
> then the core image of the IDL process increases by appro. 400k in each
> loop, which means that after a sufficent large number of files I get
> the follwoing error
>
> % Unable to allocate memory: to make array.
> Not enough space
>
>
> I have to admit that I do not exactly know what "core image of the IDL
> process" actually means, but that's what the manpage of the Unix "ps"
> command tells me ... :-) I did put the following line before the
> "endfor" statement:
>
> spawn, "ps axu | grep palbert | grep idl | grep -v grep"
>
> which actually showed me, among other info, well, the size of the core
> image. And it just constantly increased.
>
> I also put a "help, /memory" there, of course, but this number kept
> constant, so it is not IDL saving more and more variables or so.
>
> Now, the funny thing is, if I exclude the
>
> nd = h5g_get_nmembers(file_id, "Data")
>
> command, then the core size increases much more slowly.
> I have no idea what is going on here.
>
> Moreover, if I open the same file again and again, nothing happens.
>
> ??? I am completely lost.
>
> I would really like to run my code without crashing after a few hundred
> files, so if anyone has an idea what is happening here, any comment
> would be greatly appreciated.
>
> Best regards,
>
> Peter
>
|
|
|
Re: memory leak with HDF5? [message #45095 is a reply to message #44882] |
Mon, 01 August 2005 03:52  |
peter.albert@gmx.de
Messages: 108 Registered: July 2005
|
Senior Member |
|
|
Hi Eddie,
sorry for my late reply, I was away for a few days. Thanks for at least
proving that those problems exist and that I am not just doing some
silly mistakes here. Well, I'll be patient then ...
Best regards,
Peter
|
|
|