reading multiple HDF files [message #61338] |
Tue, 15 July 2008 04:41  |
julia.walterspiel
Messages: 35 Registered: July 2008
|
Member |
|
|
Hi!
I'm a newbie to IDL (I used to program in MATLAB but the place i work
at only has IDL licences) and i need to get some things here done
asap.
I've been struggling with reading HDF files from MODIS for quiet some
time now and I def need some good input!
Here's the thing:
I downloaded a bunch (hundrets) of MODIS data (e.g. MOD_06, Cloud
product, daily data, years 2000-2008) in HDF format from which I need
to extract the SDS "Cloud_Fraction" and plot it as a time series for a
specific geographical region (Switzerland). I managed to read the
files with the program hdf_read by David fanning and I have a vague
idea what to do in order to read in multiple hdf files (I guess this
is done with a FOR loop).
HOWEVER: I simply cannot imagine how it is possible to link single
SDS' (e.g. "Cloud_Fraction") from multiple hdf files, and I don't even
wanna think about displaying them as a time series or on a map of a
geographic region.
Can anybody give me some good hints? anything would be greatly
appreciated!!!
many many thanks
Julia
|
|
|
Re: reading multiple HDF files [message #61379 is a reply to message #61338] |
Wed, 16 July 2008 08:54   |
julia.walterspiel
Messages: 35 Registered: July 2008
|
Member |
|
|
yeah it was a different problem, a beginner problem.. sorry for
wasting your time folks! it works now:
############################################################ ####
PRO read_multiple_hdf
hdf_file_path=FILE_SEARCH('I:\zue\doc\ks\Temp_Satellite_Data \Temp_MODIS
\MOD_08_jointAerosolWaterVaporCloudProduct\*.hdf')
homer = intarr(6,2,99)
;*********************************************************** *****
;********** Begin the for loop to read all selected hdf files ***
for i=0,5 do begin
;for i=0,n_elements(hdf_file_path)-1 do begin
hdfid = HDF_SD_START(hdf_file_path(i), /READ)
varnames = HDF_SD_VARDIR (hdfid)
index = hdf_sd_nametoindex(hdfid,
'Cloud_Fraction_Mean_Mean') ;varnames[Cloud_Fraction_Mean_Mean])
if (index eq -1) then message, string(varnames, format='("Specified
Variable not found: ", a)')
varid = hdf_sd_select(hdfid, index)
HDF_SD_GETDATA, varid, data ;- The Cloud Fraction data will be
stored in the variable "data"
HDF_SD_ENDACCESS, varid
HDF_SD_END, hdfid
print, i
print, data
homer[*,*,i]= data[*,*]
endfor
END
####################################################
|
|
|
Re: reading multiple HDF files [message #61381 is a reply to message #61338] |
Wed, 16 July 2008 08:27   |
David Fanning
Messages: 11724 Registered: August 2001
|
Senior Member |
|
|
David Fanning writes:
> Here is the code I used to test this. I can run as many as
> I like with the HDF_SD_END in the code, but only 32 without
> it. :-)
Here is my theory. Suppose you open a file with
a command like this:
OPENR, lun, filename, /Get_LUN
If you close the file like this:
CLOSE, lun
You have closed the file, but you have not released the
logical unit number to be used over again. Since there
are only 32 of these "managed" logical unit numbers, if
you were opening files in a loop, you would run out after
32 iterations. You need to also free the logical unit
number to be used over again:
FREE_LUN, lun
(Fortunately, in this case FREE_LUN also closes the file, so you
don't need to use both of these commands. But I see a LOT of
people using CLOSE when they should be using FREE_LUN.)
That is what is happening here, I think.
HDF_SD_ENDACCESS is equivalent to CLOSE
HDF_SD_END is equivalent to FREE_LUN
So, when you are working with HDF files, you *must* use both.
Cheers,
David
--
David Fanning, Ph.D.
Fanning Software Consulting, Inc.
Coyote's Guide to IDL Programming: http://www.dfanning.com/
Sepore ma de ni thui. ("Perhaps thou speakest truth.")
|
|
|
Re: reading multiple HDF files [message #61382 is a reply to message #61338] |
Wed, 16 July 2008 08:21   |
David Fanning
Messages: 11724 Registered: August 2001
|
Senior Member |
|
|
julia.walterspiel@gmail.com writes:
>
>> Did you try closing the file, as in my last suggestion?
>> This is pretty much what I expected without that close.
>>
>
>
> yes I did.
> I put the HDF_SD_END inside the loop and outside.
> Placing it outside gave me the 32-loops.
> Placing it inside gave me nothing but
> IDL> read_multiple_hdf
> Loaded DLM: HDF.
> Compiled module: HDF_SD_VARDIR.
>
> i'm sorry, this is my very first time I'm dealing with IDL, maybe I'm
> overlooking something completely obvious and simple?
Here is the code I used to test this. I can run as many as
I like with the HDF_SD_END in the code, but only 32 without
it. :-)
;########################################################### ##
PRO read_multiple_hdf
hdf_file_path='G:\data\96108_08.hdf'
;*********************************************************** *****
;********** Begin the for loop to read all selected hdf files ***
for I=0,100 do begin
hdfid = HDF_SD_START(hdf_file_path)
varnames = HDF_SD_VARLIST (hdfid)
index = hdf_sd_nametoindex(hdfid,'ScanRate')
if (index eq -1) then message, string(varnames)
varid = hdf_sd_select(hdfid, index)
HDF_SD_GETDATA, varid, data
hdf_sd_endaccess, varid
HDF_SD_END, hdfid
Print, I
endfor
END
;########################################################### #######
--
David Fanning, Ph.D.
Fanning Software Consulting, Inc.
Coyote's Guide to IDL Programming: http://www.dfanning.com/
Sepore ma de ni thui. ("Perhaps thou speakest truth.")
|
|
|
|
|
|
Re: reading multiple HDF files [message #61387 is a reply to message #61338] |
Wed, 16 July 2008 07:46   |
David Fanning
Messages: 11724 Registered: August 2001
|
Senior Member |
|
|
julia.walterspiel@gmail.com writes:
> you can all relax :)
> closing and restarting IDL, I noticed that the error occured at loop
> number 32. Checking the file at this location didn't bring up anything
> strange, so I tried to make the loop start at i=32, got the error
> immediately, closed and reopened IDL, did the same thing again and
> tadaaa got the error at loop number 64 this time (and yes, also at
> loop no. 96)
> checking again, turns out that IDL is able to only loop through the
> program 32times before it gives the error.
> And then you have to close IDL completely before doing the next
> "cycle". I guess this is an IDL limitation when it comes to large
> files (HDF files ARE large...)
>
> is there a way to continuously reset the memory? (some IDL batch job)
> if not, I guess this leaves me with saving the data in 32-loop-
> increments (which is not such a pain considering that it's only 100
> files - at least this time)?
Did you try closing the file, as in my last suggestion?
This is pretty much what I expected without that close.
Cheers,
David
--
David Fanning, Ph.D.
Fanning Software Consulting, Inc.
Coyote's Guide to IDL Programming: http://www.dfanning.com/
Sepore ma de ni thui. ("Perhaps thou speakest truth.")
|
|
|
|
Re: reading multiple HDF files [message #61392 is a reply to message #61338] |
Wed, 16 July 2008 06:37   |
David Fanning
Messages: 11724 Registered: August 2001
|
Senior Member |
|
|
julia.walterspiel@gmail.com writes:
> well, it works, it doesnt, then it does, then it doesnt..
> ???????
>
> here's the small code, if one of you could look at it quickly? I'm
> pretty sure, the error will strike you immediatley. However, it
> doesn't strike ME.
This is completely off the wall. I don't know anything
about this, but I've thought all morning long that this
was about losing access to logical unit numbers. Maybe
you could try putting an HDF_SD_END in the code at the
end of your loop. At least it would make me feel better
about it. :-)
Cheers,
David
--
David Fanning, Ph.D.
Fanning Software Consulting, Inc.
Coyote's Guide to IDL Programming: http://www.dfanning.com/
Sepore ma de ni thui. ("Perhaps thou speakest truth.")
|
|
|
Re: reading multiple HDF files [message #61393 is a reply to message #61338] |
Wed, 16 July 2008 06:27   |
julia.walterspiel
Messages: 35 Registered: July 2008
|
Member |
|
|
well, it works, it doesnt, then it does, then it doesnt..
???????
here's the small code, if one of you could look at it quickly? I'm
pretty sure, the error will strike you immediatley. However, it
doesn't strike ME.
PRO read_multiple_hdf
hdf_file_path=FILE_SEARCH('I:\zue\doc\ks\Temp_Satellite_Data \Temp_MODIS
\MOD_08_jointAerosolWaterVaporCloudProduct\*.hdf')
;*********************************************************** *****
;********** Begin the for loop to read all selected hdf files ***
for i=0,n_elements(hdf_file_path)-1 do begin
hdfid = HDF_SD_START(hdf_file_path(i))
varnames = HDF_SD_VARDIR (hdfid)
index = hdf_sd_nametoindex(hdfid,
'Cloud_Fraction_Mean_Mean') ;varnames[Cloud_Fraction_Mean_Mean])
if (index eq -1) then message, string(varnames, format='("Specified
Variable not found: ", a)')
varid = hdf_sd_select(hdfid, index)
HDF_SD_GETDATA, varid, data ;- The Cloud Fraction data will be
stored in the variable "data"
hdf_sd_endaccess, varid
; *********** get the filename only, without path
filename_short = file_basename(hdf_file_path[i])
; to be continued....
endfor
END
|
|
|
|
|
|
|
Re: reading multiple HDF files [message #61431 is a reply to message #61379] |
Thu, 17 July 2008 08:40   |
MarioIncandenza
Messages: 231 Registered: February 2005
|
Senior Member |
|
|
I. I see several people have recommended Liam Gumley's HDF routines,
but not the best one of all: SDS_READ (http://www.ssec.wisc.edu/
~gumley/sds_read.html). I have found this to work with every MODIS
product I've tried and several other kinds of data, and it does
everything for you. It has a GUI for easy file exploration, but works
great in scripts too. It's called like so:
IDL> sds_read,<HDF_FILE>,<DUMMY>,/INFO ; see a list of available SDS
(GUI) and get metadata printed out;
IDL> sds_read,<HDF_FILE>,<DATA>,/READ_ALL,SDS=<SDS_NAME>; pull in an
SDS from the HDF file (no GUI);
II. Now, as for your application ending with:
homer[*,*,i]= data[*,*]
Your initial post suggested you were reading MODIS Level 2 data, which
have different geolocation for each granule. Thus, while you can
theoretically stack the data as you did there, you're missing a lot of
information you'll need to actually interpret the data. Maybe I'm
misunderstanding what you're trying to do, or maybe you're using MODIS
L3 data, where the reprojected data will actually stack properly.
III. For diagnosing memory limitation issues, you'll want MEMTEST. I
can no longer remember if this is a built-in or not, but if it's not
built-in, you'll want it in your library. It gives a concise rundown
of memory available and fragmentation.
Good luck!
--Edward H.
|
|
|
|
|
|
Re: reading multiple HDF files [message #61563 is a reply to message #61379] |
Fri, 18 July 2008 11:43   |
MarioIncandenza
Messages: 231 Registered: February 2005
|
Senior Member |
|
|
On Jul 16, 8:54 am, julia.waltersp...@gmail.com wrote:
> hdfid = HDF_SD_START(hdf_file_path(i), /READ)
> varnames = HDF_SD_VARDIR (hdfid)
> index = hdf_sd_nametoindex(hdfid,
> 'Cloud_Fraction_Mean_Mean') ;varnames[Cloud_Fraction_Mean_Mean])
> varid = hdf_sd_select(hdfid, index)
> HDF_SD_GETDATA, varid, data ;- The Cloud Fraction data will > HDF_SD_ENDACCESS, varid
> HDF_SD_END, hdfid
I would recommend sds_read.pro (http://www.ssec.wisc.edu/~gumley/
sds_read.html). Using SDS_READ, the above code could be reduced to:
IDL> sds_read,hdf_file_path[i],data,sds='Cloud_Fraction_Mean_Mean ',/
read_all
Also, I note this is a Level 2 product you are reading, so stacking
the data, while probably possible, will maybe not give you what you
want, since each scene is separately geo-referenced.
|
|
|
Re: reading multiple HDF files [message #61670 is a reply to message #61563] |
Thu, 24 July 2008 02:21  |
julia.walterspiel
Messages: 35 Registered: July 2008
|
Member |
|
|
>
> Also, I note this is a Level 2 product you are reading, so stacking
> the data, while probably possible, will maybe not give you what you
> want, since each scene is separately geo-referenced.
Hi Ed
for this project, I'm using MODIS L3 data (MOD_08 product), so I won't
have a problem with seperate geo-reference. However, I'm intending to
do similar work on MOD14, MOD15 and MOD35, so I will keep your info in
mind!
thanks!
juls
|
|
|