Re: reading multiple FITS files from a directory [message #84542] |
Thu, 06 June 2013 12:36 |
Craig Markwardt
Messages: 1869 Registered: November 1996
|
Senior Member |
|
|
On Wednesday, June 5, 2013 9:07:23 PM UTC-4, wno...@gmail.com wrote:
> On Wednesday, June 5, 2013 5:47:35 PM UTC-7, wno...@gmail.com wrote:
>
>> Hi,
>
>>
>
>> Completely new to IDL here. Let's say I have a folder called 'Data' with a bunch of FITS files. How would I go about reading all the fits files from the folder and storing them in an array of structures. These are images from a CCD by the way if that helps. I can read a single file using mrdfits but is there a good way to read a bunch of files from a folder or directory? Any help would be greatly appreciated. Thanks.
>
>
>
> OK, as soon as I ask come up with something on my own. It looks like i can use findfile to put all the files in a string array. Then I could just use a for loop to read each single one. Now I'm thinking what I really would like to do is distinguish between them based on their headers. Off to study FITS some more. Feel free to ignore the post, unless of course you have some sage advice. I'm all ears. I've read through several posts and find them extremely helpful. Thanks to everyone who contributes. Hopefully I can get this stuff down well enough to actually contribute myself.
The optional third argument to MRDFITS() returns the header. For example,
img = MRDFITS('myfile.fits', 0, header)
The header is simply an array of strings. You can use FXPAR() to query the header.
obj = fxpar(header, 'OBJECT')
And decide if you want to keep it or not,
if obj EQ 'MY_FAVORITE_OBJECT' then begin
... save image data for later ...
endif
You will probably want to save the header data too, because it has lots of useful metadata.
Craig
|
|
|
Re: reading multiple FITS files from a directory [message #84557 is a reply to message #84542] |
Wed, 05 June 2013 20:36  |
David Fanning
Messages: 11724 Registered: August 2001
|
Senior Member |
|
|
wnolan4@gmail.com writes:
> OK, as soon as I ask come up with something on my own. It looks like i can use findfile to put all the files in a string array. Then I could just use a for loop to read each single one. Now I'm thinking what I really would like to do is distinguish between them based on their headers. Off to study FITS some more. Feel free to ignore the post, unless of course you have some sage advice. I'm all ears. I've read through several posts and find them extremely
helpful. Thanks to everyone who contributes. Hopefully I can get this stuff down well enough to actually contribute myself.
Use File_Search, rather than FindFile. You will find *all* the files you
are looking for that way. :-)
Cheers,
David
--
David Fanning, Ph.D.
Fanning Software Consulting, Inc.
Coyote's Guide to IDL Programming: http://www.idlcoyote.com/
Sepore ma de ni thue. ("Perhaps thou speakest truth.")
|
|
|
Re: reading multiple FITS files from a directory [message #84558 is a reply to message #84557] |
Wed, 05 June 2013 18:07  |
wnolan4
Messages: 2 Registered: June 2013
|
Junior Member |
|
|
On Wednesday, June 5, 2013 5:47:35 PM UTC-7, wno...@gmail.com wrote:
> Hi,
>
> Completely new to IDL here. Let's say I have a folder called 'Data' with a bunch of FITS files. How would I go about reading all the fits files from the folder and storing them in an array of structures. These are images from a CCD by the way if that helps. I can read a single file using mrdfits but is there a good way to read a bunch of files from a folder or directory? Any help would be greatly appreciated. Thanks.
OK, as soon as I ask come up with something on my own. It looks like i can use findfile to put all the files in a string array. Then I could just use a for loop to read each single one. Now I'm thinking what I really would like to do is distinguish between them based on their headers. Off to study FITS some more. Feel free to ignore the post, unless of course you have some sage advice. I'm all ears. I've read through several posts and find them extremely helpful. Thanks to everyone who contributes. Hopefully I can get this stuff down well enough to actually contribute myself.
|
|
|