comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Re: Dealing with Large data arrays, reducing memory and ASSOC
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
Re: Dealing with Large data arrays, reducing memory and ASSOC [message #54466 is a reply to message #54465] Thu, 14 June 2007 06:41 Go to previous messageGo to previous message
bill.dman is currently offline  bill.dman
Messages: 17
Registered: June 2007
Junior Member
On Jun 14, 8:33 am, Ambrosia_Everlovely
<ambrosia_everlov...@hotmail.com> wrote:
> Hi,
> I have a fairly large datacube, DC(x,y,t)=DC(512,512,2048) and I want
> to perform an FFT in the t direction. Now I can do,
> FFTDC=fft(DC,-1,dim=3) which takes an excessive amount of memory (19 G
> + 50 G virtual) and slows the whole system down.
> Since this must be a fairly common practice amongst astronomers, can
> anyone provide - or link to - a small IDL algorithm which will allow
> me to use ASSOC or reduce the memory in some way? I have also tried
> TEMPORARY, but this doesn't seem to help at all.
>
> Thankyou!!!!

Assuming you are using single precision, you can limit memory needed
to about 6GB with

fftdc = complexarr(512,512,2048)
for i=0,511 do for j=0,511 do fftdc[i,j,0] = fft(dc[i,j,*],-1)

this should help if your machine has more than 6GB for you to use.
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: Read particular band from air temperature image among MOD07 air products
Next Topic: Yikes! Time for a blow-out party!

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Fri Oct 10 07:07:43 PDT 2025

Total time taken to generate the page: 0.63600 seconds