comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Re: Dealing with Large data arrays, reducing memory and ASSOC
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
Re: Dealing with Large data arrays, reducing memory and ASSOC [message #54462 is a reply to message #54460] Thu, 14 June 2007 09:32 Go to previous messageGo to previous message
JD Smith is currently offline  JD Smith
Messages: 850
Registered: December 1999
Senior Member
On Thu, 14 Jun 2007 08:08:44 -0500, Kenneth Bowman wrote:

> In article <1181824433.145388.26020@d30g2000prg.googlegroups.com>,
> Ambrosia_Everlovely <ambrosia_everlovely@hotmail.com> wrote:
>
>> [quoted text muted]
>
> I would just do it in slices
>
> dct = COMPLEXARR(512,512,2048)
> FOR j = 0, 511 do dct[*,j,*] = FFT(REFORM(dc[*,j,*]), -1, DIM = 2)
>
> This does access memory in nearly the worst possible way. If you are
> going to be doing this a lot, you might want to consider rearranging the
> data so that t is the first dimension
>
> dct = COMPLEXARR(2048,512,512)
> FOR k = 0, 255 D0 xt[0,0,k] = FFT(REFORM(x[*,*,k]), -1, DIM = 1)

I'd be interested to hear whether this "in order" type of array
re-arrangement results in a real speedup. I had always assumed this
is true, but in recent testing on a very different problem, found
little or no gain, to my surprise.

JD
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: Read particular band from air temperature image among MOD07 air products
Next Topic: Yikes! Time for a blow-out party!

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Wed Oct 08 19:24:37 PDT 2025

Total time taken to generate the page: 0.00428 seconds