comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Where vs Histogram vs ??
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
Re: Where vs Histogram vs ?? [message #32613 is a reply to message #32526] Thu, 17 October 2002 20:21 Go to previous messageGo to previous message
Craig Markwardt is currently offline  Craig Markwardt
Messages: 1869
Registered: November 1996
Senior Member
Andrew Cool <andrew.cool@dsto.defence.gov.au> writes:
> At the moment I'm doing something like this :-
>
> start_year = 2000
> end_year = 2002
> start_day = 120
> end_day = 133
> start_half_hr = 0
> end_half_hr = 47
> WRF = 1
> FREQ = 2
> start_beam = 0
> end_beam = 3
> nominated_parameter = 2
>
> index = Where(!database.year GE start_year AND $
> !database.year LE end_year AND $
> !database.day GE start_day AND $
> !database.day LE end_day AND $
> !database.beam GE start_beam AND $
> !database.beam LE end_beam AND $
> !database.half_hr GE start_half_hr AND $
> !database.half_hr LE end_half_hr AND $
> !database.WRF EQ WRF AND $
> !database.FREQ EQ FREQ AND $
> !database.parameter(nominated_parameter) NE
> bad_data_value)

I'll be the broken record, and agree with everybody else that
structure access is slow.

I think this could be much faster to access as *gasp* a common block.
If each parameter were an array variable in a common, then you would
save the considerable time involved in extracting the fields from the
structures in each comparison.

You also definitely want to make a field which is Julian day, since
that reduces the number of comparisons for the date/time from three to
one, and I think it will save space. Or, are you *really* interested
in data from days 120-133 in years 2000, 2001 and 2002 combined?

Finally, if you can, try to thin the array first by applying the most
stringent selection. For example, if you are only looking in a narrow
date range, then first extract only those records fromt the date
range, then go back and apply the other criteria.

With 15 million samples, anything you do will take quite a bit of
time. However, I regularly do operations on 3 million sample arrays
and it isn't *too* bad.

Hope that helps!

Craig


--
------------------------------------------------------------ --------------
Craig B. Markwardt, Ph.D. EMAIL: craigmnet@cow.physics.wisc.edu
Astrophysics, IDL, Finance, Derivatives | Remove "net" for better response
------------------------------------------------------------ --------------
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: Finding the mean of a set of images
Next Topic: slice of 2D surface

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Sat Oct 11 10:51:16 PDT 2025

Total time taken to generate the page: 0.64612 seconds