Re: working with very large data sets [message #14629 is a reply to message #14628] |
Thu, 18 March 1999 00:00  |
R.Bauer
Messages: 1424 Registered: November 1998
|
Senior Member |
|
|
Steve Carothers wrote:
> I am working with 71 Mb data file on UNIX server that has 256 Mb RAM and 500
> Mb of virtual memory. I'm not doing much data manipulation before I plot
> the data, but it doesn't take much manipulation to exceed the memory
> allocation. I understand the benefits of chunking up the data, but I would
> really like to keep all the data together for plotting purposes. I think my
> PV-Wave script could run properly if I can figure out how to minimize or
> eliminate memory fragmentation. When I'm done with a variable I set it
> equal to 0 to free up the memory. However, if I understand the manual
> correctly, this will not free up contiguous memory, which is what I need.
> Delstruct and delvar might help me but they can't be used inside a script,
> only at the prompt. I have a feeling I'll be forced to chunk up the data.
>
> Also, is there a way to remove a set of records from an array of structures
> if the records to delete are known without using the "where" command and
> without creating a tempory variable in the memory?
>
> Any advice would be appreciated.
>
> Steve
look at temporary
a=bytarr(2e5)
b=temporary(a)+1b
R.Bauer
|
|
|