comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » working with very large data sets
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
working with very large data sets [message #14631] Thu, 18 March 1999 00:00 Go to previous message
Steve Carothers is currently offline  Steve Carothers
Messages: 5
Registered: March 1999
Junior Member
I am working with 71 Mb data file on UNIX server that has 256 Mb RAM and 500
Mb of virtual memory. I'm not doing much data manipulation before I plot
the data, but it doesn't take much manipulation to exceed the memory
allocation. I understand the benefits of chunking up the data, but I would
really like to keep all the data together for plotting purposes. I think my
PV-Wave script could run properly if I can figure out how to minimize or
eliminate memory fragmentation. When I'm done with a variable I set it
equal to 0 to free up the memory. However, if I understand the manual
correctly, this will not free up contiguous memory, which is what I need.
Delstruct and delvar might help me but they can't be used inside a script,
only at the prompt. I have a feeling I'll be forced to chunk up the data.

Also, is there a way to remove a set of records from an array of structures
if the records to delete are known without using the "where" command and
without creating a tempory variable in the memory?

Any advice would be appreciated.

Steve
[Message index]
 
Read Message
Read Message
Previous Topic: Colour maps overlaid on grey-scale (medical) images
Next Topic: 3D imaging/rotation

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Wed Oct 08 13:35:43 PDT 2025

Total time taken to generate the page: 0.00410 seconds