comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Re: Big arrays, reducing data
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Return to the default flat view Create a new topic Submit Reply
Re: Big arrays, reducing data [message #53090] Wed, 21 March 2007 18:43 Go to previous message
Eric Hudson is currently offline  Eric Hudson
Messages: 19
Registered: June 2006
Junior Member
On Mar 21, 5:26 pm, "Jean H." <jghas...@DELTHIS.ucalgary.ANDTHIS.ca>
wrote:
> Eric Hudson wrote:
>> Hi,
>
>> I have what I hope is an easy question and then probably a hard one.
>
>> 1) I need to make some big arrays (ideally 16000^2 elements or more)
>> but find I often get "unable to allocate memory" errors. Is there
>> some way of determining (at run time) the largest array that I can
>> make? In C, for example, I'd try to allocate the memory and check for
>> whether it was allocated, then cut the array size if it wasn't. Is
>> there an equivalent technique in IDL?
>
> There is the memTest procedure made by ITTVIS that displays the 10
> biggest array that you can store. I have modified this procedure so you
> can retrieve a) the size of the biggest array you can save and b) the
> TOTAL available memory.
>
> Here is a copy of the code... note that I just made small modification
> to the header as I did not intend to distribute this code.
> The calling sequence is:
> biggestArrayInBits = availableMemory()
> or biggestArrayInBits = availableMemory(TotalAvailableMemoryInBits)
>
Just what I needed, thanks!

Eric
[Message index]
 
Read Message
Read Message
Read Message
Previous Topic: Re: spawn issue - was: open sockets
Next Topic: ps file with text

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Sat Oct 11 16:19:14 PDT 2025

Total time taken to generate the page: 1.28023 seconds