comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Re: writing large 3D data file fails
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Switch to threaded view of this topic Create a new topic Submit Reply
Re: writing large 3D data file fails [message #68235] Thu, 08 October 2009 07:26 Go to next message
penteado is currently offline  penteado
Messages: 866
Registered: February 2018
Senior Member
Administrator
On Oct 8, 8:32 am, Nigel Wade <n...@ion.le.ac.uk> wrote:
> I can't test your exact array because I don't have sufficient RAM, that
> array is over 50GB and I only have 32GB.
>
> However, attempting to write a smaller array (fltarr(4008,4008,200),
> which by my reckoning is about 12GB) causes a segmentation violation. The
> resulting file in my case is actually empty.
>
> IDL> volume=fltarr(4008,4008,200)
> IDL> help,/memory
> heap memory used: 12852030500, max: 12916286829, gets:      459,
> frees:      142
> IDL> GET_LUN, lun
> IDL> OPENW, lun,'bigfile'
> IDL> WRITEU, lun, volume
> Segmentation fault
>
> # ls -l bigfile
> -rw-r--r--  1 root root 0 Oct  8 12:25 bigfile
>
> I don't think WRITEU likes very big files. Maybe it's not built with
> largefile support, and internally uses a 32bit file pointer. I can't see
> why it would be being a 64bit application, but what else might cause the
> error?
>
> --
> Nigel Wade

I do not know if it will be relevant to this case, but there is a page
in the IDL help that specifically talks about some issues that may
occur with large files. It is at

IDL Programmers' Guides > Application Programming > Part II:
Components of the IDL Language > Files and Input/Output > Reading and
Writing Very Large Files
Re: writing large 3D data file fails [message #68244 is a reply to message #68235] Thu, 08 October 2009 04:32 Go to previous messageGo to next message
Nigel Wade is currently offline  Nigel Wade
Messages: 286
Registered: March 1998
Senior Member
I can't test your exact array because I don't have sufficient RAM, that
array is over 50GB and I only have 32GB.

However, attempting to write a smaller array (fltarr(4008,4008,200),
which by my reckoning is about 12GB) causes a segmentation violation. The
resulting file in my case is actually empty.

IDL> volume=fltarr(4008,4008,200)
IDL> help,/memory
heap memory used: 12852030500, max: 12916286829, gets: 459,
frees: 142
IDL> GET_LUN, lun
IDL> OPENW, lun,'bigfile'
IDL> WRITEU, lun, volume
Segmentation fault

# ls -l bigfile
-rw-r--r-- 1 root root 0 Oct 8 12:25 bigfile


I don't think WRITEU likes very big files. Maybe it's not built with
largefile support, and internally uses a 32bit file pointer. I can't see
why it would be being a 64bit application, but what else might cause the
error?

--
Nigel Wade
Re: writing large 3D data file fails [message #68245 is a reply to message #68244] Thu, 08 October 2009 02:13 Go to previous messageGo to next message
dorthe is currently offline  dorthe
Messages: 12
Registered: April 2007
Junior Member
On Oct 8, 1:32 am, Nigel Wade <n...@ion.le.ac.uk> wrote:
> On Thu, 08 Oct 2009 01:10:09 -0700, Dorthe Wildenschild wrote:
>> On Oct 7, 9:05 am, David Fanning <n...@dfanning.com> wrote:
>>> Dorthe Wildenschild writes:
>>>> I have a fltarr of 4008x4008x865 voxels that I'm trying to write to a
>>>> file using
>
>>>> GET_LUN, lun
>>>> OPENW, lun, '/nfs/blahblah.dat'
>>>> WRITEU, lun, volume
>>>> CLOSE, lun
>>>> FREE_LUN, lun
>
>>>> this normally works like a charm for writing a simple binary data
>>>> file, but for this large dataset, I can't get it to work? The file
>>>> that get's written is way too small (about 3.5 GB - if I write it as
>>>> a netDCF it is =A821 GB, which is more like the right size)
>
>>>> Any ideas what goes wrong here?
>
>>> My guess would be a 32-bit operating system. :-)
>
>>> Cheers,
>
>>> David
>
>>> --
>>> David Fanning, Ph.D.
>>> Coyote's Guide to IDL Programming (www.dfanning.com) Sepore ma de ni
>>> thui. ("Perhaps thou speakest truth.")- Hide quoted text -
>
>>> - Show quoted text -
>
>> can't be, the system is 64 bit, Linux - with 64 GB of memory, so should
>> be OK (and it is, I don't get any errors)
>
> What is the NFS server filesystem/OS and NFS version? (the /nfs sort of
> implies it's NFS mounted). I've never tried read/write multi-GB files
> over NFS, but there could be issues there.
>
> --
> Nigel Wade- Hide quoted text -
>
> - Show quoted text -

it's just the naming structure for our various unix-based raid
storage, - they write fine normally, I wrote the 21 GB netCDF file
just fine
Re: writing large 3D data file fails [message #68247 is a reply to message #68245] Thu, 08 October 2009 01:32 Go to previous messageGo to next message
Nigel Wade is currently offline  Nigel Wade
Messages: 286
Registered: March 1998
Senior Member
On Thu, 08 Oct 2009 01:10:09 -0700, Dorthe Wildenschild wrote:

> On Oct 7, 9:05 am, David Fanning <n...@dfanning.com> wrote:
>> Dorthe Wildenschild writes:
>>> I have a fltarr of 4008x4008x865 voxels that I'm trying to write to a
>>> file using
>>
>>> GET_LUN, lun
>>> OPENW, lun, '/nfs/blahblah.dat'
>>> WRITEU, lun, volume
>>> CLOSE, lun
>>> FREE_LUN, lun
>>
>>> this normally works like a charm for writing a simple binary data
>>> file, but for this large dataset, I can't get it to work? The file
>>> that get's written is way too small (about 3.5 GB - if I write it as
>>> a netDCF it is =A821 GB, which is more like the right size)
>>
>>> Any ideas what goes wrong here?
>>
>> My guess would be a 32-bit operating system. :-)
>>
>> Cheers,
>>
>> David
>>
>> --
>> David Fanning, Ph.D.
>> Coyote's Guide to IDL Programming (www.dfanning.com) Sepore ma de ni
>> thui. ("Perhaps thou speakest truth.")- Hide quoted text -
>>
>> - Show quoted text -
>
>
> can't be, the system is 64 bit, Linux - with 64 GB of memory, so should
> be OK (and it is, I don't get any errors)

What is the NFS server filesystem/OS and NFS version? (the /nfs sort of
implies it's NFS mounted). I've never tried read/write multi-GB files
over NFS, but there could be issues there.

--
Nigel Wade
Re: writing large 3D data file fails [message #68248 is a reply to message #68247] Thu, 08 October 2009 01:10 Go to previous messageGo to next message
dorthe is currently offline  dorthe
Messages: 12
Registered: April 2007
Junior Member
On Oct 7, 9:05 am, David Fanning <n...@dfanning.com> wrote:
> Dorthe Wildenschild writes:
>> I have a fltarr of 4008x4008x865 voxels that I'm trying to write to a
>> file using
>
>> GET_LUN, lun
>> OPENW, lun, '/nfs/blahblah.dat'
>> WRITEU, lun, volume
>> CLOSE, lun
>> FREE_LUN, lun
>
>> this normally works like a charm for writing a simple binary data
>> file, but for this large dataset, I can't get it to work? The file
>> that get's written is way too small (about 3.5 GB - if I write it as a
>> netDCF it is =A821 GB, which is more like the right size)
>
>> Any ideas what goes wrong here?
>
> My guess would be a 32-bit operating system. :-)
>
> Cheers,
>
> David
>
> --
> David Fanning, Ph.D.
> Coyote's Guide to IDL Programming (www.dfanning.com)
> Sepore ma de ni thui. ("Perhaps thou speakest truth.")- Hide quoted text -
>
> - Show quoted text -


can't be, the system is 64 bit, Linux - with 64 GB of memory, so
should be OK (and it is, I don't get any errors)
Re: writing large 3D data file fails [message #68255 is a reply to message #68248] Wed, 07 October 2009 11:01 Go to previous messageGo to next message
JohnSmith is currently offline  JohnSmith
Messages: 4
Registered: September 2009
Junior Member
"Dorthe Wildenschild" <dorthe@engr.orst.edu> wrote in message
news:adc5396b-772f-47cc-9208-cb932e18b0fa@33g2000vbe.googleg roups.com...
One more question:

I have a fltarr of 4008x4008x865 voxels that I'm trying to write to a
file using

GET_LUN, lun
OPENW, lun, '/nfs/blahblah.dat'
WRITEU, lun, volume
CLOSE, lun
FREE_LUN, lun

this normally works like a charm for writing a simple binary data
file, but for this large dataset, I can't get it to work? The file
that get's written is way too small (about 3.5 GB - if I write it as a
netDCF it is �21 GB, which is more like the right size)

Any ideas what goes wrong here?
Thanks,
Dorthe

*******************************


Isn't your close,lun messing up your free_lun?
Re: writing large 3D data file fails [message #68259 is a reply to message #68255] Wed, 07 October 2009 09:05 Go to previous messageGo to next message
David Fanning is currently offline  David Fanning
Messages: 11724
Registered: August 2001
Senior Member
Dorthe Wildenschild writes:

> I have a fltarr of 4008x4008x865 voxels that I'm trying to write to a
> file using
>
> GET_LUN, lun
> OPENW, lun, '/nfs/blahblah.dat'
> WRITEU, lun, volume
> CLOSE, lun
> FREE_LUN, lun
>
> this normally works like a charm for writing a simple binary data
> file, but for this large dataset, I can't get it to work? The file
> that get's written is way too small (about 3.5 GB - if I write it as a
> netDCF it is =A821 GB, which is more like the right size)
>
> Any ideas what goes wrong here?

My guess would be a 32-bit operating system. :-)

Cheers,

David

--
David Fanning, Ph.D.
Coyote's Guide to IDL Programming (www.dfanning.com)
Sepore ma de ni thui. ("Perhaps thou speakest truth.")
Re: writing large 3D data file fails [message #68302 is a reply to message #68244] Tue, 13 October 2009 02:58 Go to previous message
dorthe is currently offline  dorthe
Messages: 12
Registered: April 2007
Junior Member
On 8 Oct, 13:32, Nigel Wade <n...@ion.le.ac.uk> wrote:
> I can't test your exact array because I don't have sufficient RAM, that
> array is over 50GB and I only have 32GB.
>
> However, attempting to write a smaller array (fltarr(4008,4008,200),
> which by my reckoning is about 12GB) causes a segmentation violation. The
> resulting file in my case is actually empty.
>
> IDL> volume=fltarr(4008,4008,200)
> IDL> help,/memory
> heap memory used: 12852030500, max: 12916286829, gets:      459,
> frees:      142
> IDL> GET_LUN, lun
> IDL> OPENW, lun,'bigfile'
> IDL> WRITEU, lun, volume
> Segmentation fault
>
> # ls -l bigfile
> -rw-r--r--  1 root root 0 Oct  8 12:25 bigfile
>
> I don't think WRITEU likes very big files. Maybe it's not built with
> largefile support, and internally uses a 32bit file pointer. I can't see
> why it would be being a 64bit application, but what else might cause the
> error?
>
> --
> Nigel Wade

Thanks for trying to help, - I really don't know what's wrong, but
instead of working out the kinks of IDL, I may just write the volume
as 3 or 4 smaller sections... - a bit sad though.
Cheers,
Dorthe
  Switch to threaded view of this topic Create a new topic Submit Reply
Previous Topic: Re: Browse for Cloud-Free MODIS Images
Next Topic: Re: is there a 3D equivalent of CONTOUR?

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Wed Oct 08 19:03:23 PDT 2025

Total time taken to generate the page: 0.00674 seconds