"OPENW: Error opening file. Unit: (...) Too many links" - What does this mean? (Possible I/O problem) [message #85113] |
Fri, 05 July 2013 10:00  |
Jasdeep Anand
Messages: 16 Registered: August 2011
|
Junior Member |
|
|
Hey all,
I'm writing a program that reads and writes several thousand ASCII files in several loops. This code has traditionally worked when the total number of files was about 90,000. However, I've recently increased the loops involved so that I create many more (~ 500,000 in all). When I do this, the program starts off fine, until up to a certain point (at about the 175,000th file), when I suddenly get this message:
% OPENW: Error opening file. Unit: (...)
File: (...)
Too many links
% Execution halted at: $MAIN$ (...)
I have not seen this error before. At first I thought that the file unit was being reused too much, so I tried to make every OPENW statement have a corresponding FREE_LUN statement when I was finished writing to a different file (I'm still using the same variable, "lun" for all these units, because I thought that FREE_LUN would mean I could reuse it again for another file), but I still get the same message at the same point in time. What else could be the problem here? I've not managed to find a similar instance of this problem occurring anywhere else and the IDL documentation isn't helping much.
Thanks,
Jas.
|
|
|
Re: "OPENW: Error opening file. Unit: (...) Too many links" - What does this mean? (Possible I/O problem) [message #85118 is a reply to message #85113] |
Sun, 07 July 2013 04:33   |
skymaxwell@gmail.com
Messages: 127 Registered: January 2007
|
Senior Member |
|
|
пятница, 5 июля 2013 г., 21:00:48 UTC+4 пользователь Jas написал:
> Hey all,
>
>
>
> I'm writing a program that reads and writes several thousand ASCII files in several loops. This code has traditionally worked when the total number of files was about 90,000. However, I've recently increased the loops involved so that I create many more (~ 500,000 in all). When I do this, the program starts off fine, until up to a certain point (at about the 175,000th file), when I suddenly get this message:
>
>
>
> % OPENW: Error opening file. Unit: (...)
>
> File: (...)
>
> Too many links
>
> % Execution halted at: $MAIN$ (...)
>
>
>
> I have not seen this error before. At first I thought that the file unit was being reused too much, so I tried to make every OPENW statement have a corresponding FREE_LUN statement when I was finished writing to a different file (I'm still using the same variable, "lun" for all these units, because I thought that FREE_LUN would mean I could reuse it again for another file), but I still get the same message at the same point in time. What else could be the problem here? I've not managed to find a similar instance of this problem occurring anywhere else and the IDL documentation isn't helping much.
>
>
>
> Thanks,
>
>
>
> Jas.
Hi, from IDL help on GET_LUN procedure
"The file unit number obtained is in the range 100 to 128"
I think you need check lun numbers and check that all files are closed correct.
Also i can guess that if you have a loops, check type of counter variables there.
for example if you have
FOR I=0,10000000 DO BEGIN
change to
FOR I=0,10000000LL DO BEGIN
|
|
|
Re: "OPENW: Error opening file. Unit: (...) Too many links" - What does this mean? (Possible I/O problem) [message #85130 is a reply to message #85113] |
Tue, 09 July 2013 03:27  |
Carsten Lechte
Messages: 124 Registered: August 2006
|
Senior Member |
|
|
On 05/07/13 19:00, Jas wrote:
> files was about 90,000. However, I've recently increased the loops involved
> so that I create many more (~ 500,000 in all).
...
> % OPENW: Error opening file. Unit: (...) File: (...) Too many links %
> Execution halted at: $MAIN$ (...)
That looks like the underlying file system complaining that the maximum number
of files in a directory has been reached. Depending on which filesystem on
which operating system you use, it may be possible to increase this limit.
However, it might be better to reconsider generating so many files in the
first place. If you really need so many seperate files, you could group them
into 500 directories with about 1000 files each.
chl
|
|
|