comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » shmmap questions
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Switch to threaded view of this topic Create a new topic Submit Reply
shmmap questions [message #82366] Sun, 09 December 2012 22:14
Russell Ryan is currently offline  Russell Ryan
Messages: 122
Registered: May 2012
Senior Member
Hi gang,

I'm working on a Markov Chain Monte Carlo code to model some images, specifically using the parallel tempering algorithm. Since this algorithm is very computationally expensive, I'm using the IDL_IDLBridge to multithread the process (basically running several chains in parallel). But to compute the likelihoods I need a large array of model luminosities, which if I simply passed to each bridge I'd quickly run out of memory. So now pass the data using shared memory, and shmmap. Everything seems to work when the array of model is smallish, but the code gives the message:

% SHMMAP: Existing file too short for desired mapping.

if I make the model array large. I can't really say how large "large" is, but I wondered if this wasn't related to the post on D. Fanning's site: http://www.idlcoyote.com/code_tips/bridge.html . I'm doing this on Mac OSX, and I noticed that the shmmax variable in the kernel was set to 128 Mb. I raised that level to 1 Gb using the command: "sudo sysctl -w kern.sysv.shmmax=1073741824". I still get the error, and I'm not sure where to go next. Any ideas?

Any help is greatly appreciated!
Russell
  Switch to threaded view of this topic Create a new topic Submit Reply
Previous Topic: meanclip
Next Topic: Image reading

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Wed Oct 08 17:55:16 PDT 2025

Total time taken to generate the page: 0.00513 seconds