Re: getting procedures to use proper color index via LOADCT, x [message #10644] |
Wed, 24 December 1997 00:00  |
grunes
Messages: 68 Registered: September 1993
|
Member |
|
|
In article <349E1DA1.DDBA7E22@linmpi.mpg.de> Kevin Ivory <Kevin.Ivory@linmpi.mpg.de> writes:
> From: Kevin Ivory <Kevin.Ivory@linmpi.mpg.de>
> Subject: Re: getting procedures to use proper color index via LOADCT, x
> Date: Mon, 22 Dec 1997 08:58:25 +0100
> dmarshall@ivory.trentu.ca wrote:
>> But then my plots only ever show up in various shades of red.
> device, decomposed=0
> From the IDL online help:
> Set this keyword to 0 to cause the least-significant 8 bits of the color index value to be interpreted as a PseudoColor index. This setting allows users with DirectColor and TrueColor displays to use IDL programs written for standard, PseudoColor display
> without modification.
> Set this keyword to 1 to cause color indices to be interpreted as 3, 8-bit color indices where the least-significant 8 bits contain the red value, the next 8 bits contain the green value, and the most-significant 8 bits contain the blue value. This is th
> way IDL has always interpreted pixels when using visual classes with decomposed color.
That would be nice. It doesn't work on SGI workstations, so I would guess it
might not work on other X-windows systems--haven't had a chance to try. Of
course, I am using an old version of IDL, and have never played with
the .xdefaults file. But Fanning claims it works on PCs.
------------------------------------------------------------ -
Mitchell R Grunes, grunes@imsy1.nrl.navy.mil. Opinions are mine alone.
|
|
|