Re: pseudo color and true color [message #7862] |
Wed, 22 January 1997 00:00  |
Peter Mason
Messages: 145 Registered: June 1996
|
Senior Member |
|
|
On Sun, 19 Jan 1997, A. Scott Denning wrote:
> I am running an idl application under Solaris 2.5 from a Windows NT
> machine using the X/win32 X server.
>
> The Unix box has an 8-bit graphics board. The idl code is built using
> color tables and psedo color through and through. But the NT box has a
> 24-bit video card, and the X server software doesn't allow idl to run in
> Pseudo-Color mode (yes, I've tried setting it with DEVICE).
>
> When I display my plots (color-filled contours and images) on the PC, I
> get indistinguishable shades of purple instead of lovely rainbows
> (colors 0 through 25 in a color space of 16 million).
>
> How can I get around this behavior? Can I somehow generate true color
> images if and only if I'm running my code via the PC X server?
I don't think you can have mixed screen depths on a Windows platform
(like you can under Unix). So I think that your X-server software will
always be locked to your Windows screen depth.
The simplest way out might be for you to use the IDL command:
DEVICE,DECOMPOSED=0
before you render graphics.
(Walid Aita gave me this tip some time back.)
This is supposed to make IDL (using a direct/truecolor display) interpret
color values as 8-bit color indices, like in pseudocolor mode. But you
might still have problems with image commands like TV, TVSCL and TVRD() -
these may require changes to the source.
You might also try setting the screen depth to 256 colors on the NT box
(i.e., use an 8-bit screen mode for windows itself, if your card
supports one). The X-server should then be in 8-bit mode.
This of course will spoil things for the other programs on your NT box which
want 24-bit color.
It also is a pain to change as it requires a reboot, at least under NT3.51
(don't know about NT4).
Peter Mason
|
|
|