16bit color has 65536 possible colors, while 24 bit has millions and 32 bit has billions.
I like using 24bit color, because it allows all RGB combinations of color. With 16 bit, the computer will render 0/0/0 (black) and 0/0/1 as the same. Of course, as you noted, your eye cannot tell the difference between these.
(People with good can recognize more than 65,000 colors, but no-one can recognize all the 24-bit combinations.)
If your machine has a 24-bit setting, try it. If not, live with 16-bit … since the higher level degrades performance.
Don McCahill wrote:
16bit color has 65536 possible colors, while 24 bit has millions and 32 bit has billions.
The 24 and 32 bit setting has the same amount of colors. They both have around 16.7 million colors (also called true color). The only difference between 24 and 32 bit is that the 32 bit setting has an 8 bit alpha channel that some 3D programs and games can use for different things (more realistic explosions in games and such).
I like using 24bit color, because it allows all RGB combinations of color. With 16 bit, the computer will render 0/0/0 (black) and 0/0/1 as the same. Of course, as you noted, your eye cannot tell the difference between these.
A simple gradient from black to white in Photoshop should convince most people that 16 bit (65536 colors) isn’t suited for image editing. When the video card is set to 16 bit, you’ll see banding in the gradient. If there doesn’t seem to be any difference between the gradient in 16 or 24/32 bit, then the monitor isn’t suited for image editing, in my opinion.
—
Regards
Madsen.
Phosphor wrote:
For Photoshop, leave it at 24 bit. For gaming, bump it up.
Richard doesn’t have the 24 bit setting, as far as I understand his first post in this thread. He has 16 and 32 bit ("medium" or "high"). I haven’t got the 24 bit setting either. It depends on the graphic card and/or the graphic card driver.
<
http://home19.inet.tele.dk/who_knows/matrox/colors.png>
—
Regards
Madsen.
Ahhh…it’s all coming back to me now, Thomas. I work on Windows systems so rarely that I forgot how the controls were outfitted in the display control panel.
In that case then:
16-bits = 4 bits/channel for R,G, and B, plus 4 bits forr the Alpha channel 32-bits = 8 bits/ channel for R, G, and B, plus 8 bits for the Alpha channel.
In this case then, ALWAYS stay with 32 bit display setting. The only slowdown would be from a really antiquated system.
Phosphor wrote:
In this case then, ALWAYS stay with 32 bit display setting. The only slowdown would be from a really antiquated system.
I agree.
—
Regards
Madsen.
Thomas Madsen wrote:
I haven’t got the 24 bit setting either.
I forgot to mention that I also has a setting inside the Matrox Parhelia driver called ‘1 billion (10-bit GigaColor)’. I just can’t figure out how and when it works (I have read the documentation from Matrox, but I still can’t figure it out).
I have heard some people say that the setting produces 10-bit color in every application but I don’t think they’re right. If that was the case then I wonder why I should install a so called ‘Giga Color Viewer’ in order to see the difference (which I can’t anyway) and I must admin that I can’t see the difference in applications like Photoshop whether the 10 bit setting is turned on or off either.
—
Regards
Madsen.
I always thought the Parahelia 10-bit cards were beneficial mostly for working with video, for whatever reason.
Up till now, Photoshop won’t take advantage of 10-bit/channel data anyway.
At least, I don’t think it will. 8 bit/channelyes. 16 bit/channelyes. Anything in betweenno.
It would seem the video card discussion has reached a conclusion and I’d like to steer it another direction, if i may. What difference is acheived by setting your image at 8 bit vs 16 bit color? What are the compatability issues involved and what makes one better than the other and for what purpose?
-ninja
16 (48) bit images allow for less destructive editing functions.
This can be proven by comparing histograms after editing steps.
However, human eyes also prove that it’s a rare image where one will perceive the diff.
16 bit is quite important in some scientific applications where finer nuances of information are recorded by instrumentation (other than human bean’s eyes).
Mac
Most images you want. Few images use all of the 16-bit worth of colors at once. Most have just a couple thousand.
As for 32-bit being slower with your video card. Well, 32-bit is way faster on mine than 16-bit is. So I guess it just depends on your system and the quality of your video card.
BTW I have Matrox G550.
R
Depends on the video card (or onboard video chip).
Some have 16/24/32 bit.
Mac
nospam wrote:
Well, 32-bit is way faster on mine than 16-bit is.
Same thing here actually.
—
Regards
Madsen.
Phosphor wrote:
8 bit/channelâyes. 16 bit/channelâyes. Anything in betweenâno.
It doesn’t mean that all my images turns into 10 bit/channel when I turn the Gigacolor setting on in the Parhelia driver. 🙂 It means (according to Matrox) that instead of 24 bit (8 bit/channel) I’ll get 30 bit (10 bit/channel) on my monitor. The result should be less chance of banding when viewing high bit images in applications like Photohop for instance.
<
http://www.matrox.com/mga/products/tech_info/pdfs/parhelia/g iga_color.pdf> (See the example with the black to green gradient on page 5).
I just can’t see the difference. Maybe it’s because my eyes are getting weak, my monitor is burned out, I’m using the wrong test pictures or just plain stupidity. I don’t know but it’s not that important. 24 bit color isn’t so bad either 🙂
—
Regards
Madsen.
I’m glad someone else can’t see the difference using the Parhelia Gigacolor. I understand the Matrox information, about it allowing 10 bits per channel right through the graphics pipe, but, even using the Gigacolor viewer, I saw absolutely no difference on the desktop or in Photoshop.
I posted a note asking about it ages ago, on the Matrox forums, and got no replies – every now and then I turn it on to have a play, but then turn it off again…..
cheers
Klaas
It all depends:
Reduce the color depth and resolution of the display.
InDesign can redraw the screen more quickly if the screen color depth and resolution are set to low values. To run InDesign in Windows, you can set the color depth as low as 256 colors (8-bit color) and the screen resolution as low as 800×600 ppi.
– Adobe InDesign <
http://www.adobe.com/support/techdocs/22eca.htm>
Thomas. did you compare your parhelia to other Graphic cards ? Had you by any chance a G550 to be able to tell us if the crisper display Matrox claim about the parhelias is accurate?
Pierre,
I went from a G550 to the Parhelia, and, to my eyes the claim of a crisper display is accurate.
My primary monitor runs at 1600×1200 and small text is very readable, whereas there was a slight blurring with the G550.
For me, along with what I saw as more powerful dual monitor support, the crisper display was what got me buying the Parhelia (even though, at release, they cost $1000 Australian here).
cheers
Klaas
the horrible reputation of their drivers…
The drivers have improved considerably since release. Be warned, though, 3D performance is average at best – if you are heavy into 3D applications or games, you should look elsewhere – if you are after 2D performance, you won’t complain.
Ah! you are Australian?
Born in Holland, grew up in New Zealand (from about the age of one), been living in Australia for the last 15 years …
New Zealander by citizenship (about to get dual citizenship as an Australian).
cheers
Klaas
Thanks Thomas!
I bet that the other users that claim to see a difference are not Photoshoppers nor photographers, am I right? I think that Chris Cox expressed big doubts about the Gigacolor plug-in, given that they had not even contacted Adobe about it…
Ah Klass, I was not so far off…
Pierre Courtejoie wrote:
Thanks Thomas!
You’re welcome.
I bet that the other users that claim to see a difference are not Photoshoppers nor photographers, am I right?
Yes, you’re right.
Even if I can’t see a difference I can only recommend a Parhelia card though. It has excellent image quality and even in 1920 x 1200 at 85 Hz, which is the recommended resolution for my monitor, everything is crystal clear. At that resolution the Radeon 8500 wasn’t very good and the Matrox G400 Max was also a little bit fuzzy at that resolution.
I think that Chris Cox expressed big doubts about the Gigacolor plug-in, given that they had not even contacted Adobe about it…
Okay. The Gigacolor thing is probably just a sales trick then.
—
Regards
Madsen.
I have read about PC’s being able to display faster when in 32-bit mode, this article offers a reason:
http://www.mustek.com/Class/resol_clrdepth.html .
A quote from the article:
"Some video cards have a 32-bit mode, but this is merely a speed trick. Computers move information in 16-bit or 32-bit chunks. When a video card allows 32-bit mode, it is storing 24-bits of color information in 32 digits merely to make data transfers faster. The extra 8-bits contain null information."