Is it me or 16-Bit color banding is not actually visible with Intel GMA X4500's?
I just happen to have an older computer lying around with a G41-powered mobo and I can't, for the love of me, see them banding artifacts any more no matter what.
It looks as it were truly bypassing D3D.REN's call to downsampling 24-bit filtered textures.
If you're wondering what exactly I'm talking about (and you might if you never played Blood 2 on contemporary hardware) look at the attached picture:
Upper Left: the original 8-Bit texture.
Upper Right: the original 8-Bit texture, bilinearly-filtered with 24-Bit color precision.
Lower Left: the 24-Bit, bilinearly-filtered version downsampled to 16-Bit color (most legacy video cards)
Lower Right: the 24-Bit, bilinearly-filtered version downsampled to 16-Bit color (some VooDoo cards)
I'm getting exactly the "Upper Right" kind of image on this computer, while I still get the "Lower Left" kind of image with my trusty GF6.
Maybe this is commonplace today and I'm making too much of a deal about it, but honestly, I never had Blood 2 look this good before.
The downside, it's not worth the switch because of the amount and severity of glitches that this piece of junk video adapter introduces.
Can anyone confirm?
Discussion about Blood II and its addons.
1 post •Page 1 of 1