Author Topic: Paletted 8bit -> 24bit shading.  (Read 58978 times)

Offline Bloax

  • Colonel
  • ****
  • Posts: 322
  • do you want to be any of those things
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #60 on: May 09, 2015, 06:42:07 pm »
Employing something like the COLORMAP used in IDEngine1 for palette entries except with full RGB support for the different light levels of the respective palette entry*  would likely be a good solution to this problem.

* - the actual COLORMAP implementation uses palette entries that would visually look like a grayscale image with the pixel brightness indicating the palette entry number. This means we can't do anything we want since ultimately we're dictated by the palette. :^(

So as an example, here's the stock Doom colormap:

Each pixel in the topmost row represents the raw palette indexes from 0 (leftmost) to 255 (rightmost), with every pixel downwards being a drop in light level.
Do note all the duplicate colors due to the lack of colors to represent the darker shades in the palette.

Here's a very quick & dirty "fade to black" approach in 24 bits (not restricted to just the palette entries)
But here's the thing - we can do anything we want in all of those different light levels.

(also includes a different blue because the default one sucks)
Which of course also means that we can do some color tinting as we approach the darker shades;


And I'm bringing this up because it seems like a way to do what you're looking for, while also providing a funky tool for modding.

Offline Dioxine

  • Commander
  • *****
  • Posts: 5458
  • punk not dead
    • View Profile
    • Nocturnal Productions
Re: Paletted 8bit -> 24bit shading.
« Reply #61 on: May 09, 2015, 11:39:56 pm »
As much as I love pixelart, using 24bit color to create darker shades, including tinted-blue, but also all sorts of ambient light out of the basic 256 color palette would certainly give awesome effects. So I'm all for it, it doesn't violate the spirit of the original while allowing for so much richer graphics.

Offline Tarvis

  • Colonel
  • ****
  • Posts: 111
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #62 on: May 31, 2015, 08:12:28 pm »
Employing something like the COLORMAP used in IDEngine1 for palette entries except with full RGB support for the different light levels of the respective palette entry*  would likely be a good solution to this problem.

* - the actual COLORMAP implementation uses palette entries that would visually look like a grayscale image with the pixel brightness indicating the palette entry number. This means we can't do anything we want since ultimately we're dictated by the palette. :^(

And I'm bringing this up because it seems like a way to do what you're looking for, while also providing a funky tool for modding.
X-COM already uses a similar palette method. That's the gradients you see in earlier posts.

The idea is to not have to use the palette. This is analogous to Doom's GL ports allowing 32-bit PNGs, with the same pitfall that in GL renderer you lose the original shading look, since they don't use the colormap at all.

As for my opinion, this is a question that almost every palette based game engine upgrade project reaches at some point (Doom, Quake, Duke 3D, and so on). There simply is no way to perfectly replicate the original shading while taking non-paletted sprites into account. The best you can do is approximate it with shaders or curves or forcing darker shades to certain step intervals.

I think there should simply be a video option for choosing yourself. 8-bit, which uses the vanilla lighting and makes 24/32-bit graphics map to the nearest palette color, or 24/32-bit which allows any color despite a less authentic lighting method. This is what most other projects do.
« Last Edit: May 31, 2015, 08:16:49 pm by Tarvis »

Offline liberation

  • Captain
  • ***
  • Posts: 75
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #63 on: May 31, 2015, 09:30:51 pm »
X-COM already uses a similar palette method. That's the gradients you see in earlier posts.

The idea is to not have to use the palette. This is analogous to Doom's GL ports allowing 32-bit PNGs, with the same pitfall that in GL renderer you lose the original shading look, since they don't use the colormap at all.

As for my opinion, this is a question that almost every palette based game engine upgrade project reaches at some point (Doom, Quake, Duke 3D, and so on). There simply is no way to perfectly replicate the original shading while taking non-paletted sprites into account. The best you can do is approximate it with shaders or curves or forcing darker shades to certain step intervals.

I think there should simply be a video option for choosing yourself. 8-bit, which uses the vanilla lighting and makes 24/32-bit graphics map to the nearest palette color, or 24/32-bit which allows any color despite a less authentic lighting method. This is what most other projects do.

I remember when the edge port added 32bit textures, we started using dynamic lights rather than sector lighting. Not sure how dynamic lighting would work in xcom considering you would have to spawn lights on all of the maps.

Offline Tarvis

  • Colonel
  • ****
  • Posts: 111
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #64 on: May 31, 2015, 10:03:17 pm »
I wasn't referring to the light sources, I meant how the graphics are actually darkened from lighting.

In EDGE, GZDoom, etc. you don't get reds turning into browns and such because it doesn't use the colormap for lighting. The map doesn't get darker further away from the player. The original software lighting shading is lost or only approximated. You can see an example here.

The same would happen for OpenXcom in 32-bit. That's the discussion; that allowing non-paletted sprites results in losing the paletted shading method.

If you go back and see the UFO TTS screenshot, you can see that darker tiles look a bit flatter and paler than they would with the 8-bit lighting.

It's not a really big deal, but just like Doom there is a certain charm to it that is lost.
« Last Edit: May 31, 2015, 10:07:41 pm by Tarvis »

Offline darkestaxe

  • Colonel
  • ****
  • Posts: 254
  • Emissary of the Brain
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #65 on: June 05, 2015, 11:02:52 am »
I would point out that DOOMs lighting always looked weird because of the way the software renderer handled it. In the comparison Travis linked you can see it, there's odd color bits in various places and dark areas don't look dark so much as discolored.

TFTD is a different case. TFTD doesn't mash up the colors to approximate shading, it uses 8-bit shading to create an underwater effect. DOOMs 8-bit shading artifacts are more like an expertly handled accident compared to TFTDs 8-bit shading feature.

Another way to put it: OpenGL shades DOOM textures correctly where previously they were being shaded incorrectly, though approximated carefully enough to look good. OpenGL shades TFTD incorrectly where previously it was shading exactly as desired and intended.

OR

DOOM actually looks better with an OpenGL renderer, albeit not as nostalgic. TFTD looks wrong in 24 bit and maybe a bit weird, regardless of nostalgia.

Offline Bloax

  • Colonel
  • ****
  • Posts: 322
  • do you want to be any of those things
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #66 on: June 23, 2015, 06:21:43 pm »
Doom's palette and colormap are really bad, and it does look better in OpenGL with a certain lighting hack to imitate the software depth fog - the thing that causes things close to the camera to be lightened up.
It would definitely look better paletted with a 24-bit colormap though.

I can imagine rendering the entire scene in fullbright to capture the palette colors into one texture, rendering all the lighting into a separate texture - and then modifying the fullbright texture's colors by matching colors onto the appropriate index of the colormap for the X position and then setting the pixel's Y position on the colormap texture to the brightness on the lighting texture might work.

But that's wishful thinking. :-)

Offline kikimoristan

  • Commander
  • *****
  • Posts: 647
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #67 on: July 17, 2015, 12:57:28 pm »
if you wanna get rid of palettes  go 32 bit. i think is better for effects. you only see 24 bit max but 32 i think makes sfx better . i can't  remember why. i think it has to do with having more colors to do math on or something generating better gradients,  transparency etc.

probably simplest way to do it atm is to map the 8 bit image to a 32 bit surface just before effects  do shading /transp then present to screen so the game would still be 8 bit but will support transparency and nice gradients/shadows/lighting.

Offline yrizoud

  • Commander
  • *****
  • Posts: 1014
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #68 on: July 17, 2015, 02:41:01 pm »
tollworkout, you're missing the entire point of the thread. Look again at the very post, the two rows of colors in the attached picture :
Top is what XCOM does with a palette,
Bottom is what you'd get if the source image is arbitrary 24bit.
They are different, and the second one is not just "acceptable".

Offline kikimoristan

  • Commander
  • *****
  • Posts: 647
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #69 on: July 17, 2015, 04:42:53 pm »
tollworkout, you're missing the entire point of the thread. Look again at the very post, the two rows of colors in the attached picture :
Top is what XCOM does with a palette,
Bottom is what you'd get if the source image is arbitrary 24bit.
They are different, and the second one is not just "acceptable".

i read 1st post. i getcha. title confused me a bit. 256 color def looks better :/
« Last Edit: July 17, 2015, 04:46:08 pm by tollworkout »

Online Yankes

  • Global Moderator
  • Commander
  • *****
  • Posts: 3350
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #70 on: September 05, 2015, 02:23:49 am »
I made depth shading for AndO3131:
https://github.com/Yankes/OpenXcom/tree/ShaderDraw32bit
This is enough close to TFTD?

Btw I made sandbox where I can do experiments with palettes:
https://jsfiddle.net/hgcufgL7/14/

Offline AndO3131

  • Colonel
  • ****
  • Posts: 137
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #71 on: September 05, 2015, 09:58:00 am »
I think it's very good result. Here are screenshots comparing gazer alien. First one is without shading - difference is visible on first glance :).

Offline SupSuper

  • Lazy Developer
  • Administrator
  • Commander
  • *****
  • Posts: 2162
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #72 on: September 05, 2015, 10:28:26 pm »
Looks more transparent than tinted.

Offline Bloax

  • Colonel
  • ****
  • Posts: 322
  • do you want to be any of those things
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #73 on: September 06, 2015, 12:50:21 am »
That's what happens when the tint is "blend to this color" rather than any sort of modification of the base color. ;)
It is certainly promising though.

Online Yankes

  • Global Moderator
  • Commander
  • *****
  • Posts: 3350
    • View Profile
Re: Paletted 8bit -> 24bit shading.
« Reply #74 on: September 06, 2015, 01:55:49 am »
In AndO3131 version I made some error making unit more flat that it should be.

I made some screenshoots showing that this isn't simply blend.

I did 3 things, calculate effective shade of pixel based on sum value of colors.
subtract some linear values based on new shade.
finally blend with background color.

This 3 steps allowed me recreate most of behavior of TFTD palette.
You can toy with my implementation here: https://jsfiddle.net/hgcufgL7/14/ and see difference with original colors (upper is calculated, lower is original in each depth line).