I've spent some time reverse engineering the class MxPalette. For the most part it tells us things we already know - when an MxBitmap is loaded, it uses an MxPalette function that reads all of its entries, except the first and last 10 which are retrieved from the Win32 API GetSystemPaletteEntries(). This means these entries aren't even specific to the game, they're all from Windows' global system palette. It's still not really clear why the game uses these, are there bitmaps that need those system colors? And if so, why wouldn't the bitmaps themselves just be paletted with those entries? Most of the bitmaps actually do have those entries so forcing them doesn't make any sense...
It's almost certainly possible to disable the system palette overriding. That could give us about 9.4% more color precision in our palettes, though it's unclear if it will have any adverse effects. It may be worth experimenting with and I'll post back here if I find anything interesting.
The entries at 140 and 141 are also confirmed to be "special" entries, especially 141. 141 seems to be specifically "settable", presumably to the sky color, and 140 is
sometimes just set to the same color. This too is very unclear if I'm being honest and requires more investigation to see why this is set up this way.
Something worth noting is that each palette entry has a "flags" member (probably actually to just pad the 24-bits of 8-bit RGB out to 32-bits) and these flags are set differently depending on the above:
- The first and last 10 entries (from the system palette) are given the flags 0x80.
- Entry 141 (presumably the sky color) is given the flags 0x44.
- The remaining entries (custom file-specific) are given the flags 0x84.
It's hard to say how these flags are used just yet, but it's interesting that they correspond exactly to the different types of entries we've found so far. Also the fact that entry 140 has different flags from entry 141 could be a clue as to the differences in the way they're used.