As I mentioned in The Game Plan, I'm staying laser focused on building on top of Cxbx and doing whatever I can to get the graphics working properly. Last year, when I first started thinking about this, I actually spoke to a member of the Cxbx development team who had mentioned on Twitter that Burnout 3 was a "lost cause", at least for the foreseeable future. I asked why, and he explained some of the biggest problems facing the game, and Cxbx as a whole.
The Xbox's variant of D3D8 has a unique feature called push buffers. Rather than making a ton of D3D calls that go back and forth between the GPU and CPU, you can instead record all of those actions into a push buffer and send it to the GPU in one go. Theoretically, this cuts out a lot of overhead, especially because push buffers are generated and stored in the NV2A's (Xbox's GPU) native instruction format. Push buffers can be generated at both runtime and compile time.
The problem is, push buffers don't exist on Windows. They can't really exist in the same capacity because, unlike the Xbox, you don't know what GPU any given Windows user is going to have, so you can't generate native instructions for it. If a game uses push buffers for NV2A GPUs, there's not a whole lot of relevance that will have to the average Windows PC.
Okay, well if Cxbx is already injecting a bunch of code to replace Xbox functions, can't you replace the push buffer functions too (at least for the stuff generated at runtime)?
Well, that leads to another big roadblock for Cxbx: the XDK used aggressive optimization, especially for later Xbox games like B3. I guess MS really wanted to squeeze as much performance out of the Xbox as possible, because a ton of Xbox APIs get forcibly inlined into each game's code, including many of the push buffer functions. A lot of later games also use LTCG (link time code generation), where API code linked into the game is even further altered to improve performance.
Both of these make it almost impossible to automatically identify and replace those functions, since the code for them may take on different forms, not just between each game, but between each time the function is used in each game. This is what I was alluding to in The Game Plan about how Cxbx has unique challenges trying to wrap every game in the Xbox library.
The only real way to solve this for every game would be to add a push buffer interpreter - something that interprets NV2A instructions and reproduces the intended results on another platform. This would also be known as an Xbox GPU emulator, which doesn't really fit in with an injection-only/HLE approach.
Since I'm focusing solely on one single game, I'm not bound by trying to find automated ways to do this. Instead of working towards emulating the Xbox's GPU, my plan is instead to rewrite/reinject pieces of the B3 code so that it hopefully no longer uses push buffers at all. Again, I consider this a PC port, not emulation, so if I have the option of reimplementing parts of the Xbox vs shifting away from it, I'll pick the latter where possible. I also think that's going to be the faster approach.
So far I've been making good progress finding the functions responsible for drawing the 3D scenery, and I think I've even found some of the push buffer related functions. It can be hard to tell what code is B3's and what code is D3D8's because they're so aggressively inlined, but I've been able to match some code patterns to Xbox D3D8 code, so I think I'm on the right track.
I'll update this thread as I continue to make progress. Larger discoveries (those that can be neatly contained in one post) will continue to have their own threads, but more cas' posts like this will be put into here.
And I also have to talk about my secret weapon(s) at some point...