VirGL for old versions of Mac OS X?
Posted: Wed Jun 07, 2023 4:34 am
So far I've been treating "retro GPU passthrough" as a kind of preservation project which allows people to get old OSes running to their fullest extent without needing a whole PC from that era. But there is a snag in that it does require some old hardware. Not a whole computer, but a GPU which will almost certainly die one day (in fact, arguably, one of the most likely components in your PC to die).
While I like hardware, I also very much understand the only way to truly preserve something is to emulate it. No physical object lasts forever, but data and software can, simply because they can transcend the medium they're originally stored on.
So while my HD 2400 Pro still doesn't fully work in OS X, and I've been entertaining reverse engineering the old ATI kexts to figure out what's wrong, I've been wondering if there are perhaps better solutions that wouldn't rely on anyone else needing to have this specific card.
I briefly considered emulating a real Mac version of the HD 2400 (or something similar), and while I'm sure that would be an intellectually stimulating project, my guess is it would take years (assuming I didn't give up after a single month).
But there is a Linux technology that's rather interesting. When an OS tries to use OpenGL, it needs to open up a context on a GPU. VirGL allows a guest VM to open up an OpenGL context on the host GPU. I've been testing it out and it actually seems to work pretty well. Since we're talking about older versions of OS X here (pre-Metal) that are entirely OpenGL, this actually theoretically could make for a pretty good solution, without needing extra hardware and without the overhead of emulating an entire GPU.
Only problem is, this technology currently only works on Linux guests. It's not that it can't work on other OSes, it's just that Windows primarily uses DirectX so OpenGL passthrough isn't that useful (a driver was apparently being worked on but seems to have been abandoned now), and Apple has never publicly documented how to write a 3D accelerator (pretty much entirely so people can't get a good experience in a VM).
But none of that means it's impossible, it just means it requires a little reverse engineering. There's also already an open source framebuffer kext for running OS X in VMware/QEMU (IOFramebuffer actually is publicly documented) so that's at least some of the work already done.
I'm not saying I'm going to start work on this any time soon (already got my hands full with other projects), but I wanted to hear if anyone else had any thoughts, and say that it's something I might be interested in working on one day.
While I like hardware, I also very much understand the only way to truly preserve something is to emulate it. No physical object lasts forever, but data and software can, simply because they can transcend the medium they're originally stored on.
So while my HD 2400 Pro still doesn't fully work in OS X, and I've been entertaining reverse engineering the old ATI kexts to figure out what's wrong, I've been wondering if there are perhaps better solutions that wouldn't rely on anyone else needing to have this specific card.
I briefly considered emulating a real Mac version of the HD 2400 (or something similar), and while I'm sure that would be an intellectually stimulating project, my guess is it would take years (assuming I didn't give up after a single month).
But there is a Linux technology that's rather interesting. When an OS tries to use OpenGL, it needs to open up a context on a GPU. VirGL allows a guest VM to open up an OpenGL context on the host GPU. I've been testing it out and it actually seems to work pretty well. Since we're talking about older versions of OS X here (pre-Metal) that are entirely OpenGL, this actually theoretically could make for a pretty good solution, without needing extra hardware and without the overhead of emulating an entire GPU.
Only problem is, this technology currently only works on Linux guests. It's not that it can't work on other OSes, it's just that Windows primarily uses DirectX so OpenGL passthrough isn't that useful (a driver was apparently being worked on but seems to have been abandoned now), and Apple has never publicly documented how to write a 3D accelerator (pretty much entirely so people can't get a good experience in a VM).
But none of that means it's impossible, it just means it requires a little reverse engineering. There's also already an open source framebuffer kext for running OS X in VMware/QEMU (IOFramebuffer actually is publicly documented) so that's at least some of the work already done.
I'm not saying I'm going to start work on this any time soon (already got my hands full with other projects), but I wanted to hear if anyone else had any thoughts, and say that it's something I might be interested in working on one day.