Let me argue the opposite approach for a bit Virtualizing the VGA
access.

There are several advantages to this: 1) It would provide support for
OS's other than Windows ( how does running Windows within X within X
within X sound? ). 2) Full-Screen mode could be modified, as appropriate
to bypass the virtualization mode so apps could still run in
full-screen. 3) With a "shallow" virtualization there would be no
noticable loss in performance.

Point 3 is the probable sticking point. The basic problem is mapping one
video system into another. By translating the VGA registers 'et all into
a simplified set ( perhaps do what some of the Commodore emulators do
and have a sub-thread perform a scanline sweep to determine what to
display ), I don't think performance would be that impacted. This would
also avoid the issue with worrying about the amount of memory within the
video buffer etc. 

The virtualization I'm thinking of would implement a modest but capable
"video set" and then the guest OS would use the driver for that "modest
but capable" video set. 

The problems I see with no virtualizing is it that basically you'll be
unable to run plex86 more than once. This would arrise out of the issue
with over-riding management of the Video Card's ram. There would also be
problems with running it windowed in differant color settings etc..
Doing partial virtualization in windowed mode would run into color
translation, bitmap plane issues etc.

Reply via email to