Re: [Dri-devel] Dual-head direct 3D rendering working
On Tue, 2003-03-04 at 11:36, Michel Dänzer wrote: I have not provided a diff because it is quite a hack and very system specific, at the moment. Effectively, I forced the virtual size to be 2048x768, hacked the RADEONDoAdjustFrame() function to fix views as I wanted them, used the default cloning stuff to setup the second monitor, and removed all the conditionals that were preventing dual-head+DRI from working. Those apply in this case as well? Are you using two entities? I should have been more specific: I removed the conditionals that were preventing Xinerama+DRI from working. As I mentioned before, if Xinerama is disabled, the desktop ends at 1024, rather than 2048. --Jonathan Thambidurai --- This SF.net email is sponsored by: Etnus, makers of TotalView, The debugger for complex code. Debugging C/C++ programs can leave you feeling lost and disoriented. TotalView can help you find your way. Available on major UNIX and Linux platforms. Try it free. www.etnus.com ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head direct 3D rendering working
On Wed, Mar 05, 2003 at 11:06:45AM -0500, Jonathan Thambidurai wrote: On Tue, 2003-03-04 at 11:36, Michel Dänzer wrote: I have not provided a diff because it is quite a hack and very system specific, at the moment. Effectively, I forced the virtual size to be 2048x768, hacked the RADEONDoAdjustFrame() function to fix views as I wanted them, used the default cloning stuff to setup the second monitor, and removed all the conditionals that were preventing dual-head+DRI from working. Those apply in this case as well? Are you using two entities? I should have been more specific: I removed the conditionals that were preventing Xinerama+DRI from working. As I mentioned before, if Xinerama is disabled, the desktop ends at 1024, rather than 2048. BTW, how does this behave with desktop manager, like the gnome 2.2 desktop for example, which can get xinerama hints to draw itself correctly ? You don't touch xinerama at all in the driver, do you ? You just use the Virtual and Viewport options (in the configuration file) to set your screen accordyingly, right ? Does it not cause problems when you move the viewport inside the virtual space ? Or did you disable this ? Friendly, Sven Luther --- This SF.net email is sponsored by: Etnus, makers of TotalView, The debugger for complex code. Debugging C/C++ programs can leave you feeling lost and disoriented. TotalView can help you find your way. Available on major UNIX and Linux platforms. Try it free. www.etnus.com ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head direct 3D rendering working
On Die, 2003-03-04 at 01:52, Jonathan Thambidurai wrote: I am pleased to report that thanks to the guidance Jens Owens gave in a previous message, I have made 3D work on two heads simultaneously (IIRC, the ATI Windows XP drivers didn't do this). Coolness. I have not provided a diff because it is quite a hack and very system specific, at the moment. Effectively, I forced the virtual size to be 2048x768, hacked the RADEONDoAdjustFrame() function to fix views as I wanted them, used the default cloning stuff to setup the second monitor, and removed all the conditionals that were preventing dual-head+DRI from working. Those apply in this case as well? Are you using two entities? Does anyone have any ideas for a more elegant implementation of this functionality, especially where the config file is concerned? This is the first real code I have done in the Xserver and any input would be appreciated. This is being discussed on the [EMAIL PROTECTED] list in the thread 'Multiple video consoles'. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This SF.net email is sponsored by: Etnus, makers of TotalView, The debugger for complex code. Debugging C/C++ programs can leave you feeling lost and disoriented. TotalView can help you find your way. Available on major UNIX and Linux platforms. Try it free. www.etnus.com ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
Hello, ... As you may have noticed, i have started a (sub) thread with David Dawes on this subject on the xfree86 list. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Thu, Feb 27, 2003 at 02:14:37AM +0100, Michel Dänzer wrote: On Mit, 2003-02-26 at 18:16, Alex Deucher wrote: --- Sven Luther [EMAIL PROTECTED] wrote: [ video memory management ] How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? Not sure. I'm not familiar with the memory manager either. I seem to recall some drivers have the (compile time) option of allocating more or less to 2D vs 3D. I believe it was the mga driver, and the issue was not having enough memory for Xv cause 3D has reserved too much. Some drivers (tdfx and radeon at least) only reserve offscreen memory for 3D when it's actually used, the amount is still static though. Do they get it from the OS memory manager, or do they do another trick ? Friendly, Sven Luther --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Thu, Feb 27, 2003 at 02:12:24AM +0100, Michel Dänzer wrote: On Mit, 2003-02-26 at 21:11, Sven Luther wrote: [...] because the DRI is just rendering to the framebuffer, it doesn't know if you are displaying it or not, and doesn't even care. The only issue is with size limits of the 3D engine, like Michel said, with the Radeon 3D engine being limited to 2Kx2K, which would mean a maximum resolution of 1024x768 or 1280x1024 if you stapple the screen vertically. I don't know the radeon specs, but i guess it should be possible to work around this by changin the base address, at least in the vertical stapling case, in the horizontal case, screen stride may become a problem. But then it's no longer a shared framebuffer in so far as the 3D parts need to support it explicitly. Well, sure ... That said, i am not sure i agree with you here, i don't really know how the ATI cards work, but as i see it, you have the framebuffer and its screen stride (is this one also limited to 2048, or only the coordinates), and then you have the window you are rendering unto, i have the feeling that, provided the screen stride can be big enough, it would be enough to set the screen base to the top left corner of the 3D window, and then render into it. This would be part of the context information, or whatever. We would still have a limit of a maximum 2048x2048 OpenGL window, but well, if your hardware can't handle more, there is no chance to do more. This would allow this scheme with minimal support from the 3D parts. Friendly, Sven Luther --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Don, 2003-02-27 at 09:33, Sven Luther wrote: On Thu, Feb 27, 2003 at 02:14:37AM +0100, Michel Dänzer wrote: On Mit, 2003-02-26 at 18:16, Alex Deucher wrote: -- Sven Luther [EMAIL PROTECTED] wrote: [ video memory management ] How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? Not sure. I'm not familiar with the memory manager either. I seem to recall some drivers have the (compile time) option of allocating more or less to 2D vs 3D. I believe it was the mga driver, and the issue was not having enough memory for Xv cause 3D has reserved too much. Some drivers (tdfx and radeon at least) only reserve offscreen memory for 3D when it's actually used, the amount is still static though. Do they get it from the OS memory manager, or do they do another trick ? See RADEONDRITransitionTo{2,3}d() . -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Thu, Feb 27, 2003 at 06:58:42PM +0100, Michel Dänzer wrote: On Don, 2003-02-27 at 09:33, Sven Luther wrote: On Thu, Feb 27, 2003 at 02:14:37AM +0100, Michel Dänzer wrote: On Mit, 2003-02-26 at 18:16, Alex Deucher wrote: -- Sven Luther [EMAIL PROTECTED] wrote: [ video memory management ] How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? Not sure. I'm not familiar with the memory manager either. I seem to recall some drivers have the (compile time) option of allocating more or less to 2D vs 3D. I believe it was the mga driver, and the issue was not having enough memory for Xv cause 3D has reserved too much. Some drivers (tdfx and radeon at least) only reserve offscreen memory for 3D when it's actually used, the amount is still static though. Do they get it from the OS memory manager, or do they do another trick ? See RADEONDRITransitionTo{2,3}d() . Ok, thanks. They do use the OS memory manager. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Mon, Feb 24, 2003 at 01:28:15PM -0800, Alex Deucher wrote: right now for these chips you set up the entity as shareable and then divide your framebuffer into two or more frambuffers, one for each CRTC. Each instance of the driver then works using its framebuffer. each head is distinguished using the screen 0/1/etc. flag in the XF86Config file. (entity functions and operations for this sort of thing are not so well documented, at least not that I could find). Each pscrn gets it's framebuffer size and address mapped to the part of video ram you allocated for each frambuffer. Yes, and you have to divide the fb memory in two, one for each head, or something such, and each head will have its separate offscreen memory manager, possibly using different screen strides. It should be possible to allocate memory for both framebuffers and have the offscreen memory manager be part of the EntRec, or something such, that is common for both heads. I haven't looked at it much yet. Also, this would need to be coordinated with ian's work on the texmem memory manager, in particular the open point on XAA memory integration. How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? For chips with dual crtcs you could do just what the diagram below shows and create a single frame buffer with two sort of virtual framebuffers. I'm not sure how this would work with the existing entity code. I guess as far as XF86Config goes you have to create a custom flag. As Sven said, this is I think what the matrox and Nvidia drivers do. They can also announce xinerama for the benefit of window managers and such (even though it's technically not two pscrns). You could add the stuff from the device specific EntRec's to device specific Rec's. then each pscrn would be responsible for not only frambuffer base and address but also primary and secondary virtual frame buffers and address. The main framebuffer would hold the whole contents of the screen and each virtual framebuffer would basically be a viewport into the total screen. I haven't had time to think this through throughly and I'm already starting to have questions... I dunno...food for thought I guess. Notice that the current entity sharing stuff does distinguish between primary head and secondary head, so you could just test if you are on the primary head, and do the offscreen memory allocation and (double) framebuffer reservation on the secondary head (the one that gets initialized second), so we know the size of both framebuffers. Naturally, this info should be shareable, but since i think it will not change, it is also ok to have it in the device specific Rec. Now, then the only thing that would differ would be the viewport, which should not cause problem. Note that we need to maybe separate the modeinit stuff, into one part which will be responsible for setting up the framebuffers, and another part for setting up the viewport (the video mode one both heads, that is). This way, you would still have two screen sections in the config file, the only difference from traditional dual head, would be that we specify some flag to have it handled like that. Now, lets see what configuration option we would like for such a setup. The following things come to mind (i don't know if all the cards, or even mine support all of these, but assuredly future board will). o Traditional dual head. = No additional flag is used. o Double buffer dual head (what i did just describe) = We use the Double or whatever flag, we can also use an additional (optional) flag or argument to give the head mapping, after all, there is no reason why head 0 should be the primary head and vice-versa). o Mirrored viewports. = We use a mirror flag, both heads will be set to the same viewport. o Zoomed window. = One of the heads will have a viewport corresponding to a subpart of the other. with optional zooming if the size don't correspond. Maybe the two later could be merged, with some clever option handling or such. Are there other things i am missing here ? Also, as a later point, it would be nice if these things could be changed dynamically, maybe as a response to some special key stroke or such like they have on laptops (or do these keystroke work even if there is no driver support for those ?). Notice that this should work fine for most boards, even those having an offchip second ramdac or a third like the matrox parhelia. There may be some issues though with the accel engine of some chips which can only do 2048 width, but i guess these sizes will increase in the future, and anyway, we could staple the screen vertically, thus not reaching too big screen stride, and have the function which setup for each screen move the base address or something such. This would be a problem if you want to see both screens side by side though. Now, this was the
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
--- Sven Luther [EMAIL PROTECTED] wrote: On Mon, Feb 24, 2003 at 01:28:15PM -0800, Alex Deucher wrote: right now for these chips you set up the entity as shareable and then divide your framebuffer into two or more frambuffers, one for each CRTC. Each instance of the driver then works using its framebuffer. each head is distinguished using the screen 0/1/etc. flag in the XF86Config file. (entity functions and operations for this sort of thing are not so well documented, at least not that I could find). Each pscrn gets it's framebuffer size and address mapped to the part of video ram you allocated for each frambuffer. Yes, and you have to divide the fb memory in two, one for each head, or something such, and each head will have its separate offscreen memory manager, possibly using different screen strides. It should be possible to allocate memory for both framebuffers and have the offscreen memory manager be part of the EntRec, or something such, that is common for both heads. I haven't looked at it much yet. Also, this would need to be coordinated with ian's work on the texmem memory manager, in particular the open point on XAA memory integration. How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? Not sure. I'm not familiar with the memory manager either. I seem to recall some drivers have the (compile time) option of allocating more or less to 2D vs 3D. I believe it was the mga driver, and the issue was not having enough memory for Xv cause 3D has reserved too much. For chips with dual crtcs you could do just what the diagram below shows and create a single frame buffer with two sort of virtual framebuffers. I'm not sure how this would work with the existing entity code. I guess as far as XF86Config goes you have to create a custom flag. As Sven said, this is I think what the matrox and Nvidia drivers do. They can also announce xinerama for the benefit of window managers and such (even though it's technically not two pscrns). You could add the stuff from the device specific EntRec's to device specific Rec's. then each pscrn would be responsible for not only frambuffer base and address but also primary and secondary virtual frame buffers and address. The main framebuffer would hold the whole contents of the screen and each virtual framebuffer would basically be a viewport into the total screen. I haven't had time to think this through throughly and I'm already starting to have questions... I dunno...food for thought I guess. Notice that the current entity sharing stuff does distinguish between primary head and secondary head, so you could just test if you are on the primary head, and do the offscreen memory allocation and (double) framebuffer reservation on the secondary head (the one that gets initialized second), so we know the size of both framebuffers. Naturally, this info should be shareable, but since i think it will not change, it is also ok to have it in the device specific Rec. Would this work with the current shareable entity stuff? it seems like that would predicate two separate instances of the driver, in this case we would only want one, right? one instance driving two heads. Now, then the only thing that would differ would be the viewport, which should not cause problem. Note that we need to maybe separate the modeinit stuff, into one part which will be responsible for setting up the framebuffers, and another part for setting up the viewport (the video mode one both heads, that is). This way, you would still have two screen sections in the config file, the only difference from traditional dual head, would be that we specify some flag to have it handled like that. Now, lets see what configuration option we would like for such a setup. The following things come to mind (i don't know if all the cards, or even mine support all of these, but assuredly future board will). o Traditional dual head. = No additional flag is used. o Double buffer dual head (what i did just describe) = We use the Double or whatever flag, we can also use an additional (optional) flag or argument to give the head mapping, after all, there is no reason why head 0 should be the primary head and vice-versa). o Mirrored viewports. = We use a mirror flag, both heads will be set to the same viewport. o Zoomed window. = One of the heads will have a viewport corresponding to a subpart of the other. with optional zooming if the size don't correspond. Maybe the two later could be merged, with some clever option handling or such. Are there other things i am missing here ? maybe you could make the zoomed mode part of the mirror mode, but specify the viewport in the screen section of the XF86Config. Also, as a later point, it would
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Wed, 26 Feb 2003, Sven Luther wrote: Yes, and you have to divide the fb memory in two, one for each head, or something such, and each head will have its separate offscreen memory manager, possibly using different screen strides. Side note: I know that what people are mostly talking about is having two separate displays with different contents, but please, if you're thinking about this, try to make the solution generic enough that you can have two separate displays with the _same_ backing store content at different resolutions and different pointers. Yeah, not all chips support this, but many do (and probably all that support multiview support this subset), and it's invaluable for having laptops that have small LCD's. In particular, it should be possible to have the pointer associated with the LCD, and scroll around on the LCD while the CRT output (ie usually a projector) shows the whole picture (obviously without scrolling or without any pointer). Right now, as far as I can tell, XFree86 can not do this sanely. You can have two separate X servers for the different outputs, or you can have the exact _same_ output on both CRT controllers, but you can't make the two displays look like separate windows into the same area. And it really sounds like the DRI dual-head is not that conceptually different from this. The only issue is whether you share the frame buffer or not. So you have several cases: - shared framebuffer, shared CRT control - shared framebuffer, but separate CRT control (and mouse focus or whatnot) - separate framebuffers, and separate CRT control (and mouse focus) Is this what you call mirrored viewports? Linus --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Wed, Feb 26, 2003 at 09:16:53AM -0800, Alex Deucher wrote: --- Sven Luther [EMAIL PROTECTED] wrote: How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? Not sure. I'm not familiar with the memory manager either. I seem to recall some drivers have the (compile time) option of allocating more or less to 2D vs 3D. I believe it was the mga driver, and the issue was not having enough memory for Xv cause 3D has reserved too much. Yes, i also am faced with choice of how to allocate memory, altough i have not yet gotten to DRI support, and it may also depend on the specific graphic hardware, like Michel said. You could add the stuff from the device specific EntRec's to device specific Rec's. then each pscrn would be responsible for not only frambuffer base and address but also primary and secondary virtual frame buffers and address. The main framebuffer would hold the whole contents of the screen and each virtual framebuffer would basically be a viewport into the total screen. I haven't had time to think this through throughly and I'm already starting to have questions... I dunno...food for thought I guess. Notice that the current entity sharing stuff does distinguish between primary head and secondary head, so you could just test if you are on the primary head, and do the offscreen memory allocation and (double) framebuffer reservation on the secondary head (the one that gets initialized second), so we know the size of both framebuffers. Naturally, this info should be shareable, but since i think it will not change, it is also ok to have it in the device specific Rec. Would this work with the current shareable entity stuff? it seems like that would predicate two separate instances of the driver, in this case we would only want one, right? one instance driving two heads. No, i think it would work ok, i would need to test though, and have not much time for it right now. When we are in the chipset_probe function, we set the entity as shareable, and allocate the private entrec, as well as give some special cases there. Once we are in PreInit or even in ScreenInit we can do finer tests, and postpone the memory reservation and such until we have info on both screens. We are not showing it anyway before ModeInit is called, and both ScreenInit are called successively, if i remember well. Some care must be taken though and checking. o Mirrored viewports. = We use a mirror flag, both heads will be set to the same viewport. o Zoomed window. = One of the heads will have a viewport corresponding to a subpart of the other. with optional zooming if the size don't correspond. Maybe the two later could be merged, with some clever option handling or such. Are there other things i am missing here ? maybe you could make the zoomed mode part of the mirror mode, but specify the viewport in the screen section of the XF86Config. Mmm, and how do you set the zoom values ? You know, i think this could work, as the info in the screen section is used to call the modeinit function. There is also X/Y Mirrored mode, and rotated modes, but i guess not all hardware can do this, and it would be difficult to implement. Also, as a later point, it would be nice if these things could be changed dynamically, maybe as a response to some special key stroke or such like they have on laptops (or do these keystroke work even if there is no driver support for those ?). I think several driver authors have brought this up before. I don't think there was a way for the driver to intercept keystrokes. Thomas Winischhofer brought it up on one of the Xfree ML's, but I'm not sure if it was ever resolved. Ok, ... Now, this was the easy part, what are the problems we are facing ? o We need to make Xinerama aware of this. o Xv support (and possibly HW cursor). Well this works fine for most 2D drawing and probably 3D drawing, but the video or cursor overlays will not know about it. We will still need to do those per screen, which may not be possible on all dual head boards. yeah. Most dualhead boards have two HW cursors, some also have 2 overlays, so those could be set up screen specifically. However, for cards with one overlay, would it be possible to use the overlay on either head? Say, if more of the window is on head 1 use the overlay with that crtc, if more is on head 2, then use it with that crtc, and specify head 1 as the default for cases with an even divide. I know most boards with one overlay can usually choose which head to use it on. In fact the matrox driver for beos works that way. the source is even available. I don't really know, some allow to use the overlay only in single head mode, i guess. o 3D memory management. Ideally we would use Ian's new
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
--- Linus Torvalds [EMAIL PROTECTED] wrote: On Wed, 26 Feb 2003, Sven Luther wrote: Yes, and you have to divide the fb memory in two, one for each head, or something such, and each head will have its separate offscreen memory manager, possibly using different screen strides. Side note: I know that what people are mostly talking about is having two separate displays with different contents, but please, if you're thinking about this, try to make the solution generic enough that you can have two separate displays with the _same_ backing store content at different resolutions and different pointers. Yeah, not all chips support this, but many do (and probably all that support multiview support this subset), and it's invaluable for having laptops that have small LCD's. In particular, it should be possible to have the pointer associated with the LCD, and scroll around on the LCD while the CRT output (ie usually a projector) shows the whole picture (obviously without scrolling or without any pointer). Right now, as far as I can tell, XFree86 can not do this sanely. You can have two separate X servers for the different outputs, or you can have the exact _same_ output on both CRT controllers, but you can't make the two displays look like separate windows into the same area. And it really sounds like the DRI dual-head is not that conceptually different from this. The only issue is whether you share the frame buffer or not. So you have several cases: - shared framebuffer, shared CRT control - shared framebuffer, but separate CRT control (and mouse focus or whatnot) - separate framebuffers, and separate CRT control (and mouse focus) Is this what you call mirrored viewports? Linus I think we are all talking about the same thing. What you described above might be covered by the zoomed mode (you could zoom in or out), which is kind of a subset of the mirrored mode. In that case the second crtc would be a viewport into the same framebuffer, only with a different resolution. I agree that Xfree doesn't really have a good interface for this at the moment. it tends to be too entity specific. Alex __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Wed, Feb 26, 2003 at 09:40:18AM -0800, Linus Torvalds wrote: On Wed, 26 Feb 2003, Sven Luther wrote: Yes, and you have to divide the fb memory in two, one for each head, or something such, and each head will have its separate offscreen memory manager, possibly using different screen strides. Side note: I know that what people are mostly talking about is having two separate displays with different contents, but please, if you're thinking about this, try to make the solution generic enough that you can have two separate displays with the _same_ backing store content at different resolutions and different pointers. Yes, this was the spirit of the proposal, see below for details. You cannot have different depth modes though. Yeah, not all chips support this, but many do (and probably all that support multiview support this subset), and it's invaluable for having laptops that have small LCD's. In particular, it should be possible to have the pointer associated with the LCD, and scroll around on the LCD while the CRT output (ie usually a projector) shows the whole picture (obviously without scrolling or without any pointer). See below. Right now, as far as I can tell, XFree86 can not do this sanely. You can have two separate X servers for the different outputs, or you can have the exact _same_ output on both CRT controllers, but you can't make the two displays look like separate windows into the same area. Well, XFree86 Does not do it, and there is no way you can configure it, but the drivers can be made to handle such things, even dynamically i think. After all at least the matrox proprietary driver, and i hear the nvidia one does already. And it really sounds like the DRI dual-head is not that conceptually different from this. The only issue is whether you share the frame buffer or not. Yes, because the DRI is just rendering to the framebuffer, it doesn't know if you are displaying it or not, and doesn't even care. The only issue is with size limits of the 3D engine, like Michel said, with the Radeon 3D engine being limited to 2Kx2K, which would mean a maximum resolution of 1024x768 or 1280x1024 if you stapple the screen vertically. I don't know the radeon specs, but i guess it should be possible to work around this by changin the base address, at least in the vertical stapling case, in the horizontal case, screen stride may become a problem. So you have several cases: - shared framebuffer, shared CRT control - shared framebuffer, but separate CRT control (and mouse focus or whatnot) Mmm, didn't think about mouse focus. - separate framebuffers, and separate CRT control (and mouse focus) That is the traditional dual head, where the separate displayes can later be joined via Xinerama. Is this what you call mirrored viewports? Yes, sort of. o you can have the traditional dualhead, with two separate framebuffers each with only one viewport. This is what is currently used, and the only way to do dualhead with two single head graphic boards. o you can have shared framebuffer dual head, with two viewports on the same framebuffer, this is what was shown in the original diagram. o then you can choose to have one viewport be a mirror of the second. I believe most dual head cards boot into such modes. If one screen as a bigger resolution than the other (1024x768 LCD screen and 800x600 video projector for example), then one of the modes (usually the bigger one) can be set to be a zoomed version of the other, if your hardware supports this. o finally you can have zoomed viewports, when one is the main viewport, and the second show a zoomed version of a windows (or subset) of the other. I think in the matrox window driver this can be set dynamically, where you select a window, and it get zoomed on the second head. I don't know, but i suppose that the second display follows, even if the window is redimensioned or something such. The first is the current situation and cannot be emulated by the others, but i think a more flexible framework could englobe all three others. Basically, you would specify the framebuffer size, and the corresponding viewports for each head separatedly and independently. Also dynamic changing of viewports could be done by an extension of the RandR extension maybe, which already does something such for resolution swapping, but i did not look at it yet. This would still only be somewhat hacky right now, but could be nicely formalized for 5.0. Was this what you had in mind, or do you need some other functionality. Friendly, Sven Luther --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Mit, 2003-02-26 at 21:11, Sven Luther wrote: [...] because the DRI is just rendering to the framebuffer, it doesn't know if you are displaying it or not, and doesn't even care. The only issue is with size limits of the 3D engine, like Michel said, with the Radeon 3D engine being limited to 2Kx2K, which would mean a maximum resolution of 1024x768 or 1280x1024 if you stapple the screen vertically. I don't know the radeon specs, but i guess it should be possible to work around this by changin the base address, at least in the vertical stapling case, in the horizontal case, screen stride may become a problem. But then it's no longer a shared framebuffer in so far as the 3D parts need to support it explicitly. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
On Mit, 2003-02-26 at 18:16, Alex Deucher wrote: --- Sven Luther [EMAIL PROTECTED] wrote: [ video memory management ] How is it done right now ? Is a part of the onchip memory reserved for framebuffer and XAA, and another part free for 3D use ? Not sure. I'm not familiar with the memory manager either. I seem to recall some drivers have the (compile time) option of allocating more or less to 2D vs 3D. I believe it was the mga driver, and the issue was not having enough memory for Xv cause 3D has reserved too much. Some drivers (tdfx and radeon at least) only reserve offscreen memory for 3D when it's actually used, the amount is still static though. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This SF.net email is sponsored by: Scholarships for Techies! Can't afford IT training? All 2003 ictp students receive scholarships. Get hands-on training in Microsoft, Cisco, Sun, Linux/UNIX, and more. www.ictp.com/training/sourceforge.asp ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. Also, i guess this would facilitate the memory management in dual head configurations. In general, dual head configuration is pretty poor in the current X server, at least in the documented part. There is no way from specifying mirrored or zoomed window output, or to be able to change the configuration dynamically, but this would probably not be achievable without an extension, or maybe incorporated with the RandR stuff. I think that matrox did something such in their proprietary drivers. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. Possibly by adding a secondary monitor line to the screens section. Also, i guess this would facilitate the memory management in dual head configurations. Yes, the driver would need to handle how the memory is shared. In general, dual head configuration is pretty poor in the current X server, at least in the documented part. There is no way from specifying mirrored or zoomed window output, or to be able to change the configuration dynamically, but this would probably not be achievable without an extension, or maybe incorporated with the RandR stuff. If you want things to be dynamic, you will need to stay away from X's notion of a second X11 screen. That's is static and persistant by definition. However, a secondary monitor to the primary screen could be as dynamic as you want to make it. I think that matrox did something such in their proprietary drivers. There are other proprietary drivers that have also done similar functionality. However, I haven't seen anything in open source. -- /\ Jens Owen/ \/\ _ [EMAIL PROTECTED] /\ \ \ Steamboat Springs, Colorado --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Could you please point me to the code which needs to be worked on for the above solution so that I know where to start? Would I have to work with Xlib and the OpenGL libraries or might the work be limited to the drivers? No, you don't. :) You can start an additional server and switch between the two, gdm makes that particularly easy. Thanks for the information. --Jonathan Thambidurai --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Wouldn't this cause problems with the window manager (maximizing windows etc) and for fullscreen games? Part of the purpose of Xinerama is to provide hints to apps and WM for screen layout purposes. Personally I am quite happy with having multiple static screens in the traditional X11 style and would very much like to see this working first (for separate gfx adapters and/or multiple crtc of a single adapter). --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, Feb 24, 2003 at 08:35:13AM -0700, Jens Owen wrote: Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. Possibly by adding a secondary monitor line to the screens section. Yes, sure, but this cannot do anything like miroring or windows zooming, only standard static dual head. Also, i guess this would facilitate the memory management in dual head configurations. Yes, the driver would need to handle how the memory is shared. Yes, shared between both heads, but also between 2D and 3D, altough i don't know how DRI handles this right now. In general, dual head configuration is pretty poor in the current X server, at least in the documented part. There is no way from specifying mirrored or zoomed window output, or to be able to change the configuration dynamically, but this would probably not be achievable without an extension, or maybe incorporated with the RandR stuff. If you want things to be dynamic, you will need to stay away from X's notion of a second X11 screen. That's is static and persistant by definition. However, a secondary monitor to the primary screen could be as dynamic as you want to make it. Mmm, but you would need a fixed screen stride that is the sum of both heads, or something such. I guess using a virtual screen is the way to go, but there may be problems with regard to how apps (in particular desktop managers) handle it, i guess. I think that matrox did something such in their proprietary drivers. There are other proprietary drivers that have also done similar functionality. However, I haven't seen anything in open source. Mmm, i would be interested in working on this, need to find some time though. Are there other people whow are interested in this also ? Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, 2003-02-24 at 17:36, Steven Newbury wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Wouldn't this cause problems with the window manager (maximizing windows etc) and for fullscreen games? Part of the purpose of Xinerama is to provide hints to apps and WM for screen layout purposes. The Xinerama API could be provided for this as well. AFAIK the proprietary nvidia drivers do that. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, 2003-02-24 at 16:09, Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. I guess it would have to be handled in the driver until 5.0 or whenever the driver model will be rethought. Also, i guess this would facilitate the memory management in dual head configurations. Indeed, as it's essentially a single screen. Integrating this with RandR could get interesting to say the least though... -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, 2003-02-24 at 16:27, Jonathan Thambidurai wrote: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Could you please point me to the code which needs to be worked on for the above solution so that I know where to start? Would I have to work with Xlib and the OpenGL libraries or might the work be limited to the drivers? Only DDX, as Jens pointed out. Most prerequisites are already in place in the radeon driver, one would 'just' have to tie up the loose ends. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, Feb 24, 2003 at 06:05:21PM +0100, Michel Dänzer wrote: On Mon, 2003-02-24 at 16:09, Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. I guess it would have to be handled in the driver until 5.0 or whenever the driver model will be rethought. I have no problem with doing this in the driver, as long as every driver does it the same way. The drivers will be doing the work anyway. Also, i guess this would facilitate the memory management in dual head configurations. Indeed, as it's essentially a single screen. Integrating this with RandR could get interesting to say the least though... The complicated bits would be when both screen have not the same resolution. Altough i get using a virtual screen is the way to go, i have doubts that the current desktop manager will understand that we are not using a part of the screen. Funny that there is discution about this in the DRI list, while when i asked on the xfree86 list some time ago, nobody cared. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, 2003-02-24 at 18:11, Sven Luther wrote: On Mon, Feb 24, 2003 at 06:05:21PM +0100, Michel Dänzer wrote: On Mon, 2003-02-24 at 16:09, Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. I guess it would have to be handled in the driver until 5.0 or whenever the driver model will be rethought. I have no problem with doing this in the driver, as long as every driver does it the same way. The drivers will be doing the work anyway. Duplicating a lot of work in the drivers which would be better centralized in the driver independent infrastructure. Altough i get using a virtual screen is the way to go, i have doubts that the current desktop manager will understand that we are not using a part of the screen. That's what Xinerama is for. Funny that there is discution about this in the DRI list, while when i asked on the xfree86 list some time ago, nobody cared. Must have missed that, the major motivation behind this would be 3D acceleration though. Unfortunately, I just recalled a possibly major problem: AFAIK the 3D engine can only render to a rectangle up to 2048 pixels wide and high. That would be pretty limiting, in particular when the heads are side by side. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, Feb 24, 2003 at 06:35:47PM +0100, Michel Dänzer wrote: On Mon, 2003-02-24 at 18:11, Sven Luther wrote: On Mon, Feb 24, 2003 at 06:05:21PM +0100, Michel Dänzer wrote: On Mon, 2003-02-24 at 16:09, Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. I guess it would have to be handled in the driver until 5.0 or whenever the driver model will be rethought. I have no problem with doing this in the driver, as long as every driver does it the same way. The drivers will be doing the work anyway. Duplicating a lot of work in the drivers which would be better centralized in the driver independent infrastructure. Yes, but can it be done differently before 5.0 ? Altough i get using a virtual screen is the way to go, i have doubts that the current desktop manager will understand that we are not using a part of the screen. That's what Xinerama is for. Yes, did see that on your other mails. Funny that there is discution about this in the DRI list, while when i asked on the xfree86 list some time ago, nobody cared. Must have missed that, the major motivation behind this would be 3D acceleration though. What about monitor plug play, or you want to do a presentation, plugin in the video projector without wanting to restart X. Unfortunately, I just recalled a possibly major problem: AFAIK the 3D engine can only render to a rectangle up to 2048 pixels wide and high. That would be pretty limiting, in particular when the heads are side by side. Mmm, this is a limitation of the DRI 3D engine, or a limitation of the radeon 3D engine ? My card can do 8kx8k, so this would be no problem, and i am sure future hardware will also go into that direction. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, Feb 24, 2003 at 06:35:47PM +0100, Michel Dänzer wrote: On Mon, 2003-02-24 at 18:11, Sven Luther wrote: On Mon, Feb 24, 2003 at 06:05:21PM +0100, Michel Dänzer wrote: On Mon, 2003-02-24 at 16:09, Sven Luther wrote: On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. I guess it would have to be handled in the driver until 5.0 or whenever the driver model will be rethought. I have no problem with doing this in the driver, as long as every driver does it the same way. The drivers will be doing the work anyway. Duplicating a lot of work in the drivers which would be better centralized in the driver independent infrastructure. But you could put it in the the common directory as a set of helper functions or something such, which only need to be called by the drivers ? And the ddx driver would need to be able to program the video outputs anyway, and what else is really needed ? The Xinerama hinting stuff ? The RandR like dynamic configuration ? What i think is most important is that a common configuration options or something such be defined, before every driver go at it in its own way. Friendly, Sven Luther --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, 2003-02-24 at 11:12, Jens Owen wrote: I believe you can implement this at the 2D DDX driver level. Getting the configuration file semantics worked out would probably be the first step. Then get the driver to read those semantics and initial the secondary display pipeline. Have you worked with XFree86 DDX drivers before? I have not worked with the DDX drivers before. I have downloaded ddx.PS from XFree86.org ftp to understand it. If there is any other relevant documentation that may be helpful, please point me to it. --Jonathan Thambidurai --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
I too am curious about that. I've been working on adding duoview support to the s3 savage driver. I talked to Tim Roberts a while back and I figured out the basics of how it works. I've got a semi-working driver, but alas, I think I need some help getting the second crtc to output to the vga port. Right now it seems to think it's connected to the LCD. such is the case of writing drivers blind. Do any of you have access to savage MX/IX docs (or know where I could find some) and would be willing to answer some questions? I'm working on a Savage IX on an IBM T20. BTW, here's my code as well as a binary and a sample XF86Config file: http://www.botchco.com/alex/new-savage/savage/ Also to any of you would-be developers out there, it's actually not as hard as it seems, I mean docs are nice (and often necessary), but you can figure out a lot just looking at code from other drivers. after fiddling with duoview for a few weeks I have a pretty good understanding of 2D. I haven't messed with 3D...yet. It's sort of really daunting for a while and then all of a sudden you get a moment of clarity and it all starts to make sense. Which bring me to the subject of this email. I've been giving this some thought... The thing is there isn't really a sample of this to work from at the moment, but once the first on got converted, I think it would be pretty easy to convert others (at least for single chips with dual CRTCs). here are my thoughts on it, from my perspective anyway, keep in mind I'm still pretty green. right now for these chips you set up the entity as shareable and then divide your framebuffer into two or more frambuffers, one for each CRTC. Each instance of the driver then works using its framebuffer. each head is distinguished using the screen 0/1/etc. flag in the XF86Config file. (entity functions and operations for this sort of thing are not so well documented, at least not that I could find). Each pscrn gets it's framebuffer size and address mapped to the part of video ram you allocated for each frambuffer. For chips with dual crtcs you could do just what the diagram below shows and create a single frame buffer with two sort of virtual framebuffers. I'm not sure how this would work with the existing entity code. I guess as far as XF86Config goes you have to create a custom flag. As Sven said, this is I think what the matrox and Nvidia drivers do. They can also announce xinerama for the benefit of window managers and such (even though it's technically not two pscrns). You could add the stuff from the device specific EntRec's to device specific Rec's. then each pscrn would be responsible for not only frambuffer base and address but also primary and secondary virtual frame buffers and address. The main framebuffer would hold the whole contents of the screen and each virtual framebuffer would basically be a viewport into the total screen. I haven't had time to think this through throughly and I'm already starting to have questions... I dunno...food for thought I guess. If and when I finish duoview support, I'd like to work on this, or at least help out. Also for anyone out there who's working on 3D support for the savage IX/MX I'd be happy to help out testing and what not. Thanks, Alex -- On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. Also, i guess this would facilitate the memory management in dual head configurations. In general, dual head configuration is pretty poor in the current X server, at least in the documented part. There is no way from specifying mirrored or zoomed window output, or to be able to change the configuration dynamically, but this would probably not be achievable without an extension, or maybe incorporated with the RandR stuff. I think that matrox did something such in their proprietary drivers. Friendly, Sven Luther __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
Let me begin with the following: I downloaded the texmem branch from DRI CVS. It fails to compile, giving the error ./config/imake/imake -I./config/cf -s ./config/makedepend/Makefile.proto -f ./config/makedepend/Imakefile -DTOPDIR=../.. -DCURDIR=./config/makedepend ./config/imake/imake: No such file or directory ./config/imake/imake: Cannot exec /lib/cpp. Stop. ./config/imake/imake: Exit code 1. Stop. make[1]: *** [config/makedepend/Makefile.proto] Error 1 make[1]: Leaving directory `/usr/local/src/XFree86.workon/xc/xc' make: *** [World] Error 2 That said, I am using a DRI texmem CVS checkout from Nov 1., 2002. I checked the function I was looking at (RADEONScreenInit) and didn't notice any significant differences from the current version. I had a look at radeon_driver.c. The first thing that drew my attention was the the conditional that was prefixed by /* Xinerama has sync problem with DRI, disable it for now */ Does this imply that everything is set up already and there is some bug that must be ironed out? My problem, however, comes earlier in the file. DRI is disabled for me at the part that says Static buffer allocation failed For the initialization of each screen, the log file says it needs at least (about) 9 MB of memory. I have only 16 MB onboard, though the AGP aperture goes up to 64 MB (I'm using the texmem branch). Has the dynamic allocation of back and depth buffers been implemented yet? I don't think there is much else I can do if I don't have enough memory (or would it not be too difficult for me to implement this myself?). --Jonathan Thambidurai --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Die, 2003-02-25 at 01:24, Jonathan Thambidurai wrote: I had a look at radeon_driver.c. The first thing that drew my attention was the the conditional that was prefixed by /* Xinerama has sync problem with DRI, disable it for now */ Does this imply that everything is set up already and there is some bug that must be ironed out? I'm afraid not. Even if the server worked with DRI and Xinerama enabled (AFAIK the driver explicitly disables the DRI with multiple screens anyway), that wouldn't mean 3D acceleration would work because the 3D driver has no notion of multihead yet. That's why multiple heads as viewports into a single screen would by far be the easiest way to get working 3D acceleration with multihead. My problem, however, comes earlier in the file. DRI is disabled for me at the part that says Static buffer allocation failed For the initialization of each screen, the log file says it needs at least (about) 9 MB of memory. I have only 16 MB onboard, though the AGP aperture goes up to 64 MB (I'm using the texmem branch). Has the dynamic allocation of back and depth buffers been implemented yet? No. The radeon driver now only reserves the additional space when there are 3D contexts, but they still take up the same space. The driver now has an option to disable the back buffer, but I'm not sure that's very funny. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head (also S3 savage Duoview)
I too am curious about that. I've been working on adding duoview support to the s3 savage driver. I talked to Tim Roberts a while back and I figured out the basics of how it works. I've got a semi-working driver, but alas, I think I need some help getting the second crtc to output to the vga port. Right now it seems to think it's connected to the LCD. such is the case of writing drivers blind. Do any of you have access to savage MX/IX docs (or know where I could find some) and would be willing to answer some questions? I'm working on a Savage IX on an IBM T20. BTW, here's my code as well as a binary and a sample XF86Config file: http://www.botchco.com/alex/new-savage/savage/ Also to any of you would-be developers out there, it's actually not as hard as it seems, I mean docs are nice (and often necessary), but you can figure out a lot just looking at code from other drivers. after fiddling with duoview for a few weeks I have a pretty good understanding of 2D. I haven't messed with 3D...yet. It's sort of really daunting for a while and then all of a sudden you get a moment of clarity and it all starts to make sense. Which bring me to the subject of this email. I've been giving this some thought... The thing is there isn't really a sample of this to work from at the moment, but once the first on got converted, I think it would be pretty easy to convert others (at least for single chips with dual CRTCs). here are my thoughts on it, from my perspective anyway, keep in mind I'm still pretty green. right now for these chips you set up the entity as shareable and then divide your framebuffer into two or more frambuffers, one for each CRTC. Each instance of the driver then works using its framebuffer. each head is distinguished using the screen 0/1/etc. flag in the XF86Config file. (entity functions and operations for this sort of thing are not so well documented, at least not that I could find). Each pscrn gets it's framebuffer size and address mapped to the part of video ram you allocated for each frambuffer. For chips with dual crtcs you could do just what the diagram below shows and create a single frame buffer with two sort of virtual framebuffers. I'm not sure how this would work with the existing entity code. I guess as far as XF86Config goes you have to create a custom flag. As Sven said, this is I think what the matrox and Nvidia drivers do. They can also announce xinerama for the benefit of window managers and such (even though it's technically not two pscrns). You could add the stuff from the device specific EntRec's to device specific Rec's. then each pscrn would be responsible for not only frambuffer base and address but also primary and secondary virtual frame buffers and address. The main framebuffer would hold the whole contents of the screen and each virtual framebuffer would basically be a viewport into the total screen. I haven't had time to think this through throughly and I'm already starting to have questions... I dunno...food for thought I guess. If and when I finish duoview support, I'd like to work on this, or at least help out. Also for anyone out there who's working on 3D support for the savage IX/MX I'd be happy to help out testing and what not. Thanks, Alex -- On Mon, Feb 24, 2003 at 07:58:55AM -0700, Jens Owen wrote: A short cut to this whole thing would be to work on getting a second head supported on a single X11 screen. Then 3D comes for free: http://www.tungstengraphics.com/dri/Simple_Xinerama_DH.txt This solution provides Xinerama functionality without actually using the Xinerama wrapper. Mmm, i am curious about this, how does it get handled in the XFree86 configuration file. Also, i guess this would facilitate the memory management in dual head configurations. In general, dual head configuration is pretty poor in the current X server, at least in the documented part. There is no way from specifying mirrored or zoomed window output, or to be able to change the configuration dynamically, but this would probably not be achievable without an extension, or maybe incorporated with the RandR stuff. I think that matrox did something such in their proprietary drivers. Friendly, Sven Luther __ Do you Yahoo!? Yahoo! Tax Center - forms, calculators, tips, more http://taxes.yahoo.com/ --- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel
Re: [Dri-devel] Dual-head
On Mon, 2003-02-24 at 03:16, Jonathan Thambidurai wrote: On my Mobility Radeon 7500, I would like to have accelerated 3D working on one screen (primary I assume) of a Xinerama dual-head setup. As it stands, direct rendering is completely disabled upon XFree86 startup whenever two heads are used. If want to run any 3D game or application with accelerated graphics, I have to exit the X server and currently-running desktop environment completely and restart X with only a single head enabled. No, you don't. :) You can start an additional server and switch between the two, gdm makes that particularly easy. -- Earthling Michel Dänzer (MrCooper)/ Debian GNU/Linux (powerpc) developer XFree86 and DRI project member / CS student, Free Software enthusiast --- This SF.net email is sponsored by: SlickEdit Inc. Develop an edge. The most comprehensive and flexible code editor you can use. Code faster. C/C++, C#, Java, HTML, XML, many more. FREE 30-Day Trial. www.slickedit.com/sourceforge ___ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel