[EMAIL PROTECTED] wrote:
Our experiences differ, then. It works just fine
 if you
  have RANDR extension set up and configured.
Try running your application with J2D_TRACE_LEVEL=4
 env.
  variable set, see if it prints out any errors.

No errors reported.
We have only seen a handful of people using full
 screen
mode on linux - in part because it takes a lot to
 configure the
  linux system to work. There's a lot of variables
(xinerama, randr, xrender extensions, compositing
 window
  managers) which affect the ability of applications
 (not just Java) to enter FSEM.

Could you be so kind to share your experience with using those xorg.conf options 
for FSEM & vsync? I'm sure this way more Linux users can enjoy what MS Windows 
and Mac OS X users do. (Just setting RandR option on had no effect on vsync by the 
way.)

  I use the default xorg.conf on my ubuntu 8.04, on
  a system with Nvidia chip. Note that I _don't_ use
  the NVIDIA's proprietary drivers, and
  the default window manager is used.

  I have the following relevant extensions (as reported
  by xdpyinfo)
    DOUBLE-BUFFER
    RANDR

  In this configuration the full screen exclusive mode
  (including display mode changing) is supported and works
  fine. However, the OpenGL pipeline couldn't be enabled,
  the proprietary drivers are required for that.

  The vsync doesn't seem to be working (in 6u10) in this config.

  Also, in this config I can use shaped windows support
  introduced in 6u10, but _not_ window opacity and
  non-opaque windows (the latter is reported as supported
  but doesn't work.. I would need to start compiz for that.

  If I do install the proprietary drivers, then the
  OpenGL pipeline works fine, but for inexplicable reason
  I lose the ability to enter/exit fs mode (the
  proprietary drivers probably disable RANDR extension
  or something to that effect).

  This is what I mean when I say that it's very
  hard to configure a linux system properly so that
  everything works fine, something is always broken.

I think in 5 we used DBE (double buffer extension)
 for
creating implementing buffer strategy, may be
 something changed
  in 6 - I can't think of what though.
This is very discomforting. One wonders what's not
gonna work next time.

   You're the first to complain about this.

I'm sorry, I can't understand the response. Do you mean there is actually no 
such problem, or do you mean something else. Remember, I was referring to the 
fact that vsync works fine with 1.5 on Linux but not with 1.6 anymore (save for 
my possibly wrong/missing xorg.conf settings, of course).

  What I meant to say is that we can't test every possible
  configuration, especially on linux, we rely on our users
  to tells us if something is wrong. Then we can prioritize
  our efforts depending on how many people are affected.

  In this case, since you're the first one to report this
  issue it's either not wide spread, or noone cares.

  In either case we can't devote too much effort to find
  out what's wrong. Note that this behavior - the vsync-ing
  BufferStrategy - is not documented, nor supported - it's
  just something of a side effect.

  If you feel that this change in behavior is something that
  needs to be addressed, I encourage you to contact IcedTea
  guys, see if they could come up with a fix which we can
  incorporate into the openjdk source base.

  Thanks,
    Dmitri

===========================================================================
To unsubscribe, send email to [EMAIL PROTECTED] and include in the body
of the message "signoff JAVA2D-INTEREST".  For general help, send email to
[EMAIL PROTECTED] and include in the body of the message "help".

Reply via email to