Re: Gradients w/o BGRA_PRE

2015-02-27 Thread Jim Graham
My suggestion would be to add BYTE_RGBA_PRE to the PixelFormat enums and 
that ES2 claims it supports and have the gradient code then test for 
either BYTE_BGRA_PRE or BYTE_RGBA_PRE support and use the appropriate 
indices depending on the result...


...jim

On 2/27/15 2:29 PM, Michael Heinrichs wrote:

Hi Jim,

after disabling the test and switching the channels in insertInterpColor(), 
gradients are rendered correctly. It is a hack, but it works for now. :) Thanks 
for the hint.

The format supported by default in ES 2.0 is actually BYTE_RGBA and not 
INT_RGBA. I think it makes sense to use a byte array in both cases as you 
initially suggested.

- Michael




On 23 Feb 2015, at 20:58, Jim Graham  wrote:

Oh dear, it is a bit worse than that.  The texture creation code doesn't even 
have an enum constant to specify BYTE_RGBA_PRE.  It has BYTE_BGRA_PRE (which is 
used by this code with a byte array) and INT_RGBA_PRE (which would involve more 
than simply index twiddling to support).

So, if INT_RGBA was universally supported, the code could be refactored to use that.  If 
we have to do it conditionally, that could be done as well as I said below, but it would 
involve factoring "how do we store these components and into which array?" 
rather than just index twiddling...

...jim

On 2/23/2015 11:53 AM, Jim Graham wrote:

Ah, I see.  I was looking at the code that uploaded the pixels which
behaved as I described, but I didn't check the texture creation code,
which does look like it will reject it as you indicate.

It looks like insertInterpColor, which computes the image data for the
texture from the gradient colors, could handle either BGRA or RGBA just
by changing the indices it uses.  Is RGBA universally supported on
Linux/OGL2/D3D?  If so, then we could switch to that.  Otherwise we'd
have to factor in conditional BGRA vs RGBA tests into the appropriate
places (probably just remember which was supported by the
ResourceFactory in the initGradientTextures() function and have the
insert() function test and modify its indices as appropriate)...

 ...jim

On 2/23/2015 1:51 AM, Michael Heinrichs wrote:

Hi Jim,

thanks for your reply. Right now I do not see anything.
PaintHelper.initGradientTextures() eventually calls
ES2Texture.create(), which checks if the requested pixel format is
supported. This test fails in my case, because WebGL is ES2 only and
the extension is not supported. But when I disable this test and
pretend that BGRA is RGBA, I see exactly what you describe: the red
and the blue channel are swapped.

Thanks,
Michael



On 23 Feb 2015, at 00:55, Jim Graham  wrote:

Hi Michael,

What error are you seeing, or is it just rendering incorrectly?

Looking at the code in ES2Texture.uploadPixels() it looks like ES2
might support BGRA via an extension.  Perhaps we've only encountered
platforms with that extension so far.  Otherwise, if I read the code
correctly it looks like we will just pretend the incoming data is in
RGBA format, which would mean that the rendering would happen, but
the red and blue components would be swapped.  Is that what you are
seeing?

...jim

On 2/21/2015 5:06 AM, Michael Heinrichs wrote:

Hi,

I am experimenting with JavaFX on top of WebGL. Right now I am stuck
implementing gradients, but maybe somebody from this list can help.

WebGL usually does not support the pixel format BGRA_PRE. From my
understanding, the ES2 renderer should work with such a
configuration, too. But when I try to use a gradient, at some point
PaintHelper.initGradientTextures() is called, which requires the
pixel format BGRA_PRE  Obviously something about my configuration is
wrong. But I cannot figure out, where the alternative implementation
resides. How should the implementation of gradients work if BGRA_PRE
is not available?

Thanks,
Michael







Re: Gradients w/o BGRA_PRE

2015-02-27 Thread Michael Heinrichs
Hi Jim,

after disabling the test and switching the channels in insertInterpColor(), 
gradients are rendered correctly. It is a hack, but it works for now. :) Thanks 
for the hint.

The format supported by default in ES 2.0 is actually BYTE_RGBA and not 
INT_RGBA. I think it makes sense to use a byte array in both cases as you 
initially suggested.

- Michael



> On 23 Feb 2015, at 20:58, Jim Graham  wrote:
> 
> Oh dear, it is a bit worse than that.  The texture creation code doesn't even 
> have an enum constant to specify BYTE_RGBA_PRE.  It has BYTE_BGRA_PRE (which 
> is used by this code with a byte array) and INT_RGBA_PRE (which would involve 
> more than simply index twiddling to support).
> 
> So, if INT_RGBA was universally supported, the code could be refactored to 
> use that.  If we have to do it conditionally, that could be done as well as I 
> said below, but it would involve factoring "how do we store these components 
> and into which array?" rather than just index twiddling...
> 
>   ...jim
> 
> On 2/23/2015 11:53 AM, Jim Graham wrote:
>> Ah, I see.  I was looking at the code that uploaded the pixels which
>> behaved as I described, but I didn't check the texture creation code,
>> which does look like it will reject it as you indicate.
>> 
>> It looks like insertInterpColor, which computes the image data for the
>> texture from the gradient colors, could handle either BGRA or RGBA just
>> by changing the indices it uses.  Is RGBA universally supported on
>> Linux/OGL2/D3D?  If so, then we could switch to that.  Otherwise we'd
>> have to factor in conditional BGRA vs RGBA tests into the appropriate
>> places (probably just remember which was supported by the
>> ResourceFactory in the initGradientTextures() function and have the
>> insert() function test and modify its indices as appropriate)...
>> 
>> ...jim
>> 
>> On 2/23/2015 1:51 AM, Michael Heinrichs wrote:
>>> Hi Jim,
>>> 
>>> thanks for your reply. Right now I do not see anything.
>>> PaintHelper.initGradientTextures() eventually calls
>>> ES2Texture.create(), which checks if the requested pixel format is
>>> supported. This test fails in my case, because WebGL is ES2 only and
>>> the extension is not supported. But when I disable this test and
>>> pretend that BGRA is RGBA, I see exactly what you describe: the red
>>> and the blue channel are swapped.
>>> 
>>> Thanks,
>>> Michael
>>> 
>>> 
 On 23 Feb 2015, at 00:55, Jim Graham  wrote:
 
 Hi Michael,
 
 What error are you seeing, or is it just rendering incorrectly?
 
 Looking at the code in ES2Texture.uploadPixels() it looks like ES2
 might support BGRA via an extension.  Perhaps we've only encountered
 platforms with that extension so far.  Otherwise, if I read the code
 correctly it looks like we will just pretend the incoming data is in
 RGBA format, which would mean that the rendering would happen, but
 the red and blue components would be swapped.  Is that what you are
 seeing?
 
...jim
 
 On 2/21/2015 5:06 AM, Michael Heinrichs wrote:
> Hi,
> 
> I am experimenting with JavaFX on top of WebGL. Right now I am stuck
> implementing gradients, but maybe somebody from this list can help.
> 
> WebGL usually does not support the pixel format BGRA_PRE. From my
> understanding, the ES2 renderer should work with such a
> configuration, too. But when I try to use a gradient, at some point
> PaintHelper.initGradientTextures() is called, which requires the
> pixel format BGRA_PRE  Obviously something about my configuration is
> wrong. But I cannot figure out, where the alternative implementation
> resides. How should the implementation of gradients work if BGRA_PRE
> is not available?
> 
> Thanks,
> Michael
> 
>>> 



Re: Using JavaFX on VMWare / Linux

2015-02-27 Thread Kevin Rushforth
We neither test nor recommend using the j2d pipeline for onscreen 
rendering, so that should be:


java -Dprism.order=es2,sw ...

-- Kevin


Jörg Wille wrote:

Hi Adam,
VMWare is not officially supported but as my test shows, only DirectX
rendering does not work. If you do not have high workload for graphics in
your application you can force software rendering by starting your
application like this:
java -Dprism.order=es2,j2d -jar app.jar
(This tries out rendering engines in this order and leads to sw-rendering
on windows and OpenGL on mac and linux)
or if you package your app with the javapackager you can add this line to
package.cfg:
jvmarg.1=-Dprism.order=es2,j2d

- Joerg
  


Re: Using JavaFX on VMWare / Linux

2015-02-27 Thread Kevin Rushforth
Chien is correct that this is not a supported config. However, I know of 
cases where it has been used successfully without HW acceleration 
enabled (not sure whether it would work with HW acceleration, but there 
would be more risk in doing that).


-- Kevin


Chien Yang wrote:

Hi Adam,

I would like to inform you that VMware is not a certified hypervisor 
for Oracle JDK 8 and JRE 8, and hardware rendering is not supported in 
guest systems on Oracle VM, VirtualBox and Hyper-V Server 2012. Please 
see this link for details information:


http://www.oracle.com/technetwork/java/javase/certconfig-2095354.html

- Chien


On 2/26/15 10:43 PM, Adam Granger wrote:

The company I work at mandate Linux development is done on a Redhat 6.x
guest within VMPlayer/Workstation on top of a Windows XP host.

Previous debugging has led me to believe, correct me if I'm wrong, that
drivers in the guest used to support OpenGL via VMWare expose the guests
hardware/driver strings such as  "vmware" etc, not the hosts
hardware/driver. Sorry I've not got the details with me right now, 
they're

at work, will update later...

I'd like to know if

   1) acceleration is expected to work on the guest if the host has a
supported GPU configuration?
   2) how the whitelist / blacklist system in prism works in this case?
 - I believe it sees "vmware" and then assumes vmware isn't "nvidia"
etc. and gives up
 - how can it see the hosts physical GPU hardware/driver 
correctly to

make an informed choice?
   3) is it possible to disable/override the white/blacklist system 
albeit

at risk?

Kind regards,

Adam.








Re: In(Sanity) Testing Mondays

2015-02-27 Thread Victor D'yakov

Vadim,

Please add Leif for Controls.

Thanks,
Victor

On 27.02.2015 17:30, Vadim Pakhnushev wrote:

Reminder, Monday is our weekly sanity testing.

You can find your testing assignment at:

https://wiki.openjdk.java.net/display/OpenJFX/Sanity+Testing

Also please remember that the repo will be locked from 1am PST until 
1pm PST.


Happy testing!

Thanks,
Vadim




In(Sanity) Testing Mondays

2015-02-27 Thread Vadim Pakhnushev

Reminder, Monday is our weekly sanity testing.

You can find your testing assignment at:

https://wiki.openjdk.java.net/display/OpenJFX/Sanity+Testing

Also please remember that the repo will be locked from 1am PST until 1pm 
PST.


Happy testing!

Thanks,
Vadim


Re: Using JavaFX on VMWare / Linux

2015-02-27 Thread Jörg Wille
Hi Adam,
VMWare is not officially supported but as my test shows, only DirectX
rendering does not work. If you do not have high workload for graphics in
your application you can force software rendering by starting your
application like this:
java -Dprism.order=es2,j2d -jar app.jar
(This tries out rendering engines in this order and leads to sw-rendering
on windows and OpenGL on mac and linux)
or if you package your app with the javapackager you can add this line to
package.cfg:
jvmarg.1=-Dprism.order=es2,j2d

- Joerg