I see. You just want to be able to detect the GPU on a system to
determine which rendering techniques to use your application.
Take a look at D3DPipeline.findDefaultResourceFactory() and
ES2Pipeline.findDefaultResourceFactory(). You should be able make the
needed change to pass the GPU information (in printDriverInformation())
back to your application. You can follow the pattern of is3DSupported()
in the class.
If you like you might want to file a JIRA for this enhancement request
so that you don't have to make similar patch for future JavaFX releases.
- Chien
On 8/6/2014 12:32 PM, Mike Hearn wrote:
I want to use it for rendering :) I just want to fall back to e.g. a
color adjust instead of a GaussianBlur.
On Wed, Aug 6, 2014 at 8:17 PM, Chien Yang <chien.y...@oracle.com
<mailto:chien.y...@oracle.com>> wrote:
Yes, we know that it is a perfectly fine entry level GPU capable
of supporting JavaFX graphics requirements. I shouldn't use the
word "bad" card. What I'm saying is that you will have to add it
to your blacklist if you don't want JavaFX to use it for rendering
due to poor framerate.
- Chien
On 8/6/2014 9:57 AM, Mike Hearn wrote:
The card isn't bad per se, it's just the HD4000 integrated
graphics chip that older MacBook's ship with. It's just that I'm
very picky about my framerates :)
On Wed, Aug 6, 2014 at 6:49 PM, Chien Yang <chien.y...@oracle.com
<mailto:chien.y...@oracle.com>> wrote:
There isn't a public Java API support for what you want to
do. However if you are willing to patch JavaFX in your own
build, you can add the bad card to the GLGPUInfo blackList[]
in the GLFactory class of the specific platform if you are
using the es2 pipe. You will need to dig down into the native
C++ code if you need to support Windows d3d pipe. This will
be a little more work see D3DBadHardware.h for the entries.
Hope this helps.
- Chien
On 8/5/2014 11:39 PM, Peter Penzov wrote:
Hi All,
I'm interested how I can get the model of the GPU
card using Java. Can
you show me some basic example?
BR,
Peter
On Wed, Aug 6, 2014 at 3:02 AM, Jim Graham
<james.gra...@oracle.com
<mailto:james.gra...@oracle.com>> wrote:
If there is a card that can't keep up with what we
want it to do then we
should probably be dealing with that on our end as
well, whether by
disabling 3D on that card or by black listing it and
just falling back to
sw pipeline. We already do that with a number of
embedded GPUs...
...jim
On 8/1/14 2:27 AM, Mike Hearn wrote:
Scott is correct about the determining of the
SW pipeline. To add to
that,
if knowing whether you are running on SW is
sufficient
Unfortunately for the Intel HD4000 card that some
older laptops have, it
technically supports 3D but struggles to do basic
shader effects at 60fps
when running at high pixel densities. I think I
posted about this problem
before. Simpler animations work better (just) but
I'd prefer to only fall
back to that when necessary.
I think the suggestion about starting out
assuming that animation will be
OK and then backing off is a good one, if it
is practical for your
application.
Given that I'll be bundling a JVM with the app
anyway I think it'd be
easier and give a better UX to just patch JavaFX
to expose this data using
an API specific to my app. It obviously has it
because when running with
Prism debug logging the info is printed.