Because we (Oracle) have very strict rules about publishing binaries. We are not allowed to publish anything that hasn't been approved for release, even an experimental test program.

-- Kevin


On 11/14/2025 10:07 AM, Christopher Schnick wrote:

May I ask why that is not possible?

On 14/11/2025 19:04, Kevin Rushforth wrote:
While that might be easier, it is not possible for us to provide such a binary.

We could simplify it a bit by providing instructions to download the JMODs (rather than the SDK) from jdk.java.net/direct3d12 and use jlink to create a JDK that includes JavaFX. That way the custom options only need to be added one time to jlink; running "javac" and "java" would need no options related to finding the JavaFX modules or enabling native access.

-- Kevin


On 11/14/2025 9:50 AM, Christopher Schnick wrote:

I will have to check whether I have time to look into this, but I am wondering if it would be possible for these tests to require less manual setup?

Isn't this where something like jlink could easily solve all these steps and the user would just have to run one of multiple launcher scripts of a prebuilt runtime image? I would argue it would be faster to just build a jlink image and provide a download link compared to writing the detailed instructions so that users run the test exactly as they should.

On 14/11/2025 15:10, Lukasz Kostyra wrote:

Hello all,

I got feedback on the previous call for performance testing email that, instead of using the Bash test script on Windows (and hoping you have Cygwin/MINGW installed) it would be easier to integrate testing and CSV output functionality into RenderPerfTest. I made those changes and they are now available on jfx-sandbox direct3d12 branch (you WON’T find those on main repo yet): https://github.com/openjdk/jfx-sandbox/tree/direct3d12/tests/performance/animation/RenderPerfTest/src/renderperf

Any feedback regarding RenderPerfTest will be updated on that branch automatically, so it’s indeed a better solution if there’s more feedback to it :)


*_New steps for running tests:_*

 1. Download RenderPerfTest from above link (has to be jfx-sandbox
    repo, direct3d12 branch) - best to download the entire
    “renderperf” folder as ZIP as it contains extra resources
    needed for the test app
 2. Get JavaFX Direct3D 12 build - either download the EA2 SDK from
    [ https://jdk.java.net/javafxdirect3d12/ ] or build it from
    scratch from direct3d12 [
    https://github.com/openjdk/jfx-sandbox/tree/direct3d12 ] branch
    (make sure to *build with -PCONF=Release*; at the time of
    writing this email there is no functional difference between
    the sandbox repo and the EA2 build).
 3. RenderPerf can be run with (underlined parts you need to fill
    in yourself):
    *java --upgrade-module-path="_<path_to_jfx_sdk>/lib_"
    
--add-modules=javafx.base,javafx.controls,javafx.graphics,jdk.jsobject,javafx.media
    --enable-native-access=javafx.graphics
    -Dprism.order=_<backend>_ renderperf/RenderPerfTest.java
    --output-csv -r _<runs>
    _*Where:
    *<path_to_jfx_sdk>* -  path to directory where JavaFX SDK is
    located (has to be where JavaFX bin and lib folders reside)
    *    <backend> *- short-hand for which Prism backend to use
    *<runs>* - how many times each test case should run;RenderPerf
    will average FPS results from these runs

 4. Running RenderPerf like above will produce
    *RenderPerf_results-<backend>-<date>-<time>.csv* in your
    current directory.

*_Examples:_*

 1. For D3D baseline test, run:
    *java --upgrade-module-path="_<path_to_jfx_sdk>_/lib"
    
--add-modules=javafx.base,javafx.controls,javafx.graphics,jdk.jsobject,javafx.media,
    --enable-native-access=javafx.graphics -Dprism.order=d3d
    renderperf/RenderPerfTest.java --output-csv -r 3*

 2. For D3D12 baseline test, run:
        java --upgrade-module-path="*_<path_to_jfx_sdk>_*/lib"
    
--add-modules=javafx.base,javafx.controls,javafx.graphics,jdk.jsobject,javafx.media,
    --enable-native-access=javafx.graphics -Dprism.order=d3d12
    renderperf/RenderPerfTest.java --output-csv -r 3

*_Notes:_*

* Closing the test on warm-up stage will stop the test run early

* Similarly to the test script, RenderPerfTest defines default object numbers at the beginning as *Map<String, Integer> defaultObjectCounts *which are used for running all the tests. Similar remarks as to the test script apply - It would be preferred to keep these numbers as-is but if needed (test timeouts loading, or framerate on baseline D3D run is very low (below 15FPS) ) they can be lowered, as long as they remain consistent between D3D and D3D12 runs.

Thanks once again for your help!

-Lukasz

*From:*openjfx-dev <[email protected]> *On Behalf Of *Lukasz Kostyra
*Sent:* Thursday, 13 November 2025 15:50
*To:* [email protected]
*Subject:* JavaFX Direct3D 12 - Call for performance testing help

Hello openjfx-dev,

Because Windows is very open hardware-wise, it is difficult to prepare the backend for all possible hardware combinations available, especially from performance perspective. To make sure JavaFX performance does not degrade compared to old D3D backend we would like to call for volunteers to help performance-test the backend. These tests will let us get a general idea on how the backend behaves on different hardware and which areas of the backend to focus on while moving forward with optimization effort for Direct3D 12.

At this point we tested the backend quite extensively on Intel-based integrated GPUs and did some testing on a machine running a recent discrete Nvidia GPU. We are primarily looking for testing Direct3D 12 backend *on **a system running an AMD discrete GPU*, but any hardware combinations are welcome - the more the merrier :). Also note that these tests *require a Windows machine*, as D3D12 backend is Windows-only.

We run performance testing using RenderPerfTest JavaFX app located in the JavaFX repository under "tests/performance/animation/RenderPerfTest". I wrote a bash script to use with this app for performance-testing the backend. The script will run all available demos on RenderPerfTest with a set amount of objects per test, average the FPS results from each run and output the results into a CSV file.

*_How to run perf tests:_*

 1. Download the test script -
    https://gist.github.com/lukostyra/bc354a5fd845b82805ffb3380caebe9a
 2. Get JavaFX Direct3D 12 build - either download the EA2 SDK from
    [ https://jdk.java.net/javafxdirect3d12/ ] or build it from
    scratch from direct3d12 [
    https://github.com/openjdk/jfx-sandbox/tree/direct3d12 ] branch
    (make sure to *build with -PCONF=Release*; at the time of
    writing this email there is no functional difference between
    the sandbox repo and the EA2 build).
 3. Put the script in the same directory as
    *renderperf/RenderPerfTest.java* - I usually copy the contents
    of tests/performance/animation/RenderPerfTest/src into a
    separate directory where the script is located.The script will
    look in the current directory specifically for
    "*renderperf*"directory containing a
    "*RenderPerfTest.java*"source file.
 4. Ensure nothing else is running on your system.
 5. Perform a baseline test run with D3D backend:
    *./run_renderperf_all.sh -j <path_to_jfx_sdk> -b d3d -r 3
    *When the script is done, this should result in
    "*RenderPerf_results_d3d-<date>-<time>.csv*" file. This will
    let us establish how your machine performs on RenderPerf’s test
    cases on the old D3D backend.
 6. Perform a perf test run for D3D12 backend:
    *./run_renderperf_all.sh -j <path_to_jfx_sdk> -b d3d12 -r 3
    *This should result in
    "*RenderPerf_results_d3d12-<date>-<time>.csv*" file. The
    results will be used as a comparisonto D3D backend.

Once you are done, reply to this email thread with details what hardware you ran the tests on and with contents of both CSV files specifying which came from which test run - I’m pretty sure *attachments will be stripped when sending an email to the mailing list*, so to make sure the results get to us you should paste them inline. Also, include the console output of running any JavaFX app on D3D12 backend with "-Dprism.verbose=true".

Note that these tests will run for quite a bit of time. Each test case's single run will take 15 seconds - 5 seconds of warm-up and 10 seconds of actual test run where FPS is measured. Testing every single case, 3 runs each, for a single backend takes approximately 40 minutes - ensure your computer won't lock or turn off the screens during that process.

*_Test script reference:_*

*./run_renderperf_all.sh -j/--jfx **<path_to_jfx_sdk> [-b/--backend <backend>] [-r/--runs <runs>]*

Where:

*-j, --jfx * -  required; path to directory where JavaFX SDK is located (has to be where JavaFX bin and lib folders are located)

*-b, --backend*  - optional, defaults to "d3d12"; short-hand for which Prism backend to use. Whatever is set here will be forwarded to "-Dprism.order" property.

*-r, --runs*  -  optional, defaults to 3; how many times each test case should run. Script will average FPS results from these runs.

Notes:

   * Closing currently running test during the warm-up stage (first 5 seconds since starting the execution of the test case) should stop the test script early.

  * Script starts by declaring an array of test cases and object counts to run. We recommend leaving those as they are, however if there are tests which do not load because of a "Timeout"message(the script will then exit early) or perform significantly worse on D3D backend (below 20 FPS) you can lower the object count and retry.

Let me know if you have any problems with running the tests. And, in advance, thanks for your help!

Regards,

Lukasz


Reply via email to