Thanks! I found this setting very useful.

--Semyon

On 9/30/2016 6:51 AM, Philip Race wrote:

BTW I just noticed in the source code that there is an
 environment variable that disables the check :

see windows/native/libawt/java2d/d3d/D3DPipelineManager.cpp
static BOOL bNoHwCheck = (getenv("J2D_D3D_NO_HWCHECK") != NULL);

if that is set it does everything except return the code that means this
is bad hardware .. seems Dmitri added this from the beginning.

I overlooked this in the past because I was focused on the system property settings.

-phil.


On 9/9/16, 8:14 AM, Philip Race wrote:
No .. that would be an incompatible change that might surprise a lot of people.

-phil.

On 9/9/16, 12:31 AM, Semyon Sadetsky wrote:
I cannot reproduce JDK-803944. It is about very specific hardware configuration with two different video cards.

I didn't find any evaluation/justification, neither in JIRA nor in the review on the alias, for the 803944 resolution that d3d should be disabled for all Intel video cards. Why?

It may be disabled by default, but at least the sun.java2d.d3d=true option could enable it, no?

--Semyon


On 9/9/2016 4:58 AM, Philip Race wrote:
Please consider https://bugs.openjdk.java.net/browse/JDK-8039444
and the various bugs that were closed as a duplicate of that bug.
I don't think you can easily show this fix resolves all of these ..

-phil.


On 9/8/16, 5:12 PM, Semyon Sadetsky wrote:
I have 2 laptops Intel i5, i7. Both working with d3d normally. Some visual defects will be corrected after this fix. And I didn't get why d3d is disabled for all Intel without possibility to switch it on?

--Semyon

On 09.09.2016 02:10, Philip Race wrote:
The following is just for testing right ? It should not be in this webrev
as part of what you propose to push ..
--
src/java.desktop/windows/native/libawt/java2d/d3d/D3DBadHardware.h
Print this page

@@ -49,13 +49,10 @@
// all versions must fail ("there's no version of the driver that passes")
 #define NO_VERSION D_VERSION(0xffff, 0xffff, 0xffff, 0xffff)

 static const ADAPTER_INFO badHardware[] = {

-    // All Intel Chips.
-    { 0x8086, ALL_DEVICEIDS, NO_VERSION, OS_ALL },
-
     // ATI Mobility Radeon X1600, X1400, X1450, X1300, X1350
     // Reason: workaround for 6613066, 6687166
     // X1300 (four sub ids)
     { 0x1002, 0x714A, D_VERSION(6,14,10,6706), OS_WINXP },
     { 0x1002, 0x714A, D_VERSION(7,14,10,0567), OS_VISTA },

---

-phil.

On 9/8/16, 4:06 PM, Semyon Sadetsky wrote:
I have reworked fix to not affect ATI and NVidia.

http://cr.openjdk.java.net/~ssadetsky/8146042/webrev.05/

--Semyon


On 9/9/2016 12:20 AM, Semyon Sadetsky wrote:


On 08.09.2016 22:57, Philip Race wrote:

Can you provide something like a rationale for why these particular values might work ? Otherwise this seems like a fix that can't be reviewed, only tested.
So that testing will be important. If you can be sure it passes
on ATI, Nvidia, and Intel then we can take it .. otherwise we should defer this.
I suppose those fudge factors are obtained experimentally. Not sure that any rationale is possible here. The fix simply tested on different hardware. I hope after this fix D3D maybe enabled again for Intel APUs.
Currently it has been blacklisted in 8039444.

--Semyon

IIRC Semyon will need to change the code to bypass the check
for Intel hardware. There is no env. variable or system property to do this.

-phil.

On 9/8/16, 3:47 AM, Sergey Bylokhov wrote:
On 05.09.16 13:36, Semyon Sadetsky wrote:
At last I could get NVidia machine (special thanks to Yuri).

The updated fix should improve the situation on NVidia. For that one common height/width fudge factor was separated in two different.

http://cr.openjdk.java.net/~ssadetsky/8146042/webrev.04/

Can you please confirm that the fix and test works if d3d is enabled on the intel vk? I recall that d3d was disabled on intel so probably to check that we need to force d3d manually.

On 3/18/2016 9:12 AM, Semyon Sadetsky wrote:
Hi Phil,

Sergey wrote it fails on nvidia cards. I could play with fudge factors
values but I don't have nvidia based video to test.

--Semyon

On 3/17/2016 11:05 PM, Phil Race wrote:
Semyon,

Any update on this ?
FWIW I used jprt to build your patch as I am having windows build
problems and
it passed on my ATI card.

-phil.


On 03/01/2016 04:37 PM, Sergey Bylokhov wrote:
On 15.01.16 9:59, Semyon Sadetsky wrote:
Hi Phil & Sergey,

I have integrated Intel GPU i5 and cannot test other hardware. On Mac's retina display the screen capture doesn't return exact
pixel to
pixel image but the scaled one. So Mac platform should be excluded
from
testing:
http://cr.openjdk.java.net/~ssadetsky/8146042/webrev.01/

I run the latest test(webrev.03) on my nvidia card, and it fails after the fix, but pass before =(. I have no ati to check. Also I cannot check the fix on intel card, because I cannot enable d3d on it.


--Semyon

On 1/14/2016 9:23 PM, Phil Race wrote:
This fudge factor was last adjusted in
https://bugs.openjdk.java.net/browse/JDK-6597822
way back before the D3D pipeline was released and the comments
seem to
indicate that
there was a fair amount of testing on different hardware.

I don't know why this seems to be in un-specified hardware-dependent
territory but
it seems quite possible that this could just as easily introduce a
different artifact
on some other hardware.

What exactly are you testing on ? And I think it needs to include at
least one Nvidia
and one AMD/ATI card.

-phil.

On 1/14/2016 10:09 AM, Semyon Sadetsky wrote:
Hello,

Please review the fix for jdk9.

bug: https://bugs.openjdk.java.net/browse/JDK-8146042
webrev: http://cr.openjdk.java.net/~ssadetsky/8146042/webrev.00/

The root cause is incorrect coordinate rounding in D3D renderer. To
fix the issue one of fudge factors was adjusted.

Another issue mentioning in the JIRA ticket is taken out as a
separate bug.

--Semyon















Reply via email to