Hi All,

> On 9/28/2016 2:02 AM, Artem Smotrakov wrote:
> > Currently the tests silently quit which looks like they pass. This
> makes
> > people think that everything went smoothly, but actually nothing
> was
> > tested.
> >
> I did not get the idea.  Looks like, if no NSS installed, the test
> would 
> be ignored; if NSS get installed, the test is actually get run.  If no
> 
> NSS get installed,  the test should quite silently and test nothing 
> because nothing should be tested.  That's the expected common behavior
> 
> in order to test specific configuration.

The problem is the tests report they passed but actually they were skipped. I 
have no objections against skipping tests but it would be better to give a hint 
somehow how many tests were skipped and why. 

> 
> If you don't want the test quit silently, I may prefer to check the 
> platform to install NSS libs rather than update this test cases. 
> Exposing the testing environment configuration problem is not the job
> of 
> this test.

There're 60+ tests related to PKCS11. Two years they have been "passed" on 3 
unsupported platforms on hosts where usually no NSS libraries were installed. 
How can we rely on these results?

> 
> > I would prefer to update PKCS11Test to report a failure in case of
> > unexpected platform.
> Then you need to know all expected platform.  The test is not only run
> 
> in a certain known environment (for example the platforms Mach5 or 
> JPRT), it can also be run by third party environment (OpenJDK 
> contributors).  If not possible, I think it is a hard job to know all
> 
> the expected platform exactly.

>From my experience the most common environment issue is a lack of GUI 
>libraries. Fortunately, in such case tests simply fail due to unsatisfied 
>dependencies. If they would pass due to uncertain environment we could end up 
>having completely untested UI functionality with green status.

> 
> Xuelei

Thank you,
Denis.

Reply via email to