Re: Travis in solid red mode again

2020-02-07 Thread Matt Caswell



On 03/02/2020 19:19, Matthias St. Pierre wrote:
> Mission accomplished, Travis just turned green again on master  :-)

Every one of the last 12 master jobs have failed to complete
successfully due to a travis timeout. Apart from the build that is
timing out (enable-msan), all the other builds in the job are green.

Matt

> 
> https://travis-ci.org/openssl/openssl/builds/645563965?utm_source=github_status_medium=notification
> 
> 
> Thanks to Matt, Richard and everybody else who participated in fixing it.
> 
> 
> Matthias
> 
> 
> 


Re: Travis in solid red mode again

2020-02-03 Thread Matthias St. Pierre

Mission accomplished, Travis just turned green again on master  :-)

https://travis-ci.org/openssl/openssl/builds/645563965?utm_source=github_status_medium=notification

Thanks to Matt, Richard and everybody else who participated in fixing it.


Matthias





Re: Travis in solid red mode again

2020-02-03 Thread Salz, Rich
  
  >  Could we add a CI check for PRs that confirms that the target branch is
green in travis?

Looking at 
https://docs.travis-ci.com/user/notifications/#conditional-notifications and 
https://config.travis-ci.com/ref/notifications/email it seems like it should be 
fairly easy configure builds to do "send email to openssl-commits when builds 
on master fail"


 



Re: Travis in solid red mode again

2020-02-03 Thread Salz, Rich
>Oh, actually, maybe I'm wrong. Is it just Appveyor that posts failures
and not Travis?
  
Yes.  You need to config travis to mail failures, as I pointed out in my 
earlier message.




Re: Travis in solid red mode again

2020-02-01 Thread Matt Caswell



On 01/02/2020 10:39, Dr Paul Dale wrote:
> I thought I was subscribed but don’t seem to see the failures.
> I do get the (very many) PR activity emails….

Oh, actually, maybe I'm wrong. Is it just Appveyor that posts failures
and not Travis?

Matt

> 
> 
> Pauli
> -- 
> Dr Paul Dale | Distinguished Architect | Cryptographic Foundations 
> Phone +61 7 3031 7217
> Oracle Australia
> 
> 
> 
> 
>> On 1 Feb 2020, at 8:35 pm, Dr. Matthias St. Pierre
>> mailto:matthias.st.pie...@ncp-e.com>>
>> wrote:
>>
>>> -Original Message-
>>> On 01/02/2020 00:34, Salz, Rich wrote:

> Could we add a CI check for PRs that confirms that the target branch is
    green in travis?

 Looking at
 https://docs.travis-ci.com/user/notifications/#conditional-notifications
 and https://config.travis-
>>> ci.com/ref/notifications/email
>>>  it seems like it should be
>>> fairly easy configure builds to do "send email to openssl-commits
>>> when builds on
>>> master fail"
>>>
>>> We already have that.
>>
>> Maybe we should make it a requirement that at least all OTC members
>> subscribe to openssl-commits?
>>
>> Matthias
>>
> 


Re: Travis in solid red mode again

2020-02-01 Thread Matt Caswell



On 01/02/2020 11:38, Dr. Matthias St. Pierre wrote:
>> I thought I was subscribed but don’t seem to see the failures.
> 
>> I do get the (very many) PR activity emails….
> 
>  
> 
> You are right, those “[openssl]  update” mails generate a lot
> 
> of noise. Do we really need them? If not, we could just deactivate them.

I regularly look at these, so would like to keep them.

Matt


> 
> Alternatively, we could move the CI failures to a separate openssl-ci list.
> 
>  
> 
> Matthias
> 
>  
> 


RE: Travis in solid red mode again

2020-02-01 Thread Dr. Matthias St. Pierre
> I thought I was subscribed but don’t seem to see the failures.
> I do get the (very many) PR activity emails….

You are right, those “[openssl]  update” mails generate a lot
of noise. Do we really need them? If not, we could just deactivate them.
Alternatively, we could move the CI failures to a separate openssl-ci list.

Matthias



Re: Travis in solid red mode again

2020-02-01 Thread Dr Paul Dale
I thought I was subscribed but don’t seem to see the failures.
I do get the (very many) PR activity emails….


Pauli
-- 
Dr Paul Dale | Distinguished Architect | Cryptographic Foundations 
Phone +61 7 3031 7217
Oracle Australia




> On 1 Feb 2020, at 8:35 pm, Dr. Matthias St. Pierre 
>  wrote:
> 
>> -Original Message-
>> On 01/02/2020 00:34, Salz, Rich wrote:
>>> 
 Could we add a CI check for PRs that confirms that the target branch is
>>>green in travis?
>>> 
>>> Looking at 
>>> https://docs.travis-ci.com/user/notifications/#conditional-notifications 
>>> and https://config.travis-
>> ci.com/ref/notifications/email it seems like it should be fairly easy 
>> configure builds to do "send email to openssl-commits when builds on
>> master fail"
>> 
>> We already have that.
> 
> Maybe we should make it a requirement that at least all OTC members subscribe 
> to openssl-commits?
> 
> Matthias
> 



RE: Travis in solid red mode again

2020-02-01 Thread Dr. Matthias St. Pierre
> -Original Message-
> On 01/02/2020 00:34, Salz, Rich wrote:
> >
> >   >  Could we add a CI check for PRs that confirms that the target branch is
> > green in travis?
> >
> > Looking at 
> > https://docs.travis-ci.com/user/notifications/#conditional-notifications 
> > and https://config.travis-
> ci.com/ref/notifications/email it seems like it should be fairly easy 
> configure builds to do "send email to openssl-commits when builds on
> master fail"
> 
> We already have that.

Maybe we should make it a requirement that at least all OTC members subscribe 
to openssl-commits?

Matthias



Re: Travis in solid red mode again

2020-02-01 Thread Matt Caswell



On 01/02/2020 00:34, Salz, Rich wrote:
>   
>   >  Could we add a CI check for PRs that confirms that the target branch is
> green in travis?
> 
> Looking at 
> https://docs.travis-ci.com/user/notifications/#conditional-notifications and 
> https://config.travis-ci.com/ref/notifications/email it seems like it should 
> be fairly easy configure builds to do "send email to openssl-commits when 
> builds on master fail"

We already have that.

Matt


Re: Travis in solid red mode again

2020-01-31 Thread Matt Caswell



On 31/01/2020 22:55, Matt Caswell wrote:
> 
> 
> On 31/01/2020 17:07, Matthias St. Pierre wrote:
>>
>> After a much to short green period, Travis has turned to solid red mode
>> again:
>>
>> https://travis-ci.org/openssl/openssl/builds?utm_medium=notification_source=github_status
>>
>>
>> This fact causes a lot of pain for reviewers, because every failing job
>> needs to be checked manually.
>> I would be happy to fix some of the failures if I could, but in most
>> cases I have no clue what the problem is,
>> because I can't reproduce the issue locally. Could we try to fix them in
>> a joint effort?  For your convenience,
>> I added a summary of all failing jobs of Build #31837 below.
> 
> Trial fix for at least some of these issues in #10989. We'll see what
> travis makes of it.

It fixed most of the test failures but not the krb5 one. I have another
trial fix in #10992 (which includes the fix in #10989).

Matt


> 
>>
>> Since this is not the first time that this happens, I think it is
>> overdue that the OTC discusses those constant
>> build failures and proposes some measures to raise the priority for
>> fixing CI errors. One drastic possibility
>> would be to allow only CI fixing pull requests to be merged as long as
>> master and the stable branches are red.
>> But I am sure there is a golden middle way between such an extreme
>> sanction and our current indifference.
>>
> 
> Part of the issue is that the master branch travis failures are not
> visible in the PRs because the master branch runs the full tests
> (including the extended tests), whereas the PRs don't run the extended
> tests by default. However running the extended tests takes too long.
> 
> Could we add a CI check for PRs that confirms that the target branch is
> green in travis?
> 
> Matt
> 
> 
>>
>> Regards,
>>
>> Matthias
>>
>>
>>
>> 
>>
>>
>> Last Successful Build: #31583 (10 days ago!)
>> https://travis-ci.org/openssl/openssl/builds/639939072?utm_medium=notification_source=github_status
>>
>>
>> Recent Build: #31837
>> https://travis-ci.org/openssl/openssl/builds/644238908?utm_medium=notification_source=github_status
>>
>>
>>
>> SSL TESTS
>> =
>>
>> *
>> https://travis-ci.org/openssl/openssl/jobs/644238919?utm_medium=notification_source=github_status
>>
>> *
>> https://travis-ci.org/openssl/openssl/jobs/644238926?utm_medium=notification_source=github_status
>>
>> *
>> https://travis-ci.org/openssl/openssl/jobs/644238929?utm_medium=notification_source=github_status
>>
>>
>>>   Test Summary Report
>>> ---
>>>   80-test_ssl_new.t    (Wstat: 256 Tests: 30 Failed: 1)
>>>     Failed test:  4
>>>     Non-zero exit status: 1
>>>   80-test_ssl_old.t    (Wstat: 256 Tests: 6 Failed: 1)
>>>     Failed test:  2
>>>     Non-zero exit status: 1
>>
>>
>>
>> BORINGSSL AND KRB5
>> ==
>>
>> *
>> https://travis-ci.org/openssl/openssl/jobs/644238927?utm_medium=notification_source=github_status
>>
>>
>>> Test Summary Report
>>> ---
>>
>>> 95-test_external_boringssl.t (Wstat: 256 Tests: 1 Failed: 1)
>>>    Failed test:  1
>>>    Non-zero exit status: 1
>>> 95-test_external_krb5.t (Wstat: 256 Tests: 1 Failed: 1)
>>>    Failed test:  1
>>>    Non-zero exit status: 1
>>
>>
>>
>> 95-test_external_boringssl.t:
>> ---
>>
>>     grep 'go.*FAILED' t
>>     go test: FAILED (SSL3-Client-ClientAuth-RSA)
>>     go test: FAILED (SSL3-Server-ClientAuth-RSA)
>>     go test: FAILED (ClientAuth-Sign-RSA-SSL3)
>>     go test: FAILED (ClientAuth-InvalidSignature-RSA-SSL3)
>>     go test: FAILED (CertificateVerificationSucceed-Server-SSL3-Sync)
>>     go test: FAILED
>> (CertificateVerificationSucceed-Server-SSL3-Sync-SplitHandshakeRecords)
>>     go test: FAILED
>> (CertificateVerificationSucceed-Server-SSL3-Sync-PackHandshakeFlight)
>>     go test: FAILED (CertificateVerificationSucceed-Server-SSL3-Async)
>>     go test: FAILED
>> (CertificateVerificationSucceed-Server-SSL3-Async-SplitHandshakeRecords)
>>     go test: FAILED
>> (CertificateVerificationSucceed-Server-SSL3-Async-PackHandshakeFlight)
>>
>>
>>
>> 95-test_external_krb5.t:
>> ---
>>
>>     LD_LIBRARY_PATH=`echo -L../../../lib | sed -e "s/-L//g" -e "s/
>> /:/g"` KRB5_CONFIG=../../../config-files/krb5.conf LC_ALL=C ./t_nfold
>>     N-fold
>>     Input:    "basch"
>>     192-Fold: 1aab6b42 964b98b2 1f8cde2d 2448ba34 55d7862c 9731643f
>>     Input:    "eichin"
>>     192-Fold: 65696368 696e4b73 2b4b1b43 da1a5b99 5a58d2c6 d0d2dcca
>>     Input:    "sommerfeld"
>>     192-Fold: 2f7a9855 7c6ee4ab adf4e711 92dd442b d4ff5325 a5def75c
>>     Input:    "MASSACHVSETTS INSTITVTE OF TECHNOLOGY"
>>     192-Fold: db3b0d8f 0b061e60 3282b308 a5084122 9ad798fa b9540c1b
>>     RFC tests:
>>
>>     ...
>>
>>     Generating random keyblock: . . . OK
>>     Creating opaque key from keyblock: . . . OK
>>     Encrypting (c): . . . Failed: Cannot allocate 

Re: Travis in solid red mode again

2020-01-31 Thread Matt Caswell



On 31/01/2020 17:07, Matthias St. Pierre wrote:
> 
> After a much to short green period, Travis has turned to solid red mode
> again:
> 
> https://travis-ci.org/openssl/openssl/builds?utm_medium=notification_source=github_status
> 
> 
> This fact causes a lot of pain for reviewers, because every failing job
> needs to be checked manually.
> I would be happy to fix some of the failures if I could, but in most
> cases I have no clue what the problem is,
> because I can't reproduce the issue locally. Could we try to fix them in
> a joint effort?  For your convenience,
> I added a summary of all failing jobs of Build #31837 below.

Trial fix for at least some of these issues in #10989. We'll see what
travis makes of it.

> 
> Since this is not the first time that this happens, I think it is
> overdue that the OTC discusses those constant
> build failures and proposes some measures to raise the priority for
> fixing CI errors. One drastic possibility
> would be to allow only CI fixing pull requests to be merged as long as
> master and the stable branches are red.
> But I am sure there is a golden middle way between such an extreme
> sanction and our current indifference.
> 

Part of the issue is that the master branch travis failures are not
visible in the PRs because the master branch runs the full tests
(including the extended tests), whereas the PRs don't run the extended
tests by default. However running the extended tests takes too long.

Could we add a CI check for PRs that confirms that the target branch is
green in travis?

Matt


> 
> Regards,
> 
> Matthias
> 
> 
> 
> 
> 
> 
> Last Successful Build: #31583 (10 days ago!)
> https://travis-ci.org/openssl/openssl/builds/639939072?utm_medium=notification_source=github_status
> 
> 
> Recent Build: #31837
> https://travis-ci.org/openssl/openssl/builds/644238908?utm_medium=notification_source=github_status
> 
> 
> 
> SSL TESTS
> =
> 
> *
> https://travis-ci.org/openssl/openssl/jobs/644238919?utm_medium=notification_source=github_status
> 
> *
> https://travis-ci.org/openssl/openssl/jobs/644238926?utm_medium=notification_source=github_status
> 
> *
> https://travis-ci.org/openssl/openssl/jobs/644238929?utm_medium=notification_source=github_status
> 
> 
>>  Test Summary Report
>> ---
>>  80-test_ssl_new.t    (Wstat: 256 Tests: 30 Failed: 1)
>>    Failed test:  4
>>    Non-zero exit status: 1
>>  80-test_ssl_old.t    (Wstat: 256 Tests: 6 Failed: 1)
>>    Failed test:  2
>>    Non-zero exit status: 1
> 
> 
> 
> BORINGSSL AND KRB5
> ==
> 
> *
> https://travis-ci.org/openssl/openssl/jobs/644238927?utm_medium=notification_source=github_status
> 
> 
>> Test Summary Report
>> ---
> 
>> 95-test_external_boringssl.t (Wstat: 256 Tests: 1 Failed: 1)
>>   Failed test:  1
>>   Non-zero exit status: 1
>> 95-test_external_krb5.t (Wstat: 256 Tests: 1 Failed: 1)
>>   Failed test:  1
>>   Non-zero exit status: 1
> 
> 
> 
> 95-test_external_boringssl.t:
> ---
> 
>     grep 'go.*FAILED' t
>     go test: FAILED (SSL3-Client-ClientAuth-RSA)
>     go test: FAILED (SSL3-Server-ClientAuth-RSA)
>     go test: FAILED (ClientAuth-Sign-RSA-SSL3)
>     go test: FAILED (ClientAuth-InvalidSignature-RSA-SSL3)
>     go test: FAILED (CertificateVerificationSucceed-Server-SSL3-Sync)
>     go test: FAILED
> (CertificateVerificationSucceed-Server-SSL3-Sync-SplitHandshakeRecords)
>     go test: FAILED
> (CertificateVerificationSucceed-Server-SSL3-Sync-PackHandshakeFlight)
>     go test: FAILED (CertificateVerificationSucceed-Server-SSL3-Async)
>     go test: FAILED
> (CertificateVerificationSucceed-Server-SSL3-Async-SplitHandshakeRecords)
>     go test: FAILED
> (CertificateVerificationSucceed-Server-SSL3-Async-PackHandshakeFlight)
> 
> 
> 
> 95-test_external_krb5.t:
> ---
> 
>     LD_LIBRARY_PATH=`echo -L../../../lib | sed -e "s/-L//g" -e "s/
> /:/g"` KRB5_CONFIG=../../../config-files/krb5.conf LC_ALL=C ./t_nfold
>     N-fold
>     Input:    "basch"
>     192-Fold: 1aab6b42 964b98b2 1f8cde2d 2448ba34 55d7862c 9731643f
>     Input:    "eichin"
>     192-Fold: 65696368 696e4b73 2b4b1b43 da1a5b99 5a58d2c6 d0d2dcca
>     Input:    "sommerfeld"
>     192-Fold: 2f7a9855 7c6ee4ab adf4e711 92dd442b d4ff5325 a5def75c
>     Input:    "MASSACHVSETTS INSTITVTE OF TECHNOLOGY"
>     192-Fold: db3b0d8f 0b061e60 3282b308 a5084122 9ad798fa b9540c1b
>     RFC tests:
> 
>     ...
> 
>     Generating random keyblock: . . . OK
>     Creating opaque key from keyblock: . . . OK
>     Encrypting (c): . . . Failed: Cannot allocate memory <===ALLOC
> FAILURE==
>     Aborted (core
> dumped) <===CORE DUMPED==
>     Makefile:684: recipe for target 'check-unix' failed
>     make[5]: *** [check-unix] Error 134
>     make[5]: Leaving directory
>