On 7/10/24 18:14, Ilya Maximets wrote:
> On 7/10/24 17:41, Phelan, Michael wrote:
>>
>>> -----Original Message-----
>>> From: Ilya Maximets <[email protected]>
>>> Sent: Monday, July 8, 2024 4:41 PM
>>> To: Phelan, Michael <[email protected]>
>>> Cc: [email protected]; ovs-dev <[email protected]>; Aaron Conole
>>> <[email protected]>; Chaudron, Eelco <[email protected]>
>>> Subject: Re: Intel CI not running?
>>>
>>> On 7/8/24 11:02, Phelan, Michael wrote:
>>>>> -----Original Message-----
>>>>> From: Ilya Maximets <[email protected]>
>>>>> Sent: Thursday, July 4, 2024 9:14 PM
>>>>> To: Phelan, Michael <[email protected]>
>>>>> Cc: [email protected]; ovs-dev <[email protected]>; Aaron
>>>>> Conole <[email protected]>; Chaudron, Eelco <[email protected]>
>>>>> Subject: Re: Intel CI not running?
>>>>>
>>>>> On 7/4/24 20:46, Ilya Maximets wrote:
>>>>>> On 7/4/24 13:04, Phelan, Michael wrote:
>>>>>>> Hi Ilya,
>>>>>>> The CI got stuck running make check on a patch, I have solved the
>>>>>>> issue now so reports should be back to normal now.
>>>>>>
>>>>>> Thanks for checking!  Though I see only two reports were sent out in
>>>>>> the past 7 hours with a few hour interval between them.
>>>>>> Did it get stuck again?
>>>>>
>>>>> Got another report.  Looks like we're getting reports at rate of one
>>>>> per 3.5 hours.  That doesn't sound right.
>>>>
>>>> We have added make check to the job parameters so that has increased the
>>> duration of the testing.
>>>
>>> 'make check' in GitHub CI takes about 7 minutes.  And machines there are not
>>> very fast.  It shouldn't take hours.
>>>
>>> In GitHub actions we're running it with TESTSUITEFLAGS='-j4' RECHECK=yes
>>>
>>
>> I've added these flags, so that should speed up the tests.

If you have more cores, you may also increase the -j4 to -j<ncores>.
This may increase the rate of flakiness, but RECHECK should handle those.

>>>>
>>>> It seems like the test number 98: barrier module from make check is a bit
>>> finicky and stalls occasionally also.
>>>
>>> Hmm. I've never seen this test getting stuck.  Could you try to reproduce 
>>> this
>>> by running './tests/ovstest test-barrier -v' manually?  Would be also great 
>>> to
>>> know where exactly it is getting stuck, e.g. by running under gdb.
>>
>> I haven't been able to reproduce the issue manually with this command but it 
>> is consistently getting stuck when running on CI which is strange.
>>
>> Any other suggestions on how to debug this?
> 
> You could try to attach gdb to a running process that is stuck
> to see where it is waiting and what it is waiting for.

In addition we're also receiving a reports with all tests skipped
for some reason:
  https://mail.openvswitch.org/pipermail/ovs-build/2024-July/040128.html

> 
>>
>> Kind regards,
>> Michael.
>>
>>>
>>> Best regards, Ilya Maximets.
>>>
>>>>
>>>>>
>>>>>>
>>>>>> Best regards, Ilya Maximets.
>>>>>>
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Michael.
>>>>>>>> -----Original Message-----
>>>>>>>> From: Ilya Maximets <[email protected]>
>>>>>>>> Sent: Thursday, July 4, 2024 10:47 AM
>>>>>>>> To: Phelan, Michael <[email protected]>
>>>>>>>> Cc: [email protected]; ovs-dev <[email protected]>; Aaron
>>>>>>>> Conole <[email protected]>; Chaudron, Eelco
>>>>> <[email protected]>
>>>>>>>> Subject: Intel CI not running?
>>>>>>>>
>>>>>>>> Hi, Michael!  We seem to not get reports from Intel CI since some
>>>>>>>> time on Monday.  The last report was:
>>>>>>>>
>>>>>>>>
>>>>>>>> https://mail.openvswitch.org/pipermail/ovs-build/2024-July/039755.
>>>>>>>> ht
>>>>>>>> ml
>>>>>>>>
>>>>>>>> Could you, please, check?
>>>>>>>>
>>>>>>>> Best regards, Ilya Maximets.
>>>>>>
>>>>
>>
> 

_______________________________________________
dev mailing list
[email protected]
https://mail.openvswitch.org/mailman/listinfo/ovs-dev

Reply via email to