When test is running on emulated hardware, the performance counters
(TSC) maybe emulated and return the same value. So allow a
latency of zero, even though it should not happen IRL.

Signed-off-by: Stephen Hemminger <step...@networkplumber.org>
---
 app/test/test_latencystats.c | 4 +---
 1 file changed, 1 insertion(+), 3 deletions(-)

diff --git a/app/test/test_latencystats.c b/app/test/test_latencystats.c
index 676a99d385..ad0237ce78 100644
--- a/app/test/test_latencystats.c
+++ b/app/test/test_latencystats.c
@@ -192,9 +192,7 @@ static int test_latency_packet_forward(void)
        }
 
        TEST_ASSERT(values[4].value > 0, "No samples taken");
-       TEST_ASSERT(values[0].value > 0, "Min latency should not be zero");
-       TEST_ASSERT(values[1].value > 0, "Avg latency should not be zero");
-       TEST_ASSERT(values[2].value > 0, "Max latency should not be zero");
+
        TEST_ASSERT(values[0].value < values[1].value, "Min latency > Avg 
latency");
        TEST_ASSERT(values[0].value < values[2].value, "Min latency > Max 
latency");
        TEST_ASSERT(values[1].value < values[2].value, "Avg latency > Max 
latency");
-- 
2.47.2

Reply via email to