Hi MXNet community,

Recently, I and a few other contributors focussed on fixing examples in our 
repository which were not working out of the box as expected. 
https://github.com/apache/incubator-mxnet/issues/12800
https://github.com/apache/incubator-mxnet/issues/11895
https://github.com/apache/incubator-mxnet/pull/13196

Some of the examples failed after API changes and remained uncaught until a 
user reported the issue. While the community is actively working on fixing it, 
it might re-occur after few days if we don’t have a proper mechanism to catch 
regressions.

So, I would like to propose to enable nightly/weekly tests for the examples 
similar to what we have for tutorials to catch any such regressions. The test 
could check only basic functionalities/working of the examples. It can run 
small examples completely whereas it can run long training examples for only 
few epochs.

Any thoughts from the community? Any other suggestions for fixing the same?

Regards,
Ankit Khedia

Reply via email to