Ramkumar Ramachandra wrote on Sat, Jul 24, 2010 at 13:16:02 +0530:
> Daniel Shahaf writes:
> > Where is the log message?
> 
> My proposed svn:log
> * subversion/tests/cmdline/svnrdump_tests_data: Add a new directory to
>   keep test data for svnrdump.
> * subversion/tests/cmdline/svnrdump_tests_data/asf-0.dump: Add a
>   dumpfile of the zeroth revision of the ASF repository.
> * subversion/tests/cmdline/svnrdump_tests.py: Add a new asf_0 test to
>   test the dumpfile along with a run_test helper function.
> 

Good start.  But you don't have a first sentence before the bullets (which we
want for most non-small changes), and you aren't using the standard 
(function_name)
syntax.  So, perhaps along these lines:

[[[
Add a unit test for svnrdump dumping r0.

* subversion/tests/cmdline/svnrdump_tests_data: Add a new directory to
    keep test data for svnrdump.

* subversion/tests/cmdline/svnrdump_tests_data/asf-0.dump: Add a
    dumpfile of the zeroth revision of the ASF repository.

* subversion/tests/cmdline/svnrdump_tests.py:
  (build_repos, run_test):  New helper functions.  ("Factored out from XXX()", 
if applicable)
  (asf_0):  New test ("for XXX", if you like).
  (test_list):  Run the new test.
]]]

> > Why are you using r0 *of the ASF repository* --- is anything special about 
> > that
> > repository?
> 
> No, but I plan to build up on this to verify 1k, 10k, 100k
> ... revisions of different repositories.
> 

First of all, please don't verify a 100k dumpfile during every 'make check'.
It would take too long.

Second, is it really necessary to have all of these?  I'd assume that all edge
cases you need to test can be fitted quite roomly in a 50-revision dumpfile.

(Specifically, I think that having a 10k and 100k tests is unnecessary given
that you have a 1k test; and that the 1k test in itself could be much reduced.
It's not the number of revisions that counts!)

> > Should we have run_and_verify_svnrdump()?
> 
> Excellent suggestion. I'll get started working on it as soon as I
> commit this patch.
> 
> -- Ram

Reply via email to