Hi gang:
I have had some post-release fun with the CDS Invenio CVS logs and the
release history. In this first episode, let us look at the code
kwalitee of the latest two releases:
>>> CDS Invenio 0.92.1
================================================================================
CDS Invenio Python Code Kwalitee Check 2008-03-27 21:37:13
================================================================================
Module #LOC #UnitT #RegrT #T/1kLOC #MissDoc #PyChk/1kSRC PyLintScore
----------- -------- ------ ------ -------- -------- ------------ -----------
bibclassify 835 0 0 0.00 1 14.371 5.73/10
bibconvert 2842 5 0 1.76 2 38.705 7.54/10
bibedit 7210 31 2 4.58 6 6.657 7.19/10
bibformat 15545 25 9 2.19 0 2.509 7.99/10
bibharvest 4870 5 6 2.26 14 6.776 5.94/10
bibindex 4267 3 2 1.17 34 0.469 6.95/10
bibmatch 629 0 0 0.00 4 19.078 6.78/10
bibrank 6276 8 4 1.91 55 18.642 6.22/10
bibsched 1216 0 0 0.00 38 4.934 7.74/10
bibupload 1514 0 22 14.53 0 1.321 8.94/10
elmsubmit 6010 3 0 0.50 142 18.469 5.74/10
miscutil 3133 25 4 9.26 2 2.873 8.08/10
webaccess 6315 4 2 0.95 3 6.334 6.90/10
webalert 1880 0 1 0.53 37 1.596 6.49/10
webbasket 4487 0 1 0.22 8 2.006 8.02/10
webcomment 3226 0 3 0.93 2 6.200 4.77/10
webmessage 2353 0 1 0.42 1 1.275 8.58/10
websearch 13590 28 60 6.48 64 5.592 6.96/10
websession 6729 3 6 1.34 46 6.985 5.35/10
webstyle 1267 3 0 2.37 10 9.471 7.37/10
websubmit 20664 0 5 0.24 298 17.083 -3.11/10
----------- -------- ------ ------ -------- -------- ------------ -----------
TOTAL 114858 143 128 2.36 767 9.264 6.48/10
>>> CDS Invenio 0.99.0
================================================================================================================
CDS Invenio Python Code Kwalitee Check
2008-03-27 20:55:43
================================================================================================================
Module #LOC #UnitT #RegrT #WebT #T/1kLOC #MissDoc #PyChk/1kSRC
PyLintScore PyLintDetails
----------- -------- ------ ------ ------ -------- -------- ------------
----------- -------------------------
bibclassify 1062 0 3 0 2.82 1 14.124
4.67/10 0F 0E 27W 11R 214C
bibconvert 2941 5 3 0 2.72 2 39.102
6.87/10 0F 0E 99W 18R 126C
bibedit 9160 31 2 0 3.60 5 5.895
7.95/10 0F 4E 101W 82R 749C
bibformat 15752 27 15 1 2.73 0 7.174
8.16/10 0F 0E 249W 126R 988C
bibharvest 4826 7 6 0 2.69 11 20.928
6.24/10 0F 0E 142W 73R 688C
bibindex 4602 7 2 0 1.96 32 21.730
6.99/10 0F 0E 158W 56R 839C
bibmatch 629 0 0 0 0.00 4 19.078
6.76/10 0F 0E 65W 3R 42C
bibrank 5759 8 8 0 2.78 43 25.004
6.35/10 0F 5E 269W 63R 1044C
bibsched 1570 0 0 0 0.00 40 7.006
7.38/10 0F 0E 42W 19R 143C
bibupload 1767 0 42 0 23.77 1 3.396
9.06/10 0F 0E 62W 23R 181C
elmsubmit 5960 3 0 0 0.50 141 17.953
5.88/10 1F 11E 169W 48R 716C
miscutil 4705 82 6 0 18.70 11 4.038
8.26/10 0F 0E 60W 33R 345C
webaccess 8234 22 2 0 2.91 8 9.351
7.20/10 0F 0E 155W 90R 662C
webalert 1896 0 1 0 0.53 34 2.110
6.61/10 0F 0E 26W 24R 236C
webbasket 4545 0 4 0 0.88 8 1.980
7.66/10 0F 1E 107W 85R 303C
webcomment 3438 0 4 0 1.16 2 6.399
5.71/10 0F 0E 67W 55R 505C
webjournal 3984 0 0 0 0.00 7 14.056
6.29/10 0F 0E 183W 30R 310C
webmessage 2412 0 1 0 0.41 1 3.731
8.46/10 0F 0E 32W 30R 72C
websearch 14864 30 89 1 8.07 85 19.779
7.26/10 0F 2E 539W 241R 1718C
websession 7774 3 10 0 1.67 64 5.403
7.22/10 0F 9E 130W 107R 793C
webstyle 3122 3 0 0 0.96 9 7.047
7.22/10 0F 2E 68W 43R 175C
websubmit 28026 0 7 6 0.46 270 17.056
-0.34/10 0F 104E 1288W 331R 5599C
----------- -------- ------ ------ ------ -------- -------- ------------
----------- -------------------------
TOTAL 137028 228 205 8 3.22 779 13.209
6.72/10 1F 138E 4038W 1591R 16448C
>>> Differences from 0.92.1 to 0.99.0, Python code only:
Module D#LOC D#tests
----------- -------- -------
bibclassify +227 +3
bibconvert +99 +3
bibedit +1950 =
bibformat +207 +13
bibharvest -44 +2
bibindex +335 +4
bibmatch = =
bibrank -517 +4
bibsched +354 =
bibupload +253 +20
elmsubmit -50 =
miscutil +1572 +59
webaccess +1919 +18
webalert +16 =
webbasket +58 +3
webcomment +212 +1
webjournal +3984 0
webmessage +59 =
websearch +1274 +32
websession +1045 +4
webstyle +1855 =
websubmit +7362 +8
----------- -------- ------
TOTAL +22170 +170
>>> Conclusions?
- three best tested modules are (as in 0.92.1):
BibUpload, MiscUtil, WebSearch
- three worst tested modules are (almost as in 0.92.1):
BibMatch, BibSched, WebJournal[*]
- three best coding-style-compliant modules are (as in 0.92.1):
BibUpload, WebMessage, MiscUtil
- three worst coding-style-compliant modules are (almost as in 0.92.1):
WebSubmit, BibClassify[*], WebComment
[*] newcomer
- we have 170 more test cases now, which is good, but still only
three modules are in (or within) the comfort zone of ~10 tests per
1k LOC. (Anyhow, the massive deployment of the new web test suite
should prompt changes real soon now in this department!)
- three apparently biggest code additions happened in:
WebSubmit, WebJournal, BibEdit
- three apparently lowest code additions happened in:
BibRank, ElmSubmit, BibHarvest
Is that really so with the code additions? Stay tuned for the next
episode of this post-release fun series ;-)
Best regards
--
Tibor Simko ** CERN Document Server ** <http://cds.cern.ch/>