Hi all,

I'm a developer for the University of Amsterdam and we are planning to run uPortal 3.1 in production in a few months. Of course we like the system to be as stable as possible, and as me and my colleagues are huge fans of automated code-review tools we ran the automated (byte-)code review tool 'Findbugs' on uPortal.

Not all issue-categories reported by Findbugs are equally easy or rewarding to fix. I picked the '(possible) null pointer derefences' category as it usually contains issues that are both easy and rewarding to fix and spent an afternoon trying to fix these issues. I have included my efforts in a Jira report:

http://www.ja-sig.org/issues/browse/UP-2443

This was more of an exercise to get a feeling of the maintainability of the code, but we intend to do more fixes like this.

Other nice candidates seem e.g.

'method may fail to close database resource' (33 reported Findbugs-issues)
'Method may fail to clean up stream or resource' (40 reported Findbugs-issues)

although in my experience (and/or opinion) Findbugs usually reports a very low number of false positives; so it may not be that useful to 'zoom in' on specific categories.

How do others feel about this 'bottom-up' way of fixing possible issues in the uPortal code-base?

-Ernst-Jan Verhoeven


--
You are currently subscribed to [email protected] as: 
[email protected]
To unsubscribe, change settings or access archives, see 
http://www.ja-sig.org/wiki/display/JSG/uportal-dev

Reply via email to