Hi Lex,

I just started a code review group for the cosmology group in the physics 
department at UC Davis (our first meeting was yesterday). I thought it went 
really well (I was presenting so I may have a biased view), but in general 
people seemed really excited about it. This is a little different than the pair 
reviewing you were mentioning, but maybe you can port some of the ideas. Here 
are some things I thought about/decisions I made:
No faculty (initially). I find people don’t like putting their code in front of 
people, I wanted to make sure everyone was comfortable, even those people 
looking for an advisor. This is something we may revisit in the future as some 
faculty have expressed interest in joining. We voted as a group to allow 
research staff.
I emphasized that this is not just a time for the presenter to learn, but for 
everyone to share their knowledge.
I encouraged all levels to attend. I find that beginners who are doing more 
googling can come up with some functionality that I’ve never seen or may 
approach a problem in an unexpected way. Everyone benefits when there is a 
variety of people in the room
No more that 200 lines of code to present.
Contrary to Fernando’s guidelines - I am not expecting people to have the time 
to review the code prior to the meeting. I don’t want people to not come 
because they haven’t had time to read the code. 
I limited the group to the cosmology group rather than the whole physics 
department. This cuts down on the amount of explanation required because we are 
often speaking the same language even if the projects are different.
We are meeting weekly for an hour. Our code will live in a github repository in 
directories by presenter then date. I am giving a brief git/github tutorial 
next meeting so everyone can contribute. We will use a google doc to sign up 
for presenting.

Best of luck!
Azalee



> On Nov 19, 2015, at 6:45 AM, Jonathan Strootman 
> <[email protected]> wrote:
> 
> Choosing what to review is often a difficult choice. Limiting the scope is 
> essential, but sometimes not very effective. 
> 
> If you find that the reviews of single functions is not yielding much 
> benefit, try expanding your review scope to "slices" of code. This could be 
> something like, "review that my code properly parses these inputs", etc.
> 
> This is a future consideration. Concentrating on creating a culture of peer 
> code review is first. It takes energy to build that culture where it didn't 
> exist before.
> 
> It warms my geeky, coder heart to see these kinds of efforts!
> 
> Cheers,
> Jonathan Strootman
> 
> On Wed, Nov 18, 2015 at 5:41 PM Ariel Rokem <[email protected] 
> <mailto:[email protected]>> wrote:
> Hi Lex, 
> 
> Fernando Perez posted a few guidelines for code review ("the lab meeting for 
> code"), here: 
> 
> http://fperez.org/py4science/code_reviews.html 
> <http://fperez.org/py4science/code_reviews.html>
> 
> Cheers, 
> 
> Ariel
> 
> On Wed, Nov 18, 2015 at 11:09 AM, Greg Wilson 
> <[email protected] <mailto:[email protected]>> 
> wrote:
> Hi Lex,
> 
> I think this is a great idea - we'd really like a blog post during/after 
> about how it goes.
> 
> 1. I've attached a copy of Jason Cohen's chapter from "Making Software" about 
> recent (as of 2011) empirical results on code review - it's a quick read, and 
> I think it does a good job of explaining why code review is more than just 
> fashionable.
> 
> 2. The complaint I always had from students when I was teaching code review 
> in software engineering classes was that they didn't know what to look for, 
> even when given checklists.  I tried to address this by doing a review live 
> in front of them (and then getting another programmer to come in and do a 
> review on a different piece of code, thinking aloud).  It seemed to make a 
> big difference to their comfort levels.
> 
> 3. An experiment I *didn't* try, but wanted to, was this:
> 
> a) you pick the piece of code and review it, but don't share the review with 
> the students
> 
> b) they each review the same piece of code independently
> 
> c) their score is based on how closely their review matches yours, i.e., 100% 
> - (false positives) - (false negatives).
> 
> I've seen this done in grading short essays, and again, it gives people an 
> idea of what to look for so that they don't feel they've been thrown in at 
> the deep end.
> 
> Hope this is helpful - really look forward to hearing how it goes,
> Greg
> 
> 
> On 2015-11-18 4:02 PM, Lex Nederbragt wrote:
>> Hi,
>> 
>> I have a 20% position at the informatics department here in Oslo (the rest 
>> is at biology), where I (co)supervise Master and PhD students from the 
>> Biomedical Informatics group. The group has about a dozen MSc students, 
>> involved in a range of different projects, but most will have a coding 
>> component. We recently discussed whether we could have the student review 
>> each other’s code ('Code peer review'? 'Peer code review'?) as a way to 
>> improve their coding skill, in a hopefully low-threshold way (rather than 
>> have the professors look at it).
>> 
>> About the format, we feel students should show each other small bits of code 
>> (not all of it at once), e.g. one not-too long function that makes for a 
>> doable review exercise: understand the context and what the code is supposed 
>> to do, perhaps even test it, should not take too long time for the reviewer. 
>> The students can do the entire review process as a pair, or the reviewer 
>> looks at it before they meet to prepare questions and suggestions for 
>> improvement. Once in a while we could ask students to demonstrate an example 
>> of what they found to the whole group, e.g. a piece of code before-and-after.
>> 
>> We were wondering whether others have tried such an approach, or whether it 
>> sounds hopelessly ambitious...
>> 
>> Thanks in advance,
>> 
>>      Lex Nederbragt
>> 
>> --
>> Lex Nederbragt
>> Centre for Ecological and Evolutionary Synthesis (CEES)
>> Dept. of Biosciences, University of Oslo
>> P.O. Box 1066 Blindern 0316 Oslo, Norway
>> Ph. +47 22844132 +47 48028722 <tel:%2B47%2048028722> Fax. +47 22854001 
>> <tel:%2B47%2022854001>
>> Email  <mailto:[email protected]>[email protected] 
>> <mailto:[email protected]>
>> http://flxlex.flavors.me/ <http://flxlex.flavors.me/>
>> 
>> 
>> _______________________________________________
>> Discuss mailing list
>> [email protected] 
>> <mailto:[email protected]>
>> http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org
>>  
>> <http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org>
> -- 
> Dr. Greg Wilson    | [email protected] 
> <mailto:[email protected]>
> Software Carpentry | http://software-carpentry.org 
> <http://software-carpentry.org/>
> _______________________________________________
> Discuss mailing list
> [email protected] 
> <mailto:[email protected]>
> http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org
>  
> <http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org>
> 
> _______________________________________________
> Discuss mailing list
> [email protected] 
> <mailto:[email protected]>
> http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org
>  
> <http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org>_______________________________________________
> Discuss mailing list
> [email protected]
> http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org

_______________________________________________
Discuss mailing list
[email protected]
http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org

Reply via email to