This one is also getting a "how professors can assess the OSS work done 
by their students" spin, and was consequently tagged with the Education tag.

-------- Original Message --------
Subject: Sumana's OSCON 2010 proposal
Date: Tue, 2 Feb 2010 03:01:21 -0500 (EST)
From: Sumana Harihareswara <suma...@panix.com>
To: m...@redhat.com

Hard conversations: HOWTO assess performance in distributed FOSS
organizations, and why

by Sumana Harihareswara

Description:
How can you manage performance in a distributed, open source organization
without being a pointy-haired boss? This is your chance to engineer hard
conversations, read (and write) the company's culture, and help engineers
bootstrap their own personal & professional growth.


Abstract:
If we all were angels, we'd have no need of government -- or of a
performance appraisal structure.  Performance assessments provide a
structure for helping us have those awkward conversations we need in order
to grow.  This is especially hard when:

* everyone involved is a techie without management training
* people rarely see each other face-to-face
* the work is open source, especially R&D, with vague goals and
deliverables

How can you ensure that your organization consistently provides for
appropriate praise & criticism, and builds personnel files to help
performers grow?  I describe a schedule, architecture, and guidelines for
performance appraisal, suitable for an open source distributed
organization, as implemented at my previous firm.

Some key components:

* Flat, top-down metrics like lines of code are destined to fail.  Use
this opportunity to get buy-in -- read and write the company's culture.
Your most ninja rockstar developer may be down on herself for spending so
much time mentoring others instead of generating code, or a prima donna
may think it's beneath him to write documentation or test cases.  These
misalignments pop up in distributed workforces more frequently; address
them systematically.

* Self- and peer evaluations should balance length and ease of use (I
suggest some usability guidelines and quantitative and qualitative
choices).

* Benchmark to help everyone understand the company's expectations, and to
get everyone on the same page.  Multiple criteria and concrete examples
for each criterion show that the organization values mentorship, open
source citizenship, collegiality, and initiative as well as raw coding
skill.

* Scrum-style conversations:  a liaison regularly asks what you have
accomplished, what you aim to do next, and what the organization can help
you (fix blockers).

* Confidentiality is key.  I suggest access control rules and a separation
of responsibilities among employees/contractors, managers, and evaluators.

* I suggest how to responsibly gather performance feedback from external
reviewers, including open source peers and clients.


You will get astounding rewards from appropriate, consistent criticism and
praise of a worker's performance.  It's difficult but it's worth it, and
I'll show you how to make sure your whole organization benefits.
_______________________________________________
tos mailing list
tos@teachingopensource.org
http://teachingopensource.org/mailman/listinfo/tos

Reply via email to