Going back to the root of this discussion, Sheila had proposed the question:

+ What are we trying to accomplish at the user's perspective and why?

We've agreed that the target user for Scooby 0.3 is an employee at OSAF who is to update their PTO on the office calendar.

For Example:
+ Joe is an employee at OSAF
+ OSAF requires employees to put PTO on a shared office calendar
+ With the exception of the office manager employees should not change other employees PTO. + Being an employee, this is just part of Joe's job and this is something he is obliged to do. + OSAF has some desire to make this easy for Joe if its hard to do, people won't do it.

So let's think about this scenario again with some of the security ideas people have been talking about on the list.

Idea 1: Security by obscurity
Being an employee would also mean that I have this silent agreement among other employees that I would not give their information out to random people, publish the information on the web somewhere, or do any of the malicious spamming arguments already discussed about on the list.

Idea 2: Accountability - Everyone must identify themselves in order to participate Joe may have created his own account, or IT creates an account for him. In order to add his PTO he must log into an account. Joe is now accountable for all the changes he makes. If he does something that breaks company policy, IT blocks his account and take actions because they know who to take action against.

Idea 3: ACLs - It's the job of IT to handle setting things up for Joe and insuring he won't be able to do anything he is not allowed to do.

When Joe first started working at OSAF, he is assigned an e-mail address at OSAF. If the office policy is for employees to put in their PTO on the office calendar, Joe would also be assigned permissions. Joe might not have full read-write permission to edit everything on the calendar. He will be able to view everyone's PTO, but he may only be allowed to edit other people's PTO. The level of permissions for Joe would be set by IT based on the company's policy.

So when looking at the 0.3 target user who is an employee at a small company collaborating with other employees within the company, the security is really set by the company and the employees would most likely have to oblige. No matter how high or low the security level is.

In all three scenarios, you still need someone to set up and maintain the Cosmo server. That is to say someone in the workgroup plays the role of IT.

------
Planning for the Target Users at 1.0

Now let's think of it in terms of a visiting 'Casual Collaborator' target user, someone who is outside of the organization and would communicate very infrequently.

For example, Jeremy is a contributor for the Scooby project. He is not an employee at OSAF. Let's say OSAF is hosting a Scooby/Cosmo sprint week. Ted invites all contributors to come into the office and work with the team in person, and he only asks that they tell him when they are coming in.
Assuming that contributors use Scooby to tell Ted when they are coming:
+ How much collaboration does the visiting 'Casual Collaborator' really need to have?
   + Is it as simple as a yes/no answer?
+ Does the user really need to have the ability to type in a lengthy response? + Do users expect to see a calendar or a workflow that guides them through the task of telling Ted when they are coming in. After all, they are not regular users and may not have an interest of using Scooby as calendar.
+ Is just having read access enough for this person to participate?

I believe the 'Casual Collaborator' is more complicated than just the security discussion. This is because its possible to accommodate the 'Casual Collaborator' even if you have a security model that involves acls and accountability. In fact acls may even help make the 'Casual Collaborator' experience a better one. Even in a system with accountability and ACLs you can structure an experience for the casual collaborator that drives adoption.

---------

One final thought. If we are consciously making the decision to trade low security for higher adoption for now–when does enforcing more security become a higher priority? The last thing anybody would want to see are last minute decisions to implement a security layer that is a reactionary plan that is not thought through.

-Priscilla




_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

Open Source Applications Foundation "Design" mailing list
http://lists.osafoundation.org/mailman/listinfo/design

Reply via email to