Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-23 Thread Stephan Richter
On Wednesday 22 February 2006 10:58, Dominik Huber wrote:
 Do you have an other solutions for this problem?

Honestly, I had not thought about this case, but it is clearly a valid use 
case.

What about this structure?

repos/main/
NAMESPACE/
branches/
tags/
trunk/
...configure stuff here...
NAMESPACE.PACKAGE1/
branches/
tags/
trunk/
NAMESPACE.PACKAGE2/
branches/
tags/
trunk/

This approach is also extendable to the use case below:

repos/main/
NAMESPACE/
branches/
tags/
trunk/
...configure stuff here...
NAMESPACE.PACKAGE/
branches/
tags/
trunk/
NAMESPACE.SUBNAMESPACE/
branches/
tags/
trunk/
...configure stuff here...
NAMESPACE.SUBNAMESPACE.PACKAGE1/
branches/
tags/
trunk/
NAMESPACE.SUBNAMESPACE.PACKAGE2/
branches/
tags/
trunk/

What do you think? I think this is the right way to do it, since it is a very 
logical procedure to flatten the namespace tree.

 Another smaller question: Is it intended to provide nested namespaces such
 as repos/main/NAMESPACE.SUBNAMESPACE.PACKAGE? This pattern is
 convenient if a few packages are tied together by their release cycles.

Yes, sure. There is no limit to the namespace depth.

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-23 Thread Dominik Huber

Stephan Richter wrote:

On Wednesday 22 February 2006 10:58, Dominik Huber wrote:
  

Do you have an other solutions for this problem?



Honestly, I had not thought about this case, but it is clearly a valid use 
case.


What about this structure?

repos/main/
NAMESPACE/
branches/
tags/
trunk/
...configure stuff here...
NAMESPACE.PACKAGE1/
branches/
tags/
trunk/
NAMESPACE.PACKAGE2/
branches/
tags/
trunk/

[...]

What do you think? I think this is the right way to do it, since it is a very 
logical procedure to flatten the namespace tree.
  
We came up with the same solution first. But our problem appears within 
the following use case.


A few developers are sharing code on application level (not  package 
level!) for different dedicated customer projects. They have problems to 
setup identical dev-environments (packages with the same revisions 
etc.). So they want setup a dedicated application repository that brings 
different packages together using the svn:externals excessively. In the 
end developers should check-out this application repository (comparable 
to the Zope3/trunk) for joint development on application-level. The only 
way they could set the svn:externals-property using the above layout is 
the following:


repos/main/
   APPLICATION1/
   branches/
   tags/
   trunk/
   src/
  - svn:externals: NAMESPACE url
  - svn:externals: NAMESPACE/PACKAGE1 url
  - svn:externals: NAMESPACE/PACKAGE2 url

This scheme will work, but we have the disadvantage that they still cannot 
share application-specific configurations of NAMESPACE for further customer 
projects because they cannot link repos/main/APPLICATION1/trunk/src/NAMESPACE 
via svn:externals.

The solution we came up was to put the '...configure stuff here...' *files* of 
the NAMESPACE not directly into the namespace itself, but in a dedicated 
*folder* for example 'configure':

repos/main/
   NAMESPACE/
   branches/
   tags/
   trunk/
   configure/
  ...configure stuff here...

That way we can move the svn:externals to the NAMESPACE folder of the APPLICATION1 repository: 


repos/main/
   APPLICATION1/
   branches/
   tags/
   trunk/
   src/
  NAMESPACE/
__init__.py
- svn:externals: configure url
- svn:externals: PACKAGE1 url
- svn:externals: PACKAGE2 url

This enables the developers to share this NAMESPACE configuration for further 
customer projects for example the following PROJECT01:

repos/main/
   PROJECT01/
   branches/
   tags/
   trunk/
   src/
 - svn:externals: NAMESPACE url*

*url = repos/main/APPLICATION1/trunk/src/NAMESPACE

Did you catch our purpose? We think that could simplify joint development. So 
we might include such a 'configure-folder' convention or 
'never-put-any-configure-stuff-inside-a-namespace' rule in the ZSCP and 
CRP-proposal.

Regards,
Dominik


___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-23 Thread Benji York

Dominik Huber wrote:
We came up with the same solution first. But our problem appears within 
the following use case.


A few developers are sharing code on application level (not  package 
level!) for different dedicated customer projects. They have problems to 
setup identical dev-environments (packages with the same revisions 
etc.).


I need to carefully reread your message and reply with more detail 
later, but I'll make a drive-by comment first. :)


Ultimately, reproducible developer buildouts can never be represented 
solely with Subversion (or likely any RCS).  If you've been able to 
structure your projects such that it works, that's great, but that use 
case shouldn't be a driving force behind the common repository layout.

--
Benji York
Senior Software Engineer
Zope Corporation
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-23 Thread Dominik Huber

Benji York wrote:

Dominik Huber wrote:
We came up with the same solution first. But our problem appears 
within the following use case.


A few developers are sharing code on application level (not  package 
level!) for different dedicated customer projects. They have problems 
to setup identical dev-environments (packages with the same revisions 
etc.).


I need to carefully reread your message and reply with more detail 
later, but I'll make a drive-by comment first. :)


Ultimately, reproducible developer buildouts can never be represented 
solely with Subversion (or likely any RCS).  If you've been able to 
structure your projects such that it works, that's great, but that use 
case shouldn't be a driving force behind the common repository layout.
Yes, I know that. I'm fully aware that our suggestion is kind of a 
workaround, but it works fairly good in pratice for a wide range of 
applications.


Packages are one story of development. Other stories are exemplary 
applications that depends on a certain set of packages. Such a set is an 
important fine tuning (- matching revisions or tags etc.) in other 
words such a set is know-ledge about package combination that could be 
shared between different projects relying on the same exemplary 
application. It might minimize the barrier for new contributors and 
early adopters - IMO that's fairly important too. That's my driving force :)


Regards,
Dominik

--
Dominik Huber

Perse Engineering GmbH
Jurastrasse 9a
CH-5406 Baden

E-Mail: [EMAIL PROTECTED]
Telefon: ++41 44 586 6886


___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Reinoud van Leeuwen
On Mon, Feb 20, 2006 at 04:31:03PM -0500, Stephan Richter wrote:

Hi Stephan,

This seems to me a great step forward but I am missing something.
The quality is measured by a number of metrics, but it seems nowhere is 
actually measured if the software does what it is supposed to do, if it is 
clear how it works and whether it is something you would advise other 
people to use.
This is one of the major problems I've had in the past with things like 
the Perl CPAN repository: you can find lot's of modules that seem to solve 
your problem, but usually you discover what a module is really about when 
you've invested a lot of time in it. 

Would there be room for a voting or feedback step in the process where 
people that have tried the module could enter a rating?


 1.2. Goals
[...]
   * Quality Packages
 
 There is a natural desire for any developer to know what they are getting
 into when they are using a certain package, a baseline of quality that can
 be expected. While the Zope 3 community has some ideas of what that
 baseline is for the core, it is not well defined and applied uniformly. 
 This
 proposal defines clear quality guidelines.

[...]

 2.4. Quality Metrics
 
 
 The certification is meaningless without the precise definition of tasks that
 have to be accomplished for each certification level. This section provides a
 list of concrete items that have to be fulfilled for each certification level.
 
 Legend:
 
 - x: A metric is required for the certification level.
 - A: The metric check can be conducted automatically.
 - Q: The metric check can be conducted quickly by human inspection.
 - D: The metric check would be difficult to conduct by human inspection.
 
 +--+---+--+--+--+--+
 | Metric   | Check | List | Le 1 | Le 2 | Le 
 3 |
 +==+===+==+==+==+==+
 | Package Meta-Information (see sec. 2.5)  | A |  x   |  x   |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Test Coverage| A |  0%  | 90% | 95% | 
 95% |
 +--+---+--+--+--+--+
 | Tests verified by Automated Test Runner  | A |  x   |  x   |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Doctest-based Testing| A,Q   |  |  x   |  x   |  x  
  |
 | (or list reason for not using doctests)  |   |  |  |  | 
  |
 +--+---+--+--+--+--+
 | Tests pass on all supported platforms| A,Q   |  |  x   |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Minimal Documentation| A,Q   |  x   |  x   |  x   |  x  
  |
 | (``README.txt`` file)|   |  |  |  | 
  |
 +--+---+--+--+--+--+
 | Complete Documentation   | Q |  |  x   |  x   |  x  
  |
 | (Text files cover all of API)|   |  |  |  | 
  |
 +--+---+--+--+--+--+
 | Extensive Documentation  | D |  |  |  |  x  
  |
 | (lots of samples, API docs, tutorial)|   |  |  |  | 
  |
 +--+---+--+--+--+--+
 | Documentation available online [1]   | Q |  |  |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Documentation available in Zope's apidoc | Q |  |  |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Common package structure | A,Q   |  x   |  x   |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Follows Zope Coding Style Guide  | A,D   |  |  x   |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Conform to user interface guidelines | D |  |  |  x   |  x  
  |
 | (if applicable to package)   |   |  |  |  | 
  |
 +--+---+--+--+--+--+
 | Complete dependency list | A |  |  x   |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Standard installation method | A,Q   |  |  |  x   |  x  
  |
 +--+---+--+--+--+--+
 | Release(s) with version number   | A,Q   |  |  |  x   |  x  
  |
 

Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Martijn Faassen

Lennart Regebro wrote:
[snip]

tests (in doctest format)
 
This seems like a very random requirement for me. I'd like to see

tests that can be run with the standard test-runner, otherwise I don't
see a reason to restrict it. I find doctest greating for testing docs,
and testing longer use cases. Otherwise I don't like it at all, and
see absolutely no reason to force people to only use doctests.


While I like doctests, I think this is a good point. Tests that work 
with the standard test-runner is indeed a valid minimum requirement. 
Doctests might give you a plus, but this is also in the documentation 
domain and that gives you a plus anyway.



Packages of this level are considered fit for the Zope 3 core with the
reservation of the core developers to provide or require small improvements.
 
I'm not sure I understand what you say here. You say that level one

packages are almost good enough to be zope 3 core, and that the other
levels are good enough to be Zope3 core, even though they are not?


Perhaps we should leave out talk about inclusion into the Zope 3 core 
for now. After all, the Zope 3 core is going to become less core-ish in 
the future, and whether a package is included depends on more than just 
whether it conforms to the list of requirements - we may want to adopt a 
package that is less conformant but provides great features (having to 
bring it up to spec) above one that is very conformant but feature-wise 
isn't very interesting for the core.



[1] For small packages it will suffice, if the documentation is available
 via a Web site of the repository. For projects having a homepage, the
 documentation *must* be available there.





When you say Web site of the repository do you mean svn access via
http? Because there could be more, we could give each project a small
auto-generated website which contains documentation and releases, in
the way of codespeak. This would force every project to keep the
documentation in the same format, suitable for automatic generation
into HTML and other formats, which I guess is something we would like
anyway. In that case, documentation could be on this project-page
and if you have any other homepage for the project, you could just
link there.


I think auto-generating a website for a project is a great idea! This 
would indeed encourage uniformity in the documentation.


The risk is that someone will have to implement code that does this, 
unless we recycle the stuff for codespeak.


Regards,

Martijn
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Stephan Richter
On Tuesday 21 February 2006 05:59, Reinoud van Leeuwen wrote:
 This seems to me a great step forward but I am missing something.
 The quality is measured by a number of metrics, but it seems nowhere is
 actually measured if the software does what it is supposed to do, if it is
 clear how it works and whether it is something you would advise other
 people to use.
 This is one of the major problems I've had in the past with things like
 the Perl CPAN repository: you can find lot's of modules that seem to solve
 your problem, but usually you discover what a module is really about when
 you've invested a lot of time in it.

Right, I hear you. In fact, this is one of the reasons that we decided against 
a name like Zope Software Quality Program. With the proposed process, we 
cannot guarantee 100% that the package is good. However, there are a couple 
of safe guards:

(1) If you write doctests as a narrative text file, you really have to think 
hard about the functionality your package provides. I cannot stress it 
enough, doctest text files are *key* to the success of the certification 
program.

(2) At least in the Common Repository, people will read check-in messages.

(3) At higher certification levels, other people must support the package. 
This is also not 100% bullet proof, but it is something.

Overall, I also expect that the community has little tolerance for malicious 
attempts to break the system. If someone detects foul play all he has to do 
is complain on the mailing lists.

 Would there be room for a voting or feedback step in the process where
 people that have tried the module could enter a rating?

Ah, rating and feedback. :-) This was discussed in the pre-proposal phase as 
well. The problem with feedback and rating is that to do it right, it 
requires a lot of resources that we do not have. Here is a scenario:

1. A user U trashes a package because an important feature F is missing in 
package P. So far so good. It is his right to do so.

2. The package P authors see the comment and fix it in the code. Very cool, 
the process works.

3. But then the user U of the post, must retract his comment. What if he is 
not available? Not so good. The alternative would be to ask a certification 
manager, who might know nothing about the issue and will need a lot of time 
reviewing the complaint and solution. Not so good.

While rating and feedback is good, to do it right in a process like the ZSCP 
costs a lot of resources.

Having said that, I see the need to address the issue. Here are two 
suggestions:

1. Add a Future Possibilities section to the proposal collecting ideas for 
later iterations of the process. This would allow us to address some of the 
common concerns and say: If we have time and resources, this can be done.

2. There is already a provision in the process that a package can receive a 
warning. Currently the ZSCP states that the warning can only be issued when a 
package does not fulfill the quality metrics for a given release.

I could add another provision that a warning can be issued, if X community 
members and 1 certification manager verified a bad package. Each warning 
carries an arbitrary comment that could describe the reason of the warning. 
This way we can use the existing communication channels (mailing list and 
IRC) for feedback and still have a way to formalize feedback. I guess in this 
case we would also need a resolve action that could resolve a warning.

What do you think?

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-Users] Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Chris McDonough
I hate to cross-post this, but would it be possible to limit this  
discussion to a single list (e.g. zope3-dev, maybe)?  I'm interested  
in this topic, but my mail client isn't smart enough to filter it out  
to only one place and I'm sure there are a lot of other people with  
the same issue.


- C

On Feb 21, 2006, at 9:45 AM, Stephan Richter wrote:


On Tuesday 21 February 2006 05:59, Reinoud van Leeuwen wrote:

This seems to me a great step forward but I am missing something.
The quality is measured by a number of metrics, but it seems  
nowhere is
actually measured if the software does what it is supposed to do,  
if it is

clear how it works and whether it is something you would advise other
people to use.
This is one of the major problems I've had in the past with things  
like
the Perl CPAN repository: you can find lot's of modules that seem  
to solve
your problem, but usually you discover what a module is really  
about when

you've invested a lot of time in it.


Right, I hear you. In fact, this is one of the reasons that we  
decided against
a name like Zope Software Quality Program. With the proposed  
process, we
cannot guarantee 100% that the package is good. However, there are  
a couple

of safe guards:

(1) If you write doctests as a narrative text file, you really have  
to think

hard about the functionality your package provides. I cannot stress it
enough, doctest text files are *key* to the success of the  
certification

program.

(2) At least in the Common Repository, people will read check-in  
messages.


(3) At higher certification levels, other people must support the  
package.

This is also not 100% bullet proof, but it is something.

Overall, I also expect that the community has little tolerance for  
malicious
attempts to break the system. If someone detects foul play all he  
has to do

is complain on the mailing lists.

Would there be room for a voting or feedback step in the process  
where

people that have tried the module could enter a rating?


Ah, rating and feedback. :-) This was discussed in the pre-proposal  
phase as

well. The problem with feedback and rating is that to do it right, it
requires a lot of resources that we do not have. Here is a scenario:

1. A user U trashes a package because an important feature F is  
missing in

package P. So far so good. It is his right to do so.

2. The package P authors see the comment and fix it in the code.  
Very cool,

the process works.

3. But then the user U of the post, must retract his comment. What  
if he is
not available? Not so good. The alternative would be to ask a  
certification
manager, who might know nothing about the issue and will need a lot  
of time

reviewing the complaint and solution. Not so good.

While rating and feedback is good, to do it right in a process like  
the ZSCP

costs a lot of resources.

Having said that, I see the need to address the issue. Here are two
suggestions:

1. Add a Future Possibilities section to the proposal collecting  
ideas for
later iterations of the process. This would allow us to address  
some of the
common concerns and say: If we have time and resources, this can be  
done.


2. There is already a provision in the process that a package can  
receive a
warning. Currently the ZSCP states that the warning can only be  
issued when a

package does not fulfill the quality metrics for a given release.

I could add another provision that a warning can be issued, if X  
community
members and 1 certification manager verified a bad package. Each  
warning
carries an arbitrary comment that could describe the reason of the  
warning.
This way we can use the existing communication channels (mailing  
list and
IRC) for feedback and still have a way to formalize feedback. I  
guess in this
case we would also need a resolve action that could resolve a  
warning.


What do you think?

Regards,
Stephan
--
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-users mailing list
Zope3-users@zope.org
http://mail.zope.org/mailman/listinfo/zope3-users



___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Stephan Richter
On Tuesday 21 February 2006 06:09, Lennart Regebro wrote:
 First, about the IP: The idea that we can use the same certification
 process for different repositories and different code owners is
 interesting. In that case, there could be a common
 listing/certification site, covering several repositories that all are
 a part of the certification process. I like that better than saying
 that the code has to be owned by ZF to ber certified, because that
 more or less kills the idea of certification in the first place... :)

Right. This is the goal. While I have not stated this publically yet, I am 
also committed to develop this Web site. In fact, I have already started by 
writing code to process the proposed data both ways.

 Onto other things:
  tests (in doctest format)

 This seems like a very random requirement for me. I'd like to see
 tests that can be run with the standard test-runner, otherwise I don't
 see a reason to restrict it. I find doctest greating for testing docs,
 and testing longer use cases. Otherwise I don't like it at all, and
 see absolutely no reason to force people to only use doctests.

Why is it random? It is taken straight from the conventions now used in Zope 3 
for all new development. The rationale behind it is that you are forced to 
document and reason all the cases the software handles. This, in turn, makes 
you think much harder about your features and provides other developers to 
understand your code more easily.

If several other people agree with you (and not too many disagree), I will 
change the proposal. But I think it would lower the documentation quality a 
lot.

  Packages of this level are considered fit for the Zope 3 core with the
  reservation of the core developers to provide or require small
  improvements.

 I'm not sure I understand what you say here. You say that level one
 packages are almost good enough to be zope 3 core, and that the other
 levels are good enough to be Zope3 core, even though they are not?

All I am saying is that level 1 (and also above, of course) is good enough for 
the Zope 3 core. The second clause is saying: If we accept the package for 
the core, we will reserve the right to make small improvements. For example, 
level 1 does not require the documentation to be available via apidoc, but 
core packages must provide their docs in apidoc; thus we would have to add 
this when adding the package to the core.

I simplified the sentence structure a bit.

  [1] For small packages it will suffice, if the documentation is available
via a Web site of the repository. For projects having a homepage,
  the documentation *must* be available there.

 When you say Web site of the repository do you mean svn access via
 http?

Yes.

 Because there could be more, we could give each project a small 
 auto-generated website which contains documentation and releases, in
 the way of codespeak. This would force every project to keep the
 documentation in the same format, suitable for automatic generation
 into HTML and other formats, which I guess is something we would like
 anyway. In that case, documentation could be on this project-page
 and if you have any other homepage for the project, you could just
 link there.

Certified packages must follow the Zope 3 coding style guide, which requires 
the package documentation to be in ReST. So HTML generation is a given thing; 
in fact apidoc book module does just that.

The idea is not to make the barrier too nigh for small packages without 
homepages. On the other hand, the ZSCP site should *not* become a software 
development site. It will only provide pointers to other resources. It is 
very important to keep the focus of the proposal as narrow as possible in 
order to make it feasible to implement. This will not be the last revision of 
the process; if we see the need to provide even more information, we can add 
this later.

  Achieving the first status of being a listed package is
  an automated process.

 I'm not 100% clear on the listed status. A listed package is still
 in the repository, is that right? So anybody with repository access
 can create the package and create the meta data and then list it, is
 that the idea?

Right, this is the general idea. I am not sure whether it will be fully 
automated in the first iteration, but this is certainly the goal eventually. 
The goal of a listed package is to say: Look, here I am, and I am trying very 
hard to fulfill the quality guidelines and become certified.

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Lennart Regebro
On 2/21/06, Stephan Richter [EMAIL PROTECTED] wrote:
 On Tuesday 21 February 2006 05:59, Reinoud van Leeuwen wrote:
  This seems to me a great step forward but I am missing something.
  The quality is measured by a number of metrics, but it seems nowhere is
  actually measured if the software does what it is supposed to do, if it is
  clear how it works and whether it is something you would advise other
  people to use.
  This is one of the major problems I've had in the past with things like
  the Perl CPAN repository: you can find lot's of modules that seem to solve
  your problem, but usually you discover what a module is really about when
  you've invested a lot of time in it.

 Right, I hear you. In fact, this is one of the reasons that we decided against
 a name like Zope Software Quality Program. With the proposed process, we
 cannot guarantee 100% that the package is good. However, there are a couple
 of safe guards:

 (1) If you write doctests as a narrative text file, you really have to think
 hard about the functionality your package provides. I cannot stress it
 enough, doctest text files are *key* to the success of the certification
 program.

More importantly here, doctests will give people a good impression on
what you can do. So I would like to change the doctest requirement
(that sais that tests should be in doctest format unless that is not
possible) to these documentation requirements:

1. Having at least one reasonably complete usage example.

2. All code examples in the documentation should be in doctest format,
and included as a part of the standard test-run.

(Maybe you had these in there and I forgot about them, in that case
all is fine. :) )
--
Lennart Regebro, Nuxeo http://www.nuxeo.com/
CPS Content Management http://www.cps-project.org/
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Stephan Richter
On Tuesday 21 February 2006 10:33, Lennart Regebro wrote:
 1. Having at least one reasonably complete usage example.

 2. All code examples in the documentation should be in doctest format,
 and included as a part of the standard test-run.

 (Maybe you had these in there and I forgot about them, in that case
 all is fine. :) )

Yes, they are in there. One requirement is called Minimal Documentation 
(README.txt file) and another Doctest-based Testing (or list reason for not 
using doctests). And the standard test run part is covered by Tests 
verified by Automated Test Runner.

I am glad I was able to dissolve your concerns. :-)

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Lennart Regebro
Thanks for the answer. I only have one remaining comment, then, about testing:

On 2/21/06, Stephan Richter [EMAIL PROTECTED] wrote:
 Why is it random? It is taken straight from the conventions now used in Zope 3
 for all new development. The rationale behind it is that you are forced to
 document and reason all the cases the software handles.

Most testing I do is not about handling cases at all, it's about
testing specific functionality, making sure that old bugs doesn't pop
up again and stuff like that. Most of these tests would in doctest
format provide no documentation at all. For my calendar, many of the
tests are test to do things as migrate data from the old calendar
product, making sure that the attendee source for CPS does what it
should, but it has no educational use in itself, the API for is is
tested in CalCore and CalZope and that is indeed in DocTest format (at
least partly), we also have tests to check that the translations are
consistent, and stuff like that. Many tests require setup/teardown
functionality, which gets hard to do in the inherently linear format
of doctests. For example, the upgrade tests need to create old
calendars at the setup. Each test then fills the calendars with
different types of data, and migrates the calendar. This migration
includes installation of the new software and replacing of a local
utility, which then needs to be undone before the next test. I would
eitehr need to duplicate this code many times, or test all cases in
one fat migration step, which would make it much harder to figure out
exactly what failed.

Simply put:  I agree doctest is good for testing use cases, and
testing documentation. I don't agree that it is any good for testing
anything else.

It is indeed good practice to start your development with doing
usecases, expanding them to be doctest and using this to drive
dvelopment. But once the code has matured past that stage, I see no
reason to require that all tests should be doctests, or you should
need to convince people why doctests are impossible. It  may possible
increase the documentation quality marginally, but it may likewise
lower the testing quality.

--
Lennart Regebro, Nuxeo http://www.nuxeo.com/
CPS Content Management http://www.cps-project.org/
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Stephan Richter
On Tuesday 21 February 2006 11:02, Lennart Regebro wrote:
 On 2/21/06, Stephan Richter [EMAIL PROTECTED] wrote:
  Why is it random? It is taken straight from the conventions now used in
  Zope 3 for all new development. The rationale behind it is that you are
  forced to document and reason all the cases the software handles.

 Most testing I do is not about handling cases at all, it's about
 testing specific functionality, making sure that old bugs doesn't pop
 up again and stuff like that. Most of these tests would in doctest
 format provide no documentation at all.

Ah, that is in the eye of the beholder. I would argue that bug fixes, for 
example, should be either tested by integrating text in the documentation or 
tested using a docstring test, which SchoolTool commonly does. The idea is 
not to say: This test has been written to test the fix for issue 12234, but 
rather write something like: A special case is XYZ, which works as follows:

Any test should provide documentation.

 For my calendar, many of the 
 tests are test to do things as migrate data from the old calendar
 product, making sure that the attendee source for CPS does what it
 should, but it has no educational use in itself,

Well, a doctest could explain the migration test and what has changed. 
SchoolTool, for example, tests its migration scripts using doctests.

 the API for is is 
 tested in CalCore and CalZope and that is indeed in DocTest format (at
 least partly), we also have tests to check that the translations are
 consistent, and stuff like that.

But for translation consistency you have a different test setup anyways. 
Doctests are for testing code, not translations or similar things.

 Many tests require setup/teardown 
 functionality, which gets hard to do in the inherently linear format
 of doctests.

I used to think this way too until Jim convinced me that showing all the setup 
steps and teardown is actually useful to the reader. It shows how much is 
involved in getting a particular piece of code running. Alternatively, you 
can of course define setup and teardown functions when instantiating the 
suite.

 For example, the upgrade tests need to create old 
 calendars at the setup. Each test then fills the calendars with
 different types of data, and migrates the calendar. This migration
 includes installation of the new software and replacing of a local
 utility, which then needs to be undone before the next test. I would
 eitehr need to duplicate this code many times, or test all cases in
 one fat migration step, which would make it much harder to figure out
 exactly what failed.

SchoolTool solves this by simply storing an old ZODB database and then run the 
generation scripts, because this is what happens in the real world anyways.

 It is indeed good practice to start your development with doing
 usecases, expanding them to be doctest and using this to drive
 dvelopment. But once the code has matured past that stage, I see no
 reason to require that all tests should be doctests, or you should
 need to convince people why doctests are impossible. It  may possible
 increase the documentation quality marginally, but it may likewise
 lower the testing quality.

I don't think our testing quality has decreased in Zope 3, since we switched 
to pure doctests.

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Benji York

Lennart Regebro wrote:
 Thanks for the answer. I only have one remaining comment, then, about
  testing:

 On 2/21/06, Stephan Richter [EMAIL PROTECTED] wrote:

 Why is it random? It is taken straight from the conventions now
 used in Zope 3 for all new development. The rationale behind it is
 that you are forced to document and reason all the cases the
 software handles.

 Most testing I do is not about handling cases at all, it's about
 testing specific functionality, making sure that old bugs doesn't pop
  up again and stuff like that. Most of these tests would in doctest
 format provide no documentation at all.

Sure, they don't provide /user/ documentation, but it's been my
experience that using doctests for regression and unit tests provides
better /maintainer/ documentation, greatly enhancing their
maintainability.

 For my calendar, many of the tests are test to do things as migrate
 data from the old calendar product, making sure that the attendee
 source for CPS does what it should, but it has no educational use in
 itself

Again, not for the average user, bit if the hypothetical six month bug
were found and those tests were decently written doctests, it would be
much easier for the maintainer to follow what the test was doing.

 Many tests require setup/teardown functionality, which gets hard to
 do in the inherently linear format of doctests.

That's what the setUp and tearDown methods of the test suite are for.

 Simply put:  I agree doctest is good for testing use cases, and
 testing documentation. I don't agree that it is any good for testing
  anything else.

I (and I think others) will disagree.  Doctest encourages a style of
testing that is superior to old-style unit tests.

 It  may possible increase the documentation quality marginally,

The intent isn't (necessarily) to increase documentation, but to
encourage tests that are easy to understand and maintain.

 but it may likewise lower the testing quality.

I don't see why doctest formatted unit, functional, regression, or other
tests would be of lower quality.  It certainly hasn't been my
experience.
--
Benji York
Senior Software Engineer
Zope Corporation
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Lennart Regebro
On 2/21/06, Stephan Richter [EMAIL PROTECTED] wrote:
  Most testing I do is not about handling cases at all, it's about
  testing specific functionality, making sure that old bugs doesn't pop
  up again and stuff like that. Most of these tests would in doctest
  format provide no documentation at all.

 Ah, that is in the eye of the beholder. I would argue that bug fixes, for
 example, should be either tested by integrating text in the documentation or
 tested using a docstring test, which SchoolTool commonly does. The idea is
 not to say: This test has been written to test the fix for issue 12234, but
 rather write something like: A special case is XYZ, which works as follows:

Well, possibly, although I'm not convinced.

 Any test should provide documentation.

That depends on what you mean with provide. Of course you need in a
comment explain why you do the test, and what it tests, but that does
not mean that the test makes sense anywhere in  what would normally be
called the documentation.

  For my calendar, many of the
  tests are test to do things as migrate data from the old calendar
  product, making sure that the attendee source for CPS does what it
  should, but it has no educational use in itself,

 Well, a doctest could explain the migration test and what has changed.

Nothing changed. It's two different calendar products. It's
basically an import/export from the old calendar to the new. There is
nothing to explain.

 But for translation consistency you have a different test setup anyways.

Eh, no... I don't think I follow you here. It's a test, it's run by
the test runner. If you are referring to the setUp code, then I
actually don't understand what that has to do with it.

 Doctests are for testing code, not translations or similar things.

Well, then we at least agree on one exception to the rule. :-)

I don't agree doctest is for testing code. I think it's for testing
documentation.

  Many tests require setup/teardown
  functionality, which gets hard to do in the inherently linear format
  of doctests.

 I used to think this way too until Jim convinced me that showing all the setup
 steps and teardown is actually useful to the reader. It shows how much is
 involved in getting a particular piece of code running. Alternatively, you
 can of course define setup and teardown functions when instantiating the
 suite.

That's not the problem, the problem is that I need to run the
setup/teardown many times. I could split the tests I have now so that
each test is a separate doctest file, and use the setup/teardown
during instantiation, but is it really reasonable to have say, ten
doctest files that do very similar things, all saying things like The
calendar can migrate from the old calendar ever if it has french
accents, and then having the code for this (which is rather long).
That's not much of a doctest. It is definitely not any sort of
documentation. And, it also means that the setup and teardown steps
are NOT useful to the reader. In fact, it is my now repeatedly stated
opinion that several of CPSSharedCalendars tests are not useful to any
reader. They make sure it actually works, they don't provide any
insight into how to use CPSSharedCalendar. You use it by clicking
around on a website, not by writing python code.

I think you are stuck in library-development mode. ;-) But this
proposal isn't only made for libraries, right? It seems to me that
several of the requirements have to do with site integration and such,
things you get for finished products that you can install and use.
They need to be documented of course, and part of that documentation
can very well be in doctest format, but most of the documentation that
is needed (and currently sadly missing) is for end-users. A doctest
requirement wouldn't help the documentation in any reasonable way, and
it would slow down the test-writing.


And why should I pretend that the migration tests are documentation of
how to write a migration, when it is in fact a test that migration
works even with non-ascii characters? There is today no documentation
of how to write migration code from the old calendar to the new,
because that code is written. I don't need to document how to do it
again. :-)

 SchoolTool solves this by simply storing an old ZODB database and then run the
 generation scripts, because this is what happens in the real world anyways.

Well, I solve it by not using doctests, which I like better, because
it enables me to easily add and change the tests according to me
finding something new that needs testing, withouthaving to change a
binary file which can't be diffed. And how do you fit the ZODB into
the documentation? :-)

Tests are not always documentation. It's nice if they can be both, but
I don't think it's a good idea to require it to be both, because quite
often, it isn't.

--
Lennart Regebro, Nuxeo http://www.nuxeo.com/
CPS Content Management http://www.cps-project.org/
___

Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Lennart Regebro
On 2/21/06, Benji York [EMAIL PROTECTED] wrote:
 Again, not for the average user, bit if the hypothetical six month bug
 were found and those tests were decently written doctests, it would be
 much easier for the maintainer to follow what the test was doing.

Maybe an example can help. Because I don't understand why...

def testQuarterlyRecurringEvents(self):
caltool = self.portal.portal_cpscalendar
mgrcal = caltool.getHomeCalendarObject(manager_id)

event = Event('quarterly',
  title=This is a quarterly event,
  attendees=None,
  from_date=DateTime(2005, 4, 1, 8, 0),
  to_date=DateTime(2005, 4, 1, 10, 0),
  event_type='event_recurring',
  recurrence_period='period_quarterly')
mgrcal._setObject('quarterly', event)

self._upgrade()

sm = zapi.getUtility(IStorageManager, context=self.portal)
event = sm.getEvent('quarterly')
recurrence = event.recurrence

self.failUnless(interfaces.IMonthlyRecurrenceRule.providedBy(recurrence))
self.failUnlessEqual(recurrence.interval, 3)
self.failUnlessEqual(recurrence.until, None)

...is harder to understand than...

  The migration also handles quarterly recurring events

   caltool = self.portal.portal_cpscalendar
   mgrcal = caltool.getHomeCalendarObject(manager_id)
   event = Event('quarterly',
  ...title=This is a quarterly event,
  ...attendees=None,
  ...from_date=DateTime(2005, 4, 1, 8, 0),
  ...to_date=DateTime(2005, 4, 1, 10, 0),
  ...event_type='event_recurring',
  ...recurrence_period='period_quarterly')
   mgrcal._setObject('quarterly', event)
   if 'install_cpssharedcalendar' not in self.portal.objectIds():
  ...  script = ExternalMethod('install_cpssharedcalendar', '',
  ...  'CPSSharedCalendar.install',
  ...  'install')
   self.portal._setObject('install_cpssharedcalendar', script)
   script = self.portal['install_cpssharedcalendar']
   script()
   transaction.commit()
   if 'migrate_cpscalendar' not in self.portal.objectIds():
  ...  script = ExternalMethod('migrate_cpscalendar', '',
  ...  'CPSSharedCalendar.upgrade',
  ...  'migrate_from_cpscalendar')
   self.portal._setObject('migrate_cpscalendar', script)
   script = self.portal['migrate_cpscalendar']
   script()
   sm = zapi.getUtility(IStorageManager, context=self.portal)
   event = sm.getEvent('quarterly')
   recurrence = event.recurrence
   interfaces.IMonthlyRecurrenceRule.providedBy(recurrence)
  True
   recurrence.interval
  3
   recurrence.until
  None

...for a maintainer. I also completely fail to see how the latter
format gives anybody any extra insight, or how this provides any sort
of documentation.

(btw, through all this, I assume that Jims fix for the doctest
debugging problem that he mentioned did work, and that you now can
insert an import pdb;pdb.set_trace() in the middle of the doctests.
Right?)

--
Lennart Regebro, Nuxeo http://www.nuxeo.com/
CPS Content Management http://www.cps-project.org/
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Stephan Richter
On Tuesday 21 February 2006 11:59, Lennart Regebro wrote:
  Well, a doctest could explain the migration test and what has changed.

 Nothing changed. It's two different calendar products. It's
 basically an import/export from the old calendar to the new. There is
 nothing to explain.

If nothing changed, then you need no tests. Of course things changed. The data 
structures changed. And the migration test is a wonderful opportunity to 
document those data structure changes.

 are NOT useful to the reader. In fact, it is my now repeatedly stated
 opinion that several of CPSSharedCalendars tests are not useful to any
 reader. They make sure it actually works, they don't provide any
 insight into how to use CPSSharedCalendar. You use it by clicking
 around on a website, not by writing python code.

Of course, the tests are not useful for the end user, but they are useful for 
other programmers. Let's say tomorrow you decide to leave Nuxeo and another 
programmer has to work on your code. It will be much easier to read doctests 
than unit tests.

 I think you are stuck in library-development mode. ;-)

I don't think so. I am developing SchoolTool and the team there is totally 
committed to doctests. There are no code unite tests whatsoever. Not that I 
think that SchoolTool's tests are always perfect, but they get pretty close.

 But this 
 proposal isn't only made for libraries, right? It seems to me that
 several of the requirements have to do with site integration and such,
 things you get for finished products that you can install and use.
 They need to be documented of course, and part of that documentation
 can very well be in doctest format, but most of the documentation that
 is needed (and currently sadly missing) is for end-users. A doctest
 requirement wouldn't help the documentation in any reasonable way, and
 it would slow down the test-writing.

Doctests can be used to address both type of users. I usually find that 
functional tests are for end users and unit/integration doctests for 
developers.

 And why should I pretend that the migration tests are documentation of
 how to write a migration, when it is in fact a test that migration
 works even with non-ascii characters? There is today no documentation
 of how to write migration code from the old calendar to the new,
 because that code is written. I don't need to document how to do it
 again. :-)

You are not documenting how to write migration code, but you document how you 
do a particular migration and what data is being migrated. This is important 
to several audiences.

  SchoolTool solves this by simply storing an old ZODB database and then
  run the generation scripts, because this is what happens in the real
  world anyways.

 Well, I solve it by not using doctests, which I like better, because
 it enables me to easily add and change the tests according to me
 finding something new that needs testing, withouthaving to change a
 binary file which can't be diffed. And how do you fit the ZODB into
 the documentation? :-)

Yeah, but the cost is that you keep an old setup around, which is not 
realistic.

 Tests are not always documentation. It's nice if they can be both, but
 I don't think it's a good idea to require it to be both, because quite
 often, it isn't.

Tests *should* always be documentation. That's the goal of using doctests. XP 
clearly wants tests and documentation be the same thing and task. Doctests 
have given us the technology to fulfill this requirement.

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Benji York

Lennart Regebro wrote:

On 2/21/06, Benji York [EMAIL PROTECTED] wrote:


Again, not for the average user, bit if the hypothetical six month bug
were found and those tests were decently written doctests, it would be
much easier for the maintainer to follow what the test was doing.


Maybe an example can help. Because I don't understand why...


snip code

I'm not saying that the *format* makes them easier to maintain, I'm 
saying that the format *encourages* the developer to make more 
maintainable tests.  If your second example had the customary 
interstitial paragraphs talking about what's going on it would be easier 
for a maintainer to understand.



(btw, through all this, I assume that Jims fix for the doctest
debugging problem that he mentioned did work, and that you now can
insert an import pdb;pdb.set_trace() in the middle of the doctests.
Right?)


Yep.
--
Benji York
Senior Software Engineer
Zope Corporation
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Lennart Regebro
On 2/21/06, Stephan Richter [EMAIL PROTECTED] wrote:
 If nothing changed, then you need no tests. Of course things changed. The data
 structures changed. And the migration test is a wonderful opportunity to
 document those data structure changes.

*sigh* Could you please try to read what I wrote? There is no change.
It's two completely different products, one old, one new. The
datastructure did not change it war completely and fully replaced.
There are no similarities at all, and hence you can't document any
change. You can only document the old structure (done in the old
product) and the new structure (done in the new product).

 Of course, the tests are not useful for the end user, but they are useful for
 other programmers. Let's say tomorrow you decide to leave Nuxeo and another
 programmer has to work on your code. It will be much easier to read doctests
 than unit tests.

No it won't. See my answer to Benji.

 Doctests can be used to address both type of users. I usually find that
 functional tests are for end users and unit/integration doctests for
 developers.

Its perfectly possible to write functional doctests, I assumed you
ment functional tests should be doctests too.

 You are not documenting how to write migration code, but you document how you
 do a particular migration and what data is being migrated.

Uhm. No, not via the tests no. The migration code, that actually does
the migration, and which therefore is the interesting part of this is
in a python script. It's reasonably documented, sure, but the tests do
not, and can not give any insight into this, as all the tests do is
call this method.

  Well, I solve it by not using doctests, which I like better, because
  it enables me to easily add and change the tests according to me
  finding something new that needs testing, withouthaving to change a
  binary file which can't be diffed. And how do you fit the ZODB into
  the documentation? :-)

 Yeah, but the cost is that you keep an old setup around, which is not
 realistic.

Huh? Ah, you are still talking about upgrading from one version to
another. Well, yes, in that case the ZODB might be a good solution.

  Tests are not always documentation. It's nice if they can be both, but
  I don't think it's a good idea to require it to be both, because quite
  often, it isn't.

 Tests *should* always be documentation.

No, you are wrong, and have now in several emails explained why, and
given an example. I have nothing further to add.

--
Lennart Regebro, Nuxeo http://www.nuxeo.com/
CPS Content Management http://www.cps-project.org/
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com



Re: [Zope3-dev] The Zope Software Certification Program and Common Repository Proposal

2006-02-21 Thread Stephan Richter
On Tuesday 21 February 2006 12:14, Lennart Regebro wrote:
 ...for a maintainer. I also completely fail to see how the latter
 format gives anybody any extra insight, or how this provides any sort
 of documentation.

Of course this does not provide any benefit, because you did not document the 
steps at all. Here is how I would document this:

  The migration also handles quarterly recurring events. The first step is to 
create a calendar

   caltool = self.portal.portal_cpscalendar
   mgrcal = caltool.getHomeCalendarObject(manager_id)

and add an event to it that is repeating qurterly:

   event = Event('quarterly',
  ...                    title=This is a quarterly event,
  ...                    attendees=None,
  ...                    from_date=DateTime(2005, 4, 1, 8, 0),
  ...                    to_date=DateTime(2005, 4, 1, 10, 0),
  ...                    event_type='event_recurring',
  ...                    recurrence_period='period_quarterly')
   mgrcal._setObject('quarterly', event)

[In the following part I do not understand why you have an if statement. That 
smells fishy for a test. In fact, if the if statement is false, the test will 
fail.]
We also need to make sure that the CPS Shared Calendar tool is registered with 
the CMF.

   if 'install_cpssharedcalendar' not in self.portal.objectIds():
  ...          script = ExternalMethod('install_cpssharedcalendar', '',
  ...                                  'CPSSharedCalendar.install',
  ...                                  'install')
   self.portal._setObject('install_cpssharedcalendar', script)
   script = self.portal['install_cpssharedcalendar']
   script()
   transaction.commit()

Note: It was important to commit the transaction at this point, so that the 
objects are assigned an oid and are correctly indexed.

[This part of the test will also fail, if the condition is false; or even 
worse, if the condition is false, but the previous condition was true, it 
will fail in unexpected ways, because you reuse the same script variable.]

If a migration script is provided, then install it as a tool as well.

   if 'migrate_cpscalendar' not in self.portal.objectIds():
  ...      script = ExternalMethod('migrate_cpscalendar', '',
  ...                                  'CPSSharedCalendar.upgrade',
  ...                                  'migrate_from_cpscalendar')
   self.portal._setObject('migrate_cpscalendar', script)
   script = self.portal['migrate_cpscalendar']
   script()

[I have no clue what the storage manager has to do with the calendar. This 
would require some explanation as well.]

Once all the tools are registered, we can use the storage manager utility to 
retrieve the event:

   sm = zapi.getUtility(IStorageManager, context=self.portal)
   event = sm.getEvent('quarterly')

As you can see, the event is recurring every three months (or quarterly):

   recurrence = event.recurrence
   interfaces.IMonthlyRecurrenceRule.providedBy(recurrence)
  True
   recurrence.interval
  3
   recurrence.until
  None

Some comments about the code above. While writing the documentation, I noticed 
that

(1) this test makes no sense, since you are never using the variable `mgrcal`,

(2) a lot of magic is happening, because it is not obvious at all how the 
event goes from the calendar to the storage manager utility.

Even if this test is correct as shown, the two points above would need a lot 
of explaining for someone to understand what's going on there.


 (btw, through all this, I assume that Jims fix for the doctest
 debugging problem that he mentioned did work, and that you now can
 insert an import pdb;pdb.set_trace() in the middle of the doctests.
 Right?)

This has been fixed since ages. :-)

Regards,
Stephan
-- 
Stephan Richter
CBU Physics  Chemistry (B.S.) / Tufts Physics (Ph.D. student)
Web2k - Web Software Design, Development and Training
___
Zope3-dev mailing list
Zope3-dev@zope.org
Unsub: http://mail.zope.org/mailman/options/zope3-dev/archive%40mail-archive.com