Re: Changing the Proton build system to accommodate jni bindings

2013-01-25 Thread Ken Giusti
Thank you Phil, for providing that summary.  And, although I don't want to toss 
around a little gasoline (petrol), I have to ask:


 === System Tests ===
 
 Returning to another discussion point, note that the proton/tests
 folder
 will contain both Python and Java test suites, each of which can be
 run
 against either proton-c or proton-j. Their division of responsibility
 is
 something that will emerge over time, and does not need to be fully
 resolved right now.
 


I'd like to know if the intent is to keep both the Java and Python test suites 
synchronized.  That is, when I write a new python test, am I expected to 
provide a similar test in Java?

If we hold off that decision for later the suites will diverge and getting them 
re-synced will be painful.

-K

- Original Message -
 As promised, here is a proper write-up of how we're planning to
 modify the
 Proton build system.
 
 
 === Requirements ===
 I've updated the Proton build system requirements wiki page:
 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
 
 
 === Proposed solution ===
 
 The proposed solution changes Proton in two important ways.
 
 1. proton will built and distributed as a single project.
 This means that the released souce tarball will be created from the
 top
 level, not from the proton-c sub-folder.
 
 To support this, a CMakelists.txt file will be created at the top
 level.
 Therefore, an existing end user wishing to build proton-c will follow
 the
 usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
 proton/build ; cd proton/build ; cmake .. ; make install).
 
 
 2. Both proton-c and proton-j will be buildable and testable using
 cmake,
 although proton-j will retain Maven support.
 
 Expanding our cmake build to include proton-j solves two problems,
 namely:
 (i) Satisfying the JNI binding's dependency on proton-api.jar and
 libqpid-proton.so.
 (ii) Allowing RHEL and Fedora users to build proton-j despite the
 lack of a
 recent Maven version on their platforms.
 
 The cmake Java build will assume that the developer has already
 downloaded
 dependencies (e.g. bouncycastle.jar for SSL support), and will not
 emulate
 Maven's ability to fetch them from remote repositories. This support
 could
 be added in the future if it is deemed necessary.
 

 
 Phil
 
 
 On 24 January 2013 21:05, Phil Harvey p...@philharveyonline.com
 wrote:
 
  In case anyone thinks World War 3 is about to break out, an
  approach
  emerged during the Proton task force meeting today that is probably
  acceptable to all the main protagonists.
 
  A brief summary follows. I've tried to avoid too many value
  judgements in
  the summary, to avoid fanning the embers.
 
  - We'll add a cmake file at the top level of the proton project,
  which
  will be able to build all of the Java and C code in one pass. This
  will be
  useful both for building the JNI bindings without undue
  contortions, and
  for building the entire project in Maven-less environments. The
  Maven pom
  files will remain, for use in more mainstream Java deployments.
 
  - No duplication of the proton-api code.
 
  - The source tarball will in the future be generated from the top
  level,
  rather than from the proton-c folder. This avoids the issues
  previously
  discussed whereby JNI bindings can't be built because proton-api is
  missing from the tarball. The new top level cmake file will mean
  that the
  commands required to build proton-c will be unchanged.
 
  I'll write up some proper notes this evening if I get a chance, and
  will
  update the wiki.
 
  Going back to Rajith's point earlier in this discussion, it is of
  course
  important that decisions like this are debated and agreed in
  public, so
  rest assured this will all be written up and Jira'd properly so
  that
  everyone has the opportunity to comment.
 
  Phil
  On Jan 24, 2013 3:01 PM, Rob Godfrey rob.j.godf...@gmail.com
  wrote:
 
  On 24 January 2013 15:49, Rafael Schloming r...@alum.mit.edu
  wrote:
   On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey
   rob.j.godf...@gmail.com
  wrote:
  
   Firstly I think it would be helpful if you made clear the
   requirements
  you
   consider to be essential, nice to have,  unimportant and/or
  detrimental.
  
   On 23 January 2013 20:17, Rafael Schloming r...@alum.mit.edu
   wrote:
  
On Wed, Jan 23, 2013 at 8:01 AM, Keith W
keith.w...@gmail.com
  wrote:
   
 
  [snip]
 
Given the above workflow, it seems like even with a
relatively small
   change
like adding a getter, the scripted portion of the syncing
effort is
  going
to be vanishingly small compared to the manual process of
syncing the
implementations. Perhaps I'm just envisioning a different
workflow
  than
you, or maybe I'm missing some important scenarios. Could you
  describe
   what
workflow(s) you envision and how the sync process would
impacting
  your
productivity?
   
   
   I differ strongly 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-25 Thread Darryl L. Pierce
On Fri, Jan 25, 2013 at 12:07:46PM -0500, Ken Giusti wrote:
 I'd like to know if the intent is to keep both the Java and Python test 
 suites synchronized.  That is, when I write a new python test, am I expected 
 to provide a similar test in Java?
 
 If we hold off that decision for later the suites will diverge and getting 
 them re-synced will be painful.

Are the tests using any sort of framework? I've mentioned in past (and
have on the back burner of my todos) to look into using Cucumber for
tests. They let you define in a language-agnostic way functional tests,
defining steps for tests. Then you write language-specific
implementations of the steps.

When a language is missing a step definition you get a test failure that
outputs the template for the step so you can cut-and-paste it into the
language's definitions and fill it in.

Perhaps it's time to bring that out for us to talk about?

-- 
Darryl L. Pierce, Sr. Software Engineer @ Red Hat, Inc.
Delivering value year after year.
Red Hat ranks #1 in value among software vendors.
http://www.redhat.com/promo/vendor/



pgpDMVVJdL65k.pgp
Description: PGP signature


Re: Changing the Proton build system to accommodate jni bindings

2013-01-25 Thread Rafael Schloming
On Fri, Jan 25, 2013 at 9:56 AM, Phil Harvey p...@philharveyonline.comwrote:

 As promised, here is a proper write-up of how we're planning to modify the
 Proton build system.


 === Requirements ===
 I've updated the Proton build system requirements wiki page:

 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements


 === Proposed solution ===

 The proposed solution changes Proton in two important ways.

 1. proton will built and distributed as a single project.
 This means that the released souce tarball will be created from the top
 level, not from the proton-c sub-folder.

 To support this, a CMakelists.txt file will be created at the top level.
 Therefore, an existing end user wishing to build proton-c will follow the
 usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
 proton/build ; cd proton/build ; cmake .. ; make install).


 2. Both proton-c and proton-j will be buildable and testable using cmake,
 although proton-j will retain Maven support.

 Expanding our cmake build to include proton-j solves two problems, namely:
 (i) Satisfying the JNI binding's dependency on proton-api.jar and
 libqpid-proton.so.
 (ii) Allowing RHEL and Fedora users to build proton-j despite the lack of a
 recent Maven version on their platforms.

 The cmake Java build will assume that the developer has already downloaded
 dependencies (e.g. bouncycastle.jar for SSL support), and will not emulate
 Maven's ability to fetch them from remote repositories. This support could
 be added in the future if it is deemed necessary.


Two comments here. First, I think it's important to be clear that as part
of this proposal the root of the tree will emphasize the cmake build.  As
discussed on the call, the README will headline with the cmake build
instructions with the maven build as more of a secondary footnote. The
reasoning here being that the C code depends significantly more on manual
install testing than the Java, and the source code is really the primary
distribution mechanism for C, whereas most people will probably never
bother with the source when consuming the Java code.

Secondly, I suspect it's probably technically feasible (and possibly quite
natural) to structure the build such that an export of the pure Java
portion of the tree would constitute a completely functioning maven build.
This would leave us with the option to produce two source tarballs if we
wished where the Java source tarball was simply a nested subset of the full
source tarball. I don't know that we necessarily need this, but it might be
good to keep the option open if in the future we find the cmake oriented
source tarball is an obstacle for Java users.



 === System Tests ===

 Returning to another discussion point, note that the proton/tests folder
 will contain both Python and Java test suites, each of which can be run
 against either proton-c or proton-j. Their division of responsibility is
 something that will emerge over time, and does not need to be fully
 resolved right now.


As a final note on this point, I don't think this needs to be resolved to
proceed with the build, however I do think it is an important discussion to
have sooner rather than later if a significant investment in new tests are
planned.

--Rafael


Re: Changing the Proton build system to accommodate jni bindings

2013-01-25 Thread Rob Godfrey
On 25 January 2013 18:07, Ken Giusti kgiu...@redhat.com wrote:
 Thank you Phil, for providing that summary.  And, although I don't want to 
 toss around a little gasoline (petrol), I have to ask:


 === System Tests ===

 Returning to another discussion point, note that the proton/tests
 folder
 will contain both Python and Java test suites, each of which can be
 run
 against either proton-c or proton-j. Their division of responsibility
 is
 something that will emerge over time, and does not need to be fully
 resolved right now.



 I'd like to know if the intent is to keep both the Java and Python test 
 suites synchronized.  That is, when I write a new python test, am I expected 
 to provide a similar test in Java?

 If we hold off that decision for later the suites will diverge and getting 
 them re-synced will be painful.

Since all the Python tests will continue to be run against the Java
implementation there will be no need to implement a test in Java if
you write it in Python.

We believe the Python tests will be runnable across more platforms
than the Java (since JVM support is not available on some hardware /
OS combinations), as such no Python test should be migrated to Java
since this would reduce coverage.

For reasons of developer efficiency / tool integration, Java tests
have advantages.  Rather than write unit tests which *only* run
against the Java code, we will be writing system tests that run
against the pure Java and the JNI bindings.  This will give us extra
test coverage of the C implementation (and the JNI binding code) in
addition to testing the Java implementation.

-- Rob


 -K

 - Original Message -
 As promised, here is a proper write-up of how we're planning to
 modify the
 Proton build system.


 === Requirements ===
 I've updated the Proton build system requirements wiki page:
 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements


 === Proposed solution ===

 The proposed solution changes Proton in two important ways.

 1. proton will built and distributed as a single project.
 This means that the released souce tarball will be created from the
 top
 level, not from the proton-c sub-folder.

 To support this, a CMakelists.txt file will be created at the top
 level.
 Therefore, an existing end user wishing to build proton-c will follow
 the
 usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
 proton/build ; cd proton/build ; cmake .. ; make install).


 2. Both proton-c and proton-j will be buildable and testable using
 cmake,
 although proton-j will retain Maven support.

 Expanding our cmake build to include proton-j solves two problems,
 namely:
 (i) Satisfying the JNI binding's dependency on proton-api.jar and
 libqpid-proton.so.
 (ii) Allowing RHEL and Fedora users to build proton-j despite the
 lack of a
 recent Maven version on their platforms.

 The cmake Java build will assume that the developer has already
 downloaded
 dependencies (e.g. bouncycastle.jar for SSL support), and will not
 emulate
 Maven's ability to fetch them from remote repositories. This support
 could
 be added in the future if it is deemed necessary.



 Phil


 On 24 January 2013 21:05, Phil Harvey p...@philharveyonline.com
 wrote:

  In case anyone thinks World War 3 is about to break out, an
  approach
  emerged during the Proton task force meeting today that is probably
  acceptable to all the main protagonists.
 
  A brief summary follows. I've tried to avoid too many value
  judgements in
  the summary, to avoid fanning the embers.
 
  - We'll add a cmake file at the top level of the proton project,
  which
  will be able to build all of the Java and C code in one pass. This
  will be
  useful both for building the JNI bindings without undue
  contortions, and
  for building the entire project in Maven-less environments. The
  Maven pom
  files will remain, for use in more mainstream Java deployments.
 
  - No duplication of the proton-api code.
 
  - The source tarball will in the future be generated from the top
  level,
  rather than from the proton-c folder. This avoids the issues
  previously
  discussed whereby JNI bindings can't be built because proton-api is
  missing from the tarball. The new top level cmake file will mean
  that the
  commands required to build proton-c will be unchanged.
 
  I'll write up some proper notes this evening if I get a chance, and
  will
  update the wiki.
 
  Going back to Rajith's point earlier in this discussion, it is of
  course
  important that decisions like this are debated and agreed in
  public, so
  rest assured this will all be written up and Jira'd properly so
  that
  everyone has the opportunity to comment.
 
  Phil
  On Jan 24, 2013 3:01 PM, Rob Godfrey rob.j.godf...@gmail.com
  wrote:
 
  On 24 January 2013 15:49, Rafael Schloming r...@alum.mit.edu
  wrote:
   On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey
   rob.j.godf...@gmail.com
  wrote:
  
   Firstly I think it would be helpful 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-24 Thread Rob Godfrey
On 24 January 2013 14:43, Rafael Schloming r...@alum.mit.edu wrote:
 On Wed, Jan 23, 2013 at 6:10 PM, Rob Godfrey rob.j.godf...@gmail.comwrote:

 On 23 January 2013 19:09, Rafael Schloming r...@alum.mit.edu wrote:

  I've added another wiki page that documents the proton release steps as
  best I can remember. I'll updated it more during the 0.4 release:
  https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps
 
  I think it's important to understand the overall release and testing
  process as it is a significant and perhaps underrepresented factor
 against
  which to measure any proposals. I believe the build system requirements
  documented below are inherently incomplete as they don't recognize the
 fact
  that the C build system is not just a developer productivity tool, it is
  also the installer for our end users. And before anyone says our end
 users
  will just use yum or equivalents, all those packaging tools *also* depend
  on our build system both directly, and because we can't even supply a
  release for packagers to consume without a reasonable amount of direct
  install testing. To a good extent a standard looking C source tarball is
  pretty much the equivalent of a jar or jar + pom file in the Java world,
  it's really the only platform independent means of distribution we have.
 
 
 It would be helpful if you could enumerate requirements which you believe
 to be missing and add the to the existing wiki page.  I don't think anyone
 is suggesting that the make install step should be broken in the source
 tarball, so it's a little unclear to me the problem you are trying to
 highlight above.


 I believe it was suggested at one point that we not have a C source tarball
 but just export the entire tree as a single source tarball. This strictly
 speaking would not break the make install step, however it would have a
 serious impact on our ability to leverage others to test the C impl. Anyone
 downloading this would need to understand a great deal about the dual
 nature of proton and how it is structured just in order to know that they
 can ignore half the tree. Compare that with a standard C source tarball
 where I can hand it off to someone who knows nothing about proton and
 simply tell them to do a make install and then run one test script. Given
 the latter structure to our release artifacts there are *significantly*
 more resources we have access to in order to perform the testing necessary
 to do a quality release.


I'm not sure requiring the ability to read a README file is really
going to have a huge impact on our ability to leverage others. I'm not
sure how widespread CMake use is (it's certainly less familiar to me
than autotools) - certainly I expect people unfamiliar with cmake will
have to read the README anyway.



 I'll take a stab at distilling some requirements out of the above scenario
 and sticking them onto the wiki page, but I actually think the scenario
 itself is more important than the requirements. There's no disagreement
 that it would be nice to have a very standard looking C source tarball with
 minimal dependencies and so forth that can be used in the above manner,
 it's simply the relative priority of the requirement when it conflicts with
 developer convenience that is a source of contention.



  It's also probably worth noting that perhaps the biggest issue with
 system
  tests in Java is not so much imposing maven on proton-c developers, but
 the and nottest against Java code running elsewhere.

  fact that Java may not be available on all the platforms that proton-c
  needs to be tested on. My primary concern here would be iOS. I'm not an
  expert, but my brief googling seems to suggest there would be significant
  issues.
 
 
 So, I think we probably need to consider what sort of tests are required,
 and which languages it is appropriate to write any particular type of test
 in.  For me tests in Java have some advantages over Python tests. Firstly
 they allow interop tests between the two implementations within the same
 process


 Can you elaborate on the benefits of this? It seems to me when it comes to
 interop testing that, to the extent you can get away with it, over the wire
 tests would be preferred. For example you could run proton-c on iOS via
 Java tests running on another system, to say nothing of testing against non
 proton implementations which would necessarily need to be over-the-wire.


The same arguments could be made about our existing tests. However
being able to run in process makes thing much easier to automate - no
need to randomly assign ports, communicate randomly assigned ports to
the other process, etc.


 and secondly they will also be able to be used against any future
 pure JavaScript Proton implementation (something we have planned to do but
 not yet embarked upon).


 This is also true of python tests. In fact the whole point of a python test
 suite is that you can run python, java, and javascript all within 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-24 Thread Rafael Schloming
On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey rob.j.godf...@gmail.comwrote:

 Firstly I think it would be helpful if you made clear the requirements you
 consider to be essential, nice to have,  unimportant and/or detrimental.

 On 23 January 2013 20:17, Rafael Schloming r...@alum.mit.edu wrote:

  On Wed, Jan 23, 2013 at 8:01 AM, Keith W keith.w...@gmail.com wrote:
 
   Essential
  
   3. To change proton-api, all that is required is to edit a Java file.
   - Developer productivity
  
 
  This seems to be kind of a leading requirement so to speak, or at least
  it's phrased a little bit oddly. That said I would never argue with it
 for
  most of the Java files, however in the case of the API files I don't see
  how you're ever going to be able to stop after just editing the API.
  Because we have two implementations, we're fundamentally stuck with
  manually syncing the implementations themselves whenever a change to the
  interface occurs. By comparison the highly automatable task of syncing
 the
  API files themselves seems quite small. I'm imagining most changes would
 go
  something like this, say we want to add a getter to the Message
 interface,
  we would need to:
 
 
 I think it's worth considering two different cases

 1) The API change is purely on the Java side... there is no corresponding
 change to the C API.  This may be to add some sort of convenience method,
 or simply a refactoring.

 In this case the developer making the change needs only to work in Java,
 there will be two implementations of the interface to change (in two
 different source locations) but is all rather trivial.


Is this actually possible? Wouldn't you at least need to build/run the C so
that you know there is actually no impact on the C impl? Even if you're
just calling it differently it could tickle a bug.

2) The API change affects both C and Java.

 In this case either a single developer has to commit to making the change
 in both the C and the Java, or the API change has to have been discussed
 before work commences and Java and C developers will need to work
 together.  If there is a single developer or developers working very
 closely together then I would suggest that the steps would in fact be:

   1. edit the Message interface /  edit the message.h file
   2. write and/or modify a test (and Python binding if necessary)
   3. edit the JNI binding to use the SWIG generated API
   4. edit the C / Pure Java
   5. run the tests against the C / Java
   (6. modify other bindings if necessary)

   repeat steps 4 and 5 until they pass.

 In the case where the C and Java developers are separated by time/distance
 then the build / tests on one side will be broken until the implementation
 catches up.  For the sake of politeness it is probably better to ensure
 that at all points the checked in code compiles even if the tests do not
 pass.  For cases where the changes to the API are additions then it should
 be relatively easy to make the changes in such a way as to simply have any
 tests relating to the new API be skipped. For cases where the C leads the
 Java, the Java implementation can simply throw
 UnsupportedOperationException or some such.  Where the Java leads the C we
 can throw said exception from the JNI binding code and leave the .h file
 unchanged until the C developer is ready to do the work.

 Only for cases where there is modification to existing APIs does it seem
 that there may be occaisins where we could not have a consistent build
 across components, and I would strongly recommend that any change where the
 Java and C are being worked on in such a fashion should take place on a
 branch, with a merge to trunk only occurring when all tests are passing
 against all implementations.


1. edit the Message interface
2. write and/or possibly modify a test
3. edit the java Message implementation
4. run the tests against java, if they don't pass go to step 2
5. now that the java impl passes the tests, run the tests against the C
  impl
6. if the sync check fails on the C build, run the sync script
7. edit the message.h file
8. edit the message.c implementation
9. edit the adapter layer between the C API and the Java interfaces
10. run the tests against the C, if they don't pass go to step 8
11. run the tests against both, just to be sure
12. check in
 
  Given the above workflow, it seems like even with a relatively small
 change
  like adding a getter, the scripted portion of the syncing effort is going
  to be vanishingly small compared to the manual process of syncing the
  implementations. Perhaps I'm just envisioning a different workflow than
  you, or maybe I'm missing some important scenarios. Could you describe
 what
  workflow(s) you envision and how the sync process would impacting your
  productivity?
 
 
 I differ strongly in my opinion here. Every time I need to drop out of my
 development environment to run some ad-hoc script then there is 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-23 Thread Keith W
 What are people's views on the relative priority of these requirements?
 Are there any I've missed?  I think answering these questions is a
 prerequisite for agreeing the technical solution.

With the aim of stimulating discussion regarding our requirements and
to reach a consensus, I've classified each of the proposed
requirements into whether I believe each is essential, neutral or
detrimental to the smooth development of Proton.

(proposed requirement numbers from
https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
)

Essential

3. To change proton-api, all that is required is to edit a Java file.
- Developer productivity

4. To switch to a particular SVN revision, simple SVN commands are run
(e.g. svn switch or svn update)
- Developer productivity

5. proton-c can be built, excluding its JNI binding, without requiring
non-standard tools*
6. proton-c can be built, excluding its JNI binding, from a standalone
checkout of the proton-c directory
- Developer productivity / tool familiarity

Neutral

1. A tarball source release of proton-c can be built by a user
without an external dependency on any other part of proton, e.g.
proton-api.
2. The aforementioned proton-c tarball release can be produced by
performing a simple svn export of proton-c.
- If I were building proton-c for my platform for tarball, I would
also want to run the tests to be sure proton-c functions correctly.
For this reason I question the usefulness of a proton-c tarball.  I
would want a tarball that included the whole tree including the tests.

7. Proton-c can be built without requiring non-standard tools*
9. Proton-c can be tested without requiring non-standard tools*
 - If we can achieve this without introducing too much complexity,
reinventing too many wheels and the result is portable across all
target platforms.

Detrimental

8. proton-c can be built from a standalone checkout of the proton-c
directory
 - I think that all proton developers who are changing either the C or
Java implementations should be running the system tests before each
commit.  If they are changing system tests then they need to run
against both implementations before each commit.

On 22 January 2013 17:09, Rafael Schloming r...@alum.mit.edu wrote:
 Thanks for posting this, I think it's a very useful step. I'd suggest
 adding another Stakeholder -- someone testing a release artifact. Rob makes
 a good point that the release manager is a distinct view, but I think the
 desire to minimize deltas between the svn tree and the release artifacts is
 most directly motivated by my experience *testing* release artifacts. I
 remember going through qpid releases in the old days and having the very
 unpleasant experience of trying to remember from 8 or 10 months ago how
 exactly stuff worked in the release artifact as compared to the build tree.
 I very much like the fact that with a simple export I can be highly
 confident that my experience of stuff working in my checkout translates
 well to the release artifacts and testing them is a very familiar, quick,
 and easy process.

 Strictly speaking I think the requirement from a release management
 perspective is purely that we can produce releases at the rate we need, so
 it has to be quick and easy and robust to different environments, but I
 wouldn't say the export thing is a requirement of the release manager
 per/se. As many have pointed out we already use a script for this and it
 can remap things quite easily.

 I have more thoughts on the release process, especially as it is somewhat
 expanded now to produce java binaries and will need to expand more to
 include windows stuff, however I need to run an errand at the moment. I'll
 post and/or comment on the page later though.

 --Rafael

 I very much like the fact that our current release artifacts are trivial

 On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey 
 p...@philharveyonline.comwrote:

 It sounds like we're still a little way away from reaching a consensus.  As
 a step towards this, I would like to clarify the relative priority of the
 various requirements that have come up.  I've therefore created a page on
 the wiki that lists them, with a child page briefly describing the various
 proposals.


 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements

 What are people's views on the relative priority of these requirements?
 Are there any I've missed?  I think answering these questions is a
 prerequisite for agreeing the technical solution.

 Phil


Re: Changing the Proton build system to accommodate jni bindings

2013-01-23 Thread Rafael Schloming
I've added another wiki page that documents the proton release steps as
best I can remember. I'll updated it more during the 0.4 release:
https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps

I think it's important to understand the overall release and testing
process as it is a significant and perhaps underrepresented factor against
which to measure any proposals. I believe the build system requirements
documented below are inherently incomplete as they don't recognize the fact
that the C build system is not just a developer productivity tool, it is
also the installer for our end users. And before anyone says our end users
will just use yum or equivalents, all those packaging tools *also* depend
on our build system both directly, and because we can't even supply a
release for packagers to consume without a reasonable amount of direct
install testing. To a good extent a standard looking C source tarball is
pretty much the equivalent of a jar or jar + pom file in the Java world,
it's really the only platform independent means of distribution we have.

It's also probably worth noting that perhaps the biggest issue with system
tests in Java is not so much imposing maven on proton-c developers, but the
fact that Java may not be available on all the platforms that proton-c
needs to be tested on. My primary concern here would be iOS. I'm not an
expert, but my brief googling seems to suggest there would be significant
issues.

--Rafael

On Wed, Jan 23, 2013 at 12:45 PM, Phil Harvey p...@philharveyonline.comwrote:

 In case anyone has missed it, note that Gordon has added some relevant
 comments directly on the wiki pages:


 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements

 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals

 Phil


 On 23 January 2013 13:01, Keith W keith.w...@gmail.com wrote:

   What are people's views on the relative priority of these
 requirements?
   Are there any I've missed?  I think answering these questions is a
   prerequisite for agreeing the technical solution.
 
  With the aim of stimulating discussion regarding our requirements and
  to reach a consensus, I've classified each of the proposed
  requirements into whether I believe each is essential, neutral or
  detrimental to the smooth development of Proton.
 
  (proposed requirement numbers from
 
 
 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
  )
 
  Essential
 
  3. To change proton-api, all that is required is to edit a Java file.
  - Developer productivity
 
  4. To switch to a particular SVN revision, simple SVN commands are run
  (e.g. svn switch or svn update)
  - Developer productivity
 
  5. proton-c can be built, excluding its JNI binding, without requiring
  non-standard tools*
  6. proton-c can be built, excluding its JNI binding, from a standalone
  checkout of the proton-c directory
  - Developer productivity / tool familiarity
 
  Neutral
 
  1. A tarball source release of proton-c can be built by a user
  without an external dependency on any other part of proton, e.g.
  proton-api.
  2. The aforementioned proton-c tarball release can be produced by
  performing a simple svn export of proton-c.
  - If I were building proton-c for my platform for tarball, I would
  also want to run the tests to be sure proton-c functions correctly.
  For this reason I question the usefulness of a proton-c tarball.  I
  would want a tarball that included the whole tree including the tests.
 
  7. Proton-c can be built without requiring non-standard tools*
  9. Proton-c can be tested without requiring non-standard tools*
   - If we can achieve this without introducing too much complexity,
  reinventing too many wheels and the result is portable across all
  target platforms.
 
  Detrimental
 
  8. proton-c can be built from a standalone checkout of the proton-c
  directory
   - I think that all proton developers who are changing either the C or
  Java implementations should be running the system tests before each
  commit.  If they are changing system tests then they need to run
  against both implementations before each commit.
 
  On 22 January 2013 17:09, Rafael Schloming r...@alum.mit.edu wrote:
   Thanks for posting this, I think it's a very useful step. I'd suggest
   adding another Stakeholder -- someone testing a release artifact. Rob
  makes
   a good point that the release manager is a distinct view, but I think
 the
   desire to minimize deltas between the svn tree and the release
 artifacts
  is
   most directly motivated by my experience *testing* release artifacts. I
   remember going through qpid releases in the old days and having the
 very
   unpleasant experience of trying to remember from 8 or 10 months ago how
   exactly stuff worked in the release artifact as compared to the build
  tree.
   I very much like the fact that with a simple export I can be highly
   confident that my experience of stuff 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-23 Thread Rafael Schloming
On Wed, Jan 23, 2013 at 8:01 AM, Keith W keith.w...@gmail.com wrote:

 Essential

 3. To change proton-api, all that is required is to edit a Java file.
 - Developer productivity


This seems to be kind of a leading requirement so to speak, or at least
it's phrased a little bit oddly. That said I would never argue with it for
most of the Java files, however in the case of the API files I don't see
how you're ever going to be able to stop after just editing the API.
Because we have two implementations, we're fundamentally stuck with
manually syncing the implementations themselves whenever a change to the
interface occurs. By comparison the highly automatable task of syncing the
API files themselves seems quite small. I'm imagining most changes would go
something like this, say we want to add a getter to the Message interface,
we would need to:

  1. edit the Message interface
  2. write and/or possibly modify a test
  3. edit the java Message implementation
  4. run the tests against java, if they don't pass go to step 2
  5. now that the java impl passes the tests, run the tests against the C
impl
  6. if the sync check fails on the C build, run the sync script
  7. edit the message.h file
  8. edit the message.c implementation
  9. edit the adapter layer between the C API and the Java interfaces
  10. run the tests against the C, if they don't pass go to step 8
  11. run the tests against both, just to be sure
  12. check in

Given the above workflow, it seems like even with a relatively small change
like adding a getter, the scripted portion of the syncing effort is going
to be vanishingly small compared to the manual process of syncing the
implementations. Perhaps I'm just envisioning a different workflow than
you, or maybe I'm missing some important scenarios. Could you describe what
workflow(s) you envision and how the sync process would impacting your
productivity?


 4. To switch to a particular SVN revision, simple SVN commands are run
 (e.g. svn switch or svn update)
 - Developer productivity

 5. proton-c can be built, excluding its JNI binding, without requiring
 non-standard tools*
 6. proton-c can be built, excluding its JNI binding, from a standalone
 checkout of the proton-c directory
 - Developer productivity / tool familiarity

 Neutral

 1. A tarball source release of proton-c can be built by a user
 without an external dependency on any other part of proton, e.g.
 proton-api.
 2. The aforementioned proton-c tarball release can be produced by
 performing a simple svn export of proton-c.
 - If I were building proton-c for my platform for tarball, I would
 also want to run the tests to be sure proton-c functions correctly.
 For this reason I question the usefulness of a proton-c tarball.  I
 would want a tarball that included the whole tree including the tests.


The proton-c tarball does include the tests directory. The tests directory
is just pure python code, so once you've installed proton-c onto your
system, you can run any of the proton tests just like you would run any
normal python script. As I mentioned in another post, the inclusion of
tests under both proton-c and proton-j is the one deviation in directory
structure from a pure svn export, and even this much is kindof a pain as
there is no way for the README to actually describe things properly without
being broken in either the svn tree or in the release artifact.



 7. Proton-c can be built without requiring non-standard tools*
 9. Proton-c can be tested without requiring non-standard tools*
  - If we can achieve this without introducing too much complexity,
 reinventing too many wheels and the result is portable across all
 target platforms.

 Detrimental

 8. proton-c can be built from a standalone checkout of the proton-c
 directory
  - I think that all proton developers who are changing either the C or
 Java implementations should be running the system tests before each
 commit.  If they are changing system tests then they need to run
 against both implementations before each commit.


Doesn't this conflict pretty directly with 6?


Re: Changing the Proton build system to accommodate jni bindings

2013-01-23 Thread Rob Godfrey
On 23 January 2013 19:09, Rafael Schloming r...@alum.mit.edu wrote:

 I've added another wiki page that documents the proton release steps as
 best I can remember. I'll updated it more during the 0.4 release:
 https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps

 I think it's important to understand the overall release and testing
 process as it is a significant and perhaps underrepresented factor against
 which to measure any proposals. I believe the build system requirements
 documented below are inherently incomplete as they don't recognize the fact
 that the C build system is not just a developer productivity tool, it is
 also the installer for our end users. And before anyone says our end users
 will just use yum or equivalents, all those packaging tools *also* depend
 on our build system both directly, and because we can't even supply a
 release for packagers to consume without a reasonable amount of direct
 install testing. To a good extent a standard looking C source tarball is
 pretty much the equivalent of a jar or jar + pom file in the Java world,
 it's really the only platform independent means of distribution we have.


It would be helpful if you could enumerate requirements which you believe
to be missing and add the to the existing wiki page.  I don't think anyone
is suggesting that the make install step should be broken in the source
tarball, so it's a little unclear to me the problem you are trying to
highlight above.


 It's also probably worth noting that perhaps the biggest issue with system
 tests in Java is not so much imposing maven on proton-c developers, but the
 fact that Java may not be available on all the platforms that proton-c
 needs to be tested on. My primary concern here would be iOS. I'm not an
 expert, but my brief googling seems to suggest there would be significant
 issues.


So, I think we probably need to consider what sort of tests are required,
and which languages it is appropriate to write any particular type of test
in.  For me tests in Java have some advantages over Python tests. Firstly
they allow interop tests between the two implementations within the same
process and secondly they will also be able to be used against any future
pure JavaScript Proton implementation (something we have planned to do but
not yet embarked upon).  A third issue for me is that when we start to
attempt more granular testing of things such as error handling, I will want
to ensure that the user experience is identical between the pure Java and
JNI binding implementations of the Java Proton API... if the tests are
being run through a second translation into the Python API then this is not
easily verifiable.

As a final aside, on the standard development environment many of us have
to work with, the installed version on Python is too old to support the
current Python client (lack of UUID, etc).

Personally I think the more tests we have the better, and it's more
important to encourage people to write tests than to force the use of a
particular language to write them in.  I'd also suggest that we should be
writing at least some tests for each of the idiomatic bindings.

-- Rob

--Rafael

 On Wed, Jan 23, 2013 at 12:45 PM, Phil Harvey p...@philharveyonline.com
 wrote:

  In case anyone has missed it, note that Gordon has added some relevant
  comments directly on the wiki pages:
 
 
 
 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
 
 
 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals
 
  Phil
 
 
  On 23 January 2013 13:01, Keith W keith.w...@gmail.com wrote:
 
What are people's views on the relative priority of these
  requirements?
Are there any I've missed?  I think answering these questions is a
prerequisite for agreeing the technical solution.
  
   With the aim of stimulating discussion regarding our requirements and
   to reach a consensus, I've classified each of the proposed
   requirements into whether I believe each is essential, neutral or
   detrimental to the smooth development of Proton.
  
   (proposed requirement numbers from
  
  
 
 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
   )
  
   Essential
  
   3. To change proton-api, all that is required is to edit a Java file.
   - Developer productivity
  
   4. To switch to a particular SVN revision, simple SVN commands are run
   (e.g. svn switch or svn update)
   - Developer productivity
  
   5. proton-c can be built, excluding its JNI binding, without requiring
   non-standard tools*
   6. proton-c can be built, excluding its JNI binding, from a standalone
   checkout of the proton-c directory
   - Developer productivity / tool familiarity
  
   Neutral
  
   1. A tarball source release of proton-c can be built by a user
   without an external dependency on any other part of proton, e.g.
   proton-api.
   2. The aforementioned proton-c tarball release can be produced by
   performing a 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-23 Thread Rob Godfrey
Firstly I think it would be helpful if you made clear the requirements you
consider to be essential, nice to have,  unimportant and/or detrimental.

On 23 January 2013 20:17, Rafael Schloming r...@alum.mit.edu wrote:

 On Wed, Jan 23, 2013 at 8:01 AM, Keith W keith.w...@gmail.com wrote:

  Essential
 
  3. To change proton-api, all that is required is to edit a Java file.
  - Developer productivity
 

 This seems to be kind of a leading requirement so to speak, or at least
 it's phrased a little bit oddly. That said I would never argue with it for
 most of the Java files, however in the case of the API files I don't see
 how you're ever going to be able to stop after just editing the API.
 Because we have two implementations, we're fundamentally stuck with
 manually syncing the implementations themselves whenever a change to the
 interface occurs. By comparison the highly automatable task of syncing the
 API files themselves seems quite small. I'm imagining most changes would go
 something like this, say we want to add a getter to the Message interface,
 we would need to:


I think it's worth considering two different cases

1) The API change is purely on the Java side... there is no corresponding
change to the C API.  This may be to add some sort of convenience method,
or simply a refactoring.

In this case the developer making the change needs only to work in Java,
there will be two implementations of the interface to change (in two
different source locations) but is all rather trivial.

2) The API change affects both C and Java.

In this case either a single developer has to commit to making the change
in both the C and the Java, or the API change has to have been discussed
before work commences and Java and C developers will need to work
together.  If there is a single developer or developers working very
closely together then I would suggest that the steps would in fact be:

  1. edit the Message interface /  edit the message.h file
  2. write and/or modify a test (and Python binding if necessary)
  3. edit the JNI binding to use the SWIG generated API
  4. edit the C / Pure Java
  5. run the tests against the C / Java
  (6. modify other bindings if necessary)

  repeat steps 4 and 5 until they pass.

In the case where the C and Java developers are separated by time/distance
then the build / tests on one side will be broken until the implementation
catches up.  For the sake of politeness it is probably better to ensure
that at all points the checked in code compiles even if the tests do not
pass.  For cases where the changes to the API are additions then it should
be relatively easy to make the changes in such a way as to simply have any
tests relating to the new API be skipped. For cases where the C leads the
Java, the Java implementation can simply throw
UnsupportedOperationException or some such.  Where the Java leads the C we
can throw said exception from the JNI binding code and leave the .h file
unchanged until the C developer is ready to do the work.

Only for cases where there is modification to existing APIs does it seem
that there may be occaisins where we could not have a consistent build
across components, and I would strongly recommend that any change where the
Java and C are being worked on in such a fashion should take place on a
branch, with a merge to trunk only occurring when all tests are passing
against all implementations.


   1. edit the Message interface
   2. write and/or possibly modify a test
   3. edit the java Message implementation
   4. run the tests against java, if they don't pass go to step 2
   5. now that the java impl passes the tests, run the tests against the C
 impl
   6. if the sync check fails on the C build, run the sync script
   7. edit the message.h file
   8. edit the message.c implementation
   9. edit the adapter layer between the C API and the Java interfaces
   10. run the tests against the C, if they don't pass go to step 8
   11. run the tests against both, just to be sure
   12. check in

 Given the above workflow, it seems like even with a relatively small change
 like adding a getter, the scripted portion of the syncing effort is going
 to be vanishingly small compared to the manual process of syncing the
 implementations. Perhaps I'm just envisioning a different workflow than
 you, or maybe I'm missing some important scenarios. Could you describe what
 workflow(s) you envision and how the sync process would impacting your
 productivity?


I differ strongly in my opinion here. Every time I need to drop out of my
development environment to run some ad-hoc script then there is overhead...
Moreover if we are using svn to do this I presume we would be having to
check in any change before the sync could be made. This means that every
edit to a file now has to be followed by commit and sync (which would
obviously be an insane process).  Those of us behind corporate firewalls
and proxies experience very degraded response times when updating from the

Re: Changing the Proton build system to accommodate jni bindings

2013-01-22 Thread Rob Godfrey
On 21 January 2013 18:05, Rafael Schloming r...@alum.mit.edu wrote:

 On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey rob.j.godf...@gmail.com
 wrote:

  Ummm... it's a dependency... you're familiar with those, yeah?
 
  The same way that the Qpid JMS clients depend on a JMS API jar, for which
  the source is readily available from another source. The JNI binding
 would
  build if the dependency was installed.  The same way I believe the SSL
 code
  in the core of proton-c builds if the dependency for it is installed.
 

 That's not really a proper analogy. Again the JMS interfaces are defined
 outside of qpid. We don't release them, and we depend only on a well
 defined version of them, we don't share a release cycle with them. If the
 JMS API was something that we developed/defined right alongside the impl
 and was part of the same release process, we would certainly not be allowed
 to release without the source.


This releasing without the source is a complete red herring and you know
it.  The source is released in whichever scheme we settle upon.

If you want an example of dependencies within the qpid project, how did the
AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the proton
source get released with the C++ Broker / client?  In the future are you
expecting every part of the Qpid project which depends on proton to include
its full source?  If yes then how is the source tree going to work - is
everything to be a subdirectory of proton-c?

I agree that having the source for the version of the Java API included in
the source release bundle is advantageous. But if the collective decision
is that we have taken a religious position that the source tarballs can
only be svn exports of subdirectories of our source tree, then my
preference would be to use separated dependencies over duplication in the
repository.  Personally I would think that having a more flexible policy on
constructing the release source tarballs would make a lot more sense.

-- Rob


 --Rafael



Re: Changing the Proton build system to accommodate jni bindings

2013-01-22 Thread Rafael Schloming
On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey rob.j.godf...@gmail.comwrote:

 On 21 January 2013 18:05, Rafael Schloming r...@alum.mit.edu wrote:

  On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey rob.j.godf...@gmail.com
  wrote:
 
   Ummm... it's a dependency... you're familiar with those, yeah?
  
   The same way that the Qpid JMS clients depend on a JMS API jar, for
 which
   the source is readily available from another source. The JNI binding
  would
   build if the dependency was installed.  The same way I believe the SSL
  code
   in the core of proton-c builds if the dependency for it is installed.
  
 
  That's not really a proper analogy. Again the JMS interfaces are defined
  outside of qpid. We don't release them, and we depend only on a well
  defined version of them, we don't share a release cycle with them. If the
  JMS API was something that we developed/defined right alongside the impl
  and was part of the same release process, we would certainly not be
 allowed
  to release without the source.
 
 
 This releasing without the source is a complete red herring and you know
 it.  The source is released in whichever scheme we settle upon.

 If you want an example of dependencies within the qpid project, how did the
 AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the proton
 source get released with the C++ Broker / client?  In the future are you
 expecting every part of the Qpid project which depends on proton to include
 its full source?  If yes then how is the source tree going to work - is
 everything to be a subdirectory of proton-c?


Again that's not really the same. If the Java API where on a separate
(staggered) release cycle and the dependency was on a specific version,
then that would be the same, but for what we're discussing, it really
isn't. Proton and the cpp broker live under different trunks and
branch/release separately, as far as I know this is not what you're
proposing for the Java API, it is to live under the same trunk and
branch/release together.



 I agree that having the source for the version of the Java API included in
 the source release bundle is advantageous. But if the collective decision
 is that we have taken a religious position that the source tarballs can
 only be svn exports of subdirectories of our source tree, then my
 preference would be to use separated dependencies over duplication in the
 repository.  Personally I would think that having a more flexible policy on
 constructing the release source tarballs would make a lot more sense.


You can call it religious if you like, but I don't think there is anything
invalid about wanting to keep a simple mapping between release artifacts
and proton developer environment. In the past we have had quite direct
experience of exactly this factor contributing to very poor out of the box
experience for users. Correct me if I'm wrong, but I believe you yourself
have actually advocated (or at least agreed with) this position in the past.

That said, I don't think I'm asking for us to be entirely inflexible in
that regard. There really are two opposing concerns here, one being the
user experience for our release artifacts, and the other being the
convenience of the development process for proton developers. All I'm
asking is that we recognize that there is a real tradeoff and be willing to
explore options that might preserve user experience albeit at a hopefully
minor cost to developer convenience. For any other aspect of software
engineering this would be a no-brainer, you start from the user
requirements/user experience and work your way backwards to the simplest
solution that achieves this, however this proposal and the related
requirements JIRA make zero mention of any *user* requirements merely
developer requirements. This may be ok for Java where all your users will
get stuff via jars and the source tarball is mostly a formality, but for C
the situation is different.

--Rafael


Re: Changing the Proton build system to accommodate jni bindings

2013-01-22 Thread Phil Harvey
It sounds like we're still a little way away from reaching a consensus.  As
a step towards this, I would like to clarify the relative priority of the
various requirements that have come up.  I've therefore created a page on
the wiki that lists them, with a child page briefly describing the various
proposals.

https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements

What are people's views on the relative priority of these requirements?
Are there any I've missed?  I think answering these questions is a
prerequisite for agreeing the technical solution.

Phil


On 22 January 2013 13:34, Rob Godfrey rob.j.godf...@gmail.com wrote:

 On 22 January 2013 13:47, Rafael Schloming r...@alum.mit.edu wrote:

  On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey rob.j.godf...@gmail.com
  wrote:
 
   On 21 January 2013 18:05, Rafael Schloming r...@alum.mit.edu wrote:
  
On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey 
 rob.j.godf...@gmail.com
wrote:
   
 Ummm... it's a dependency... you're familiar with those, yeah?

 The same way that the Qpid JMS clients depend on a JMS API jar, for
   which
 the source is readily available from another source. The JNI
 binding
would
 build if the dependency was installed.  The same way I believe the
  SSL
code
 in the core of proton-c builds if the dependency for it is
 installed.

   
That's not really a proper analogy. Again the JMS interfaces are
  defined
outside of qpid. We don't release them, and we depend only on a well
defined version of them, we don't share a release cycle with them. If
  the
JMS API was something that we developed/defined right alongside the
  impl
and was part of the same release process, we would certainly not be
   allowed
to release without the source.
   
   
   This releasing without the source is a complete red herring and you
  know
   it.  The source is released in whichever scheme we settle upon.
  
   If you want an example of dependencies within the qpid project, how did
  the
   AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the
  proton
   source get released with the C++ Broker / client?  In the future are
 you
   expecting every part of the Qpid project which depends on proton to
  include
   its full source?  If yes then how is the source tree going to work - is
   everything to be a subdirectory of proton-c?
  
 
  Again that's not really the same. If the Java API where on a separate
  (staggered) release cycle and the dependency was on a specific version,
  then that would be the same, but for what we're discussing, it really
  isn't. Proton and the cpp broker live under different trunks and
  branch/release separately, as far as I know this is not what you're
  proposing for the Java API, it is to live under the same trunk and
  branch/release together.
 
 
 The point was that the source code doesn't need to be in the same tarball
 let alone the same subdirectory in source control. If one considers that
 the Java API is a dependency then whether it is released concurrently or
 not with the JNI binding is moot.

 I've already said that it is preferable to have the source within the same
 tarball for the source release, but if needs be then I can live with the
 strict dependency view of things.


 
  
   I agree that having the source for the version of the Java API included
  in
   the source release bundle is advantageous. But if the collective
 decision
   is that we have taken a religious position that the source tarballs can
   only be svn exports of subdirectories of our source tree, then my
   preference would be to use separated dependencies over duplication in
 the
   repository.  Personally I would think that having a more flexible
 policy
  on
   constructing the release source tarballs would make a lot more sense.
  
 
  You can call it religious if you like, but I don't think there is
 anything
  invalid about wanting to keep a simple mapping between release artifacts
  and proton developer environment. In the past we have had quite direct
  experience of exactly this factor contributing to very poor out of the
 box
  experience for users. Correct me if I'm wrong, but I believe you yourself
  have actually advocated (or at least agreed with) this position in the
  past.
 
  That said, I don't think I'm asking for us to be entirely inflexible in
  that regard. There really are two opposing concerns here, one being the
  user experience for our release artifacts, and the other being the
  convenience of the development process for proton developers.


 I actually think there are three perspectives here.  The user experience of
 our release artefacts, the committer experience of working on the
 checkedout codebase, and the release manager view of preparing the release
 aretfacts from the source control.


  All I'm
  asking is that we recognize that there is a real tradeoff and be willing
 to
  explore options that might preserve 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-22 Thread Rafael Schloming
Thanks for posting this, I think it's a very useful step. I'd suggest
adding another Stakeholder -- someone testing a release artifact. Rob makes
a good point that the release manager is a distinct view, but I think the
desire to minimize deltas between the svn tree and the release artifacts is
most directly motivated by my experience *testing* release artifacts. I
remember going through qpid releases in the old days and having the very
unpleasant experience of trying to remember from 8 or 10 months ago how
exactly stuff worked in the release artifact as compared to the build tree.
I very much like the fact that with a simple export I can be highly
confident that my experience of stuff working in my checkout translates
well to the release artifacts and testing them is a very familiar, quick,
and easy process.

Strictly speaking I think the requirement from a release management
perspective is purely that we can produce releases at the rate we need, so
it has to be quick and easy and robust to different environments, but I
wouldn't say the export thing is a requirement of the release manager
per/se. As many have pointed out we already use a script for this and it
can remap things quite easily.

I have more thoughts on the release process, especially as it is somewhat
expanded now to produce java binaries and will need to expand more to
include windows stuff, however I need to run an errand at the moment. I'll
post and/or comment on the page later though.

--Rafael

I very much like the fact that our current release artifacts are trivial

On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey p...@philharveyonline.comwrote:

 It sounds like we're still a little way away from reaching a consensus.  As
 a step towards this, I would like to clarify the relative priority of the
 various requirements that have come up.  I've therefore created a page on
 the wiki that lists them, with a child page briefly describing the various
 proposals.


 https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements

 What are people's views on the relative priority of these requirements?
 Are there any I've missed?  I think answering these questions is a
 prerequisite for agreeing the technical solution.

 Phil


 On 22 January 2013 13:34, Rob Godfrey rob.j.godf...@gmail.com wrote:

  On 22 January 2013 13:47, Rafael Schloming r...@alum.mit.edu wrote:
 
   On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey rob.j.godf...@gmail.com
   wrote:
  
On 21 January 2013 18:05, Rafael Schloming r...@alum.mit.edu wrote:
   
 On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey 
  rob.j.godf...@gmail.com
 wrote:

  Ummm... it's a dependency... you're familiar with those, yeah?
 
  The same way that the Qpid JMS clients depend on a JMS API jar,
 for
which
  the source is readily available from another source. The JNI
  binding
 would
  build if the dependency was installed.  The same way I believe
 the
   SSL
 code
  in the core of proton-c builds if the dependency for it is
  installed.
 

 That's not really a proper analogy. Again the JMS interfaces are
   defined
 outside of qpid. We don't release them, and we depend only on a
 well
 defined version of them, we don't share a release cycle with them.
 If
   the
 JMS API was something that we developed/defined right alongside the
   impl
 and was part of the same release process, we would certainly not be
allowed
 to release without the source.


This releasing without the source is a complete red herring and you
   know
it.  The source is released in whichever scheme we settle upon.
   
If you want an example of dependencies within the qpid project, how
 did
   the
AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the
   proton
source get released with the C++ Broker / client?  In the future are
  you
expecting every part of the Qpid project which depends on proton to
   include
its full source?  If yes then how is the source tree going to work -
 is
everything to be a subdirectory of proton-c?
   
  
   Again that's not really the same. If the Java API where on a separate
   (staggered) release cycle and the dependency was on a specific version,
   then that would be the same, but for what we're discussing, it really
   isn't. Proton and the cpp broker live under different trunks and
   branch/release separately, as far as I know this is not what you're
   proposing for the Java API, it is to live under the same trunk and
   branch/release together.
  
  
  The point was that the source code doesn't need to be in the same tarball
  let alone the same subdirectory in source control. If one considers that
  the Java API is a dependency then whether it is released concurrently or
  not with the JNI binding is moot.
 
  I've already said that it is preferable to have the source within the
 same
  tarball for the source release, but if needs be then I 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-21 Thread Robbie Gemmell
I would echo some of Robs points (since he beat me to saying them msyelf :)
) and add some of my own.

I also dont see a need to check out proton-c or proton-j in isolation, if
the tests for both of them sit a level up then thats what people should be
grabbing in my mind.

Duplicating code sounds fishy to start with, but doing so given the
apparent real need to check out the common parent directory seems easily
questionable.

One possible adjustment I might suggest (but dont personally see the need
for) would be that if the compile requirement for maven to generate the
proton-api jar used by the C tree to build the JNI bindings is considered
unworkable for some, if its just a simple jar it could also be built with
CMake for the C build, leaving Maven to do it for the Java build. I'm not
sure how such developers would be planning to run the common test suite
that still needed Maven though.

If we are releasing the C and Java components at the same time, and the
tests sit at the top level, why does there need to be two source tars? We
had this discussion with regards to the various Qpid clients and brokers
some time ago and the agreed (but never fully implemented, we still have
subset source tars) outcome was that we should do away with the
component-specific source tars and only have the one main source tar which
is actually 'the release' in terms of the project, with e.g Java binaries
being separate complementary things.

If we did just have a single source artifact to consitutate the full
release and either we or some third party then wanted to build a c-only
source artifact for some reason, that could of course still be done by
simply processing the contents of the repository or single 'the release'
tar appropriately. E.g, the 'individual component' source releases in Qpid
arent simple svn exports, they contain different parts of the tree bundled
into a tar, which is I guess ok because they are not actually 'the release'.

Robbie

On 21 January 2013 12:10, Rob Godfrey rob.j.godf...@gmail.com wrote:

 

   This results in something that is quite awkward for the C build. For
 one
   thing I'm not sure an svn export of the proton-c directory would be
   considered releasable under this scheme as it would include the java
   binding, but not the source code necessary to build it, and apache
 policy
   requires releases to include full source code. Regardless it would no
   longer be a useful/sensible artifact to end-users since they couldn't
   actually build the java binding.
  
 
 
 This seems a slightly odd position to take. The artefact doesn't include
 the entire source to python, ruby, openssl, etc.  If the dependencies for
 these are not present then the relevant parts of the tree are not built.
  The same is true in this proposal with respect to the java binding...
 there is a dependency on the Java API being installed in order to build the
 JNI bindings within the C build.


 I must admit I remain bemused by the idea that trying to maintain two
 copies of the Java API in the source tree makes any kind of sense.

 I think we are contorting ourselves and adding potentially huge
 complication to our build/development process in order to try to satisfy a
 number of somewhat arbitrary requirements that are being imposed on the
 directory structure.

 Personally I don't perceive there to be an actually need to allow checking
 out of only part of the Proton tree.  Indeed I would wish to strongly
 discourage the sort of silo-d attitude that checking out only Java or only
 C would imply.

 Moreover, while I see that it is advantageous to be able to release
 source packages directly as svn exports from points in the tree... I
 don't find this so compelling that I would break fundamental tenets of
 source control is expected to be used.

 Personally, given that our current plan is to release all of Proton at the
 same time, I'm not sure what would be wrong with simply shipping a single
 source tarball of the entire directory structure.  People who wish to build
 from source would thus be able to build whatever they so desired.

 -- Rob



Re: Changing the Proton build system to accommodate jni bindings

2013-01-21 Thread Rob Godfrey
On 21 January 2013 15:11, Rafael Schloming r...@alum.mit.edu wrote:

 On Mon, Jan 21, 2013 at 7:10 AM, Rob Godfrey rob.j.godf...@gmail.com
 wrote:

  
 
This results in something that is quite awkward for the C build. For
  one
thing I'm not sure an svn export of the proton-c directory would be
considered releasable under this scheme as it would include the java
binding, but not the source code necessary to build it, and apache
  policy
requires releases to include full source code. Regardless it would no
longer be a useful/sensible artifact to end-users since they couldn't
actually build the java binding.
   
  
  
  This seems a slightly odd position to take. The artefact doesn't include
  the entire source to python, ruby, openssl, etc.  If the dependencies for
  these are not present then the relevant parts of the tree are not built.
   The same is true in this proposal with respect to the java binding...
  there is a dependency on the Java API being installed in order to build
 the
  JNI bindings within the C build.
 

 The problem isn't with not including the source code to external
 dependencies (i.e. Java in your analogy), the problem is with the fact that
 all of the Java binding (the API and the JNI implementation of it) is
 developed within the qpid project, and the artifact would not include all
 of it. The apache release policy is quite clear on this front:

 The Apache Software Foundation produces open source software. All releases
 are in the form of the source materials needed to make changes to the
 software being released. In some cases, binary/bytecode packages are also
 produced as a convenience to users that might not have the appropriate
 tools to build a compiled version of the source. In all such cases, the
 binary/bytecode package must have the same version number as the source
 release and may only add binary/bytecode files that are the result of
 compiling that version of the source code release.

 Producing an artifact that has source code for impls, but not source for
 the interfaces would quite clearly constitute an artifact that didn't
 include all the source materials needed to make changes.



Ummm... it's a dependency... you're familiar with those, yeah?

The same way that the Qpid JMS clients depend on a JMS API jar, for which
the source is readily available from another source. The JNI binding would
build if the dependency was installed.  The same way I believe the SSL code
in the core of proton-c builds if the dependency for it is installed.


 I must admit I remain bemused by the idea that trying to maintain two
  copies of the Java API in the source tree makes any kind of sense.
 
  I think we are contorting ourselves and adding potentially huge
  complication to our build/development process in order to try to satisfy
 a
  number of somewhat arbitrary requirements that are being imposed on the
  directory structure.
 

 You're arguing against a straw man here. Nobody has proposed copying the
 API the way you keep describing it. The original solution implemented on
 the JNI branch was to have the API in two places at once via svn externals.


This isn't in two places... it's very clearly in one place in the
repository, with another place linking to it, through a rather inelegant
manner.

Having said that, the externals solution is not a particularly pleasant
solution and was only put in place because of the requirement to be able
to check out from a subdirectory of proton.  Having further considered the
matter, my feeling is that it is better to re-examine the need to be able
to check out just a single subdirectory of the proton tree.


 This however does violate one of the fundamental tenants of source
 control as you put it since it fundamentally loses track of what version
 of the API source goes with what version of the implementation source.



Umm... no it doesn't.  Again... I'm not pushing for svn:externals but if
you insist on the each subdirectory must be provide able to be
independently checked out then I think svn:externals is a better solution
than the copy.  The original svn:externals proposal makes it very clear
that the version of the Java API code that the JNI binding works with must
be the same as that which the Java impl works with.  The externals is to a
sibling directory within the same project.  So long as you consider the
proton project as a whole then it is never unclear as to which version you
should be using.  Only in w orld where the Java and C versions are not
progressed with a common API does this become a problem.  If you do not
believe the two should have a common API then I think we need to have a
wider discussion (since we've been working pretty hard until now to keep
the APIs in sync).


 Branching the API into two places and putting the necessary scripts in
 place to enforce that the C version of that branch is a read only copy of
 the Java version is simply another way to achieve exactly what is currently

Re: Changing the Proton build system to accommodate jni bindings

2013-01-21 Thread Rafael Schloming
On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey rob.j.godf...@gmail.comwrote:

 Ummm... it's a dependency... you're familiar with those, yeah?

 The same way that the Qpid JMS clients depend on a JMS API jar, for which
 the source is readily available from another source. The JNI binding would
 build if the dependency was installed.  The same way I believe the SSL code
 in the core of proton-c builds if the dependency for it is installed.


That's not really a proper analogy. Again the JMS interfaces are defined
outside of qpid. We don't release them, and we depend only on a well
defined version of them, we don't share a release cycle with them. If the
JMS API was something that we developed/defined right alongside the impl
and was part of the same release process, we would certainly not be allowed
to release without the source.

--Rafael


Re: Changing the Proton build system to accommodate jni bindings

2013-01-21 Thread Rafael Schloming
On Mon, Jan 21, 2013 at 8:03 AM, Robbie Gemmell robbie.gemm...@gmail.comwrote:

 I would echo some of Robs points (since he beat me to saying them msyelf :)
 ) and add some of my own.

 I also dont see a need to check out proton-c or proton-j in isolation, if
 the tests for both of them sit a level up then thats what people should be
 grabbing in my mind.

 Duplicating code sounds fishy to start with, but doing so given the
 apparent real need to check out the common parent directory seems easily
 questionable.

 One possible adjustment I might suggest (but dont personally see the need
 for) would be that if the compile requirement for maven to generate the
 proton-api jar used by the C tree to build the JNI bindings is considered
 unworkable for some, if its just a simple jar it could also be built with
 CMake for the C build, leaving Maven to do it for the Java build. I'm not
 sure how such developers would be planning to run the common test suite
 that still needed Maven though.

 If we are releasing the C and Java components at the same time, and the
 tests sit at the top level, why does there need to be two source tars? We
 had this discussion with regards to the various Qpid clients and brokers
 some time ago and the agreed (but never fully implemented, we still have
 subset source tars) outcome was that we should do away with the
 component-specific source tars and only have the one main source tar which
 is actually 'the release' in terms of the project, with e.g Java binaries
 being separate complementary things.


I'm not sure I can answer this in a way that will be satisfying to you as
the answer is based a lot on C development standards where source tarballs
play a much more active role as a means to distribute software than in the
Java world where everything is distributed via binaries. But I'll try by
saying that having a C project where you can't simply untar it and do one
of src/configure  make or cmake src  make is a bit like having a
Java project that doesn't use maven or ant. I'm aware we could have cmake
at the top level alongside a pom.xml, and some third entry script that
invokes both for system tests and the like, and while I would encourage
that for proton developers, it is imposing a very complex set of entry
points onto our users. I can see that this might impact Java users less as
they may care less about src distros, but it is far from an ideal release
artifact for C users.

As for producing a C tarball by post processing a large source tarball,
it's simply something I would prefer to avoid given that there are
alternatives as having a complex mapping from source control to release
artifact is in my experience quite bad for the health of a project. It
means developers are more detached from what their users experience out of
the box.

--Rafael


Re: Changing the Proton build system to accommodate jni bindings

2013-01-21 Thread Rafael Schloming
On Sat, Jan 19, 2013 at 5:48 PM, Phil Harvey p...@philharveyonline.comwrote:

 I worked with Keith on this proposal so I should state up front that I'm
 not coming to this debate from a neutral standpoint.

 Hopefully we can find a solution that is acceptable to everyone.  To this
 end, we listed our understanding of the requirements on
 https://issues.apache.org/jira/browse/PROTON-194.  I'm hoping that this
 discussion will allow us to clarify our requirements, such that the best
 technical solution naturally follows.

 I've added some comments in-line below...

 On 18 January 2013 19:29, Rafael Schloming r...@alum.mit.edu wrote:

  On Fri, Jan 18, 2013 at 11:17 AM, Keith W keith.w...@gmail.com wrote:
 
   We are currently in the process of implementing the proton-jni binding
   for the proton-c library that implements the Java Proton-API, allow
   Java users to choose the C based proton stack if they wish. This work
   is being performed on the jni-branch under PROTON-192 (for the JNI
   work) and PROTON-194 (for the build system changes).
  
   Currently, Proton has two independent build systems: one for the
   proton-c and its ruby/perl/python/php bindings (based on Cmake/Make),
   and second a separate build system for proton-j (based on Maven).  As
   proton-jni will cut across both technology areas, non trivial changes
   are required to both build systems.
  
   The nub of the problem is the sharing of the Java Proton-API between
   both proton-c and proton-j trees. Solutions based on svn-external and
   a simple tree copy have been considered and discussed at length on
   conference calls.  We have identified drawbacks in both solutions.
  
 
  To be honest I don't think we've sufficiently explored the copy option.
  While its true there were a lot of hypothetical issues thrown around on
 the
  calls, many of them have quite reasonable solutions that may well be less
  work than the alternatives.
 
  In my experience, maintaining two copies of any code is usually a bad
 thing.  However, I try to be open minded so I agree that it's worth
 exploring this option.  I'd be interested to hear your opinion on (a) the
 scenarios when it would be acceptable for these two copies to diverge and
 (b) the mechanism you're envisaging for achieving convergence.  I imagine
 there are both technical and process dimensions to making this work.


This is a good question, sorry I missed it with the flurry of other posts.
To answer (a), I think on trunk these two things should probably never (or
very rarely at least) diverge. On very specific feature development
branches I think we've seen it can be convenient to let them diverge a
little, but as the whole point of a feature branch is to be able to break
things I think that's neither here nor there, either way I would consider
non matching APIs to be a broken state of things.

The mechanism I'd propose would be to add a check to the C build system
that would cause a build failure if the API as viewed from the JNI binding
was any different from the API as it exists in the Java source tree. I
believe for most developer scenarios this would achieve *almost* the same
thing that svn externals does without the inherent drawbacks. I'll detail
the scenarios I've thought of below:

1. Changing the Java API from the Java tree
  If the Java developer changes the API, the C build will break due to
check failure.
  If the Java developer changes both to avoid the check, the C build will
break to do compile failure.

2. Changing the Java API from the C tree
  If the C developer forgets to change the Java API, then the C build will
break due to check failure.
  If the C developer changes both to avoid the check, the Java build will
break due to compile failure.

I believe the above scenarios enforce pretty much the same thing svn
externals does. The only added step is the need to copy changes to both
places when you are being a good citizen and bringing forward both at the
same time. I would hope that this would become less and less of an issue as
the API should really stabilize and not change, however if that is an issue
for the near term I would propose adding a sync script on the C side to
pull the changes over to a local checkout. This would result in the
following process for someone changing both simultaneously:

  - Make you changes and test on the Java build. When this works,
transition over to the C build and see what the impact of the changes has
been. The first breakage will be the check failure and running the sync
script will fix this. You can then proceed to see what other build failures
there are and how to fix them.

I would hope overall this would minimize the impact of the syncing as I
would expect API changes to be primarily driven from the Java side. Either
way, I think the only development process difference between this setup and
the svn externals one is that for the C build you fix the check breakage by
running the sync script before proceeding to fix 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-19 Thread Phil Harvey
I worked with Keith on this proposal so I should state up front that I'm
not coming to this debate from a neutral standpoint.

Hopefully we can find a solution that is acceptable to everyone.  To this
end, we listed our understanding of the requirements on
https://issues.apache.org/jira/browse/PROTON-194.  I'm hoping that this
discussion will allow us to clarify our requirements, such that the best
technical solution naturally follows.

I've added some comments in-line below...

On 18 January 2013 19:29, Rafael Schloming r...@alum.mit.edu wrote:

 On Fri, Jan 18, 2013 at 11:17 AM, Keith W keith.w...@gmail.com wrote:

  We are currently in the process of implementing the proton-jni binding
  for the proton-c library that implements the Java Proton-API, allow
  Java users to choose the C based proton stack if they wish. This work
  is being performed on the jni-branch under PROTON-192 (for the JNI
  work) and PROTON-194 (for the build system changes).
 
  Currently, Proton has two independent build systems: one for the
  proton-c and its ruby/perl/python/php bindings (based on Cmake/Make),
  and second a separate build system for proton-j (based on Maven).  As
  proton-jni will cut across both technology areas, non trivial changes
  are required to both build systems.
 
  The nub of the problem is the sharing of the Java Proton-API between
  both proton-c and proton-j trees. Solutions based on svn-external and
  a simple tree copy have been considered and discussed at length on
  conference calls.  We have identified drawbacks in both solutions.
 

 To be honest I don't think we've sufficiently explored the copy option.
 While its true there were a lot of hypothetical issues thrown around on the
 calls, many of them have quite reasonable solutions that may well be less
 work than the alternatives.

 In my experience, maintaining two copies of any code is usually a bad
thing.  However, I try to be open minded so I agree that it's worth
exploring this option.  I'd be interested to hear your opinion on (a) the
scenarios when it would be acceptable for these two copies to diverge and
(b) the mechanism you're envisaging for achieving convergence.  I imagine
there are both technical and process dimensions to making this work.


  This email proposes another solution. The hope is that this proposal
  can be developed on list into a solution that is acceptable to all.
 
  Proposal:
 
  Move the Java Proton-API to the top level so that it can be shared
  simply and conveniently by both proton-j and proton-c.
 
  * Maven builds the proton-api JAR to a well known location
  * Cmake/make builds proton-c and all bindings including java. As the
  building of the java binding requires the Java Proton API, it is
  optional and only takes place if proton-api has been previously
  created by Maven (or found by other means).
  * Maven builds of proton-j
  * Maven runs the system tests against either proton-c or proton-j. The
  system tests are currently written in Python but are being augmented
  with new ones written in Java.
 
  Proposed Directory Structure:
 
  proton
  |-- release.sh/bat  # Builds, tests and packages
  proton-c and proton-j
  |-- pom.xml
  |
  |-- proton-api  # Java Proton-API
  |   |-- pom.xml # Will create proton-api.jar at a
  well known location in tree
  |   `-- main
  |
  |-- proton-c# Proton-C and Proton-C bindings
  |   |-- CMakeLists.txt
  |   `-- bindings
  |   |-- CMakeLists.txt
  |   `-- java
  |   |-- CMakeLists.txt
  |   `-- jni
  |   `-- CMakeLists.txt  # Creates proton-jni.jar using
  proton-api.jar from a well known
  |   # location in tree or skip if jar
  cannot be found
  |
  |-- proton-j# Proton-J
  |   |-- pom.xml # Creates proton-j.jar using
  proton-api.jar (found via Maven)
  |   `-- src
  |   `-- main
  |
  `-- tests   # Python and Java based system
  tests that test equally Proton-C and
  |   # Proton-J.
  |-- pom.xml
  `-- src
  `-- test
 
  Use cases:
 
  usecase #1 - Proton-C Developer exclusively focused on Proton-C
 
  This developer may choose to check out the proton-c subtree.  The
  build tool set remains unchanged from today i.e. cmake and make.  By
  default, all bindings will be built expect for the java bindings (as
  Cmake would fail to find the proton-api.jar).  For flexibility, we
  would include option to have cmake search another directory allowing
  proton-api.jar to be found in non-standard locations.
 
  usecase #2 - Proton-C Developer who wishes to run all system tests
 
  This developer must check out the complete proton tree.  The build
  tool set now includes maven in order to build the proton-api and run
  the complete system test suite.
 
  Typical commands used by this developer would 

Re: Changing the Proton build system to accommodate jni bindings

2013-01-18 Thread Rajith Attapattu
 On Fri, Jan 18, 2013 at 2:29 PM, Rafael Schloming r...@alum.mit.edu wrote:
 The nub of the problem is the sharing of the Java Proton-API between
 both proton-c and proton-j trees. Solutions based on svn-external and
 a simple tree copy have been considered and discussed at length on
 conference calls.  We have identified drawbacks in both solutions.

It would be great if things are discussed on the mailing lists as
opposed to conference calls.
However I applaud Keith for posting a detailed summary of the proposal
on the list.

Rajith