Hi,

Going through the above list is a task a release manager can do before
doing the release but things might be broken very badly by that time. So it
will be a mess to start fixing things during the release time.

I had a chance to look into integration tests in Airavata and there were
only few tests. If we could add more integration tests there covering more
functionality then it will

1. help ease the work of release manager
2. improve the confidence of the adding new patches (and all the good
things that will come with integration tests)

I saw Amila enumerating some use cases on a different mail thread on what
needs to be tested. I strongly recommend spending sometime to implement
those tests to make Airavata releases more stable.

Also, considering the number of components we have in Airavata its like an
airplane with lots of complicated parts waiting to fail at any time.
Failure of one component can be disastrous for the whole system. The only
"warning" we will have will be those integration tests.

In WS project, we used to have a nightly build that will run all
integration tests and send "complaint" mails to everyone who broke the
build with their commits. These nightly builds are as good as RCs to start
any release process.

Just sharing some experience I had in a previous life as a release manager
:)

Thanks,
Eran Chinthaka Withana


On Mon, Dec 16, 2013 at 10:24 AM, Chathuri Wimalasena
<[email protected]>wrote:

> There is a general checklist added by Raman [1], which covers basic
> functionalities.
>
> Thanks..
> Chathuri
>
> [1]
> https://cwiki.apache.org/confluence/display/AIRAVATA/Airavata+Release+Testing
>
>
> On Mon, Dec 16, 2013 at 12:56 PM, Saminda Wijeratne <[email protected]>wrote:
>
>>
>>
>>
>> On Mon, Dec 16, 2013 at 9:28 AM, Suresh Marru <[email protected]> wrote:
>>
>>> Thanks Amila for weighing in. Comments inline:
>>>
>>> On Dec 16, 2013, at 11:29 AM, Amila Jayasekara <[email protected]>
>>> wrote:
>>>
>>> > Hi Suresh,
>>> >
>>> > I have some comments inline.
>>> >
>>> >
>>> > On Mon, Dec 16, 2013 at 10:53 AM, Suresh Marru <[email protected]>
>>> wrote:
>>> > Hi All,
>>> >
>>> > This is a very good question. Lets discuss these options so we are
>>> consistent across releases.
>>> >
>>> > If we look at the way we are doing releases, we are calling a feature
>>> freeze and code freeze and cutting a release. Most of the time, our build
>>> is broken. Jenkins   statistics for Airavata is not looking good at all [1].
>>> >
>>> > There is something wrong with the Jenkins configurations. I tried to
>>> figure out sometime back I was unable to do so. Even though builds are
>>> successful in our local machines they are failing intermittently in Jenkins.
>>> >
>>> > We are barely fixing the build a day before the release, putting out
>>> an RC and testing on it and releasing it in a quick succession.
>>> >
>>> > This is not entirely true. For the past few months I only experienced
>>> one or two build breaks (maybe less). I build couple of times per week. I
>>> believe usually build is stable and with integration tests passing, we
>>> always get a workable version. I know its not a good practice not to rely
>>> on the build server. But commiters have personal discipline to keep the
>>> build stable. Nevertheless we must fix Jenkins configuration issue.
>>>
>>> May be we should put focus on Jenkins configuration? Any volunteers?
>>>
>>> >
>>> > As we are seeing on user lists, we have users upgrading with every
>>> release. I think we should increase the release quality.
>>> >
>>> > +1 for this.
>>> >
>>> > I would vote for atleast 3 RC’s per release. If we are not finding
>>> issues in first RC, I would say, either the software has magically become
>>> too too good or we are not doing through testing. I suspect the later.
>>>
>> How about we keep a checklist of release tests? I know we already send a
>> mail on dev on what needs to be tested for each RC, but I need that is too
>> abstract. For core developers of Airavata I think there should be test
>> cases predefined (a test document if you may). Since we have several core
>> developers in the list we can atleast decide upon what must be tested and
>> make sure that each test case is covered by atleast one developer for a RC.
>>
>>> >
>>> > I guess you mentioned this under assumption that build is not stable.
>>>
>>> Half of my assumption is on Jenkins, so if builds are ok and Jenkins is
>>> thinking wrong, then we can alleviate it by fixing it.
>>>
>>> > I will propose the following, please counter it and lets agree on a
>>> process:
>>> >
>>> > * Lets post a RC1 as is (which means it will have a snapshot). This
>>> pack, we should all test as much as possible, so its more of a test
>>> candidate then a release candidate. If it helps, we can use the name TC1. I
>>> am not particular on the naming but trying to emphasize the need for having
>>> atleast more RC's per release.
>>> >
>>> > I am not sure whether we really need a TC. The release manager should
>>> be doing some verifications on the RC before putting it out. Therefore it
>>> should be a RC. Anyhow i am fine having TC concept and trying it out.
>>>
>>> We probably should stick to RC, but I think the onus should not be on
>>> the RM to test it. They should coordinate and mobilize every one to do the
>>> testing including doing a testing bit more than others. But my point is, we
>>> should test and the only way to do that is to put a series of RC’s and have
>>> focused testing.
>>>
>> A TC should be something internal IMO. But when we are going for a
>> release it should be alpha, beta and then RC releases. I think it need not
>> be mandatory for the RMs to do pre-evaluation of the builds other than
>> making sure all the unit tests and integration tests pass. Once an RC is
>> confirmed of release quality I think we can follow the actual release cycle
>> from the trunk itself with since its in a code freeze anyway.
>>
>>>
>>> Suresh
>>>
>>> >
>>> > What we really need is set of verifiable test cases.
>>> >
>>> > Thank you
>>> > Regards
>>> > Amila
>>> >
>>> >
>>> > * If we do not expose significant issues in RC/TC 1 then we proceed
>>> with RC2 which will follow the proper release process. But if we have a
>>> reasonable issues bought out, we need a RC2/TC2 also without following the
>>> release process.
>>> >
>>> > * The key thing I am proposing is, we keep doing RC/TC’s until we all
>>> are sure the quality is good enough with documented known issues. When we
>>> are sure, then we proceed to have RC with proper release process.
>>> >
>>> > So this will mean more testing and twice (or more) the times every one
>>> has to test, but I think it is worth it. This might also get over the 6
>>> week release cycle, but I think we need to trade for some quality releases
>>> as we march towards 1.0.
>>> >
>>> > Suresh
>>> > [1] - https://builds.apache.org/job/Apache%20Airavata/
>>> >
>>> >
>>> > On Dec 15, 2013, at 4:28 PM, Lahiru Gunathilake <[email protected]>
>>> wrote:
>>> >
>>> > >
>>> > > Hi Chathuri,
>>> > >
>>> > > I think having snapshot as the version in RC is wrong. Every RC has
>>> to be like a release and if it pass we just call a vote/discussion thread
>>> and do the release. If we do with snapshot  and if things go right, then
>>> have to change versions and test again. But we can do the release just by
>>> changing snapshot without testing but that wrong AFAIT.
>>> > >
>>> > > I remember doing this mistake in earlier release with RC1 build. I
>>> think we can stick to the release management instructions in
>>> airavata.org.
>>> > >
>>> > > Regards
>>> > > Lahiru
>>> > >
>>> > >
>>> > > On Fri, Dec 13, 2013 at 3:43 PM, Chathuri Wimalasena <
>>> [email protected]> wrote:
>>> > > Hi All,
>>> > >
>>> > > Airavata 0.11 RC1[1] is ready for testing.
>>> > >
>>> > > Here are some pointers for testing
>>> > >       • Verify the fixed issue for this release [2]
>>> > >       • Verify the basic workflow composition/execution/monitoring
>>> scenarios from
>>> > >       • Airavata 5 & 10 min tutorials [3],[4]
>>> > >       • Verify airavata client samples
>>> > >       • Verify the stability with derby & mysql backend databases
>>> > >       • Verify that the XBaya JNLP distribution works
>>> > >       • Verify deploying Airavata server in a tomcat distribution
>>> > > Please report any issues[5] if you encounter while testing. Thank
>>> you for your time in validating the release.
>>> > >
>>> > > Regards,
>>> > > Chathuri (On behalf of Airavata PMC)
>>> > >
>>> > > [1] https://dist.apache.org/repos/dist/dev/airavata/0.11/RC1/
>>> > > [2]
>>> https://issues.apache.org/jira/browse/AIRAVATA-278?jql=project%20%3D%20AIRAVATA%20AND%20fixVersion%20%3D%20%220.11%22%20ORDER%20BY%20status%20DESC%2C%20priority%20DESC
>>> > > [3]
>>> http://airavata.apache.org/documentation/tutorials/airavata-in-5-minutes.html
>>> > > [4]
>>> http://airavata.apache.org/documentation/tutorials/airavata-in-10-minutes.html
>>> > > [5] https://issues.apache.org/jira/browse/AIRAVATA
>>> > >
>>> > >
>>> > >
>>> > >
>>> > >
>>> > >
>>> > > --
>>> > > System Analyst Programmer
>>> > > PTI Lab
>>> > > Indiana University
>>> >
>>> >
>>>
>>>
>>
>

Reply via email to