Here is a link to the testplan for 2.1 release:
http://opensourcewebconferencing.blogspot.ru/2013/03/testplan-for-release-21-testing.html

On Sun, Mar 24, 2013 at 2:05 PM, Irina Arkhipets
<[email protected]>wrote:

> Hi Sebastian,
>
> Some time ago I've created a test plan for Vasiliy where I tried to cover
> all the cases and which was revieved by Alexei and Maxim.
>
> It's currently on Russian, and probably is incomplete in some aspects.
> I'll try to do translate it on English and share with others ASAP (hope,
> today later or tomorrow). Vasiliy will be responsible for the reports about
> the tests execution.
> .
> Any yes, our fault is that we did not share it with the community from the
> very beginning :(.
>
> You are right about the automated junit tests. I'll try to help Maxim with
> this :)
>
> Best regards,
> Irina.
>
>
>
> On Sun, Mar 24, 2013 at 11:15 AM, [email protected] <
> [email protected]> wrote:
>
>> I would be already happy if we do the following:
>>
>> 1) Enable the Junit test to run automated (by using the Backup Import via
>> JUnit as example)
>> So that every committer can add new JUnit tests that run with every
>> Nightly
>> Build.
>>
>> 2) Start a list of test/use cases that should be performed with any
>> release.
>> Maybe there is already such a list ? What did Alexey, Artyom, Irina,
>> Vasya,
>> Yuliya test at all ?
>> How did they manage the work of "testing", did they agree on any tests
>> that
>> need to be performed ?
>>
>> Sebastian
>>
>>
>>
>> 2013/3/24 Maxim Solodovnik <[email protected]>
>>
>> > I guess we need to improve+enlarge our automated tests and rely on it in
>> > the future.
>> > Right now it is necessary to run manual tests :(
>> > I'll try to write 1-2 tests per day/week (too much work right now :(( )
>> >
>> >
>> > On Sun, Mar 24, 2013 at 11:52 AM, [email protected] <
>> > [email protected]> wrote:
>> >
>> > > Wicket will help to do tests. However our client is 100% Flash now.
>> > >
>> > > Do we want to run UI tests in the Flash UI or do we want to only run
>> > JUnit
>> > > tests automated ?
>> > >
>> > > How can we define which JUnit tests run automated ?
>> > >
>> > > From my perspective the more we can test automated the less time you
>> > spend
>> > > on even more painful tasks.
>> > > Cause every test that is _not_ automated means that:
>> > >  - It is likely that nobody will do testing
>> > >  - A extremly painful process will start where we maintain a wiki
>> > document
>> > > that lists all tests (with all problems including, like nobody takes
>> care
>> > > of those documents, nobody can really control if those tests have been
>> > > performed at all or not, et cetera)
>> > >
>> > > So from my perspective putting some time into an automated test is
>> still
>> > > much less pain then trying now to re-run all those manual tests, mail
>> > ping
>> > > pong and discussion with every release that we do.
>> > >
>> > > Sebastian
>> > >
>> > >
>> > >
>> > > 2013/3/24 Maxim Solodovnik <[email protected]>
>> > >
>> > > > It is hard for me to answer such long letters :)))
>> > > >
>> > > > >> Yeah, well how should any user do a test if there is no public
>> demo
>> > > > I'm not sure what is the status of demo.dataved.ru, it allows "self
>> > > > registration" and it is up 24/7, but you are right, there were no
>> "Call
>> > > to
>> > > > test". But I was sure My emails like "people I'm going to release,
>> > please
>> > > > stop me if it is too early" is sort of call to test it and let me
>> > > know....
>> > > >
>> > > > I agree on "automated testing", I promise I'll add out tests to the
>> > build
>> > > > (I forgot about it, will create JIRA issue).
>> > > >
>> > > > since we will be on Wicket we can finally start writing tests on
>> our UI
>> > > > similar to their tests (never tried that)
>> > > > I do like automated tests, it is just not my favorite task :)
>> > > >
>> > > > according to our (and Apache guide)
>> > > > http://openmeetings.apache.org/ReleaseGuide.html
>> > > > "
>> > > >
>> > > > *Before voting +1 PMC members are required to download the signed
>> > source
>> > > > code package, compile it as provided, and test the resulting
>> executable
>> > > on
>> > > > their own platform, along with also verifying that the package meets
>> > the
>> > > > requirements of the ASF policy on releases.*
>> > > >
>> > > > "
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > > On Sun, Mar 24, 2013 at 11:17 AM, [email protected] <
>> > > > [email protected]> wrote:
>> > > >
>> > > > > We did extensive testing of 2.1 (Alexey, Artyom, Irina, Vasya,
>> > Yuliya)
>> > > > > => where did they perform the tests? I thought we would invite the
>> > > > > community to help us testing.
>> > > > >
>> > > > > *1) there were no issues reported by users*
>> > > > > Yeah, well how should any user do a test if there is no public
>> demo?
>> > I
>> > > > also
>> > > > > did not hear any call on the user mailing list that users are
>> invited
>> > > to
>> > > > > test.
>> > > > > *2) We better release 2.1.1 or 2.2 in a month than wait another 6
>> > > months*
>> > > > > I agree on that. But our past agreement was more like "dev
>> complete
>> > =>
>> > > > > release". That model will not work for our future.
>> > > > > And I want to make sure that everybody involved understands that.
>> > > > >
>> > > > > IMHO our lack of automated testing and the need for a manual test
>> /
>> > > click
>> > > > > through of all the features is one of the biggest issues in our
>> > current
>> > > > > project.
>> > > > > For example I do not understand why the JUnit test for the backup
>> > > import
>> > > > > was never integrated into the Nightly builds? I mean all that work
>> > that
>> > > > > you've put into that. Simply nobody uses it now.
>> > > > > It would be such a nice thing to wake up every morning and see
>> what
>> > > test
>> > > > > fails and what to look at? I guess there are only a couple of bits
>> > > > missing
>> > > > > to get the backup import running automated but I don't understand
>> > what
>> > > > > keeps us away from doing that?
>> > > > >
>> > > > > Similar for the rest of the Junit tests. Of couse a good amount of
>> > the
>> > > > > tests are just outdated.
>> > > > > But if there would be at least a minimal subset of tests that run
>> > > > > automated, that would be an improment by 100%, cause at the
>> moment,
>> > > just
>> > > > > zero tests run automated.
>> > > > > This will become even more interesting with Wicket, where you can
>> > test
>> > > a
>> > > > > lot of the UI stuff with simple JUnit tests.
>> > > > > The manual work that Alexey, Artyom, Irina, Vasya, Yuliya and
>> anybody
>> > > > else
>> > > > > involved has done for 2.1
>> > > > > => Will need to happen with every release. 2.1.1, 2.2, ...
>> > > > > An approach like "A feature that has been tested in the release
>> 2.1
>> > > needs
>> > > > > no more testing in a release 2.1.1 (or 2.2)". I will not agree on
>> > that
>> > > in
>> > > > > any sense. Every release does need a full test.
>> > > > > And IMHO this approach will not scale at all with the growing
>> number
>> > of
>> > > > > committers.
>> > > > >
>> > > > > It would be great if we start thinking about what we will do to
>> > improve
>> > > > > that in the future?
>> > > > >
>> > > > > The tools are basically there but it seems like nobody involved in
>> > the
>> > > > > project believes that automated tests make sense (except me) ?
>> > > > >
>> > > > > From @Alexey I know that he believes only additions to the feature
>> > add
>> > > > > value to the end product. And it seems like "testing" is not a
>> > > "feature"
>> > > > > that adds any value to the end user from that perspective.
>> > > > > So my questions would be: Do we really want to do the same amount
>> of
>> > > > manual
>> > > > > click-through tests that we do now with every release ?!
>> > > > > I mean: Am I the only person sick of downloading every release and
>> > > > clicking
>> > > > > through every feature 30 minutes to give a "+1" ?!
>> > > > >
>> > > > > Sebastian
>> > > > >
>> > > > >
>> > > > > 2013/3/24 Maxim Solodovnik <[email protected]>
>> > > > >
>> > > > > > We did extensive testing of 2.1 (Alexey, Artyom, Irina, Vasya,
>> > > Yuliya)
>> > > > > > additional causes are:
>> > > > > > 1) there were no issues reported by users
>> > > > > > 2) We better release 2.1.1 or 2.2 in a month than wait another 6
>> > > months
>> > > > > >
>> > > > > > ps Apach Wicket has 1 month release cycle .... I believe we
>> should
>> > > have
>> > > > > 2-3
>> > > > > > month
>> > > > > >
>> > > > > >
>> > > > > >
>> > > > > >
>> > > > > > On Sun, Mar 24, 2013 at 10:20 AM, [email protected] <
>> > > > > > [email protected]> wrote:
>> > > > > >
>> > > > > > > Hi Maxim,
>> > > > > > >
>> > > > > > > I was wondering if the testing phase that I thought we have
>> > agreed
>> > > on
>> > > > > > > already happen?
>> > > > > > > Or is there another reason why you initiated this RC?
>> > > > > > >
>> > > > > > > Sebastian
>> > > > > > >
>> > > > > > >
>> > > > > > > 2013/3/23 Maxim Solodovnik <[email protected]>
>> > > > > > >
>> > > > > > > > Dear OpenMeetings Community,
>> > > > > > > >
>> > > > > > > > I would like to start a vote about releasing Apache
>> > OpenMeetings
>> > > > > 2.1.0
>> > > > > > > RC3
>> > > > > > > >
>> > > > > > > > RC2 was rejected due to broken audio/video setup panel
>> > > > > > > >
>> > > > > > > > Main changes are covered in the Readme:
>> > > > > > > >
>> > > > > > > >
>> > http://svn.apache.org/repos/asf/openmeetings/tags/2.1RC3/README
>> > > > > > > >
>> > > > > > > > Full Changelog:
>> > > > > > > >
>> > > http://svn.apache.org/repos/asf/openmeetings/tags/2.1RC3/CHANGELOG
>> > > > > > > >
>> > > > > > > > Release artefacts:
>> > > > > > > >
>> https://dist.apache.org/repos/dist/dev/openmeetings/2.1/rc3/
>> > > > > > > >
>> > > > > > > > Tag:
>> http://svn.apache.org/repos/asf/openmeetings/tags/2.1RC3/
>> > > > > > > >
>> > > > > > > > PGP release keys (signed using C467526E):
>> > > > > > > >
>> > https://dist.apache.org/repos/dist/dev/openmeetings/2.1/rc3/KEYS
>> > > > > > > >
>> > > > > > > > Vote will be open for 72 hours.
>> > > > > > > >
>> > > > > > > > [ ] +1  approve
>> > > > > > > > [ ] +0  no opinion
>> > > > > > > > [ ] -1  disapprove (and reason why)
>> > > > > > > >
>> > > > > > > > My vote is +1.
>> > > > > > > >
>> > > > > > > >
>> > > > > > > > --
>> > > > > > > > WBR
>> > > > > > > > Maxim aka solomax
>> > > > > > > >
>> > > > > > >
>> > > > > > >
>> > > > > > >
>> > > > > > > --
>> > > > > > > Sebastian Wagner
>> > > > > > > https://twitter.com/#!/dead_lock
>> > > > > > > http://www.webbase-design.de
>> > > > > > > http://www.wagner-sebastian.com
>> > > > > > > [email protected]
>> > > > > > >
>> > > > > >
>> > > > > >
>> > > > > >
>> > > > > > --
>> > > > > > WBR
>> > > > > > Maxim aka solomax
>> > > > > >
>> > > > >
>> > > > >
>> > > > >
>> > > > > --
>> > > > > Sebastian Wagner
>> > > > > https://twitter.com/#!/dead_lock
>> > > > > http://www.webbase-design.de
>> > > > > http://www.wagner-sebastian.com
>> > > > > [email protected]
>> > > > >
>> > > >
>> > > >
>> > > >
>> > > > --
>> > > > WBR
>> > > > Maxim aka solomax
>> > > >
>> > >
>> > >
>> > >
>> > > --
>> > > Sebastian Wagner
>> > > https://twitter.com/#!/dead_lock
>> > > http://www.webbase-design.de
>> > > http://www.wagner-sebastian.com
>> > > [email protected]
>> > >
>> >
>> >
>> >
>> > --
>> > WBR
>> > Maxim aka solomax
>> >
>>
>>
>>
>> --
>> Sebastian Wagner
>> https://twitter.com/#!/dead_lock
>> http://www.webbase-design.de
>> http://www.wagner-sebastian.com
>> [email protected]
>>
>
>

Reply via email to