Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-30 Thread morgan.richomme
Le 27/01/2017 à 18:49, Wenjing Chu a écrit :
>
> I agree more information is better, however it is not obvious where
> that information will come from.
>
>  
>
> -Test case run result history
>
> Do we keep records of old runs in Colorado for functest & yardsticks?
> If we do, let’s link them up.
>
> If not, we can always re-run these tests on the frozen Colorado
> release and produce these results. Are we regularly running them now?
>
> Also note, the Colorado release is frozen, test cases are frozen, so
> this spot info may not be as relevant as it appears. However I agree
> it’ll become more informational and valuable with longer history.
>
>  
>
http://testresults.opnfv.org/reporting/functest/release/colorado/index-status-apex.html
no run on colorado since more than 10 days
ressources allocated to Danube/Master

all previous tests stored in the DB
http://testresults.opnfv.org/test/api/v1/results?project=yardstick=10=colorado
http://testresults.opnfv.org/test/api/v1/results?case=tempest_smoke_serial=10=colorado



> -Source code of test cases
>
> Do we have a link to source code repos in openstack, ODL/ONOS, etc
> upstreams?
>
> Can someone involved in CI/CD pitch in?
>
??
for Functest we may find such links in the build of the docker file
https://git.opnfv.org/functest/tree/docker/Dockerfile?h=stable/colorado

you wil find the reference to OPNFV repos and upstream repos used for
the tests

|# OpenStack repositories RUN git clone --depth 1 -b $OPENSTACK_TAG
https://github.com/openstack/networking-bgpvpn ${repos_dir}/bgpvpn RUN
git clone --depth 1 -b $KINGBIRD_TAG
https://github.com/openstack/kingbird.git ${repos_dir}/kingbird RUN git
clone --depth 1 -b $RALLY_TAG https://github.com/openstack/rally.git
${repos_dir}/rally RUN git clone --depth 1 -b $TEMPEST_TAG
https://github.com/openstack/tempest.git ${repos_dir}/tempest |

Note that a catalog web site is planned for Danube

/Morgan

>  
>
> Wenjing
>
>  
>
> *From:*SULLIVAN, BRYAN L [mailto:bs3...@att.com]
> *Sent:* Friday, January 27, 2017 6:20 AM
> *To:* Wenjing Chu <wenjing@huawei.com>; Pierre Lynch
> <ply...@ixiacom.com>; Jose Lausuch <jose.laus...@ericsson.com>
> *Cc:* TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
> *Subject:* RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area
>
>  
>
> More inline.
>
>  
>
> Thanks,
>
> Bryan Sullivan | AT
>
>  
>
> *From:*Wenjing Chu [mailto:wenjing@huawei.com]
> *Sent:* Thursday, January 26, 2017 12:30 PM
> *To:* SULLIVAN, BRYAN L <bs3...@att.com <mailto:bs3...@att.com>>;
> Pierre Lynch <ply...@ixiacom.com <mailto:ply...@ixiacom.com>>; Jose
> Lausuch <jose.laus...@ericsson.com <mailto:jose.laus...@ericsson.com>>
> *Cc:* TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org
> <mailto:opnfv-tech-discuss@lists.opnfv.org>>
> *Subject:* RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area
>
>  
>
> Hi Bryan
>
>  
>
> Hope my inline responses are still readable …
>
> Thanks.
>
>  
>
> Wenjing
>
>  
>
> *From:*SULLIVAN, BRYAN L [mailto:bs3...@att.com]
> *Sent:* Thursday, January 26, 2017 8:29 AM
> *To:* Wenjing Chu <wenjing@huawei.com
> <mailto:wenjing@huawei.com>>; Pierre Lynch <ply...@ixiacom.com
> <mailto:ply...@ixiacom.com>>; Jose Lausuch <jose.laus...@ericsson.com
> <mailto:jose.laus...@ericsson.com>>
> *Cc:* TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org
> <mailto:opnfv-tech-discuss@lists.opnfv.org>>
> *Subject:* RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area
>
>  
>
> More comments inline.
>
>  
>
> Thanks,
>
> Bryan Sullivan | AT
>
>  
>
> *From:*Wenjing Chu [mailto:wenjing@huawei.com]
> *Sent:* Wednesday, January 25, 2017 3:00 PM
> *To:* SULLIVAN, BRYAN L <bs3...@att.com <mailto:bs3...@att.com>>;
> Pierre Lynch <ply...@ixiacom.com <mailto:ply...@ixiacom.com>>; Jose
> Lausuch <jose.laus...@ericsson.com <mailto:jose.laus...@ericsson.com>>
> *Cc:* TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org
> <mailto:opnfv-tech-discuss@lists.opnfv.org>>
> *Subject:* RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area
>
>  
>
> Thanks Bryan. See my response inline below.
>
>  
>
> Wenjing
>
>  
>
> *From:*SULLIVAN, BRYAN L [mailto:bs3...@att.com]
> *Sent:* Wednesday, January 25, 2017 11:32 AM
> *To:* Wenjing Chu <wenjing@huawei.com
> <mailto:wenjing@huawei.com>>; Pierre Lynch <ply...@ixiacom.com
> <mailto:ply...@ixiacom.com>>; Jose Lausuch <jose.laus...@ericsso

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-27 Thread Wenjing Chu
On source code link, take a look if these (as examples) will serve the purpose 
for now.

https://gerrit.opnfv.org/gerrit/#/c/27225/ ,
https://gerrit.opnfv.org/gerrit/#/c/27221/ ,
https://gerrit.opnfv.org/gerrit/#/c/27227/ ,
https://gerrit.opnfv.org/gerrit/#/c/27231/ ,
https://gerrit.opnfv.org/gerrit/#/c/27223/

Wenjing


From: Wenjing Chu
Sent: Friday, January 27, 2017 9:49 AM
To: 'SULLIVAN, BRYAN L' <bs3...@att.com>; Pierre Lynch <ply...@ixiacom.com>; 
Jose Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I agree more information is better, however it is not obvious where that 
information will come from.


-Test case run result history
Do we keep records of old runs in Colorado for functest & yardsticks? If we do, 
let’s link them up.
If not, we can always re-run these tests on the frozen Colorado release and 
produce these results. Are we regularly running them now?
Also note, the Colorado release is frozen, test cases are frozen, so this spot 
info may not be as relevant as it appears. However I agree it’ll become more 
informational and valuable with longer history.


-Source code of test cases
Do we have a link to source code repos in openstack, ODL/ONOS, etc upstreams?
Can someone involved in CI/CD pitch in?

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Friday, January 27, 2017 6:20 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Thursday, January 26, 2017 12:30 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Bryan

Hope my inline responses are still readable …
Thanks.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Thursday, January 26, 2017 8:29 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More comments inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Wednesday, January 25, 2017 3:00 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?
[bryan] A test reference should not be added to a test area until the test has 
been approved (e.g. verified

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-27 Thread Wenjing Chu
I agree more information is better, however it is not obvious where that 
information will come from.


-Test case run result history
Do we keep records of old runs in Colorado for functest & yardsticks? If we do, 
let’s link them up.
If not, we can always re-run these tests on the frozen Colorado release and 
produce these results. Are we regularly running them now?
Also note, the Colorado release is frozen, test cases are frozen, so this spot 
info may not be as relevant as it appears. However I agree it’ll become more 
informational and valuable with longer history.


-Source code of test cases
Do we have a link to source code repos in openstack, ODL/ONOS, etc upstreams?
Can someone involved in CI/CD pitch in?

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Friday, January 27, 2017 6:20 AM
To: Wenjing Chu <wenjing@huawei.com>; Pierre Lynch <ply...@ixiacom.com>; 
Jose Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Thursday, January 26, 2017 12:30 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Bryan

Hope my inline responses are still readable …
Thanks.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Thursday, January 26, 2017 8:29 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More comments inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Wednesday, January 25, 2017 3:00 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?
[bryan] A test reference should not be added to a test area until the test has 
been approved (e.g. verified by more than one committer/reviewer). I see one 
commit with about 40 tests referenced. Have all these been verified?

[wenjing] the act of “adding a test case to a test area” is reflected as adding 
a single line in the compliance_set.yml file. Using your example, I can add 40 
tests in a patch, meaning, I “propose” (based on the wiki discussion in the 
past) that these 40 cases be included. The submitter is sending this proposal 
out for review. If you have opinion about any of these, or simply want more 
explanation/clarification, please comment on the gerrit patch. Based on those 
comments, the submitter can modify the patch to reflect updated view and 
resubmit, until we are good and approve with +1 (or disapprove with -1). Hope 
that is clear. These are patch reviews, to be approved. Not yet.

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-27 Thread SULLIVAN, BRYAN L
More inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Thursday, January 26, 2017 12:30 PM
To: SULLIVAN, BRYAN L <bs3...@att.com>; Pierre Lynch <ply...@ixiacom.com>; Jose 
Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Bryan

Hope my inline responses are still readable …
Thanks.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Thursday, January 26, 2017 8:29 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More comments inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Wednesday, January 25, 2017 3:00 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?
[bryan] A test reference should not be added to a test area until the test has 
been approved (e.g. verified by more than one committer/reviewer). I see one 
commit with about 40 tests referenced. Have all these been verified?

[wenjing] the act of “adding a test case to a test area” is reflected as adding 
a single line in the compliance_set.yml file. Using your example, I can add 40 
tests in a patch, meaning, I “propose” (based on the wiki discussion in the 
past) that these 40 cases be included. The submitter is sending this proposal 
out for review. If you have opinion about any of these, or simply want more 
explanation/clarification, please comment on the gerrit patch. Based on those 
comments, the submitter can modify the patch to reflect updated view and 
resubmit, until we are good and approve with +1 (or disapprove with -1). Hope 
that is clear. These are patch reviews, to be approved. Not yet.

I’m not sure what you mean by “verified”. If you mean if the software is 
tested, I think the answer is yes. If you mean if the test case has been 
approved, no, a patch email is precisely asking you to review.

[bryan] It will be much more complicated to approve a block of tests for 
inclusion, rather than test-by-test, unless it is clarified that as a group 
they have been passing (e.g by reference to Jenkins jobs where the tests have 
been successfully running as a group). A single large commit with a bunch of 
text strings with no explanation as to what they are, where they have been 
tested, etc, is not the way we should manage the dovetail testcase list.


2)  The commit will include a link to the details of the test case (script 
or otherwise what I would use to run the test for myself)

You can trace down to the source step by step, e.g. from test area to test 
case, then to functest or yardstick, and/or to openstack or other upstream 
eventually the source code in that upstream project.

To test run it, you would need a test environment/pod. I would think that 
running dovetail tool, specifying the individual test cases you’d want to run, 
and examining the results probably a good way to go. Maybe good to write down a 
“how-to” che

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-26 Thread Wenjing Chu
Hi Bryan

Hope my inline responses are still readable …
Thanks.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Thursday, January 26, 2017 8:29 AM
To: Wenjing Chu <wenjing@huawei.com>; Pierre Lynch <ply...@ixiacom.com>; 
Jose Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

More comments inline.

Thanks,
Bryan Sullivan | AT

From: Wenjing Chu [mailto:wenjing@huawei.com]
Sent: Wednesday, January 25, 2017 3:00 PM
To: SULLIVAN, BRYAN L <bs3...@att.com<mailto:bs3...@att.com>>; Pierre Lynch 
<ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing@huawei.com<mailto:wenjing@huawei.com>>; Pierre 
Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?
[bryan] A test reference should not be added to a test area until the test has 
been approved (e.g. verified by more than one committer/reviewer). I see one 
commit with about 40 tests referenced. Have all these been verified?

[wenjing] the act of “adding a test case to a test area” is reflected as adding 
a single line in the compliance_set.yml file. Using your example, I can add 40 
tests in a patch, meaning, I “propose” (based on the wiki discussion in the 
past) that these 40 cases be included. The submitter is sending this proposal 
out for review. If you have opinion about any of these, or simply want more 
explanation/clarification, please comment on the gerrit patch. Based on those 
comments, the submitter can modify the patch to reflect updated view and 
resubmit, until we are good and approve with +1 (or disapprove with -1). Hope 
that is clear. These are patch reviews, to be approved. Not yet.

I’m not sure what you mean by “verified”. If you mean if the software is 
tested, I think the answer is yes. If you mean if the test case has been 
approved, no, a patch email is precisely asking you to review.


2)  The commit will include a link to the details of the test case (script 
or otherwise what I would use to run the test for myself)

You can trace down to the source step by step, e.g. from test area to test 
case, then to functest or yardstick, and/or to openstack or other upstream 
eventually the source code in that upstream project.

To test run it, you would need a test environment/pod. I would think that 
running dovetail tool, specifying the individual test cases you’d want to run, 
and examining the results probably a good way to go. Maybe good to write down a 
“how-to” cheat-sheet for this?
[bryan] Not sure how that answered the comment. Rather than having to search 
for something that relates to the test case reference, it would be good for the 
commit message (for test cases) to contain a URL reference to the test source. 
That’s what I was referring to. We need to simplify the effort of reviewers, to 
encourage more active reviews and second-opinions/testing on the tests. Re the 
test environment, that’s no problem the test should clarify what is needed and 
how to run it. Having an environment to do so is clear, and should be clarified 
by the test anyway.

[wenjing] I’ll be happy to try a simpler way for us to review upstream code, so 
if anyone has better idea, please suggest here. But let me first make sure I 
understand what you are asking. If a piece of code is written in Dovetail 
project, yes, your gerrit patch will show everything. If we are only 
referencing a test case in Functest, you will only see the reference line of 
source in Dovetail. I believe we’ve pro

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-26 Thread Wenjing Chu
Hi Tapio

Think of these patches as a place to review the proposed test cases. It does 
not imply that the submitter believes it’s already approved. If a yardstick 
test case may be falling into performance area and shouldn’t be included in 
your opinion, I would provide that comment in gerrit. The submitter may explain 
that is not the case, and once it’s cleared, then you can -1 it or +1 it. So 
this is the normal review process using gerrit.

Wrt yardstick, my understanding is that it’s not all performance benchmark 
testing only, but I’ll leave that to Yardstick folks to comment.

Regards
Wenjing

From: opnfv-tech-discuss-boun...@lists.opnfv.org 
[mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Tapio Tallgren
Sent: Wednesday, January 25, 2017 11:36 PM
To: opnfv-tech-discuss@lists.opnfv.org
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I would like to clarify one thing for myself: Dovetail is (currently) about 
interface tests that test certain functionality using an interface, and not 
about how fast or good the implementation for that functionality is. Yardstick, 
in my mind, is about running performance tests to benchmark the implementation. 
Is or is not?

The reason I am asking that many proposed test cases are about measuring 
something. I will "-1" all those in Gerrit with this explanation.

-Tapio




On 01/26/2017 01:00 AM, Wenjing Chu wrote:
Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing@huawei.com><mailto:wenjing@huawei.com>; Pierre 
Lynch <ply...@ixiacom.com><mailto:ply...@ixiacom.com>; Jose Lausuch 
<jose.laus...@ericsson.com><mailto:jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org><mailto:opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?


2)  The commit will include a link to the details of the test case (script 
or otherwise what I would use to run the test for myself)

You can trace down to the source step by step, e.g. from test area to test 
case, then to functest or yardstick, and/or to openstack or other upstream 
eventually the source code in that upstream project.

To test run it, you would need a test environment/pod. I would think that 
running dovetail tool, specifying the individual test cases you’d want to run, 
and examining the results probably a good way to go. Maybe good to write down a 
“how-to” cheat-sheet for this?


3)  All tests need to be working under at least one scenario, and the more 
scenarios that have been validated (either explicitly or implicitly), the 
higher priority the test should get. “Implicit” means that a test validated on 
a basic scenario (e.g. nosdn) is implicitly validated on other scenarios for 
that installer. But explicit validation is of course best.

Thanks for highlighting the implicit cases: more “implicit” is “better”, 
because it means something works  more “universally” rather than relying on 
special cases. I would caution on the “more scenario” metrics again because it 
does not necessarily mean “larger applicability”. Sometimes it does, sometimes 
it doesn’t. Also note the fact that we ought not be counting non-generic 
scenarios as the same as generic ones. So let’s not be too numerical about it, 
the criteria should be about the larger applicability scope. I made this point 
in one my earlier emails as well going through the scenarios in Colorado.


4)  The reviewers may require that they be able to duplicate the test 
validation before commit merge.

Please refer to 2) and see if you need anything else.
///


Thanks,
Bryan Sullivan | AT

From: 
opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>
 [mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Wenjing Chu
Sent: Wednesday, January 25, 2017 11:26 AM
To: Pierre Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re:

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-25 Thread Tapio Tallgren
I would like to clarify one thing for myself: Dovetail is (currently) 
about interface tests that test certain functionality using an 
interface, and not about how fast or good the implementation for that 
functionality is. Yardstick, in my mind, is about running performance 
tests to benchmark the implementation. Is or is not?


The reason I am asking that many proposed test cases are about measuring 
something. I will "-1" all those in Gerrit with this explanation.


-Tapio




On 01/26/2017 01:00 AM, Wenjing Chu wrote:


Thanks Bryan. See my response inline below.

Wenjing

*From:*SULLIVAN, BRYAN L [mailto:bs3...@att.com]
*Sent:* Wednesday, January 25, 2017 11:32 AM
*To:* Wenjing Chu <wenjing@huawei.com>; Pierre Lynch 
<ply...@ixiacom.com>; Jose Lausuch <jose.laus...@ericsson.com>

*Cc:* TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
*Subject:* RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we 
need alignment on:


1)All proposed dovetail included tests will be added one-by-one, in a 
separate commit.


Please follow the gerrit tickets below and see if you can follow 
through. Test cases are organized into two levels for convenience: 
test areas and test cases. There will be one commit for each test 
case, and one commit for each test area (which includes a lot of test 
cases that are related to a function area). The test case commit says 
we are good on how that test case is implemented. The test area commit 
says we agree the test case ought to be included. Clear enough?


2)The commit will include a link to the details of the test case 
(script or otherwise what I would use to run the test for myself)


You can trace down to the source step by step, e.g. from test area to 
test case, then to functest or yardstick, and/or to openstack or other 
upstream eventually the source code in that upstream project.


To test run it, you would need a test environment/pod. I would think 
that running dovetail tool, specifying the individual test cases you’d 
want to run, and examining the results probably a good way to go. 
Maybe good to write down a “how-to” cheat-sheet for this?


3)All tests need to be working under at least one scenario, and the 
more scenarios that have been validated (either explicitly or 
implicitly), the higher priority the test should get. “Implicit” means 
that a test validated on a basic scenario (e.g. nosdn) is implicitly 
validated on other scenarios for that installer. But explicit 
validation is of course best.


Thanks for highlighting the implicit cases: more “implicit” is 
“better”, because it means something works  more “universally” rather 
than relying on special cases. I would caution on the “more scenario” 
metrics again because it does not necessarily mean “larger 
applicability”. Sometimes it does, sometimes it doesn’t. Also note the 
fact that we ought not be counting non-generic scenarios as the same 
as generic ones. So let’s not be too numerical about it, the criteria 
should be about the larger applicability scope. I made this point in 
one my earlier emails as well going through the scenarios in Colorado.


4)The reviewers may require that they be able to duplicate the test 
validation before commit merge.


Please refer to 2) and see if you need anything else.

///

Thanks,

Bryan Sullivan | AT

*From:*opnfv-tech-discuss-boun...@lists.opnfv.org 
<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org> 
[mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] *On Behalf Of 
*Wenjing Chu

*Sent:* Wednesday, January 25, 2017 11:26 AM
*To:* Pierre Lynch <ply...@ixiacom.com <mailto:ply...@ixiacom.com>>; 
Jose Lausuch <jose.laus...@ericsson.com 
<mailto:jose.laus...@ericsson.com>>
*Cc:* TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org 
<mailto:opnfv-tech-discuss@lists.opnfv.org>>

*Subject:* Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

The process that we may have already being informally following is as 
follows. We work towards having consensus on majority of areas that 
arise within the dovetail project. If there are open questions that we 
can’t resolve, we could gather the relevant info and bring that to TSC 
for decision. In the TSC review, dovetail will present the proposed 
plan out of dovetail, plus potentially open issues, and ask for (a) 
approval of the proposal (b) determination of open questions, if any.


Does this sound like a good process to follow?

On the topic of scenario cleanup, the Dovetail team has been voicing 
that opinion for a long time, and so I applaud and strongly support 
the effort to separate general vs specific scenarios, and it will help 
Dovetail tremendously going forward. However, also keep in mind that 
that work is slated for D and E releases. It unfortunately can’t help 
in the immediate task for us for C release target.


To join in the detailed re

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-25 Thread Wenjing Chu
Thanks Bryan. See my response inline below.

Wenjing

From: SULLIVAN, BRYAN L [mailto:bs3...@att.com]
Sent: Wednesday, January 25, 2017 11:32 AM
To: Wenjing Chu <wenjing@huawei.com>; Pierre Lynch <ply...@ixiacom.com>; 
Jose Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: RE: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

Please follow the gerrit tickets below and see if you can follow through. Test 
cases are organized into two levels for convenience: test areas and test cases. 
There will be one commit for each test case, and one commit for each test area 
(which includes a lot of test cases that are related to a function area). The 
test case commit says we are good on how that test case is implemented. The 
test area commit says we agree the test case ought to be included. Clear enough?


2)  The commit will include a link to the details of the test case (script 
or otherwise what I would use to run the test for myself)

You can trace down to the source step by step, e.g. from test area to test 
case, then to functest or yardstick, and/or to openstack or other upstream 
eventually the source code in that upstream project.

To test run it, you would need a test environment/pod. I would think that 
running dovetail tool, specifying the individual test cases you’d want to run, 
and examining the results probably a good way to go. Maybe good to write down a 
“how-to” cheat-sheet for this?


3)  All tests need to be working under at least one scenario, and the more 
scenarios that have been validated (either explicitly or implicitly), the 
higher priority the test should get. “Implicit” means that a test validated on 
a basic scenario (e.g. nosdn) is implicitly validated on other scenarios for 
that installer. But explicit validation is of course best.

Thanks for highlighting the implicit cases: more “implicit” is “better”, 
because it means something works  more “universally” rather than relying on 
special cases. I would caution on the “more scenario” metrics again because it 
does not necessarily mean “larger applicability”. Sometimes it does, sometimes 
it doesn’t. Also note the fact that we ought not be counting non-generic 
scenarios as the same as generic ones. So let’s not be too numerical about it, 
the criteria should be about the larger applicability scope. I made this point 
in one my earlier emails as well going through the scenarios in Colorado.


4)  The reviewers may require that they be able to duplicate the test 
validation before commit merge.

Please refer to 2) and see if you need anything else.
///


Thanks,
Bryan Sullivan | AT

From: 
opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>
 [mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Wenjing Chu
Sent: Wednesday, January 25, 2017 11:26 AM
To: Pierre Lynch <ply...@ixiacom.com<mailto:ply...@ixiacom.com>>; Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area


The process that we may have already being informally following is as follows. 
We work towards having consensus on majority of areas that arise within the 
dovetail project. If there are open questions that we can’t resolve, we could 
gather the relevant info and bring that to TSC for decision. In the TSC review, 
dovetail will present the proposed plan out of dovetail, plus potentially open 
issues, and ask for (a) approval of the proposal (b) determination of open 
questions, if any.
Does this sound like a good process to follow?

On the topic of scenario cleanup, the Dovetail team has been voicing that 
opinion for a long time, and so I applaud and strongly support the effort to 
separate general vs specific scenarios, and it will help Dovetail tremendously 
going forward. However, also keep in mind that that work is slated for D and E 
releases. It unfortunately can’t help in the immediate task for us for C 
release target.

To join in the detailed review effort, please note that review of test areas 
and test cases are based on Jira and Gerrit. For example,

These are for test areas: (the file is compliance_set.yml)

https://gerrit.opnfv.org/gerrit/27493
https://gerrit.opnfv.org/gerrit/27219

And here is an example of a test case within a test area:

https://gerrit.opnfv.org/gerrit/27221

These Gerrit  links are also posted on wiki for convenience: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Areas+and+Test+Cases
However it’s a bit slow to refresh there since it is a manual process. I

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-25 Thread SULLIVAN, BRYAN L
I posted some comments in gerrit. Here are the main points I think we need 
alignment on:

1)  All proposed dovetail included tests will be added one-by-one, in a 
separate commit.

2)  The commit will include a link to the details of the test case (script 
or otherwise what I would use to run the test for myself)

3)  All tests need to be working under at least one scenario, and the more 
scenarios that have been validated (either explicitly or implicitly), the 
higher priority the test should get. “Implicit” means that a test validated on 
a basic scenario (e.g. nosdn) is implicitly validated on other scenarios for 
that installer. But explicit validation is of course best.

4)  The reviewers may require that they be able to duplicate the test 
validation before commit merge.

Thanks,
Bryan Sullivan | AT

From: opnfv-tech-discuss-boun...@lists.opnfv.org 
[mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Wenjing Chu
Sent: Wednesday, January 25, 2017 11:26 AM
To: Pierre Lynch <ply...@ixiacom.com>; Jose Lausuch <jose.laus...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area


The process that we may have already being informally following is as follows. 
We work towards having consensus on majority of areas that arise within the 
dovetail project. If there are open questions that we can’t resolve, we could 
gather the relevant info and bring that to TSC for decision. In the TSC review, 
dovetail will present the proposed plan out of dovetail, plus potentially open 
issues, and ask for (a) approval of the proposal (b) determination of open 
questions, if any.
Does this sound like a good process to follow?

On the topic of scenario cleanup, the Dovetail team has been voicing that 
opinion for a long time, and so I applaud and strongly support the effort to 
separate general vs specific scenarios, and it will help Dovetail tremendously 
going forward. However, also keep in mind that that work is slated for D and E 
releases. It unfortunately can’t help in the immediate task for us for C 
release target.

To join in the detailed review effort, please note that review of test areas 
and test cases are based on Jira and Gerrit. For example,

These are for test areas: (the file is compliance_set.yml)

https://gerrit.opnfv.org/gerrit/27493
https://gerrit.opnfv.org/gerrit/27219

And here is an example of a test case within a test area:

https://gerrit.opnfv.org/gerrit/27221

These Gerrit  links are also posted on wiki for convenience: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Areas+and+Test+Cases
However it’s a bit slow to refresh there since it is a manual process. I would 
recommend you get on gerrit still. We are at the beginning of the review 
process so it’s not late. General level questions or specific topics can of 
course still be done in mailing list or on meetings, but try to stay on gerrit 
as much as you can. Let us know if you have any feedback. Thanks.

Regards
Wenjing

From: 
opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>
 [mailto:opnfv-tech-discuss-boun...@lists.opnfv.org] On Behalf Of Pierre Lynch
Sent: Wednesday, January 25, 2017 8:44 AM
To: Jose Lausuch <jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

IMHO, getting agreement on what the scope of testing will be (features, etc) 
should be pretty urgent. How should we go about it? Agree within the Dovetail 
team, then run it to the TSC to get their blessing? Should we consolidate this 
process with the current ongoing discussion on scenario consolidation, which 
lead to the idea of generic versus specific scenarios? Dovetail would include 
generic scenarios, while specific scenarios would be excluded from Dovetail? It 
would provide uniformity….

I would expect that determining what’s in and what’s out could be a delicate 
process.

Thanks,

Pierre




On Jan 25, 2017, at 7:18 AM, Jose Lausuch 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>> wrote:

Thanks Chris, that makes things clearer. But still, it is a broad statement and 
difficult to measure. I guess and as you say, the TSC has the final word when 
approving features to be verified/certified in Dovetail with existing tests. 
From functional prospective, I can just provide and overview about how the 
tests were behaving when releasing Colorado.

Regards,
Jose


From: Christopher Price
Sent: Wednesday, January 25, 2017 14:17 PM
To: Jose Lausuch; Tianhongbo; Tim Irnich
Cc: 'TECH-DISCUSS OPNFV'
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Jose,

The intent of this statement is that we should not attempt to establish 
compliance

Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-25 Thread Jose Lausuch
Thanks Chris, that makes things clearer. But still, it is a broad statement and 
difficult to measure. I guess and as you say, the TSC has the final word when 
approving features to be verified/certified in Dovetail with existing tests. 
From functional prospective, I can just provide and overview about how the 
tests were behaving when releasing Colorado.

Regards,
Jose


From: Christopher Price
Sent: Wednesday, January 25, 2017 14:17 PM
To: Jose Lausuch; Tianhongbo; Tim Irnich
Cc: 'TECH-DISCUSS OPNFV'
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Jose,

The intent of this statement is that we should not attempt to establish 
compliance tests on features or capabilities that are unique to a very specific 
configuration or composition of components. The statement is intended to mean 
that we should focus our efforts on compliance on “generally available” or 
“community relevant” use cases and features.

Again, I am not able to accurately articulate what that means or how to measure 
it, as such we have a somewhat obtuse statement in the documentation.

This should be seen as a guideline to be applied by the development, testing 
and dovetail teams around expectations for compliance testing.  It would be 
eventually ratified or judged by the TSC as they have the final say on the 
tests that are approved for compliance validation for a given dovetail release.

Does that help?
I do believe we should formalize and commit our governance into a repo and have 
the TSC cast an approving eye over it as well for good form.  Then, if nothing 
else, we would have a more consistent view of our intention and needed approach.

/ Chris

From: 
<opnfv-tech-discuss-boun...@lists.opnfv.org<mailto:opnfv-tech-discuss-boun...@lists.opnfv.org>>
 on behalf of Jose Angel Lausuch Sales 
<jose.laus...@ericsson.com<mailto:jose.laus...@ericsson.com>>
Date: Wednesday, 25 January 2017 at 11:51
To: Tianhongbo 
<hongbo.tianhon...@huawei.com<mailto:hongbo.tianhon...@huawei.com>>, Tim Irnich 
<tim.irn...@ericsson.com<mailto:tim.irn...@ericsson.com>>
Cc: TECH-DISCUSS OPNFV 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Hongbo,


Test cases must pass on OPNFV reference deployments
• Tests must not require a specific NFVi platform composition or 
installation tool

Can you please explain what this statement exactly means? By “installation 
tool” are we talking about the installers we have or a specific and different 
tool to install a certain feature?

Adding Tim, who is the SDNVPN PTL.

Thanks,
Jose

From: Tianhongbo [mailto:hongbo.tianhon...@huawei.com]
Sent: Wednesday, January 25, 2017 01:46 AM
To: Jose Lausuch
Cc: 'TECH-DISCUSS OPNFV'
Subject: [dovetail]L3VPN for dovetail area

Hi Jose:

As you mentioned, there will be discussion about the more detail of the L3VPN 
with L3VPN team to check if the L3VPN can be included in the dovetail area now.

There are some requirements from the dovetail wiki page: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Case+Requirements

Look forward to your reply.

Best regards

hongbo
___
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss


Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-25 Thread Christopher Price
Hi Jose,

The intent of this statement is that we should not attempt to establish 
compliance tests on features or capabilities that are unique to a very specific 
configuration or composition of components. The statement is intended to mean 
that we should focus our efforts on compliance on “generally available” or 
“community relevant” use cases and features.

Again, I am not able to accurately articulate what that means or how to measure 
it, as such we have a somewhat obtuse statement in the documentation.

This should be seen as a guideline to be applied by the development, testing 
and dovetail teams around expectations for compliance testing.  It would be 
eventually ratified or judged by the TSC as they have the final say on the 
tests that are approved for compliance validation for a given dovetail release.

Does that help?
I do believe we should formalize and commit our governance into a repo and have 
the TSC cast an approving eye over it as well for good form.  Then, if nothing 
else, we would have a more consistent view of our intention and needed approach.

/ Chris

From: <opnfv-tech-discuss-boun...@lists.opnfv.org> on behalf of Jose Angel 
Lausuch Sales <jose.laus...@ericsson.com>
Date: Wednesday, 25 January 2017 at 11:51
To: Tianhongbo <hongbo.tianhon...@huawei.com>, Tim Irnich 
<tim.irn...@ericsson.com>
Cc: TECH-DISCUSS OPNFV <opnfv-tech-discuss@lists.opnfv.org>
Subject: Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

Hi Hongbo,


Test cases must pass on OPNFV reference deployments
· Tests must not require a specific NFVi platform composition or 
installation tool

Can you please explain what this statement exactly means? By “installation 
tool” are we talking about the installers we have or a specific and different 
tool to install a certain feature?

Adding Tim, who is the SDNVPN PTL.

Thanks,
Jose

From: Tianhongbo [mailto:hongbo.tianhon...@huawei.com]
Sent: Wednesday, January 25, 2017 01:46 AM
To: Jose Lausuch
Cc: 'TECH-DISCUSS OPNFV'
Subject: [dovetail]L3VPN for dovetail area

Hi Jose:

As you mentioned, there will be discussion about the more detail of the L3VPN 
with L3VPN team to check if the L3VPN can be included in the dovetail area now.

There are some requirements from the dovetail wiki page: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Case+Requirements

Look forward to your reply.

Best regards

hongbo
___
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss


Re: [opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-25 Thread Jose Lausuch
Hi Hongbo,


Test cases must pass on OPNFV reference deployments
* Tests must not require a specific NFVi platform composition or 
installation tool

Can you please explain what this statement exactly means? By "installation 
tool" are we talking about the installers we have or a specific and different 
tool to install a certain feature?

Adding Tim, who is the SDNVPN PTL.

Thanks,
Jose

From: Tianhongbo [mailto:hongbo.tianhon...@huawei.com]
Sent: Wednesday, January 25, 2017 01:46 AM
To: Jose Lausuch
Cc: 'TECH-DISCUSS OPNFV'
Subject: [dovetail]L3VPN for dovetail area

Hi Jose:

As you mentioned, there will be discussion about the more detail of the L3VPN 
with L3VPN team to check if the L3VPN can be included in the dovetail area now.

There are some requirements from the dovetail wiki page: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Case+Requirements

Look forward to your reply.

Best regards

hongbo
___
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss


[opnfv-tech-discuss] [dovetail]L3VPN for dovetail area

2017-01-24 Thread Tianhongbo
Hi Jose:

As you mentioned, there will be discussion about the more detail of the L3VPN 
with L3VPN team to check if the L3VPN can be included in the dovetail area now.

There are some requirements from the dovetail wiki page: 
https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Case+Requirements

Look forward to your reply.

Best regards

hongbo
___
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss