See below...


----------------------------------------

From: "xudan (N)" <xuda...@huawei.com<mailto:xuda...@huawei.com>>
Date: Monday, January 21, 2019 at 1:19:22 AM
To: "LOVETT, TREVOR J" <tl2...@att.com<mailto:tl2...@att.com>>, "Kanagaraj 
Manickam" 
<kanagaraj.manic...@huawei.com<mailto:kanagaraj.manic...@huawei.com>>, 
"opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>" 
<opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>>
Cc: "STARK, STEVEN" <ss8...@att.com<mailto:ss8...@att.com>>, "WRIGHT, STEVEN A" 
<sw3...@att.com<mailto:sw3...@att.com>>, "HALLAHAN, RYAN" 
<rh1...@att.com<mailto:rh1...@att.com>>, "WEINSTOCK, ALAN M" 
<aw2...@att.com<mailto:aw2...@att.com>>, "Gaoweitao (Victor, Cloudify Network 
OSDT)" <victor....@huawei.com<mailto:victor....@huawei.com>>, 
"mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>" 
<mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>>, 
"pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>" 
<pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>>
Subject: RE: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action Item 
from Developer Event

>
> Hi Trevor,
>
>
> The results format looks much better and clearer. I have some further 
> questions about the results.
>
> 1. The ‘result’ of each test case could be PASS/FAIL/SKIP. In which case one 
> test case will SKIP?
>
[Trevor] details are I. The docs I linked to, but this essentially the test is 
not applicable due to the template contents (ex: the requirement is about 
neutron ports but the template doesn’t have neutron ports)

> 2. For some test cases, there are ‘requirements’ which has 3 keys ‘id’, 
> ‘text’ and ‘keyword’. Also there are some test cases with empty 
> ‘requirements’ (Line 915). Why they are empty without ‘id’, ‘text’ and 
> ‘keyword’?

[Trevor] there are a few areas where the tests need to be synced up with the 
latest requirement changes. In these instances the requirement has been 
deleted, but the test hasn’t been updated yet. We will be making those updates 
this week.

> 3. From Line 10539 to the end, there is another different kind of data 
> format. Why there are 2 totally different data formats?

Please refer to the documentation. (See Requirement Result)

> 4. The ‘outcome’ should be the final result of all test cases. How to get the 
> outcome? It will be PASS if there is not any FAIL test cases or not any 
> FAIL/SKIP test cases?

At the header level there is one field that summarizes the overall result into 
PASS, FAIL, ERROR see Header/ Top Level section of the docs.


> Anyway, the results format is sufficient for both Dovetail tool and web 
> portal.
>
>
> Regards,
>
> Dan
>
>
> From: LOVETT, TREVOR J [mailto:tl2...@att.com]
> Sent: Monday, January 21, 2019 12:41 PM
> To: Kanagaraj Manickam 
> <kanagaraj.manic...@huawei.com<mailto:kanagaraj.manic...@huawei.com>>; xudan 
> (N) <xuda...@huawei.com<mailto:xuda...@huawei.com>>; 
> opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
> Cc: STARK, STEVEN <ss8...@att.com<mailto:ss8...@att.com>>; WRIGHT, STEVEN A 
> <sw3...@att.com<mailto:sw3...@att.com>>; HALLAHAN, RYAN 
> <rh1...@att.com<mailto:rh1...@att.com>>; WEINSTOCK, ALAN M 
> <aw2...@att.com<mailto:aw2...@att.com>>; Gaoweitao (Victor, Cloudify Network 
> OSDT) <victor....@huawei.com<mailto:victor....@huawei.com>>; 
> mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>; 
> pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>
> Subject: RE: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action 
> Item from Developer Event
>
>
> It still needs to be merged with master, but I’ve submitted a change to the 
> validation-scripts that has a more comprehensive output.
>
>
> For more details, please see the file format specification 
> here:https://wiki.onap.org/display/DW/Updated+JSON+Report+Output+for+Validation+Scripts
>
>
> I’ve attached a sample file here as well.
>
>
> I’ve also added a new option to the command line to direct output to a new 
> directory ( --output-directory)
>
>
> The failures file still exists and con continue to be used if needed, but I 
> would recommend the new file (report.json).
>
>
> Thanks,
>
> Trevor
>
>
> From: Kanagaraj Manickam [mailto:kanagaraj.manic...@huawei.com]
> Sent: Friday, January 18, 2019 2:43 AM
> To: LOVETT, TREVOR J <tl2...@att.com<mailto:tl2...@att.com>>; xudan (N) 
> <xuda...@huawei.com<mailto:xuda...@huawei.com>>;opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
> Cc: STARK, STEVEN <ss8...@att.com<mailto:ss8...@att.com>>; WRIGHT, STEVEN A 
> <sw3...@att.com<mailto:sw3...@att.com>>; HALLAHAN, RYAN 
> <rh1...@att.com<mailto:rh1...@att.com>>; WEINSTOCK, ALAN M 
> <aw2...@att.com<mailto:aw2...@att.com>>; Gaoweitao (Victor, Cloudify Network 
> OSDT) 
> <victor....@huawei.com<mailto:victor....@huawei.com>>;mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>;
>  pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>
> Subject: RE: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action 
> Item from Developer Event
>
>
> Hi Trevor,
>
>
> Thank you for the inputs.
>
>
> I have integrated the VVP scripts in VTP and tested the results and needs to 
> be refined further.
>
> Please find more details in-line response below.
>
>
> Thanks.
>
>
> Regards,
>
> Kanagaraj Manickam
>
> Senior System Architect
>
> P&S ONAP
>
> Huawei Technologies India Pvt. Ltd.
>
> Survey No. 37, Next to EPIP Area, Kundalahalli, Whitefield
>
> Bengaluru-560066, Karnataka
>
> Tel: + 91-80-49160700 ext 72410 Mob: 9945602938
>
>
>
> This e-mail and its attachments contain confidential information from HUAWEI, 
> which
> is intended only for the person or entity whose address is listed above. Any 
> use of the
> information contained herein in any way (including, but not limited to, total 
> or partial
> disclosure, reproduction, or dissemination) by persons other than the intended
> recipient(s) is prohibited. If you receive this e-mail in error, please 
> notify the sender by
> phone or email immediately and delete it!
>
>
> From: LOVETT, TREVOR J [mailto:tl2...@att.com]
> Sent: 16 January 2019 21:49
> To: Kanagaraj Manickam 
> <kanagaraj.manic...@huawei.com<mailto:kanagaraj.manic...@huawei.com>>; xudan 
> (N) 
> <xuda...@huawei.com<mailto:xuda...@huawei.com>>;opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
> Cc: STARK, STEVEN <ss8...@att.com<mailto:ss8...@att.com>>; WRIGHT, STEVEN A 
> <sw3...@att.com<mailto:sw3...@att.com>>; HALLAHAN, RYAN 
> <rh1...@att.com<mailto:rh1...@att.com>>; WEINSTOCK, ALAN M 
> <aw2...@att.com<mailto:aw2...@att.com>>; Gaoweitao (Victor, Cloudify Network 
> OSDT) 
> <victor....@huawei.com<mailto:victor....@huawei.com>>;mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>;
>  pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>
> Subject: RE: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action 
> Item from Developer Event
>
>
> Here are the answers to your questions…
>
>
> 1. As we knew, ONAP Vendor Software Product (VSP) is zip containing the 
> required HOT template with MANIFEST.json, Is VVP script validate this VSP as 
> a whole including this MANIFEST.json or it only validates the HOT templates 
> inside the VSP
>
>
> [Trevor] For a VNF created from Heat, the VSP is an artifact created in SDC 
> itself so we are not uploading a VSP nor verifying the VSP as a whole – only 
> the Heat. The Heat Orchestration Template package is a zip file with all 
> files (Heat templates, environment files, and any supporting files/scripts) 
> in the root directory of that zip file. This is what is uploaded into SDC. 
> There is no MANIFEST.json file included per the Heat requirements.
>
>
> [Kanag] OK. I observed from the vFW demo that the artifact used for creating 
> the VSP contains these files 1. HOT template 2. HOT env file 3. 
> MANIFEST.json. so as part of CVC, when vendor submit their VNF package, I 
> assumed MANIFEST also will be included. If this is not the case, I think we 
> can ignore thisJ
>
>
> 2. As VNFREQS defines list of HOT template requirements with MUST and MUST 
> NOT criteria,
>
> a. How to validate given VSPagainst only MUST VNFREQS by using 
> https://github.com/onap/vvp-validation-scripts ?
>
>
> [Trevor] The validation scripts already only validate MUST and MUST NOT 
> requirements – not SHOULD and MAY. If you’re suggesting we need a way to 
> filter further on testing just MUST and not MUST NOTS, please elaborate on 
> why you think that is necessary and provide an example. All the tests in the 
> validation-scripts section must execute and pass for Heat to be considered 
> valid.
>
> [Kanag] OK.
>
>
> b. How to validate given VSP fora given set of VNFREQS by using these 
> validation scripts
>
> NOTE: Here Assuming VVP scripts supports VSP as whole.
>
>
> [Trevor] Please provide an example or use case of what you mean here. The 
> validation-scripts only validate the Heat requirements. The validations 
> already bypass checks that are not necessary based on conditions. Why would 
> you need further filtering of the execution?
>
> [Kanag] I was under assumption that we need to validate the VNF package w.r.t 
> for the given set of VNFREQS to satisfy the CVC. But in this week VNFSDK 
> meeting, it was decided to go with all those released checks as part of 
> Casablanca. So this use case may not be required now.
>
>
> 3. What are the versions of OpenStack HOT template (heat_template_version), 
> VVP scripts support?
>
>
> a. [Trevor] There are no restrictions placed on heat_template_version in the 
> validation-scripts directly as the version of Open Stack is somewhat 
> operator-specific. The validation-scripts support versions as early as 
> 2013-05-23, but we haven’t extensively tested it against all versions.
>
> [Kanag] OK.
>
>
> I don’t have any issue with you all writing a wrapper. You can either take 
> the output files we documented and convert the results to a compatible format 
> or we can add an additional output format for consumption by Dovetail/OVP. If 
> you would like us to produce an additional output format, then I would need 
> some additional detail on the format described in your email. It’s relatively 
> trivial to create a new output format.
>
>
> [Kanag] I tried to integrate the VVP scripts in VTP and followings are the 
> responses:
>
>
> Success case:
>
> {"results" : [{}],
>
> "build_tag" :“build-1”,
> "criteria" : "PASS"
> }
>
> Failure case:
>
> it capture results as below by pulling details from the output/failures file 
> by using entries vnfreqs, file (template) and messag.?
>
> {"results" : [
>
> {
> "vnfreqs" : "[]",
>
> “file” :“xxx”,
>
> “message”:“xxx”
> },
>
> {
> "vnfreqs" : "[]",
>
> “file” :“xxx”,
>
> “message”:“xxx”
> }
>
> ],
>
> "build_tag" :“build-1”,
> "criteria" : "FAILED"
> }
>
>
> Pls let me know if any changes required. Also I have attached the sample for 
> reference
>
>
> I have made following observations while testing:
>
> 1. on consecutive execution of test cases, it appends the existing 
> output/failures file. Is there a way to overwrite this file instead of 
> appending?
>
> 2. Is there a way to set the output path while running the test case, 
> currently the output folder is generated directly under the ice_validator 
> folder. This would help to run test cases simultaneously from VTP or to run 
> test cases for different HOT templates
>
>
>
>
> · In“results”, is there one entry per validation?
>
> [Kanag] yes. Similar to the output/failures file format.
>
> · In“results”, what does it look like if there’s a failure?
>
> [Kanag] as given in attachment
>
> · In“results”, are there any restrictions on length or formatting of the 
> error messages? Some of the VVP information is multi-line.
>
> [Kanag] I think if the failures json file is proper json format, it should 
> not be any issue, VTP will handle it smoothly
>
> · What does“build_tag” represent?
>
> [Kanag] Dovetail set this when it triggers the test cases on VTP. It can be 
> ignored from the test case output as VTP takes care of it.
>
> · What does“criteria” refer to? Is it the overall result of the validation? 
> If so, what are the valid values?
>
> [Kanag] Yes, PASS– testcase passed and no failures . otherwise it is FAILED. 
> In VVP case.
>
>
>
> Thanks,
>
> Trevor
>
>
>
> From: Kanagaraj Manickam [mailto:kanagaraj.manic...@huawei.com]
> Sent: Wednesday, January 16, 2019 3:10 AM
> To: xudan (N) <xuda...@huawei.com<mailto:xuda...@huawei.com>>; LOVETT, TREVOR 
> J 
> <tl2...@att.com<mailto:tl2...@att.com>>;opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
> Cc: STARK, STEVEN <ss8...@att.com<mailto:ss8...@att.com>>; WRIGHT, STEVEN A 
> <sw3...@att.com<mailto:sw3...@att.com>>; HALLAHAN, RYAN 
> <rh1...@att.com<mailto:rh1...@att.com>>; WEINSTOCK, ALAN M 
> <aw2...@att.com<mailto:aw2...@att.com>>; Gaoweitao (Victor, Cloudify Network 
> OSDT) 
> <victor....@huawei.com<mailto:victor....@huawei.com>>;mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>;
>  pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>
> Subject: RE: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action 
> Item from Developer Event
>
>
> Hi Trevor,
>
>
> During Casablanca release, In VTP, we introduced test cases for validating 
> the give VNF CSAR file weather it complaint to ETSI SOL004 and it produce the 
> results with following details:
>
> 1. Passed/failed
>
> 2. Error details.
>
>
> Current result format is:
>
> {
> "results" : [ {
> "error" : "SUCCESS"
> } ],
> "build_tag" : “CVC”,
> "criteria" : "PASS"
> }
> }
>
>
> So this would help to verify weather given CSAR is compliant and if not, to 
> let user knows the errors in the CSAR. Also we are enhancing this result with 
> following details as its required for CVC
>
> 1. VNF is TOSCA or HOT based
>
> 2. VNF Template version
>
>
> I believe similar approach could be followed for validating the HOT VNF as 
> well. In this aspect, could you please help to find answers for following 
> queries, which would help to integrate VVP scripts in VTP as validation test 
> case, as discussed in the PARIS CVC meeting last week:
>
> 1. As we knew, ONAP Vendor Software Product (VSP) is zip containing the 
> required HOT template with MANIFEST.json, Is VVP script validate this VSP as 
> a whole including this MANIFEST.json or it only validates the HOT templates 
> inside the VSP ?
>
> 2. As VNFREQS defines list of HOT template requirements with MUST and MUST 
> NOT criteria,
>
> a. How to validate given VSPagainst only MUST VNFREQS by using 
> https://github.com/onap/vvp-validation-scripts ?
>
> b. How to validate given VSP fora given set of VNFREQS by using these 
> validation scripts
>
> NOTE: Here Assuming VVP scripts supports VSP as whole.
>
> 3. What are the versions of OpenStack HOT template (heat_template_version), 
> VVP scripts support?
>
>
> I assume that we need to introduce an wrapper test case over VVP scripts, 
> which will run the VVP scripts and produce the result in the form as required 
> by the OVP portal/ Dovetail. Pls let me know your inputs.
>
>
> Thank you.
>
>
> Regards
>
> Kanagaraj Manickam
>
> Senior System Architect
>
> P&S ONAP
>
> Huawei Technologies India Pvt. Ltd.
>
> Survey No. 37, Next to EPIP Area, Kundalahalli, Whitefield
>
> Bengaluru-560066, Karnataka
>
> Tel: + 91-80-49160700 ext 72410 Mob: 9945602938
>
>
>
> This e-mail and its attachments contain confidential information from HUAWEI, 
> which
> is intended only for the person or entity whose address is listed above. Any 
> use of the
> information contained herein in any way (including, but not limited to, total 
> or partial
> disclosure, reproduction, or dissemination) by persons other than the intended
> recipient(s) is prohibited. If you receive this e-mail in error, please 
> notify the sender by
> phone or email immediately and delete it!
>
>
> From: xudan (N)
> Sent: 16 January 2019 12:14
> To: LOVETT, TREVOR J 
> <tl2...@att.com<mailto:tl2...@att.com>>;opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
> Cc: STARK, STEVEN <ss8...@att.com<mailto:ss8...@att.com>>; WRIGHT, STEVEN A 
> <sw3...@att.com<mailto:sw3...@att.com>>; HALLAHAN, RYAN 
> <rh1...@att.com<mailto:rh1...@att.com>>; WEINSTOCK, ALAN M 
> <aw2...@att.com<mailto:aw2...@att.com>>; Gaoweitao (Victor, Cloudify Network 
> OSDT) <victor....@huawei.com<mailto:victor....@huawei.com>>; Kanagaraj 
> Manickam 
> <kanagaraj.manic...@huawei.com<mailto:kanagaraj.manic...@huawei.com>>;mok...@intracom-telecom.com<mailto:mok...@intracom-telecom.com>;
>  pkara...@intracom-telecom.com<mailto:pkara...@intracom-telecom.com>
> Subject: RE: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action 
> Item from Developer Event
>
>
> Hi all,
>
>
> The results looks very friendly for users to debugging. But it will be better 
> if there is an explicit value to show if it PASS or FAIL.
>
> The results seems to be running one or several test cases against several 
> VNFs, but it’s difficult to find out the test case details (how many test 
> cases there are and the results of each of them).
>
> When doing the compliance tests, it always tests a set of chosen test cases 
> against one single VNF. And it should report that which test cases PASS and 
> which FAIL (better with the failure reason for debugging).
>
> Please take these under consideration when VTP integrates these test cases.
>
>
> BR,
>
> Dan Xu
>
>
> From: 
> opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org> 
> [mailto:opnfv-tech-discuss@lists.opnfv.org]On Behalf Of LOVETT, TREVOR J
> Sent: Friday, January 11, 2019 12:18 AM
> To: 
> opnfv-tech-discuss@lists.opnfv.org<mailto:opnfv-tech-discuss@lists.opnfv.org>
> Cc: STARK, STEVEN <ss8...@att.com<mailto:ss8...@att.com>>; WRIGHT, STEVEN A 
> <sw3...@att.com<mailto:sw3...@att.com>>; HALLAHAN, RYAN 
> <rh1...@att.com<mailto:rh1...@att.com>>; WEINSTOCK, ALAN M 
> <aw2...@att.com<mailto:aw2...@att.com>>; Gaoweitao (Victor, Cloudify Network 
> OSDT) <victor....@huawei.com<mailto:victor....@huawei.com>>
> Subject: [opnfv-tech-discuss] [Dovetail] Example VVP Output per Action Item 
> from Developer Event
>
>
> Per an action item out of the developer event in France, the ONAP VVP team is 
> providing a sample of what the default output of the vvp/validation-scripts 
> produce. This can be examined by the Dovetail team to determine if it can be 
> consumed for the purpose of the LFN certification of Heat templates.
>
>
> The scripts can produce several different output formats, but I’m using the 
> CSV output format as it’s the most machine readable. There are also HTML and 
> Excel output formats. It’s fairly trivial to add additional output formats so 
> if there’s an existing format that dovetail can consume, please provide the 
> details and we can create that format if needed.
>
>
> The sample_template_with_errors.zip contains the Heat template files that 
> produced these reports.
>
>
> Upon completion of a validation run, several files will be written to 
> theoutputs folder. The two key files are:
>
>
> · failures – This file contains information on the failures in JSON format. 
> This file willnot be present if no requirement violations are detected.
>
> · report.csv – Similar content to failures, but in CSV format and it also 
> includes some additional data in a header section. This file will be present 
> even if violations are not detected, but it will not have rows error section 
> of the report.
>
>
> I’ve attached samples of both files in this email. I’ve also attached a copy 
> of what the report.csv looks like when there are no errors (see report – 
> SUCCESS.csv)
>
>
> The format of the files is fairly straight forward, but here’s some 
> additional documentation of the fields in the file:
>
>
> report.csv file
>
>
> · Header – The first 9 rows of the csv file represent a header section that 
> provides the following information
>
> o Row 1 – Static report header: “Validation Failures”
>
> o Row 2 - Blank
>
> o Row 3 – Profile Selected: Always ONAP
>
> o Row 4 – Tool Version: Semantic version ID Of the tool that produced the 
> report
>
> o Row 5 – Report Generated At: Date and Time Stamp of when the report was 
> generated (ex: 2018-12-18 12:34:19.064412 Eastern Standard Time)
>
> o Row 6 – Directory Validated: Shows the local, absolute directory that was 
> scanned
>
> o Row 7 – Checksum: MD5 hash of all files in the directory that was scanned
>
> o Row 8 – Total Errors: The count of all errors encountered
>
> o Row 9 – Blank
>
>
> · Collection Failures – In the unlikely event there is setting up the test 
> suite to run, then an optional section of the report will be written 
> describing these errors. There will be 2 header rows, and then 1 row for each 
> setup failure encountered. It is possible to have both collection failures 
> and validation failures as the validation-scripts will execute any tests that 
> did not fail setup. However, if this section exists it represents an 
> unexpected error and invalid run of the tool. These issues should be referred 
> to the VVP team for investigation.
>
> o Row 10 (if these errors are encountered): Start of Collection Failures will 
> be denoted by a row containing “WARNING: The following unexpected errors…” in 
> the first column of the row.
>
> o Row 11 (if these errors are encountered): Collection Failure heading 
> columns: “Validation File”, “Test”, “Fixtures”, “Error”
>
> o Row 12-N (if these errors are encountered): Rows for each error. Each 
> column is a string.
>
>
> · Validation Failures (Note the start row depends on whether or not 
> collection failures are encountered)
>
> o Start of report – Static Report Header denoting the start of the individual 
> errors. Always “Validation Failures”
>
> o Error Report Column Headings – “Input File”, “Test”, “Requirements”, 
> “Resolution Steps”, “Error Message”, “Raw Test Output”
>
> o Following the heading - 0 or more individual error rows for each violation 
> found
>
>
> Column
>
> Description
>
> Format
>
> Required/Optional
>
> Notes
>
> Input File
>
> List of files the error was detected in
>
> Semicolon delimited list
>
> Required
>
>
> Test
>
> Name of python module that detected the error
>
> String
>
> Required
>
>
> Requirements
>
> The full requirement text that the test validated. There can be multiple, 
> related requirements listed here
>
> Multi-line String
>
> Optional
>
> Almost all requirements are mapped to 1 or more requirements, but there may 
> be a few that are not yet mapped
>
> Resolution Steps
>
> Optional information to aid in resolution
>
> Multi-line String
>
> Optional
>
> Consider this deprecated
>
> Error Message
>
> Detailed error message produced by the validation test. This should provide 
> specific information on why the test failed
>
> Multi-line String
>
> Required
>
>
> Raw Test Output
>
> Full output from pytest. This includes the test content itself and detailed 
> error message
>
> Multi-line String
>
> Required
>
>
>
>
> failures file
>
>
> This file is similar to the report file, but in JSON format. There are a few 
> notable differences beyond CSV vs. JSON:
>
> · The full requirement text is not available in this file – only the 
> requirement IDs themselves.
>
> · Resolution Steps is also not available in this file
>
> · Collection failures are not included in this report
>
>
> The following mapping shows how the JSON elements map to the column above.
>
>
> - file -> Input File: Format is a JSON list
>
> - vnfrqts -> N/A: Format is a JSON list of the requirement IDs associated 
> with the test
>
> - test -> N/A: Name of the test method
>
> - test_file -> test: Name of the python file containing the test
>
> - raw_output -> Raw Test Output
>
> - message -> Error Message
>
>
> Thanks,
>
> Trevor
>
>
>
>
-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.

View/Reply Online (#22717): 
https://lists.opnfv.org/g/opnfv-tech-discuss/message/22717
Mute This Topic: https://lists.opnfv.org/mt/29006268/21656
Group Owner: opnfv-tech-discuss+ow...@lists.opnfv.org
Unsubscribe: https://lists.opnfv.org/g/opnfv-tech-discuss/unsub  
[arch...@mail-archive.com]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to