[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-07-07 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=273048&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-273048
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 08/Jul/19 05:25
Start Date: 08/Jul/19 05:25
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on issue #1035: HDDS-1735. 
Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#issuecomment-509083277
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 32 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | 0 | @author | 0 | Skipping @author checks as author.sh has been patched. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 32 | Maven dependency ordering for branch |
   | +1 | mvninstall | 474 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | pylint | 1 | Error running pylint. Please check pylint stderr files. |
   | +1 | shadedclient | 774 | branch has no errors when building and testing 
our client artifacts. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 30 | Maven dependency ordering for patch |
   | +1 | mvninstall | 453 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | -1 | pylint | 2 | Error running pylint. Please check pylint stderr files. |
   | +1 | pylint | 2 | There were no new pylint issues. |
   | +1 | shellcheck | 2 | The patch generated 0 new + 0 unchanged - 7 fixed = 
0 total (was 7) |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 702 | patch has no errors when building and testing 
our client artifacts. |
   ||| _ Other Tests _ |
   | +1 | unit | 104 | hadoop-hdds in the patch passed. |
   | +1 | unit | 179 | hadoop-ozone in the patch passed. |
   | +1 | asflicense | 49 | The patch does not generate ASF License warnings. |
   | | | 3029 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=17.05.0-ce Server=17.05.0-ce base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1035 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs 
pylint |
   | uname | Linux cc2fdb9998fb 4.4.0-138-generic #164-Ubuntu SMP Tue Oct 2 
17:16:02 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 9c90729 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/4/artifact/out/branch-pylint-stderr.txt
 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/4/artifact/out/patch-pylint-stderr.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/4/testReport/ |
   | Max. process+thread count | 411 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone hadoop-ozone/fault-injection-test/network-tests 
U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/4/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 pylint=1.9.2 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 273048)
Time Spent: 1h 40m  (was: 1.5h)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
> Attachments: Screen Shot 2019-07-02 at 3.25.33 PM.png
>
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be pri

[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-07-07 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=272986&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-272986
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 07/Jul/19 22:56
Start Date: 07/Jul/19 22:56
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on issue #1035: HDDS-1735. 
Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#issuecomment-509037232
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 32 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | 0 | shelldocs | 1 | Shelldocs was not available. |
   | 0 | @author | 0 | Skipping @author checks as author.sh has been patched. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 44 | Maven dependency ordering for branch |
   | +1 | mvninstall | 504 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | pylint | 1 | Error running pylint. Please check pylint stderr files. |
   | +1 | shadedclient | 772 | branch has no errors when building and testing 
our client artifacts. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 32 | Maven dependency ordering for patch |
   | +1 | mvninstall | 454 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | -1 | pylint | 2 | Error running pylint. Please check pylint stderr files. |
   | +1 | pylint | 2 | There were no new pylint issues. |
   | +1 | shellcheck | 1 | The patch generated 0 new + 0 unchanged - 7 fixed = 
0 total (was 7) |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 708 | patch has no errors when building and testing 
our client artifacts. |
   ||| _ Other Tests _ |
   | +1 | unit | 103 | hadoop-hdds in the patch passed. |
   | +1 | unit | 179 | hadoop-ozone in the patch passed. |
   | +1 | asflicense | 48 | The patch does not generate ASF License warnings. |
   | | | 3076 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=17.05.0-ce Server=17.05.0-ce base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1035 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs 
pylint |
   | uname | Linux f16dcd073b52 4.4.0-138-generic #164-Ubuntu SMP Tue Oct 2 
17:16:02 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 9c90729 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/3/artifact/out/branch-pylint-stderr.txt
 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/3/artifact/out/patch-pylint-stderr.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/3/testReport/ |
   | Max. process+thread count | 446 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone hadoop-ozone/fault-injection-test/network-tests 
U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/3/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 pylint=1.9.2 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 272986)
Time Spent: 1.5h  (was: 1h 20m)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
> Attachments: Screen Shot 2019-07-02 at 3.25.33 PM.png
>
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be print

[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-07-04 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=272267&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-272267
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 04/Jul/19 15:47
Start Date: 04/Jul/19 15:47
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on issue #1035: HDDS-1735. 
Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#issuecomment-508524307
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 32 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | 0 | @author | 0 | Skipping @author checks as author.sh has been patched. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 52 | Maven dependency ordering for branch |
   | +1 | mvninstall | 482 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | pylint | 5 | Error running pylint. Please check pylint stderr files. |
   | +1 | shadedclient | 783 | branch has no errors when building and testing 
our client artifacts. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 22 | Maven dependency ordering for patch |
   | +1 | mvninstall | 431 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | -1 | pylint | 6 | Error running pylint. Please check pylint stderr files. |
   | +1 | pylint | 6 | There were no new pylint issues. |
   | +1 | shellcheck | 0 | The patch generated 0 new + 0 unchanged - 7 fixed = 
0 total (was 7) |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 703 | patch has no errors when building and testing 
our client artifacts. |
   ||| _ Other Tests _ |
   | +1 | unit | 104 | hadoop-hdds in the patch passed. |
   | +1 | unit | 176 | hadoop-ozone in the patch passed. |
   | +1 | asflicense | 48 | The patch does not generate ASF License warnings. |
   | | | 3007 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=17.05.0-ce Server=17.05.0-ce base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1035 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs 
pylint |
   | uname | Linux 2c4cf55cdab1 4.4.0-138-generic #164-Ubuntu SMP Tue Oct 2 
17:16:02 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 1c254a8 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/2/artifact/out/branch-pylint-stderr.txt
 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/2/artifact/out/patch-pylint-stderr.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/2/testReport/ |
   | Max. process+thread count | 444 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone hadoop-ozone/fault-injection-test/network-tests 
U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/2/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 pylint=1.9.2 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 272267)
Time Spent: 1h 20m  (was: 1h 10m)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
> Attachments: Screen Shot 2019-07-02 at 3.25.33 PM.png
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be p

[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269645&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269645
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:50
Start Date: 29/Jun/19 00:50
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on pull request #1035: 
HDDS-1735. Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#discussion_r298781323
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/checkstyle.sh
 ##
 @@ -13,7 +13,10 @@
 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 # See the License for the specific language governing permissions and
 # limitations under the License.
-mvn -fn checkstyle:check -am -pl :hadoop-ozone-dist -Phdds
+mvn -B -fn checkstyle:check -f pom.ozone.xml
+
+#Print out the exact violations with parsing XML results with sed
+find -name checkstyle-errors.xml | xargs sed  '$!N; //d'
 
 Review comment:
   shellcheck:1: note: Some finds don't have a default path. Specify '.' 
explicitly. [SC2185]
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269645)
Time Spent: 0.5h  (was: 20m)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be printed out to the console
>  * in case of test failure a non zero exit code should be used
>  
> The tests are working well (in fact I have some experiments with executing 
> these scripts on k8s and argo where all the shell scripts are executed 
> parallel) but we need some update:
>  1. Most important: the unit tests and integration tests can be separated. 
> Integration tests are more flaky and it's better to have a way to run only 
> the normal unit tests
>  2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
> of the magical "am pl hadoop-ozone-dist" trick--
>  3. To make it possible to run blockade test in containers we should use - T 
> flag with docker-compose
>  4. checkstyle violations are printed out to the console



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269647&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269647
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:50
Start Date: 29/Jun/19 00:50
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on pull request #1035: 
HDDS-1735. Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#discussion_r298781325
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/rat.sh
 ##
 @@ -16,7 +16,10 @@
 
 mkdir -p target
 rm target/rat-aggregated.txt
-mvn -fn org.apache.rat:apache-rat-plugin:0.13:check -am -pl :hadoop-ozone-dist 
-Phdds
+cd hadoop-hdds
 
 Review comment:
   shellcheck:1: warning: Use 'cd ... || exit' or 'cd ... || return' in case cd 
fails. [SC2164]
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269647)
Time Spent: 50m  (was: 40m)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be printed out to the console
>  * in case of test failure a non zero exit code should be used
>  
> The tests are working well (in fact I have some experiments with executing 
> these scripts on k8s and argo where all the shell scripts are executed 
> parallel) but we need some update:
>  1. Most important: the unit tests and integration tests can be separated. 
> Integration tests are more flaky and it's better to have a way to run only 
> the normal unit tests
>  2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
> of the magical "am pl hadoop-ozone-dist" trick--
>  3. To make it possible to run blockade test in containers we should use - T 
> flag with docker-compose
>  4. checkstyle violations are printed out to the console



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269646&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269646
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:50
Start Date: 29/Jun/19 00:50
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on pull request #1035: 
HDDS-1735. Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#discussion_r298781324
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/integration.sh
 ##
 @@ -0,0 +1,25 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+export MAVEN_OPTS="-Xmx4096m"
+mvn -B install -f pom.ozone.xml -DskipTests
+mvn -B -fn test -f pom.ozone.xml -pl 
:hadoop-ozone-integration-test,:hadoop-ozone-filesystem
+module_failed_tests=$(find "." -name 'TEST*.xml'\
 
 Review comment:
   shellcheck:23: warning: Use -print0/-0 or -exec + to allow for 
non-alphanumeric filenames. [SC2038]
   shellcheck:49: note: This word is outside of quotes. Did you intend to 'nest 
'"'single quotes'"' instead'?  [SC2026]
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269646)
Time Spent: 40m  (was: 0.5h)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be printed out to the console
>  * in case of test failure a non zero exit code should be used
>  
> The tests are working well (in fact I have some experiments with executing 
> these scripts on k8s and argo where all the shell scripts are executed 
> parallel) but we need some update:
>  1. Most important: the unit tests and integration tests can be separated. 
> Integration tests are more flaky and it's better to have a way to run only 
> the normal unit tests
>  2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
> of the magical "am pl hadoop-ozone-dist" trick--
>  3. To make it possible to run blockade test in containers we should use - T 
> flag with docker-compose
>  4. checkstyle violations are printed out to the console



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269648&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269648
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:50
Start Date: 29/Jun/19 00:50
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on pull request #1035: 
HDDS-1735. Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#discussion_r298781327
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/rat.sh
 ##
 @@ -16,7 +16,10 @@
 
 mkdir -p target
 rm target/rat-aggregated.txt
-mvn -fn org.apache.rat:apache-rat-plugin:0.13:check -am -pl :hadoop-ozone-dist 
-Phdds
+cd hadoop-hdds
+mvn -B -fn org.apache.rat:apache-rat-plugin:0.13:check
+cd ../hadoop-ozone
 
 Review comment:
   shellcheck:1: warning: Use 'cd ... || exit' or 'cd ... || return' in case cd 
fails. [SC2164]
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269648)
Time Spent: 1h  (was: 50m)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be printed out to the console
>  * in case of test failure a non zero exit code should be used
>  
> The tests are working well (in fact I have some experiments with executing 
> these scripts on k8s and argo where all the shell scripts are executed 
> parallel) but we need some update:
>  1. Most important: the unit tests and integration tests can be separated. 
> Integration tests are more flaky and it's better to have a way to run only 
> the normal unit tests
>  2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
> of the magical "am pl hadoop-ozone-dist" trick--
>  3. To make it possible to run blockade test in containers we should use - T 
> flag with docker-compose
>  4. checkstyle violations are printed out to the console



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269649&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269649
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:50
Start Date: 29/Jun/19 00:50
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on issue #1035: HDDS-1735. 
Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#issuecomment-506913281
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 31 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 1 | Shelldocs was not available. |
   | 0 | @author | 0 | Skipping @author checks as author.sh has been patched. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 31 | Maven dependency ordering for branch |
   | +1 | mvninstall | 472 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | pylint | 1 | Error running pylint. Please check pylint stderr files. |
   | +1 | shadedclient | 726 | branch has no errors when building and testing 
our client artifacts. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 31 | Maven dependency ordering for patch |
   | +1 | mvninstall | 449 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | -1 | pylint | 1 | Error running pylint. Please check pylint stderr files. |
   | +1 | pylint | 1 | There were no new pylint issues. |
   | -1 | shellcheck | 1 | The patch generated 8 new + 6 unchanged - 0 fixed = 
14 total (was 6) |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 715 | patch has no errors when building and testing 
our client artifacts. |
   ||| _ Other Tests _ |
   | +1 | unit | 88 | hadoop-hdds in the patch passed. |
   | +1 | unit | 168 | hadoop-ozone in the patch passed. |
   | +1 | asflicense | 36 | The patch does not generate ASF License warnings. |
   | | | 2947 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=17.05.0-ce Server=17.05.0-ce base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1035 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs 
pylint |
   | uname | Linux c87588353a8c 4.4.0-138-generic #164-Ubuntu SMP Tue Oct 2 
17:16:02 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d203045 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/1/artifact/out/branch-pylint-stderr.txt
 |
   | pylint | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/1/artifact/out/patch-pylint-stderr.txt
 |
   | shellcheck | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/1/artifact/out/diff-patch-shellcheck.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/1/testReport/ |
   | Max. process+thread count | 413 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone hadoop-ozone/fault-injection-test/network-tests 
U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1035/1/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 pylint=1.9.2 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269649)
Time Spent: 1h 10m  (was: 1h)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with

[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269644&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269644
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:50
Start Date: 29/Jun/19 00:50
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on pull request #1035: 
HDDS-1735. Create separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035#discussion_r298781321
 
 

 ##
 File path: hadoop-ozone/dev-support/checks/acceptance.sh
 ##
 @@ -15,5 +15,6 @@
 # limitations under the License.
 DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
 export HADOOP_VERSION=3
-"$DIR/../../../hadoop-ozone/dist/target/ozone-*-SNAPSHOT/compose/test-all.sh"
+OZONE_VERSION=$(cat $DIR/../../pom.xml  | grep "" | sed 
's/<[^>]*>//g'|  sed 's/^[ \t]*//')
 
 Review comment:
   shellcheck:21: note: Double quote to prevent globbing and word splitting. 
[SC2086]
   shellcheck:21: note: Useless cat. Consider 'cmd < file | ..' or 'cmd file | 
..' instead. [SC2002]
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269644)
Time Spent: 20m  (was: 10m)

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be printed out to the console
>  * in case of test failure a non zero exit code should be used
>  
> The tests are working well (in fact I have some experiments with executing 
> these scripts on k8s and argo where all the shell scripts are executed 
> parallel) but we need some update:
>  1. Most important: the unit tests and integration tests can be separated. 
> Integration tests are more flaky and it's better to have a way to run only 
> the normal unit tests
>  2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
> of the magical "am pl hadoop-ozone-dist" trick--
>  3. To make it possible to run blockade test in containers we should use - T 
> flag with docker-compose
>  4. checkstyle violations are printed out to the console



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org



[jira] [Work logged] (HDDS-1735) Create separate unit and integration test executor dev-support script

2019-06-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/HDDS-1735?focusedWorklogId=269634&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-269634
 ]

ASF GitHub Bot logged work on HDDS-1735:


Author: ASF GitHub Bot
Created on: 29/Jun/19 00:00
Start Date: 29/Jun/19 00:00
Worklog Time Spent: 10m 
  Work Description: elek commented on pull request #1035: HDDS-1735. Create 
separate unit and integration test executor dev-support script
URL: https://github.com/apache/hadoop/pull/1035
 
 
   hadoop-ozone/dev-support/checks directory contains multiple helper script to 
execute different type of testing (findbugs, rat, unit, build).
   
   They easily define how tests should be executed, with the following contract:
   
    * The problems should be printed out to the console
   
    * in case of test failure a non zero exit code should be used
   
    
   
   The tests are working well (in fact I have some experiments with executing 
these scripts on k8s and argo where all the shell scripts are executed 
parallel) but we need some update:
   
    1. Most important: the unit tests and integration tests can be separated. 
Integration tests are more flaky and it's better to have a way to run only the 
normal unit tests
   
    2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
of the magical "am pl hadoop-ozone-dist" trick--
   
    3. To make it possible to run blockade test in containers we should use - T 
flag with docker-compose
   
    4. checkstyle violations are printed out to the console
   
   See: https://issues.apache.org/jira/browse/HDDS-1735
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 269634)
Time Spent: 10m
Remaining Estimate: 0h

> Create separate unit and integration test executor dev-support script
> -
>
> Key: HDDS-1735
> URL: https://issues.apache.org/jira/browse/HDDS-1735
> Project: Hadoop Distributed Data Store
>  Issue Type: Improvement
>Reporter: Elek, Marton
>Assignee: Elek, Marton
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> hadoop-ozone/dev-support/checks directory contains multiple helper script to 
> execute different type of testing (findbugs, rat, unit, build).
> They easily define how tests should be executed, with the following contract:
>  * The problems should be printed out to the console
>  * in case of test failure a non zero exit code should be used
>  
> The tests are working well (in fact I have some experiments with executing 
> these scripts on k8s and argo where all the shell scripts are executed 
> parallel) but we need some update:
>  1. Most important: the unit tests and integration tests can be separated. 
> Integration tests are more flaky and it's better to have a way to run only 
> the normal unit tests
>  2. As HDDS-1115 introduced a pom.ozone.xml it's better to use them instead 
> of the magical "am pl hadoop-ozone-dist" trick--
>  3. To make it possible to run blockade test in containers we should use - T 
> flag with docker-compose
>  4. checkstyle violations are printed out to the console



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org