[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-07 Thread Aihua Xu (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505135#comment-16505135
 ] 

Aihua Xu commented on HIVE-18690:
-

+1.

> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch, HIVE-18690.2.patch, 
> HIVE-18690.3.patch, HIVE-18690.4.patch, HIVE-18690.5.patch, HIVE-18690.6.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-07 Thread Sahil Takiar (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505008#comment-16505008
 ] 

Sahil Takiar commented on HIVE-18690:
-

Thanks for taking a look [~aihuaxu]. Addressed your comments and fixed the 
checkstyle issues.

> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch, HIVE-18690.2.patch, 
> HIVE-18690.3.patch, HIVE-18690.4.patch, HIVE-18690.5.patch, HIVE-18690.6.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-07 Thread Aihua Xu (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16504933#comment-16504933
 ] 

Aihua Xu commented on HIVE-18690:
-

[~stakiar] The patch looks great. Can you check the style errors above? Also, 
one more question: would "file not visible" the only cause for the exception in 
updateSparkBytesWrittenMetrics? If not, maybe we can change the message to 
{{log.debug("Unable to collect file stats for file:"  + path + ". Output 
metrics may be inaccurate", e);}}



> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch, HIVE-18690.2.patch, 
> HIVE-18690.3.patch, HIVE-18690.4.patch, HIVE-18690.5.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-07 Thread Sahil Takiar (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16504739#comment-16504739
 ] 

Sahil Takiar commented on HIVE-18690:
-

[~aihuaxu] could you take a look?

> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch, HIVE-18690.2.patch, 
> HIVE-18690.3.patch, HIVE-18690.4.patch, HIVE-18690.5.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-07 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16504732#comment-16504732
 ] 

Hive QA commented on HIVE-18690:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12926572/HIVE-18690.5.patch

{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.

{color:green}SUCCESS:{color} +1 due to 14481 tests passed

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11587/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11587/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11587/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12926572 - PreCommit-HIVE-Build

> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch, HIVE-18690.2.patch, 
> HIVE-18690.3.patch, HIVE-18690.4.patch, HIVE-18690.5.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-07 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16504677#comment-16504677
 ] 

Hive QA commented on HIVE-18690:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
31s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  6m 
15s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
41s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
55s{color} | {color:green} master passed {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
33s{color} | {color:blue} itests/hive-unit in master has 2 extant Findbugs 
warnings. {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  3m 
27s{color} | {color:blue} ql in master has 2284 extant Findbugs warnings. 
{color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
20s{color} | {color:blue} spark-client in master has 10 extant Findbugs 
warnings. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
16s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
8s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:red}-1{color} | {color:red} mvninstall {color} | {color:red}  0m 
48s{color} | {color:red} ql in the patch failed. {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
37s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m 
37s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
33s{color} | {color:red} ql: The patch generated 3 new + 55 unchanged - 1 fixed 
= 58 total (was 56) {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m  
8s{color} | {color:red} spark-client: The patch generated 4 new + 20 unchanged 
- 0 fixed = 24 total (was 20) {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  4m 
34s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
15s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:red}-1{color} | {color:red} asflicense {color} | {color:red}  0m 
11s{color} | {color:red} The patch generated 1 ASF License warnings. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 25m 58s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-11587/dev-support/hive-personality.sh
 |
| git revision | master / cfd5734 |
| Default Java | 1.8.0_111 |
| findbugs | v3.0.0 |
| mvninstall | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11587/yetus/patch-mvninstall-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11587/yetus/diff-checkstyle-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11587/yetus/diff-checkstyle-spark-client.txt
 |
| asflicense | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11587/yetus/patch-asflicense-problems.txt
 |
| modules | C: itests/hive-unit ql spark-client U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-11587/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: 

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-04 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16501065#comment-16501065
 ] 

Hive QA commented on HIVE-18690:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12926097/HIVE-18690.4.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11509/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11509/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11509/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-06-05 00:09:27.506
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11509/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-06-05 00:09:27.508
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   5667af3..bf70bd2  master -> origin/master
   5ec8e35..ab1be56  branch-3   -> origin/branch-3
+ git reset --hard HEAD
HEAD is now at 5667af3 HIVE-19690 : multi-insert query with multiple GBY, and 
distinct in only some branches can produce incorrect results (Sergey Shelukhin, 
reviewed by Ashutosh Chauhan)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/master
HEAD is now at bf70bd2 HIVE-19096: query result cache interferes with explain 
analyze (Jason Dere, reviewed by Zoltan Haindrich)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-06-05 00:09:29.623
+ rm -rf ../yetus_PreCommit-HIVE-Build-11509
+ mkdir ../yetus_PreCommit-HIVE-Build-11509
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11509
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11509/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/exec/spark/TestSparkStatistics.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/Statistic/SparkStatisticsNames.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/SparkMetricsUtils.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/MetricsCollection.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java: 
does not exist in index
error: 
a/spark-client/src/test/java/org/apache/hive/spark/client/TestMetricsCollection.java:
 does not exist in index
error: patch failed: 
itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/exec/spark/TestSparkStatistics.java:81
Falling back to three-way merge...
Applied patch to 
'itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/exec/spark/TestSparkStatistics.java'
 with conflicts.
error: patch failed: 
ql/src/java/org/apache/hadoop/hive/ql/exec/spark/Statistic/SparkStatisticsNames.java:41
Falling back to three-way merge...
Applied patch to 
'ql/src/java/org/apache/hadoop/hive/ql/exec/spark/Statistic/SparkStatisticsNames.java'
 cleanly.
error: patch failed: 
ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/SparkMetricsUtils.java:58
Falling back to three-way merge...
Applied patch to 
'ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/SparkMetricsUtils.java'
 cleanly.
error: patch failed: 

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-06-01 Thread Hive QA (JIRA)


[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497908#comment-16497908
 ] 

Hive QA commented on HIVE-18690:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12925632/HIVE-18690.3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/11411/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11411/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11411/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-06-01 12:08:03.431
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11411/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-06-01 12:08:03.434
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at 28779d2 HIVE-19370: Issue: ADD Months function on timestamp 
datatype fields in hive (Bharathkrishna Guruvayoor Murali, reviewed by Peter 
Vary)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 28779d2 HIVE-19370: Issue: ADD Months function on timestamp 
datatype fields in hive (Bharathkrishna Guruvayoor Murali, reviewed by Peter 
Vary)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-06-01 12:08:03.993
+ rm -rf ../yetus_PreCommit-HIVE-Build-11411
+ mkdir ../yetus_PreCommit-HIVE-Build-11411
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11411
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11411/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/exec/spark/TestSparkStatistics.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/Statistic/SparkStatisticsNames.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/SparkMetricsUtils.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/MetricsCollection.java:
 does not exist in index
error: 
a/spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java: 
does not exist in index
error: 
a/spark-client/src/test/java/org/apache/hive/spark/client/TestMetricsCollection.java:
 does not exist in index
error: patch failed: 
ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java:22
Falling back to three-way merge...
Applied patch to 
'ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java' with 
conflicts.
Going to apply patch with: git apply -p1
error: patch failed: 
ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java:22
Falling back to three-way merge...
Applied patch to 
'ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java' with 
conflicts.
U ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-11411
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12925632 - PreCommit-HIVE-Build

> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: 

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-04-18 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16443517#comment-16443517
 ] 

Hive QA commented on HIVE-18690:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12919478/HIVE-18690.2.patch

{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 54 failed/errored test(s), 14287 tests 
executed
*Failed tests:*
{noformat}
TestNonCatCallsWithCatalog - did not produce a TEST-*.xml file (likely timed 
out) (batchId=217)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[llap_smb] (batchId=92)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[parquet_vectorization_0] 
(batchId=17)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[results_cache_invalidation2]
 (batchId=39)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[tez_join_hash] 
(batchId=54)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[results_cache_invalidation2]
 (batchId=163)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[sysdb] 
(batchId=163)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_1] 
(batchId=171)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_5] 
(batchId=105)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_notnull_constraint_violation]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[avro_non_nullable_union]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[cachingprintstream]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[check_constraint_violation]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[compute_stats_long]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[default_constraint_invalid_default_value_type]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[dyn_part3] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[dyn_part_max_per_node]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[dynamic_partitions_with_whitelist]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into_acid_notnull]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into_notnull_constraint]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_multi_into_notnull]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_overwrite_notnull_constraint]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insertsel_fail] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[merge_constraint_notnull]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[script_broken_pipe2]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[script_broken_pipe3]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[script_error] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[serde_regex2] 
(batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[stats_aggregator_error_2]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[stats_publisher_error_1]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[stats_publisher_error_2]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_corr_in_agg]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_in_implicit_gby]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_notin_implicit_gby]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_scalar_corr_multi_rows]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_scalar_multi_rows]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_assert_true2]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_assert_true] 
(batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_reflect_neg] 
(batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_test_error] 
(batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_test_error_reduce]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[update_notnull_constraint]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeMinimrCliDriver.testCliDriver[cluster_tasklog_retrieval]
 (batchId=98)
org.apache.hadoop.hive.cli.TestNegativeMinimrCliDriver.testCliDriver[local_mapred_error_cache]
 (batchId=98)

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-04-18 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16443474#comment-16443474
 ] 

Hive QA commented on HIVE-18690:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m  
1s{color} | {color:blue} Findbugs executables are not available. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
42s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  7m 
39s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  2m  
5s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  1m 
 9s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
43s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
9s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:red}-1{color} | {color:red} mvninstall {color} | {color:red}  0m 
57s{color} | {color:red} ql in the patch failed. {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  2m 
16s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  2m 
16s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
40s{color} | {color:red} ql: The patch generated 3 new + 68 unchanged - 13 
fixed = 71 total (was 81) {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
10s{color} | {color:red} spark-client: The patch generated 5 new + 20 unchanged 
- 0 fixed = 25 total (was 20) {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
33s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:red}-1{color} | {color:red} asflicense {color} | {color:red}  0m 
14s{color} | {color:red} The patch generated 3 ASF License warnings. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 21m  4s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-10311/dev-support/hive-personality.sh
 |
| git revision | master / 760d472 |
| Default Java | 1.8.0_111 |
| mvninstall | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10311/yetus/patch-mvninstall-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10311/yetus/diff-checkstyle-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10311/yetus/diff-checkstyle-spark-client.txt
 |
| asflicense | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10311/yetus/patch-asflicense-problems.txt
 |
| modules | C: itests/hive-unit ql spark-client U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10311/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch, HIVE-18690.2.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This 

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-04-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440072#comment-16440072
 ] 

Hive QA commented on HIVE-18690:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12919135/HIVE-18690.1.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 245 failed/errored test(s), 14230 tests 
executed
*Failed tests:*
{noformat}
TestDbNotificationListener - did not produce a TEST-*.xml file (likely timed 
out) (batchId=247)
TestHCatHiveCompatibility - did not produce a TEST-*.xml file (likely timed 
out) (batchId=247)
TestNonCatCallsWithCatalog - did not produce a TEST-*.xml file (likely timed 
out) (batchId=217)
TestSequenceFileReadWrite - did not produce a TEST-*.xml file (likely timed 
out) (batchId=247)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[llap_smb] (batchId=92)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[parquet_vectorization_0] 
(batchId=17)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[tez_join_hash] 
(batchId=54)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[default_constraint]
 (batchId=163)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[sysdb] 
(batchId=163)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_1] 
(batchId=171)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[auto_sortmerge_join_16]
 (batchId=183)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[infer_bucket_sort_num_buckets]
 (batchId=183)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[list_bucket_dml_10]
 (batchId=182)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[orc_merge1]
 (batchId=182)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[orc_merge2]
 (batchId=185)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[orc_merge7]
 (batchId=185)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[orc_merge_diff_fs]
 (batchId=182)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[orc_merge_incompat2]
 (batchId=185)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_dynamic_partition_pruning]
 (batchId=182)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_5] 
(batchId=105)
org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver[infer_bucket_sort_dyn_part]
 (batchId=93)
org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver[infer_bucket_sort_map_operators]
 (batchId=93)
org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver[infer_bucket_sort_num_buckets]
 (batchId=93)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.org.apache.hadoop.hive.cli.TestNegativeCliDriver
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.org.apache.hadoop.hive.cli.TestNegativeCliDriver
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[alter_notnull_constraint_violation]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[default_constraint_invalid_default_value_type]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into3] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into4] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into5] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into_acid_notnull]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_into_notnull_constraint]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_multi_into_notnull]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_overwrite_notnull_constraint]
 (batchId=96)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insert_sorted] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insertsel_fail] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[joinneg] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[lockneg1] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[lockneg3] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[lockneg4] 
(batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[materialized_view_authorization_create_no_select_perm]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[materialized_view_authorization_rebuild_no_grant]
 (batchId=95)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[materialized_view_authorization_rebuild_other]
 (batchId=95)

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-04-16 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16439995#comment-16439995
 ] 

Hive QA commented on HIVE-18690:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m  
0s{color} | {color:blue} Findbugs executables are not available. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  1m  
1s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  8m 
16s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
33s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
57s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
25s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m  
9s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:red}-1{color} | {color:red} mvninstall {color} | {color:red}  1m  
3s{color} | {color:red} ql in the patch failed. {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  1m 
19s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  1m 
19s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
45s{color} | {color:red} ql: The patch generated 3 new + 68 unchanged - 13 
fixed = 71 total (was 81) {color} |
| {color:red}-1{color} | {color:red} checkstyle {color} | {color:red}  0m 
10s{color} | {color:red} spark-client: The patch generated 5 new + 20 unchanged 
- 0 fixed = 25 total (was 20) {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
15s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:red}-1{color} | {color:red} asflicense {color} | {color:red}  0m 
16s{color} | {color:red} The patch generated 3 ASF License warnings. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 19m  0s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Optional Tests |  asflicense  javac  javadoc  findbugs  checkstyle  compile  |
| uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 
3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/data/hiveptest/working/yetus_PreCommit-HIVE-Build-10256/dev-support/hive-personality.sh
 |
| git revision | master / 6afa544 |
| Default Java | 1.8.0_111 |
| mvninstall | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10256/yetus/patch-mvninstall-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10256/yetus/diff-checkstyle-ql.txt
 |
| checkstyle | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10256/yetus/diff-checkstyle-spark-client.txt
 |
| asflicense | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10256/yetus/patch-asflicense-problems.txt
 |
| modules | C: ql spark-client U: . |
| Console output | 
http://104.198.109.242/logs//PreCommit-HIVE-Build-10256/yetus.txt |
| Powered by | Apache Yetushttp://yetus.apache.org |


This message was automatically generated.



> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Sahil Takiar
>Priority: Major
> Attachments: HIVE-18690.1.patch
>
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA

[jira] [Commented] (HIVE-18690) Integrate with Spark OutputMetrics

2018-02-12 Thread Sahil Takiar (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-18690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361760#comment-16361760
 ] 

Sahil Takiar commented on HIVE-18690:
-

I hacked some code together locally and this does work. The question is how to 
do it in the correct manner.

> Integrate with Spark OutputMetrics
> --
>
> Key: HIVE-18690
> URL: https://issues.apache.org/jira/browse/HIVE-18690
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Priority: Major
>
> Spark has an {{OutputMetrics}} it uses to expose records / bytes written. We 
> currently don't integrate with it and the Spark UI shows a blank value for 
> output records / bytes. We have our own customer accumulators instead (like 
> {{HIVE_RECORDS_OUT}}).
> Spark exposes the {{OutputMetrics}} object inside individual tasks via the 
> {{TaskContext.get()}} method. We can use this method to access the 
> {{OutputMetrics}} object and update it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)