[jira] [Commented] (HIVE-18389) Print out Spark Web UI URL to the console log
[ https://issues.apache.org/jira/browse/HIVE-18389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16355348#comment-16355348 ] Peter Vary commented on HIVE-18389: --- Yeah, and refactoring the {{LogHelper}} to the hive-common package is not an easy feat, since it uses the SessionState owned streams, which are already leaked, and even set from other classes. So it is for another Jira :) Then +1 for this one. > Print out Spark Web UI URL to the console log > - > > Key: HIVE-18389 > URL: https://issues.apache.org/jira/browse/HIVE-18389 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18389.1.patch > > > Should be accessible via {{SparkContext#uiWebUrl}}. It just needs to be sent > from the {{RemoteDriver}} to HS2. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18389) Print out Spark Web UI URL to the console log
[ https://issues.apache.org/jira/browse/HIVE-18389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16354130#comment-16354130 ] Sahil Takiar commented on HIVE-18389: - Yes, that would be the ideal approach, but unfortunately {{LogHelper}} in the in {{ql}} module while this code is in the {{spark-client}} module. There already exists a dependency from {{ql}} to {{spark-client}}. Using the {{LogHelper}} would require adding a dependency from {{spark-client}} to {{ql}}, which would create a circular dependency and break the build. > Print out Spark Web UI URL to the console log > - > > Key: HIVE-18389 > URL: https://issues.apache.org/jira/browse/HIVE-18389 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18389.1.patch > > > Should be accessible via {{SparkContext#uiWebUrl}}. It just needs to be sent > from the {{RemoteDriver}} to HS2. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18389) Print out Spark Web UI URL to the console log
[ https://issues.apache.org/jira/browse/HIVE-18389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16353696#comment-16353696 ] Peter Vary commented on HIVE-18389: --- [~stakiar]: I have questions about this part of the code: {code:java} private void handle(ChannelHandlerContext ctx, SparkUIWebURL msg) { String printMsg = "Hive on Spark Session Web UI URL: " + msg.UIWebURL; consoleStream.println(printMsg); LOG.info(printMsg); } {code} The goal of this change to print the Web UI URL to the logs, and to the console as well? If I remember correctly, if we use the {{LogHelper.logInfo}} we can archive the same in one line, with the additional benefit of honoring if the client set the output to {{silent}}. So if we use {{LogHelper console}} instead of {{PrintStream consoleStream}} we might better of in the end. Is there any drawback with this approach? Thanks, Peter > Print out Spark Web UI URL to the console log > - > > Key: HIVE-18389 > URL: https://issues.apache.org/jira/browse/HIVE-18389 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18389.1.patch > > > Should be accessible via {{SparkContext#uiWebUrl}}. It just needs to be sent > from the {{RemoteDriver}} to HS2. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18389) Print out Spark Web UI URL to the console log
[ https://issues.apache.org/jira/browse/HIVE-18389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16353454#comment-16353454 ] Sahil Takiar commented on HIVE-18389: - [~pvary], [~aihuaxu] could you take a look. This just exposes the link to the UI described here: https://spark.apache.org/docs/latest/monitoring.html#web-interfaces > Print out Spark Web UI URL to the console log > - > > Key: HIVE-18389 > URL: https://issues.apache.org/jira/browse/HIVE-18389 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18389.1.patch > > > Should be accessible via {{SparkContext#uiWebUrl}}. It just needs to be sent > from the {{RemoteDriver}} to HS2. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18389) Print out Spark Web UI URL to the console log
[ https://issues.apache.org/jira/browse/HIVE-18389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16314905#comment-16314905 ] Hive QA commented on HIVE-18389: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12904897/HIVE-18389.1.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 19 failed/errored test(s), 11549 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[auto_join25] (batchId=72) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[ppd_join5] (batchId=35) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[bucketsortoptimize_insert_2] (batchId=151) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[hybridgrace_hashjoin_2] (batchId=156) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=164) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[llap_acid] (batchId=168) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[llap_acid_fast] (batchId=159) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[sysdb] (batchId=159) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_opt_shuffle_serde] (batchId=177) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[authorization_part] (batchId=93) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[stats_aggregator_error_1] (batchId=93) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[ppd_join5] (batchId=120) org.apache.hadoop.hive.cli.TestSparkPerfCliDriver.testCliDriver[query73] (batchId=247) org.apache.hadoop.hive.metastore.TestEmbeddedHiveMetaStore.testTransactionalValidation (batchId=213) org.apache.hadoop.hive.ql.io.TestDruidRecordWriter.testWrite (batchId=253) org.apache.hadoop.hive.ql.parse.TestReplicationScenarios.testConstraints (batchId=225) org.apache.hive.jdbc.TestSSL.testConnectionMismatch (batchId=231) org.apache.hive.jdbc.TestSSL.testConnectionWrongCertCN (batchId=231) org.apache.hive.jdbc.TestSSL.testMetastoreConnectionWrongCertCN (batchId=231) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/8487/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/8487/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-8487/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 19 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12904897 - PreCommit-HIVE-Build > Print out Spark Web UI URL to the console log > - > > Key: HIVE-18389 > URL: https://issues.apache.org/jira/browse/HIVE-18389 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar > Attachments: HIVE-18389.1.patch > > > Should be accessible via {{SparkContext#uiWebUrl}}. It just needs to be sent > from the {{RemoteDriver}} to HS2. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (HIVE-18389) Print out Spark Web UI URL to the console log
[ https://issues.apache.org/jira/browse/HIVE-18389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16314884#comment-16314884 ] Hive QA commented on HIVE-18389: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 0s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 1m 29s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 5m 28s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 11s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m 40s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 58s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 21s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 1m 24s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 14s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 14s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 11s{color} | {color:red} spark-client: The patch generated 4 new + 59 unchanged - 1 fixed = 63 total (was 60) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 1s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 12s{color} | {color:green} The patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 15m 3s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus/dev-support/hive-personality.sh | | git revision | master / a6b88d9 | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-8487/yetus/diff-checkstyle-spark-client.txt | | modules | C: spark-client ql U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-8487/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Print out Spark Web UI URL to the console log > - > > Key: HIVE-18389 > URL: https://issues.apache.org/jira/browse/HIVE-18389 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar > Attachments: HIVE-18389.1.patch > > > Should be accessible via {{SparkContext#uiWebUrl}}. It just needs to be sent > from the {{RemoteDriver}} to HS2. -- This message was sent by Atlassian JIRA (v6.4.14#64029)