[
https://issues.apache.org/jira/browse/PHOENIX-5146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17071848#comment-17071848
]
Josh Elser commented on PHOENIX-5146:
-------------------------------------
bq. code falls on 'fat' phoenix-5.0.0.3.1.0.0-78-client.jar to provide all the
required libraries even for HDFS read.
Did you build Phoenix against the version of Hadoop that you're using?
As you should have been able to read, there is no clear bug here. Unless you
can provide steps to reproduce the issue, I'm going to close this Jira issue.
> Phoenix missing class definition: java.lang.NoClassDefFoundError:
> org/apache/phoenix/shaded/org/apache/http/Consts
> ------------------------------------------------------------------------------------------------------------------
>
> Key: PHOENIX-5146
> URL: https://issues.apache.org/jira/browse/PHOENIX-5146
> Project: Phoenix
> Issue Type: Bug
> Affects Versions: 5.0.0
> Environment: 3 node kerberised cluster.
> Hbase 2.0.2
> Reporter: Narendra Kumar
> Priority: Major
>
> While running a SparkCompatibility check for Phoniex hitting this issue:
> {noformat}
> 2019-02-15 09:03:38,470|INFO|MainThread|machine.py:169 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|RUNNING: echo "
> import org.apache.spark.graphx._;
> import org.apache.phoenix.spark._;
> val rdd = sc.phoenixTableAsRDD(\"EMAIL_ENRON\", Seq(\"MAIL_FROM\",
> \"MAIL_TO\"),
> zkUrl=Some(\"huaycloud012.l42scl.hortonworks.com:2181:/hbase-secure\"));
> val rawEdges = rdd.map
> { e => (e(\"MAIL_FROM\").asInstanceOf[VertexId],
> e(\"MAIL_TO\").asInstanceOf[VertexId])}
> ;
> val graph = Graph.fromEdgeTuples(rawEdges, 1.0);
> val pr = graph.pageRank(0.001);
> pr.vertices.saveToPhoenix(\"EMAIL_ENRON_PAGERANK\", Seq(\"ID\", \"RANK\"),
> zkUrl = Some(\"huaycloud012.l42scl.hortonworks.com:2181:/hbase-secure\"));
> " | spark-shell --master yarn --jars
> /usr/hdp/current/hadoop-client/lib/hadoop-lzo-0.6.0.3.1.0.0-75.jar
> --properties-file
> /grid/0/log/cluster/run_phoenix_secure_ha_all_1/artifacts/spark_defaults.conf
> 2>&1 | tee
> /grid/0/log/cluster/run_phoenix_secure_ha_all_1/artifacts/Spark_clientLogs/phoenix-spark.txt
> 2019-02-15 09:03:38,488|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SPARK_MAJOR_VERSION is set
> to 2, using Spark2
> 2019-02-15 09:03:39,901|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: Class path contains
> multiple SLF4J bindings.
> 2019-02-15 09:03:39,902|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: Found binding in
> [jar:file:/usr/hdp/3.1.0.0-75/phoenix/phoenix-5.0.0.3.1.0.0-75-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 2019-02-15 09:03:39,902|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: Found binding in
> [jar:file:/usr/hdp/3.1.0.0-75/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 2019-02-15 09:03:39,902|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: See
> [http://www.slf4j.org/codes.html#multiple_bindings] for an explanation.
> 2019-02-15 09:03:41,400|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|Setting default log level to
> "WARN".
> 2019-02-15 09:03:41,400|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|To adjust logging level use
> sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
> 2019-02-15 09:03:54,837|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84{color:#ff0000}*|java.lang.NoClassDefFoundError:
> org/apache/phoenix/shaded/org/apache/http/Consts*{color}
> 2019-02-15 09:03:54,838|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.phoenix.shaded.org.apache.http.client.utils.URIBuilder.digestURI(URIBuilder.java:181)
> 2019-02-15 09:03:54,839|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.phoenix.shaded.org.apache.http.client.utils.URIBuilder.<init>(URIBuilder.java:82)
> 2019-02-15 09:03:54,839|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.createURL(KMSClientProvider.java:468)
> 2019-02-15 09:03:54,839|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.getDelegationToken(KMSClientProvider.java:1023)
> 2019-02-15 09:03:54,840|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$1.call(LoadBalancingKMSClientProvider.java:252)
> 2019-02-15 09:03:54,840|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$1.call(LoadBalancingKMSClientProvider.java:249)
> 2019-02-15 09:03:54,840|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:172)
> 2019-02-15 09:03:54,841|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.getDelegationToken(LoadBalancingKMSClientProvider.java:249)
> 2019-02-15 09:03:54,841|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.security.token.DelegationTokenIssuer.collectDelegationTokens(DelegationTokenIssuer.java:95)
> 2019-02-15 09:03:54,841|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.security.token.DelegationTokenIssuer.collectDelegationTokens(DelegationTokenIssuer.java:107)
> 2019-02-15 09:03:54,842|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.hadoop.security.token.DelegationTokenIssuer.addDelegationTokens(DelegationTokenIssuer.java:76)
> 2019-02-15 09:03:54,842|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$org$apache$spark$deploy$security$HadoopFSDelegationTokenProvider$$fetchDelegationTokens$1.apply(HadoopFSDelegationTokenProvider.scala:98)
> 2019-02-15 09:03:54,842|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$org$apache$spark$deploy$security$HadoopFSDelegationTokenProvider$$fetchDelegationTokens$1.apply(HadoopFSDelegationTokenProvider.scala:96)
> 2019-02-15 09:03:54,843|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> scala.collection.immutable.Set$Set1.foreach(Set.scala:94)
> 2019-02-15 09:03:54,843|INFO|MainThread|machine.py:184 -
> run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
> {noformat}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)