Hi Max,
That was the output from "flink run -c
com.dataradiant.beam.examples.StreamWordCount ./target/beam-starter-0.2.jar"
From the Eclipse IDE , I get:
Optional:
file:/home/wayneco/.m2/repository/com/google/guava/guava/19.0/guava-19.0.jar
Absent:
file:/home/wayneco/.m2/repository/com/google/guava/guava/19.0/guava-19.0.jar
The demo code and pom.xml needs some tweaking to launch the flink job
itself (i.e. java -cp ...) and I wanted to keep the code as close to
original as possible.
Trying Alexey's suggested shading now...
Thanks,
Wayne
On 2016-12-05 05:50 AM, Maximilian Michels wrote:
Hi Wayne,
That seems like you ran that from the IDE. I'm interested how the
output would look like on the cluster where you experienced your problems.
Thanks,
Max
On Fri, Dec 2, 2016 at 7:27 PM, Wayne Collins <[email protected]
<mailto:[email protected]>> wrote:
Hi Max,
Here's the output:
---------------------
Optional:
file:/home/wayneco/workspace/beam-starter/beam-starter/./target/beam-starter-0.2.jar
Absent:
file:/home/wayneco/workspace/beam-starter/beam-starter/./target/beam-starter-0.2.jar
---------------------
Thanks for your help!
Wayne
On 2016-12-02 08:42 AM, Maximilian Michels wrote:
Hi Wayne,
Thanks for getting back to me. Could you compile a new version
of your
job with the following in your main method?
URL location1 =
com.google.common.base.Optional.class.getProtectionDomain().getCodeSource().getLocation();
System.out.println("Optional: " + location1);
URL location2 =
Class.forName("com.google.common.base.Optional").getProtectionDomain().getCodeSource().getLocation();
System.out.println("Absent: " + location2);
Could you run this on your cluster node with the flink
command? This
should give us a hint from where the Guava library is
bootstrapped.
Thanks,
Max
On Thu, Dec 1, 2016 at 7:54 PM, Wayne Collins
<[email protected] <mailto:[email protected]>> wrote:
Hi Max,
Here is the result from the "flink run" launcher node
(devbox):
-----------------------
root@devbox:~# echo
${HADOOP_CLASSPATH}:${HADOOP_CONF_DIR}:${YARN_CONF_DIR}:${HBASE_CONF_DIR}
:/etc/hadoop-conf:/etc/yarn-conf:
-----------------------
Here is the result from one of the Cloudera YARN nodes as
root:
-----------------------
[root@hadoop0 ~]# echo
${HADOOP_CLASSPATH}:${HADOOP_CONF_DIR}:${YARN_CONF_DIR}:${HBASE_CONF_DIR}
:::
-----------------------
Here is the result from one of the Cloudera YARN nodes as
yarn:
-----------------------
[yarn@hadoop0 ~]$ echo
${HADOOP_CLASSPATH}:${HADOOP_CONF_DIR}:${YARN_CONF_DIR}:${HBASE_CONF_DIR}
:::
-----------------------
Note that both the yarn-session.sh and the flink run
commands are run as
root on devbox.
Software version details:
devbox has these versions of the client software:
flink-1.1.2
hadoop-2.6.0
kafka_2.11-0.9.0.1
(also reproduced the problem with kafka_2.10-0.9.0.1)
The cluster (providing YARN) is:
CDH5 - 5.8.2-1.cdh5.8.2.p0.3 (Hadoop 2.6.0)
Kafka - 2.0.2-1.2.0.2.p0.5 (Kafka 0.9.0)
Thanks for your help!
Wayne
On 2016-12-01 12:54 PM, Maximilian Michels wrote:
What is the output of the following on the nodes? I have a
suspision
that something sneaks in from one of the classpath
variables that
Flink picks up:
echo
${HADOOP_CLASSPATH}:${HADOOP_CONF_DIR}:${YARN_CONF_DIR}:${HBASE_CONF_DIR}
On Tue, Nov 29, 2016 at 9:17 PM, Wayne Collins
<[email protected] <mailto:[email protected]>> wrote:
Hi Max,
I rebuilt my sandbox with Beam 0.3.0-incubating and Flink
1.1.2 and I'm
still seeing the following error message with the
StreamWordCount demo code:
Caused by: java.lang.IllegalAccessError: tried to access
method
com.google.common.base.Optional.<init>()V from class
com.google.common.base.Absent
at
com.google.common.base.Absent.<init>(Absent.java:35)
at
com.google.common.base.Absent.<clinit>(Absent.java:33)
at sun.misc.Unsafe.ensureClassInitialized(Native
Method)
...
(snip)