Hi Jorn,

There are no more logs . Attaching yarn aggregated logs for first problem .
For second one job is not even getting submitted.

- Sohi

On Sun, Dec 9, 2018 at 2:13 PM Jörn Franke <jornfra...@gmail.com> wrote:

> Can you check the Flink log files? You should get there a better
> description of the error.
>
> > Am 08.12.2018 um 18:15 schrieb sohimankotia <sohimanko...@gmail.com>:
> >
> > Hi ,
> >
> > I have installed flink-1.7.0 Hadoop 2.7 scala 2.11 .  We are using
> > hortonworks hadoop distribution.(hdp/2.6.1.0-129/)
> >
> > *Flink lib folder looks like :*
> >
> >
> > -rw-r--r-- 1 hdfs hadoop 93184216 Nov 29 02:15 flink-dist_2.11-1.7.0.jar
> > -rw-r--r-- 1 hdfs hadoop    79219 Nov 29 03:33
> > flink-hadoop-compatibility_2.11-1.7.0.jar
> > -rw-r--r-- 1 hdfs hadoop   141881 Nov 29 02:13
> flink-python_2.11-1.7.0.jar
> > -rw-r--r-- 1 hdfs hadoop   489884 Nov 28 23:01 log4j-1.2.17.jar
> > -rw-r--r-- 1 hdfs hadoop     9931 Nov 28 23:01 slf4j-log4j12-1.7.15.j
> >
> > *My code :*
> >
> >       ExecutionEnvironment env =
> > ExecutionEnvironment.getExecutionEnvironment();
> >
> >       String p = args[0];
> >
> >
> >       Job job = Job.getInstance();
> >       SequenceFileInputFormat<Text, BytesWritable> inputFormat = new
> > SequenceFileInputFormat<>();
> >
> > job.getConfiguration().setBoolean(FileInputFormat.INPUT_DIR_RECURSIVE,
> > true);
> >       final HadoopInputFormat<Text, BytesWritable> hInputEvents =
> > HadoopInputs.readHadoopFile(inputFormat, Text.class,
> BytesWritable.class, p,
> > job);
> >       org.apache.flink.configuration.Configuration fileReadConfig = new
> > org.apache.flink.configuration.Configuration();
> >
> >       env.createInput(hInputEvents)
> >               .output(new PrintingOutputFormat<>());
> >
> >
> > *pom.xml*
> >
> > flink.version = 1.7.0
> >
> >    <dependency>
> >      <groupId>org.apache.flink</groupId>
> >      <artifactId>flink-java</artifactId>
> >      <version>${flink.version}</version>
> >      <scope>provided</scope>
> >    </dependency>
> >    <dependency>
> >      <groupId>org.apache.flink</groupId>
> >      <artifactId>flink-clients_2.11</artifactId>
> >      <version>${flink.version}</version>
> >      <scope>provided</scope>
> >    </dependency>
> >    <dependency>
> >      <groupId>org.apache.flink</groupId>
> >      <artifactId>flink-streaming-java_2.11</artifactId>
> >      <version>${flink.version}</version>
> >      <scope>provided</scope>
> >    </dependency>
> >
> >    <dependency>
> >      <groupId>org.apache.flink</groupId>
> >      <artifactId>flink-hadoop-compatibility_2.11</artifactId>
> >      <version>${flink.version}</version>
> >      <scope>provided</scope>
> >    </dependency>
> >
> >    <dependency>
> >      <groupId>org.apache.flink</groupId>
> >      <artifactId>flink-shaded-hadoop2</artifactId>
> >      <version>${flink.version}</version>
> >      <scope>provided</scope>
> >    </dependency>
> >
> > *
> > in script :*
> >
> >
> >
> > export HADOOP_CONF_DIR=/etc/hadoop/conf
> > export HADOOP_CLASSPATH="/usr/hdp/2.6.1.0-129/hadoop/hadoop-*":`hadoop
> > classpath`
> >
> > echo ${HADOOP_CLASSPATH}
> >
> > PARALLELISM=1
> > JAR_PATH="jar"
> > CLASS_NAME="CLASS_NAME"
> > NODES=1
> > SLOTS=1
> > MEMORY_PER_NODE=2048
> > QUEUE="default"
> > NAME="sample"
> >
> > IN="input-file-path"
> >
> >
> > /home/hdfs/flink-1.7.0/bin/flink run -m yarn-cluster  -yn ${NODES} -yqu
> > ${QUEUE} -ys ${SLOTS} -ytm ${MEMORY_PER_NODE} --parallelism
> ${PARALLELISM}
> > -ynm ${NAME} -c ${CLASS_NAME} ${JAR_PATH} ${IN}
> >
> >
> > *where classpath is printing:*
> >
> >
> /usr/hdp/2.6.1.0-129/hadoop/hadoop-*:/usr/hdp/2.6.1.0-129/hadoop/conf:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop/.//*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/./:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/.//*:/usr/hdp/2.6.1.0-129/hadoop-yarn/lib/*:/usr/hdp/2.6.1.0-129/hadoop-yarn/.//*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/.//*:/usr/hdp/
> 2.6.1.
> 0-129/hadoop/conf:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop/.//*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/./:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/.//*:/usr/hdp/2.6.1.0-129/hadoop-yarn/lib/*:/usr/hdp/2.6.1.0-129/hadoop-yarn/.//*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/.//*::mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.6.1.0-129/tez/*:/usr/hdp/2.6.1.0-129/tez/lib/*:/usr/hdp/2.6.1.0-129/tez/conf:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.6.1.0-129/tez/*:/usr/hdp/2.6.1.0-129/tez/lib/*:/usr/hdp/2.6.1.0-129/tez/conf
> >
> >
> > But I am getting class not found error for hadoop related jar . Error is
> > attached .
> >
> >
> > error.txt
> > <
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/file/t894/error.txt>
>
> > *Another Problem :*
> >
> > If i added hadoop shaded jar in lib folder
> >
> >
> > -rw-r--r-- 1 hdfs hadoop 93184216 Nov 29 02:15 flink-dist_2.11-1.7.0.jar
> > -rw-r--r-- 1 hdfs hadoop    79219 Nov 29 03:33
> > flink-hadoop-compatibility_2.11-1.7.0.jar
> > -rw-r--r-- 1 hdfs hadoop   141881 Nov 29 02:13
> flink-python_2.11-1.7.0.jar
> > *-rw-r--r-- 1 hdfs hadoop 41130742 Dec  8 22:38
> > flink-shaded-hadoop2-uber-1.7.0.jar*
> > -rw-r--r-- 1 hdfs hadoop   489884 Nov 28 23:01 log4j-1.2.17.jar
> > -rw-r--r-- 1 hdfs hadoop     9931 Nov 28 23:01 slf4j-log4j12-1.7.15.jar
> >
> > I am getting following error. And this is happening for all version
> greater
> > than 1.4.2 .
> >
> > java.lang.IllegalAccessError: tried to access method
> >
> org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider.getProxyInternal()Ljava/lang/Object;
> > from class
> > org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider
> >    at
> >
> org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider.init(RequestHedgingRMFailoverProxyProvider.java:75)
> >    at
> >
> org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:163)
> >    at
> org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:94)
> >    at
> >
> org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72)
> >    at
> >
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:187)
> >    at
> > org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
> >    at
> >
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:985)
> >    at
> >
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:273)
> >    at
> >
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:451)
> >    at
> >
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:96)
> >    at
> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:224)
> >
> >
> > Thanks in advance .
> >
> >
> >
> >
> >
> > --
> > Sent from:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
>
Container: container_e186_1541304140537_100017_01_000001 on nn-hw-m1d23.dbp.name.in_45454_1544345374804
LogAggregationType: AGGREGATED
============================================================================================================
LogType:directory.info
Log Upload Time:Sun Dec 09 14:19:34 +0530 2018
LogLength:2223
Log Contents:
ls -l:
total 40
-rw------- 1 yarn hadoop    71 Dec  9 14:19 container_tokens
lrwxrwxrwx 1 yarn hadoop   168 Dec  9 14:19 flink-conf.yaml -> /grid/12/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/11/application_1541304140537_100017-flink-conf.yaml1533676801661714355.tmp
lrwxrwxrwx 1 yarn hadoop   121 Dec  9 14:19 flink.jar -> /grid/6/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/16/flink-dist_2.11-1.7.0.jar
-rwx------ 1 yarn hadoop 10659 Dec  9 14:19 launch_container.sh
drwxr-x--- 2 yarn hadoop  4096 Dec  9 14:19 lib
lrwxrwxrwx 1 yarn hadoop   112 Dec  9 14:19 log4j.properties -> /grid/5/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/15/log4j.properties
lrwxrwxrwx 1 yarn hadoop   107 Dec  9 14:19 logback.xml -> /grid/2/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/12/logback.xml
drwxr-x--- 2 yarn hadoop  4096 Dec  9 14:19 tmp
find -L . -maxdepth 5 -ls:
5111823    4 drwxr-x---   4 yarn     hadoop       4096 Dec  9 14:19 .
5111826    4 -rw-------   1 yarn     hadoop         71 Dec  9 14:19 ./container_tokens
139198601    4 -r-x------   1 yarn     hadoop        292 Dec  9 14:19 ./flink-conf.yaml
168427613    4 -r-x------   1 yarn     hadoop       2331 Dec  9 14:19 ./logback.xml
5111824    4 drwxr-x---   2 yarn     hadoop       4096 Dec  9 14:19 ./tmp
5111825   12 -rwx------   1 yarn     hadoop      10659 Dec  9 14:19 ./launch_container.sh
133300401    4 -r-x------   1 yarn     hadoop       1939 Dec  9 14:19 ./log4j.properties
54001673 91004 -r-x------   1 yarn     hadoop   93184216 Dec  9 14:19 ./flink.jar
5111828    4 drwxr-x---   2 yarn     hadoop       4096 Dec  9 14:19 ./lib
241041669  140 -r-x------   1 yarn     hadoop     141881 Dec  9 14:19 ./lib/flink-python_2.11-1.7.0.jar
50725055   80 -r-x------   1 yarn     hadoop      79219 Dec  9 14:19 ./lib/flink-hadoop-compatibility_2.11-1.7.0.jar
11010215   12 -r-x------   1 yarn     hadoop       9931 Dec  9 14:19 ./lib/slf4j-log4j12-1.7.15.jar
120586377  480 -r-x------   1 yarn     hadoop     489884 Dec  9 14:19 ./lib/log4j-1.2.17.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):

End of LogType:directory.info

LogType:launch_container.sh
Log Upload Time:Sun Dec 09 14:19:34 +0530 2018
LogLength:10659
Log Contents:
#!/bin/bash

export PATH="/usr/sbin:/sbin:/usr/lib/ambari-server/*:/mnt/vol1/jdk1.8.0_144/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent"
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/hdp/current/hadoop-client/conf"}
export MAX_APP_ATTEMPTS="1"
export _SLOTS="1"
export JAVA_HOME=${JAVA_HOME:-"/mnt/vol1/jdk1.8.0_144/"}
export _CLIENT_HOME_DIR="hdfs://hwcluster01/user/hdfs"
export LANG="en_US.UTF-8"
export APP_SUBMIT_TIME_ENV="1544345364329"
export NM_HOST="nn-hw-m1d23.dbp.name.in"
export _APP_ID="application_1541304140537_100017"
export HADOOP_USER_NAME="hdfs"
export LOGNAME="hdfs"
export JVM_PID="$$"
export _DETACHED="false"
export PWD="/grid/8/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001"
export LOCAL_DIRS="/grid/1/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/10/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/11/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/12/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/2/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/3/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/4/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/5/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/6/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/7/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/8/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/9/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017"
export APPLICATION_WEB_PROXY_BASE="/proxy/application_1541304140537_100017"
export _DYNAMIC_PROPERTIES=""
export NM_HTTP_PORT="8042"
export LOG_DIRS="/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/10/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/11/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/12/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/2/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/4/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/5/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/6/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/7/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/8/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001,/grid/9/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001"
export NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
"
export _CLIENT_SHIP_FILES="logback.xml=hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/logback.xml,log4j.properties=hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/log4j.properties,lib/flink-python_2.11-1.7.0.jar=hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/lib/flink-python_2.11-1.7.0.jar,lib/log4j-1.2.17.jar=hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/lib/log4j-1.2.17.jar,lib/flink-hadoop-compatibility_2.11-1.7.0.jar=hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/lib/flink-hadoop-compatibility_2.11-1.7.0.jar,lib/slf4j-log4j12-1.7.15.jar=hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/lib/slf4j-log4j12-1.7.15.jar,"
export NM_PORT="45454"
export USER="hdfs"
export HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-"/usr/hdp/current/hadoop-yarn-nodemanager"}
export CLASSPATH="lib/flink-hadoop-compatibility_2.11-1.7.0.jar:lib/flink-python_2.11-1.7.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.15.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*"
export _FLINK_YARN_FILES="hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017"
export _ZOOKEEPER_NAMESPACE="application_1541304140537_100017"
export HADOOP_TOKEN_FILE_LOCATION="/grid/8/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/container_tokens"
export NM_AUX_SERVICE_spark_shuffle=""
export _FLINK_JAR_PATH="hdfs://hwcluster01/user/hdfs/.flink/application_1541304140537_100017/flink-dist_2.11-1.7.0.jar"
export LOCAL_USER_DIRS="/grid/1/hadoop/yarn/local/usercache/hdfs/,/grid/10/hadoop/yarn/local/usercache/hdfs/,/grid/11/hadoop/yarn/local/usercache/hdfs/,/grid/12/hadoop/yarn/local/usercache/hdfs/,/grid/2/hadoop/yarn/local/usercache/hdfs/,/grid/3/hadoop/yarn/local/usercache/hdfs/,/grid/4/hadoop/yarn/local/usercache/hdfs/,/grid/5/hadoop/yarn/local/usercache/hdfs/,/grid/6/hadoop/yarn/local/usercache/hdfs/,/grid/7/hadoop/yarn/local/usercache/hdfs/,/grid/8/hadoop/yarn/local/usercache/hdfs/,/grid/9/hadoop/yarn/local/usercache/hdfs/"
export HADOOP_HOME="/usr/hdp/2.6.1.0-129/hadoop"
export _FLINK_CLASSPATH="lib/flink-hadoop-compatibility_2.11-1.7.0.jar:lib/flink-python_2.11-1.7.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.15.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml:"
export _CLIENT_TM_COUNT="1"
export _CLIENT_TM_MEMORY="2048"
export HOME="/home/"
export NM_AUX_SERVICE_spark2_shuffle=""
export CONTAINER_ID="container_e186_1541304140537_100017_01_000001"
export MALLOC_ARENA_MAX="4"
ln -sf "/grid/12/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/11/application_1541304140537_100017-flink-conf.yaml1533676801661714355.tmp" "flink-conf.yaml"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
mkdir -p lib
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/7/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/17/log4j-1.2.17.jar" "lib/log4j-1.2.17.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
mkdir -p lib
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/4/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/14/flink-python_2.11-1.7.0.jar" "lib/flink-python_2.11-1.7.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/6/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/16/flink-dist_2.11-1.7.0.jar" "flink.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
mkdir -p lib
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/3/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/13/flink-hadoop-compatibility_2.11-1.7.0.jar" "lib/flink-hadoop-compatibility_2.11-1.7.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/2/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/12/logback.xml" "logback.xml"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
mkdir -p lib
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/11/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/10/slf4j-log4j12-1.7.15.jar" "lib/slf4j-log4j12-1.7.15.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/grid/5/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/15/log4j.properties" "log4j.properties"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" "/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/launch_container.sh"
chmod 640 "/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/directory.info"
ls -l 1>>"/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/directory.info"
find -L . -maxdepth 5 -ls 1>>"/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/grid/1/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -Xmx424m  -Dlog.file="/grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/jobmanager.log" -Dlogback.configurationFile=file:logback.xml -Dlog4j.configuration=file:log4j.properties org.apache.flink.yarn.entrypoint.YarnSessionClusterEntrypoint  1> /grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/jobmanager.out 2> /grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/jobmanager.err"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

End of LogType:launch_container.sh

LogType:jobmanager.err
Log Upload Time:Sun Dec 09 14:19:34 +0530 2018
LogLength:533
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/grid/11/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/filecache/10/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

End of LogType:jobmanager.err

LogType:jobmanager.log
Log Upload Time:Sun Dec 09 14:19:34 +0530 2018
LogLength:45929
Log Contents:
2018-12-09 14:19:28,621 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - --------------------------------------------------------------------------------
2018-12-09 14:19:28,622 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  Starting YarnSessionClusterEntrypoint (Version: 1.7.0, Rev:49da9f9, Date:28.11.2018 @ 17:59:06 UTC)
2018-12-09 14:19:28,623 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  OS current user: yarn
2018-12-09 14:19:29,031 WARN  org.apache.hadoop.util.NativeCodeLoader                       - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-12-09 14:19:29,067 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  Current Hadoop/Kerberos user: hdfs
2018-12-09 14:19:29,067 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  JVM: Java HotSpot(TM) 64-Bit Server VM - Oracle Corporation - 1.8/25.144-b01
2018-12-09 14:19:29,067 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  Maximum heap size: 406 MiBytes
2018-12-09 14:19:29,067 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  JAVA_HOME: /mnt/vol1/jdk1.8.0_144/
2018-12-09 14:19:29,068 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  Hadoop version: 2.7.3.2.6.1.0-129
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  JVM Options:
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -     -Xmx424m
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -     -Dlog.file=/grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/jobmanager.log
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -     -Dlogback.configurationFile=file:logback.xml
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -     -Dlog4j.configuration=file:log4j.properties
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  Program Arguments: (none)
2018-12-09 14:19:29,069 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -  Classpath: lib/flink-hadoop-compatibility_2.11-1.7.0.jar:lib/flink-python_2.11-1.7.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.15.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/hdp/current/hadoop-client/hadoop-azure.jar:/usr/hdp/current/hadoop-client/hadoop-annotations.jar:/usr/hdp/current/hadoop-client/hadoop-auth-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/hadoop-nfs-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/hadoop-nfs.jar:/usr/hdp/current/hadoop-client/hadoop-azure-datalake-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/hadoop-common-2.7.3.2.6.1.0-129-tests.jar:/usr/hdp/current/hadoop-client/hadoop-auth.jar:/usr/hdp/current/hadoop-client/hadoop-annotations-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/azure-data-lake-store-sdk-2.1.4.jar:/usr/hdp/current/hadoop-client/hadoop-common-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/hadoop-azure-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/hadoop-common-tests.jar:/usr/hdp/current/hadoop-client/hadoop-common.jar:/usr/hdp/current/hadoop-client/hadoop-azure-datalake.jar:/usr/hdp/current/hadoop-client/hadoop-aws-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/hadoop-aws.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/ranger-hdfs-plugin-shim-0.7.0.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/lib/slf4j-api-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-client/lib/ojdbc6.jar:/usr/hdp/current/hadoop-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-client/lib/servlet-api-2.5.jar:/usr/hdp/current/hadoop-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jsch-0.1.54.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-client/lib/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-client/lib/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/zookeeper-3.4.6.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/joda-time-2.9.4.jar:/usr/hdp/current/hadoop-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-client/lib/junit-4.11.jar:/usr/hdp/current/hadoop-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-client/lib/ranger-plugin-classloader-0.7.0.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-client/lib/ranger-yarn-plugin-shim-0.7.0.2.6.1.0-129.jar:/usr/hdp/current/hadoop-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-client/lib/xz-1.0.jar:/usr/hdp/current/hadoop-hdfs-client/hadoop-hdfs-nfs.jar:/usr/hdp/current/hadoop-hdfs-client/hadoop-hdfs.jar:/usr/hdp/current/hadoop-hdfs-client/hadoop-hdfs-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-hdfs-client/hadoop-hdfs-2.7.3.2.6.1.0-129-tests.jar:/usr/hdp/current/hadoop-hdfs-client/hadoop-hdfs-tests.jar:/usr/hdp/current/hadoop-hdfs-client/hadoop-hdfs-nfs-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-hdfs-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-hdfs-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hadoop-hdfs-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-hdfs-client/lib/servlet-api-2.5.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-hdfs-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-hdfs-client/lib/netty-all-4.0.23.Final.jar:/usr/hdp/current/hadoop-hdfs-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-hdfs-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-hdfs-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-hdfs-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-hdfs-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-hdfs-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-hdfs-client/lib/xml-apis-1.3.04.jar:/usr/hdp/current/hadoop-hdfs-client/lib/okhttp-2.4.0.jar:/usr/hdp/current/hadoop-hdfs-client/lib/xercesImpl-2.9.1.jar:/usr/hdp/current/hadoop-hdfs-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-hdfs-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-hdfs-client/lib/okio-1.4.0.jar:/usr/hdp/current/hadoop-hdfs-client/lib/commons-daemon-1.0.13.jar:/usr/hdp/current/hadoop-hdfs-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-hdfs-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-sharedcachemanager-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-tests.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-nodemanager-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-api-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-api.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-common.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-timeline-pluginstorage-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-nodemanager.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-web-proxy.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-registry.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-registry-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-applicationhistoryservice-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-common-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-common.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-tests-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-client.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-timeline-pluginstorage.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-common-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-resourcemanager.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-resourcemanager-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-server-web-proxy-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-client-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-applications-distributedshell-2.7.3.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/hadoop-yarn-applications-distributedshell.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-yarn-client/lib/zookeeper-3.4.6.2.6.1.0-129-tests.jar:/usr/hdp/current/hadoop-yarn-client/lib/javax.inject-1.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-yarn-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-yarn-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/jersey-client-1.9.jar:/usr/hdp/current/hadoop-yarn-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-yarn-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hadoop-yarn-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-yarn-client/lib/servlet-api-2.5.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-yarn-client/lib/jsch-0.1.54.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/objenesis-2.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/guice-3.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/fst-2.24.jar:/usr/hdp/current/hadoop-yarn-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-yarn-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-yarn-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-yarn-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/metrics-core-3.0.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/zookeeper-3.4.6.2.6.1.0-129.jar:/usr/hdp/current/hadoop-yarn-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/javassist-3.18.1-GA.jar:/usr/hdp/current/hadoop-yarn-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-yarn-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-yarn-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-yarn-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-yarn-client/lib/jersey-guice-1.9.jar:/usr/hdp/current/hadoop-yarn-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-yarn-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-yarn-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-yarn-client/lib/aopalliance-1.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hadoop-yarn-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-yarn-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-yarn-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-yarn-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/guice-servlet-3.0.jar:/usr/hdp/current/hadoop-yarn-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-yarn-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-yarn-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-yarn-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-yarn-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-yarn-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-yarn-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-yarn-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-yarn-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-yarn-client/lib/xz-1.0.jar
2018-12-09 14:19:29,070 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - --------------------------------------------------------------------------------
2018-12-09 14:19:29,070 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Registered UNIX signal handlers for [TERM, HUP, INT]
2018-12-09 14:19:29,135 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - YARN daemon is running as: hdfs Yarn client user obtainer: hdfs
2018-12-09 14:19:29,140 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: rest.port, 8081
2018-12-09 14:19:29,140 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: internal.cluster.execution-mode, NORMAL
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: parallelism.default, 1
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: high-availability.cluster-id, application_1541304140537_100017
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.rpc.address, localhost
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.rpc.port, 6123
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: taskmanager.heap.size, 2048m
2018-12-09 14:19:29,141 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.heap.size, 1024m
2018-12-09 14:19:29,160 INFO  org.apache.flink.runtime.clusterframework.BootstrapTools      - Setting directories for temporary files to: /grid/1/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/10/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/11/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/12/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/2/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/3/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/4/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/5/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/6/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/7/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/8/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017,/grid/9/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017
2018-12-09 14:19:29,187 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Starting YarnSessionClusterEntrypoint.
2018-12-09 14:19:29,187 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Install default filesystem.
2018-12-09 14:19:29,261 INFO  org.apache.flink.runtime.security.modules.HadoopModule        - Hadoop user set to hdfs (auth:SIMPLE)
2018-12-09 14:19:29,276 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Initializing cluster services.
2018-12-09 14:19:29,429 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils         - Trying to start actor system at nn-hw-m1d23.dbp.name.in:0
2018-12-09 14:19:29,804 INFO  akka.event.slf4j.Slf4jLogger                                  - Slf4jLogger started
2018-12-09 14:19:29,895 INFO  akka.remote.Remoting                                          - Starting remoting
2018-12-09 14:19:30,072 INFO  akka.remote.Remoting                                          - Remoting started; listening on addresses :[akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121]
2018-12-09 14:19:30,080 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils         - Actor system started at akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121
2018-12-09 14:19:30,100 INFO  org.apache.flink.runtime.blob.BlobServer                      - Created BLOB server storage directory /grid/12/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/blobStore-5308aebb-1216-439e-ba5e-aafc4822af44
2018-12-09 14:19:30,111 INFO  org.apache.flink.runtime.blob.BlobServer                      - Started BLOB server at 0.0.0.0:22767 - max concurrent requests: 50 - max backlog: 1000
2018-12-09 14:19:30,157 INFO  org.apache.flink.runtime.metrics.MetricRegistryImpl           - No metrics reporter configured, no metrics will be exposed/reported.
2018-12-09 14:19:30,158 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Trying to start actor system at nn-hw-m1d23.dbp.name.in:0
2018-12-09 14:19:30,176 INFO  akka.event.slf4j.Slf4jLogger                                  - Slf4jLogger started
2018-12-09 14:19:30,185 INFO  akka.remote.Remoting                                          - Starting remoting
2018-12-09 14:19:30,195 INFO  akka.remote.Remoting                                          - Remoting started; listening on addresses :[akka.tcp://flink-metr...@nn-hw-m1d23.dbp.name.in:15452]
2018-12-09 14:19:30,196 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Actor system started at akka.tcp://flink-metr...@nn-hw-m1d23.dbp.name.in:15452
2018-12-09 14:19:30,202 INFO  org.apache.flink.runtime.dispatcher.FileArchivedExecutionGraphStore  - Initializing FileArchivedExecutionGraphStore: Storage directory /grid/1/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/executionGraphStore-12e37892-5523-4e94-94c2-005875fcce45, expiration time 3600000, maximum cache size 52428800 bytes.
2018-12-09 14:19:30,225 INFO  org.apache.flink.runtime.blob.TransientBlobCache              - Created BLOB cache storage directory /grid/11/hadoop/yarn/local/usercache/hdfs/appcache/application_1541304140537_100017/blobStore-bf84b253-7cac-41a4-9adf-8388f571ab9b
2018-12-09 14:19:30,248 WARN  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Upload directory /tmp/flink-web-6345fac2-a787-46a3-be09-4ab1cf646849/flink-web-upload does not exist, or has been deleted externally. Previously uploaded files are no longer available.
2018-12-09 14:19:30,260 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Created directory /tmp/flink-web-6345fac2-a787-46a3-be09-4ab1cf646849/flink-web-upload for file uploads.
2018-12-09 14:19:30,264 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Starting rest endpoint.
2018-12-09 14:19:30,510 INFO  org.apache.flink.runtime.webmonitor.WebMonitorUtils           - Determined location of main cluster component log file: /grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/jobmanager.log
2018-12-09 14:19:30,510 INFO  org.apache.flink.runtime.webmonitor.WebMonitorUtils           - Determined location of main cluster component stdout file: /grid/3/hadoop/yarn/log/application_1541304140537_100017/container_e186_1541304140537_100017_01_000001/jobmanager.out
2018-12-09 14:19:30,679 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Rest endpoint listening at nn-hw-m1d23.dbp.name.in:26980
2018-12-09 14:19:30,679 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - http://nn-hw-m1d23.dbp.name.in:26980 was granted leadership with leaderSessionID=00000000-0000-0000-0000-000000000000
2018-12-09 14:19:30,679 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Web frontend listening at http://nn-hw-m1d23.dbp.name.in:26980.
2018-12-09 14:19:30,745 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService              - Starting RPC endpoint for org.apache.flink.yarn.YarnResourceManager at akka://flink/user/resourcemanager .
2018-12-09 14:19:30,797 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService              - Starting RPC endpoint for org.apache.flink.runtime.dispatcher.StandaloneDispatcher at akka://flink/user/dispatcher .
2018-12-09 14:19:31,022 INFO  org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider  - Looking for the active RM in [rm1, rm2]...
2018-12-09 14:19:31,289 INFO  org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider  - Found active RM [rm2]
2018-12-09 14:19:31,291 INFO  org.apache.flink.yarn.YarnResourceManager                     - Recovered 0 containers from previous attempts ([]).
2018-12-09 14:19:31,303 INFO  org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy  - yarn.client.max-cached-nodemanagers-proxies : 0
2018-12-09 14:19:31,305 INFO  org.apache.flink.yarn.YarnResourceManager                     - ResourceManager akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121/user/resourcemanager was granted leadership with fencing token 00000000000000000000000000000000
2018-12-09 14:19:31,306 INFO  org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager  - Starting the SlotManager.
2018-12-09 14:19:31,335 INFO  org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Dispatcher akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121/user/dispatcher was granted leadership with fencing token 00000000-0000-0000-0000-000000000000
2018-12-09 14:19:31,341 INFO  org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Recovering all persisted jobs.
2018-12-09 14:19:32,467 INFO  org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Submitting job 53e77d649a08e1274a2bcd15ce979515 (Hadoop WordCount).
2018-12-09 14:19:32,525 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService              - Starting RPC endpoint for org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_0 .
2018-12-09 14:19:32,533 INFO  org.apache.flink.runtime.jobmaster.JobMaster                  - Initializing job Hadoop WordCount (53e77d649a08e1274a2bcd15ce979515).
2018-12-09 14:19:32,545 INFO  org.apache.flink.runtime.jobmaster.JobMaster                  - Using restart strategy NoRestartStrategy for Hadoop WordCount (53e77d649a08e1274a2bcd15ce979515).
2018-12-09 14:19:32,549 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService              - Starting RPC endpoint for org.apache.flink.runtime.jobmaster.slotpool.SlotPool at akka://flink/user/6a0b9952-5c9f-43e0-91c6-c260a9d73dbe .
2018-12-09 14:19:32,572 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Job recovers via failover strategy: full graph restart
2018-12-09 14:19:32,598 INFO  org.apache.flink.runtime.jobmaster.JobMaster                  - Running initialization on master for job Hadoop WordCount (53e77d649a08e1274a2bcd15ce979515).
2018-12-09 14:19:32,609 ERROR org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Failed to submit job 53e77d649a08e1274a2bcd15ce979515.
java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
	at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
	at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:176)
	at org.apache.flink.runtime.dispatcher.Dispatcher$DefaultJobManagerRunnerFactory.createJobManagerRunner(Dispatcher.java:1058)
	at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:308)
	at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
	... 7 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 'DataSource (at createInput(ExecutionEnvironment.java:548) (org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat))': Deserializing the InputFormat (org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat@4441d567) failed: org/apache/hadoop/mapreduce/JobContext
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:220)
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
	at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1166)
	at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1146)
	at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:296)
	at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:157)
	... 10 more
Caused by: java.lang.Exception: Deserializing the InputFormat (org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat@4441d567) failed: org/apache/hadoop/mapreduce/JobContext
	at org.apache.flink.runtime.jobgraph.InputFormatVertex.initializeOnMaster(InputFormatVertex.java:66)
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:216)
	... 15 more
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/JobContext
	at java.lang.Class.getDeclaredFields0(Native Method)
	at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
	at java.lang.Class.getDeclaredField(Class.java:2068)
	at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1703)
	at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:484)
	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:472)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:472)
	at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:369)
	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:598)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:524)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:510)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:498)
	at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:459)
	at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:288)
	at org.apache.flink.runtime.jobgraph.InputFormatVertex.initializeOnMaster(InputFormatVertex.java:63)
	... 16 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.JobContext
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 44 more
2018-12-09 14:19:32,612 ERROR org.apache.flink.runtime.rest.handler.job.JobSubmitHandler    - Implementation error: Unhandled exception.
org.apache.flink.runtime.client.JobSubmissionException: Failed to submit job.
	at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$submitJob$2(Dispatcher.java:267)
	at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
	at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
	at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:739)
	at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:332)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:158)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:70)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.onReceive(AkkaRpcActor.java:142)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.onReceive(FencedAkkaRpcActor.java:40)
	at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:165)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:502)
	at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:95)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
	at akka.actor.ActorCell.invoke(ActorCell.scala:495)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
	at akka.dispatch.Mailbox.run(Mailbox.scala:224)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
	at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
	... 4 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
	at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:176)
	at org.apache.flink.runtime.dispatcher.Dispatcher$DefaultJobManagerRunnerFactory.createJobManagerRunner(Dispatcher.java:1058)
	at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:308)
	at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
	... 7 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 'DataSource (at createInput(ExecutionEnvironment.java:548) (org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat))': Deserializing the InputFormat (org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat@4441d567) failed: org/apache/hadoop/mapreduce/JobContext
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:220)
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:100)
	at org.apache.flink.runtime.jobmaster.JobMaster.createExecutionGraph(JobMaster.java:1166)
	at org.apache.flink.runtime.jobmaster.JobMaster.createAndRestoreExecutionGraph(JobMaster.java:1146)
	at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:296)
	at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:157)
	... 10 more
Caused by: java.lang.Exception: Deserializing the InputFormat (org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat@4441d567) failed: org/apache/hadoop/mapreduce/JobContext
	at org.apache.flink.runtime.jobgraph.InputFormatVertex.initializeOnMaster(InputFormatVertex.java:66)
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:216)
	... 15 more
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/JobContext
	at java.lang.Class.getDeclaredFields0(Native Method)
	at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
	at java.lang.Class.getDeclaredField(Class.java:2068)
	at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1703)
	at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:484)
	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:472)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:472)
	at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:369)
	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:598)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:524)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:510)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:498)
	at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:459)
	at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:288)
	at org.apache.flink.runtime.jobgraph.InputFormatVertex.initializeOnMaster(InputFormatVertex.java:63)
	... 16 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.JobContext
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 44 more
2018-12-09 14:19:32,713 INFO  org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Stopping dispatcher akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121/user/dispatcher.
2018-12-09 14:19:32,713 INFO  org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Stopping all currently running jobs of dispatcher akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121/user/dispatcher.
2018-12-09 14:19:32,719 INFO  org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator  - Shutting down stack trace sample coordinator.
2018-12-09 14:19:32,722 INFO  org.apache.flink.runtime.dispatcher.StandaloneDispatcher      - Stopped dispatcher akka.tcp://fl...@nn-hw-m1d23.dbp.name.in:5121/user/dispatcher.
2018-12-09 14:19:32,723 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Shutting YarnSessionClusterEntrypoint down with application status SUCCEEDED. Diagnostics null.
2018-12-09 14:19:32,723 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Shutting down rest endpoint.
2018-12-09 14:19:32,774 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Removing cache directory /tmp/flink-web-6345fac2-a787-46a3-be09-4ab1cf646849/flink-web-ui
2018-12-09 14:19:32,774 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - http://nn-hw-m1d23.dbp.name.in:26980 lost leadership
2018-12-09 14:19:32,775 INFO  org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint    - Shut down complete.
2018-12-09 14:19:32,776 INFO  org.apache.flink.yarn.YarnResourceManager                     - Shut down cluster because application is in SUCCEEDED, diagnostics null.
2018-12-09 14:19:32,778 INFO  org.apache.flink.yarn.YarnResourceManager                     - Unregister application from the YARN Resource Manager with final status SUCCEEDED.
2018-12-09 14:19:32,809 INFO  org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl         - Waiting for application to be successfully unregistered.
2018-12-09 14:19:33,471 WARN  org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory       - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-12-09 14:19:33,511 INFO  org.apache.hadoop.yarn.client.api.impl.NMClientImpl           - Clean up running containers on stop.
2018-12-09 14:19:33,511 INFO  org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl  - Interrupted while waiting for queue
java.lang.InterruptedException
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048)
	at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
	at org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl$CallbackHandlerThread.run(AMRMClientAsyncImpl.java:276)
2018-12-09 14:19:33,511 INFO  org.apache.hadoop.yarn.client.api.impl.NMClientImpl           - Running containers cleaned up. Stopping NM proxies.
2018-12-09 14:19:33,512 INFO  org.apache.hadoop.yarn.client.api.impl.NMClientImpl           - Stopped all proxies.
2018-12-09 14:19:33,512 INFO  org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager  - Closing the SlotManager.
2018-12-09 14:19:33,512 INFO  org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager  - Suspending the SlotManager.
2018-12-09 14:19:33,514 INFO  org.apache.flink.runtime.blob.BlobServer                      - Stopped BLOB server at 0.0.0.0:22767
2018-12-09 14:19:33,514 INFO  org.apache.flink.runtime.blob.TransientBlobCache              - Shutting down BLOB cache
2018-12-09 14:19:33,519 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService              - Stopping Akka RPC service.
2018-12-09 14:19:33,524 INFO  org.apache.flink.runtime.jobmaster.slotpool.SlotPool          - Stopping SlotPool.
2018-12-09 14:19:33,525 INFO  org.apache.flink.runtime.jobmaster.JobMaster                  - Stopping the JobMaster for job Hadoop WordCount(53e77d649a08e1274a2bcd15ce979515).
2018-12-09 14:19:33,525 INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator         - Shutting down remote daemon.
2018-12-09 14:19:33,527 INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator         - Remote daemon shut down; proceeding with flushing remote transports.
2018-12-09 14:19:33,530 INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator         - Shutting down remote daemon.
2018-12-09 14:19:33,533 INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator         - Remote daemon shut down; proceeding with flushing remote transports.
2018-12-09 14:19:33,555 INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator         - Remoting shut down.
2018-12-09 14:19:33,555 INFO  akka.remote.RemoteActorRefProvider$RemotingTerminator         - Remoting shut down.
2018-12-09 14:19:33,576 INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService              - Stopped Akka RPC service.
2018-12-09 14:19:33,577 INFO  org.apache.flink.runtime.entrypoint.ClusterEntrypoint         - Terminating cluster entrypoint process YarnSessionClusterEntrypoint with exit code 0.

End of LogType:jobmanager.log

LogType:jobmanager.out
Log Upload Time:Sun Dec 09 14:19:34 +0530 2018
LogLength:0
Log Contents:

End of LogType:jobmanager.out

Reply via email to