-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/54033/
-----------------------------------------------------------

Review request for bigtop.


Bugs: BIGTOP-2604
    https://issues.apache.org/jira/browse/BIGTOP-2604


Repository: bigtop


Description
-------

Flink sub-module, flink-dist, has some dependencies that are not properly 
shaded when built with Maven 3.3 or greater (due to a Maven bug). The 
instructions on the Flink website say to build the parent and then build 
flink-dist again, regardless of Maven version. BigTop should be doing this in 
its packaging since it's not yet clear that the Maven community will fix the 
issue.


Diffs
-----

  bigtop-packages/src/common/flink/do-component-build 4fa9102 

Diff: https://reviews.apache.org/r/54033/diff/


Testing
-------

Tested a program that writes to an Amazon Kinesis stream using the 
FlinkKinesisProducer. Tried reading from the stream using a 
FlinkKinesisConsumer and got this exception:
------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The program 
execution failed: Job execution failed.
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:409)
        at 
org.apache.flink.yarn.YarnClusterClient.submitJob(YarnClusterClient.java:204)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:382)
        at 
org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
        at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1429)
        at 
wikiedits.WikipediaAnalysisConsumer.main(WikipediaAnalysisConsumer.java:98)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:509)
        at 
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:320)
        at 
org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:777)
        at org.apache.flink.client.CliFrontend.run(CliFrontend.java:253)
        at 
org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1005)
        at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1048)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
        at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply$mcV$sp(JobManager.scala:822)
        at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply(JobManager.scala:768)
        at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply(JobManager.scala:768)
        at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
        at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
        at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoSuchMethodError: 
org.apache.http.params.HttpConnectionParams.setSoKeepalive(Lorg/apache/http/params/HttpParams;Z)V
        at 
com.amazonaws.http.HttpClientFactory.createHttpClient(HttpClientFactory.java:96)
        at com.amazonaws.http.AmazonHttpClient.<init>(AmazonHttpClient.java:187)
        at 
com.amazonaws.AmazonWebServiceClient.<init>(AmazonWebServiceClient.java:136)
        at 
com.amazonaws.services.kinesis.AmazonKinesisClient.<init>(AmazonKinesisClient.java:221)
        at 
com.amazonaws.services.kinesis.AmazonKinesisClient.<init>(AmazonKinesisClient.java:197)
        at 
org.apache.flink.streaming.connectors.kinesis.util.AWSUtil.createKinesisClient(AWSUtil.java:56)
        at 
org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.<init>(KinesisProxy.java:118)
        at 
org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.create(KinesisProxy.java:176)
        at 
org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.<init>(KinesisDataFetcher.java:188)
        at 
org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.run(FlinkKinesisConsumer.java:198)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:80)
        at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:53)
        at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:266)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:585)
        at java.lang.Thread.run(Thread.java:745)
Command exiting with ret '1'

With patch, no exceptions:


2016-11-22 15:08:32,670 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli     
            - No path for the flink jar passed. Using the location of class 
org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2016-11-22 15:08:32,733 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            - Using values:
2016-11-22 15:08:32,736 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            -   TaskManager count = 1
2016-11-22 15:08:32,736 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            -   JobManager memory = 1024
2016-11-22 15:08:32,736 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            -   TaskManager memory = 1024
2016-11-22 15:08:32,756 INFO  org.apache.hadoop.yarn.client.RMProxy             
            - Connecting to ResourceManager at 
ip-10-113-191-196.ec2.internal/10.113.191.196:8032
2016-11-22 15:08:33,560 INFO  org.apache.flink.yarn.Utils                       
            - Copying from file:/usr/lib/flink/lib to 
hdfs://ip-10-113-191-196.ec2.internal:8020/user/hadoop/.flink/application_1479758686609_0011/lib
2016-11-22 15:08:34,500 INFO  org.apache.flink.yarn.Utils                       
            - Copying from file:/etc/flink/conf/log4j.properties to 
hdfs://ip-10-113-191-196.ec2.internal:8020/user/hadoop/.flink/application_1479758686609_0011/log4j.properties
2016-11-22 15:08:34,528 INFO  org.apache.flink.yarn.Utils                       
            - Copying from file:/usr/lib/flink/lib/flink-dist_2.10-1.1.3.jar to 
hdfs://ip-10-113-191-196.ec2.internal:8020/user/hadoop/.flink/application_1479758686609_0011/flink-dist_2.10-1.1.3.jar
2016-11-22 15:08:35,184 INFO  org.apache.flink.yarn.Utils                       
            - Copying from /etc/flink/conf/flink-conf.yaml to 
hdfs://ip-10-113-191-196.ec2.internal:8020/user/hadoop/.flink/application_1479758686609_0011/flink-conf.yaml
2016-11-22 15:08:35,223 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            - Submitting application master application_1479758686609_0011
2016-11-22 15:08:35,245 INFO  
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl         - Submitted 
application application_1479758686609_0011
2016-11-22 15:08:35,245 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            - Waiting for the cluster to be allocated
2016-11-22 15:08:35,246 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            - Deploying cluster, current state ACCEPTED
2016-11-22 15:08:40,517 INFO  org.apache.flink.yarn.YarnClusterDescriptor       
            - YARN application has been deployed successfully.
Cluster started: Yarn cluster with application id application_1479758686609_0011
Using address 10.186.182.74:35132 to connect to JobManager.
JobManager web interface address 
http://ip-10-113-191-196.ec2.internal:20888/proxy/application_1479758686609_0011/
Using the parallelism provided by the remote cluster (1). To use another 
parallelism, set it at the ./bin/flink client.
Starting execution of program
2016-11-22 15:08:40,557 INFO  org.apache.flink.yarn.YarnClusterClient           
            - Starting program in interactive mode
2016-11-22 15:08:40,677 INFO  org.apache.flink.yarn.YarnClusterClient           
            - Waiting until all TaskManagers have connected
Waiting until all TaskManagers have connected
2016-11-22 15:08:40,678 INFO  org.apache.flink.yarn.YarnClusterClient           
            - Starting client actor system.
2016-11-22 15:08:41,532 INFO  org.apache.flink.yarn.YarnClusterClient           
            - TaskManager status (0/1)
TaskManager status (0/1)
2016-11-22 15:08:44,406 INFO  org.apache.flink.yarn.YarnClusterClient           
            - TaskManager status (1/1)
TaskManager status (1/1)
2016-11-22 15:08:44,406 INFO  org.apache.flink.yarn.YarnClusterClient           
            - All TaskManagers are connected
All TaskManagers are connected
2016-11-22 15:08:44,407 INFO  org.apache.flink.yarn.YarnClusterClient           
            - Submitting job with JobID: 2ca36526d88930561e204263f9bcb003. 
Waiting for job completion.
Submitting job with JobID: 2ca36526d88930561e204263f9bcb003. Waiting for job 
completion.
Connected to JobManager at 
Actor[akka.tcp://[email protected]:35132/user/jobmanager#-109909372]
11/22/2016 15:08:45     Job execution switched to status RUNNING.
11/22/2016 15:08:45     Source: Custom Source -> Sink: Unnamed(1/1) switched to 
SCHEDULED 
11/22/2016 15:08:45     Source: Custom Source -> Sink: Unnamed(1/1) switched to 
DEPLOYING 
11/22/2016 15:08:46     Source: Custom Source -> Sink: Unnamed(1/1) switched to 
RUNNING


Thanks,

Craig Foster

Reply via email to