[ 
https://issues.apache.org/jira/browse/FLINK-15500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17010812#comment-17010812
 ] 

Leonard Xu commented on FLINK-15500:
------------------------------------

[~jark] [~liyu] I spent some time understanding travis and nightly scripts. The 
conclusion is this should not a bug.

Both latest master branch  and release-1.10 branch can pass e2e tests like 
test_sql_client_kafka.sh 
、test_sql_client_kafka010.sh、test_sql_client_kafka011.sh in local environment. 
Just need add parameters `-Dinclude-hadoop -Dhadoop.version=2.8.3` when compile 
flink. (ignore my above comment)
 Because some classes that `avro` format depends on is contained in hadoop, eg. 
org/apache/avro/io/DatumReader.java :
{code:java}
/Users/bang/sourcecode/project/flink-master/flink/build-target/lib
bang@mac lib (master) $ll
drwxr-xr-x   8 bang  staff        256 Jan  9 00:09 ./
drwxr-xr-x  11 bang  staff        352 Jan  9 00:11 ../
-rw-r--r--   1 bang  staff  109966522 Jan  9 00:09 
flink-dist_2.11-1.11-SNAPSHOT.jar
-rw-r--r--   1 bang  staff   43465537 Jan  9 00:09 
flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
-rw-r--r--   1 bang  staff   22511690 Jan  9 00:09 
flink-table-blink_2.11-1.11-SNAPSHOT.jar
-rw-r--r--   1 bang  staff   19278808 Jan  9 00:09 
flink-table_2.11-1.11-SNAPSHOT.jar
-rw-r--r--   1 bang  staff     489884 Jan  9 00:09 log4j-1.2.17.jar
-rw-r--r--   1 bang  staff       9931 Jan  9 00:09 slf4j-log4j12-1.7.15.jar
bang@mac lib (master) $jar -tvf flink-shaded-hadoop-2-uber-2.8.3-9.0.jar|grep 
DatumReader
   986 Tue Nov 19 21:49:32 CST 2019 
org/apache/avro/generic/GenericDatumReader$1.class
  1886 Tue Nov 19 21:49:32 CST 2019 
org/apache/avro/generic/GenericDatumReader$2.class
 18135 Tue Nov 19 21:49:32 CST 2019 
org/apache/avro/generic/GenericDatumReader.class
   414 Tue Nov 19 21:49:32 CST 2019 org/apache/avro/io/DatumReader.class
 12610 Tue Nov 19 21:49:32 CST 2019 
org/apache/avro/reflect/ReflectDatumReader.class
   829 Tue Nov 19 21:49:32 CST 2019 
org/apache/avro/specific/SpecificDatumReader$1.class
  5481 Tue Nov 19 21:49:32 CST 2019 
org/apache/avro/specific/SpecificDatumReader.class
bang@mac lib (master) $

{code}
About travis tests in flink master branch, I believe these tests can pass too 
if "Streaming File Sink s3 end-to-end test" can fixed.

So, I think we can close this issue safely.

> "SQL Client end-to-end test for Kafka" failed in my local environment
> ---------------------------------------------------------------------
>
>                 Key: FLINK-15500
>                 URL: https://issues.apache.org/jira/browse/FLINK-15500
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / Client
>    Affects Versions: 1.11.0
>            Reporter: Jark Wu
>            Priority: Critical
>
> The "SQL Client end-to-end test for modern Kafka" (aka. 
> {{test_sql_client_kafka.sh}}) test is failed in my local environment with 
> following exception:
> {code:java}
> Exception in thread "main" org.apache.flink.table.client.SqlClientException: 
> Unexpected exception. This is a bug. Please consider filing an issue.
>       at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)
> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could 
> not create execution context.
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:759)
>       at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
>       at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
>       at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
> Caused by: java.lang.NoClassDefFoundError: org/apache/avro/io/DatumReader
>       at 
> org.apache.flink.formats.avro.AvroRowFormatFactory.createDeserializationSchema(AvroRowFormatFactory.java:64)
>       at 
> org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryBase.getDeserializationSchema(KafkaTableSourceSinkFactoryBase.java:281)
>       at 
> org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryBase.createStreamTableSource(KafkaTableSourceSinkFactoryBase.java:161)
>       at 
> org.apache.flink.table.factories.StreamTableSourceFactory.createTableSource(StreamTableSourceFactory.java:49)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.createTableSource(ExecutionContext.java:371)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:552)
>       at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:550)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:487)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:748)
>       ... 3 more
> Caused by: java.lang.ClassNotFoundException: org.apache.avro.io.DatumReader
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       ... 15 more
> [FAIL] Test script contains errors.
> Checking of logs skipped.
> [FAIL] 'flink-end-to-end-tests/test-scripts/test_sql_client_kafka.sh' failed 
> after 0 minutes and 27 seconds! Test exited with exit code 1
> {code}
> I guess the reason why nightly travis didn't report this is that "e2e - misc 
> - hadoop 2.8" is failed on the "Streaming File Sink s3 end-to-end test", that 
> result in all the following cases (including SQL Client end-to-end tests) are 
> not triggered. For example https://api.travis-ci.org/v3/job/633275285/log.txt



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to