jiangqingsong opened a new issue, #6506:
URL: https://github.com/apache/seatunnel/issues/6506

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   Failed to synchronize data from ElasticSearch to Hive. 
   The error log is displayed in the error file.txt, and it contains the 
seatunnel's config information.
   [error.txt](https://github.com/apache/seatunnel/files/14599652/error.txt)
   
   In addition, it has been verified in my environment: Es->Console and 
Mysql->Hive are executed normally
   
   
   
   ### SeaTunnel Version
   
   2.3.4
   
   ### SeaTunnel Config
   
   ```conf
   en
   irallelism = 1
     job.mode = "BATCH"
   }
   
   source {
           Elasticsearch {
           hosts = ["http://localhost:9200";]
           username = "elastic"
           password = ""
           tls_verify_hostname = false
           index = "person"
           source = ["name", "age"]
           query = {"match_all":{}}
   
       }
   }
   
   transform {}
   
   sink {
   
     Hive {
       table_name = "test.person"
       metastore_uri = "thrift://xxx:9083"
     }
   
   }
   ```
   
   
   ### Running Command
   
   ```shell
   ./bin/start-seatunnel-spark-2-connector-v2.sh --master local[4] 
--deploy-mode client --config ./config/my/erdp_es_2_hive.config --plugin_name 
hive
   ```
   
   
   ### Error Exception
   
   ```log
   24/03/14 16:39:31 INFO utils.ConfigBuilder: Loading config file from path: 
./config/my/erdp_es_2_hive.config
   24/03/14 16:39:31 INFO utils.ConfigShadeUtils: Load config shade spi: 
[base64]
   24/03/14 16:39:31 INFO utils.ConfigBuilder: Parsed config file: 
   {
       "env" : {
           "parallelism" : 1,
           "job.mode" : "BATCH"
       },
       "source" : [
           {
               "tls_verify_hostname" : false,
               "password" : "elastic@2024",
               "hosts" : [
                   "http://10.0.25.150:9200";
               ],
               "query" : {
                   "match_all" : {}
               },
               "index" : "web_measurement",
               "result_table_name" : "t1",
               "source" : [
                   "timestamp",
                   "projectName",
                   "department",
                   "departmentId",
                   "user",
                   "module",
                   "page",
                   "object",
                   "behaviorType",
                   "token"
               ],
               "plugin_name" : "Elasticsearch",
               "username" : "elastic"
           }
       ],
       "transform" : [
           {
               "field_mapper" : {
                   "timestamp" : "timestamp",
                   "projectName" : "project_name",
                   "department" : "department",
                   "departmentId" : "department_id",
                   "user" : "user",
                   "page" : "page",
                   "object" : "object",
                   "behaviorType" : "behavior_type",
                   "token" : "token"
               },
               "source_table_name" : "t1",
               "result_table_name" : "t2",
               "plugin_name" : "FieldMapper"
           }
       ],
       "sink" : [
           {
               "source_table_name" : "t2",
               "table_name" : "ods.ods_erdp_web_measurement",
               "metastore_uri" : "thrift://10.36.191.30:9083",
               "plugin_name" : "hive"
           }
       ]
   }
   
   24/03/14 16:39:31 INFO conf.HiveConf: Found configuration file 
file:/etc/hive/conf.cloudera.hive/hive-site.xml
   24/03/14 16:39:31 INFO spark.SparkContext: Running Spark version 
2.4.0-cdh6.3.2
   24/03/14 16:39:31 INFO logging.DriverLogger: Added a local log appender at: 
/tmp/spark-426fbceb-0f4f-4d34-b704-e0a0dd383a5f/__driver_logs__/driver.log
   24/03/14 16:39:31 INFO spark.SparkContext: Submitted application: SeaTunnel
   24/03/14 16:39:32 INFO spark.SecurityManager: Changing view acls to: bigdata
   24/03/14 16:39:32 INFO spark.SecurityManager: Changing modify acls to: 
bigdata
   24/03/14 16:39:32 INFO spark.SecurityManager: Changing view acls groups to: 
   24/03/14 16:39:32 INFO spark.SecurityManager: Changing modify acls groups 
to: 
   24/03/14 16:39:32 INFO spark.SecurityManager: SecurityManager: 
authentication disabled; ui acls disabled; users  with view permissions: 
Set(bigdata); groups with view permissions: Set(); users  with modify 
permissions: Set(bigdata); groups with modify permissions: Set()
   24/03/14 16:39:32 INFO util.Utils: Successfully started service 
'sparkDriver' on port 40655.
   24/03/14 16:39:32 INFO spark.SparkEnv: Registering MapOutputTracker
   24/03/14 16:39:32 INFO spark.SparkEnv: Registering BlockManagerMaster
   24/03/14 16:39:32 INFO storage.BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
   24/03/14 16:39:32 INFO storage.BlockManagerMasterEndpoint: 
BlockManagerMasterEndpoint up
   24/03/14 16:39:32 INFO storage.DiskBlockManager: Created local directory at 
/tmp/blockmgr-a32b1b1d-8595-4a68-a583-2c07a5dcd6e8
   24/03/14 16:39:32 INFO memory.MemoryStore: MemoryStore started with capacity 
366.3 MB
   24/03/14 16:39:32 INFO spark.SparkEnv: Registering OutputCommitCoordinator
   24/03/14 16:39:32 INFO util.log: Logging initialized @2311ms
   24/03/14 16:39:32 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: 
2018-09-05T05:11:46+08:00, git hash: 3ce520221d0240229c862b122d2b06c12a625732
   24/03/14 16:39:32 INFO server.Server: Started @2378ms
   24/03/14 16:39:32 WARN util.Utils: Service 'SparkUI' could not bind on port 
4040. Attempting port 4041.
   24/03/14 16:39:32 INFO server.AbstractConnector: Started 
ServerConnector@5f18f9d2{HTTP/1.1,[http/1.1]}{0.0.0.0:4041}
   24/03/14 16:39:32 INFO util.Utils: Successfully started service 'SparkUI' on 
port 4041.
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@4d178d55{/jobs,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@33364212{/jobs/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@2216effc{/jobs/job,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@6da9dc6{/jobs/job/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@7fd69dd{/stages,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@12010fd1{/stages/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@7c84195{/stages/stage,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@23940f86{/stages/stage/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@66153688{/stages/pool,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@455824ad{/stages/pool/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@7318daf8{/storage,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@70f31322{/storage/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@3f1ddac2{/storage/rdd,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@3be4fcc0{/storage/rdd/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@e1e2e5e{/environment,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@661c46bc{/environment/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@37864b77{/executors,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@2b98b3bb{/executors/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@540b0448{/executors/threadDump,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@50a691d3{/executors/threadDump/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@557eb543{/static,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@99407c2{/,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@6c796cc1{/api,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@128c502c{/jobs/job/kill,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@45667d98{/stages/stage/kill,null,AVAILABLE,@Spark}
   24/03/14 16:39:32 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://cdh1:4041
   24/03/14 16:39:32 INFO spark.SparkContext: Added JAR 
file:///opt/module/seatunnel/apache-seatunnel-2.3.4/lib/seatunnel-transforms-v2.jar
 at spark://cdh1:40655/jars/seatunnel-transforms-v2.jar with timestamp 
1710405572511
   24/03/14 16:39:32 INFO spark.SparkContext: Added JAR 
file:///opt/module/seatunnel/apache-seatunnel-2.3.4/lib/seatunnel-hadoop3-3.1.4-uber.jar
 at spark://cdh1:40655/jars/seatunnel-hadoop3-3.1.4-uber.jar with timestamp 
1710405572512
   24/03/14 16:39:32 INFO spark.SparkContext: Added JAR 
file:///opt/module/seatunnel/apache-seatunnel-2.3.4/connectors/connector-elasticsearch-2.3.4.jar
 at spark://cdh1:40655/jars/connector-elasticsearch-2.3.4.jar with timestamp 
1710405572512
   24/03/14 16:39:32 INFO spark.SparkContext: Added JAR 
file:///opt/module/seatunnel/apache-seatunnel-2.3.4/connectors/connector-hive-2.3.4.jar
 at spark://cdh1:40655/jars/connector-hive-2.3.4.jar with timestamp 
1710405572512
   24/03/14 16:39:32 INFO spark.SparkContext: Added JAR 
file:/opt/module/seatunnel/apache-seatunnel-2.3.4/starter/seatunnel-spark-2-starter.jar
 at spark://cdh1:40655/jars/seatunnel-spark-2-starter.jar with timestamp 
1710405572512
   24/03/14 16:39:32 INFO executor.Executor: Starting executor ID driver on 
host localhost
   24/03/14 16:39:32 INFO util.Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 43782.
   24/03/14 16:39:32 INFO netty.NettyBlockTransferService: Server created on 
cdh1:43782
   24/03/14 16:39:32 INFO storage.BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
   24/03/14 16:39:32 INFO storage.BlockManagerMaster: Registering BlockManager 
BlockManagerId(driver, cdh1, 43782, None)
   24/03/14 16:39:32 INFO storage.BlockManagerMasterEndpoint: Registering block 
manager cdh1:43782 with 366.3 MB RAM, BlockManagerId(driver, cdh1, 43782, None)
   24/03/14 16:39:32 INFO storage.BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, cdh1, 43782, None)
   24/03/14 16:39:32 INFO storage.BlockManager: external shuffle service port = 
7337
   24/03/14 16:39:32 INFO storage.BlockManager: Initialized BlockManager: 
BlockManagerId(driver, cdh1, 43782, None)
   24/03/14 16:39:32 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@4110765e{/metrics/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:33 INFO scheduler.EventLoggingListener: Logging events to 
hdfs://cdh1:8020/user/spark/applicationHistory/local-1710405572561
   24/03/14 16:39:33 WARN lineage.LineageWriter: Lineage directory 
/data/var/log/spark/lineage doesn't exist or is not writable. Lineage for this 
application will be disabled.
   24/03/14 16:39:33 INFO util.Utils: Extension 
com.cloudera.spark.lineage.NavigatorAppListener not being initialized.
   24/03/14 16:39:33 INFO logging.DriverLogger$DfsAsyncWriter: Started driver 
log file sync to: /user/spark/driverLogs/local-1710405572561_driver.log
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load 
SeaTunnelSource Plugin from 
/opt/module/seatunnel/apache-seatunnel-2.3.4/connectors
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load Factory 
Plugin from /opt/module/seatunnel/apache-seatunnel-2.3.4/connectors
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Discovery plugin 
jar for: PluginIdentifier{engineType='seatunnel', pluginType='source', 
pluginName='Elasticsearch'} at: 
file:/opt/module/seatunnel/apache-seatunnel-2.3.4/connectors/connector-elasticsearch-2.3.4.jar
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load plugin: 
PluginIdentifier{engineType='seatunnel', pluginType='source', 
pluginName='Elasticsearch'} from classpath
   24/03/14 16:39:33 INFO client.EsRestClient: GET /web_measurement/_mappings 
respnse={"web_measurement":{"mappings":{"properties":{"_class":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"behaviorType":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"department":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"departmentId":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"event":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"extension":{"properties":{"accurate":{"type":"long"},"appID":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"eventType":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"height":{"type":"long"},"innerHTML":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"left":{"type":"long"},"outerHTML":{"type":"text","fields":{"keyword":{"type":"keyword","ignore
 
_above":256}}},"pageHeight":{"type":"long"},"pageURL":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"paths":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"scrollTop":{"type":"long"},"startTime":{"type":"float"},"subType":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"target":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"token":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"top":{"type":"long"},"type":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"useId":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"uuid":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"viewport":{"properties":{"height":{"type":"long"},"width":{"type":"long"}}},"width":{"type":"long"}}},"id":{"type":"long"},"module":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}
 
,"object":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"page":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"pageURL":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"projectName":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"referrer":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"timestamp":{"type":"long"},"token":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"user":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}}}}
   24/03/14 16:39:33 INFO execution.SparkRuntimeEnvironment: register plugins 
:[file:/opt/module/seatunnel/apache-seatunnel-2.3.4/connectors/connector-elasticsearch-2.3.4.jar]
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load 
SeaTunnelTransform Plugin from /opt/module/seatunnel/apache-seatunnel-2.3.4/lib
   24/03/14 16:39:33 INFO execution.SparkRuntimeEnvironment: register plugins 
:[]
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load Factory 
Plugin from /opt/module/seatunnel/apache-seatunnel-2.3.4/connectors
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load SeaTunnelSink 
Plugin from /opt/module/seatunnel/apache-seatunnel-2.3.4/connectors
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Discovery plugin 
jar for: PluginIdentifier{engineType='seatunnel', pluginType='sink', 
pluginName='hive'} at: 
file:/opt/module/seatunnel/apache-seatunnel-2.3.4/connectors/connector-hive-2.3.4.jar
   24/03/14 16:39:33 INFO discovery.AbstractPluginDiscovery: Load plugin: 
PluginIdentifier{engineType='seatunnel', pluginType='sink', pluginName='hive'} 
from classpath
   24/03/14 16:39:33 INFO execution.SparkRuntimeEnvironment: register plugins 
:[file:/opt/module/seatunnel/apache-seatunnel-2.3.4/connectors/connector-hive-2.3.4.jar]
   24/03/14 16:39:33 INFO internal.SharedState: loading hive config file: 
file:/etc/hive/conf.cloudera.hive/hive-site.xml
   24/03/14 16:39:33 INFO internal.SharedState: spark.sql.warehouse.dir is not 
set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir 
to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
   24/03/14 16:39:33 INFO internal.SharedState: Warehouse path is 
'/user/hive/warehouse'.
   24/03/14 16:39:33 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@4776e209{/SQL,null,AVAILABLE,@Spark}
   24/03/14 16:39:33 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@265a094b{/SQL/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:33 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@54c425b1{/SQL/execution,null,AVAILABLE,@Spark}
   24/03/14 16:39:33 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@50b734c4{/SQL/execution/json,null,AVAILABLE,@Spark}
   24/03/14 16:39:33 INFO handler.ContextHandler: Started 
o.s.j.s.ServletContextHandler@b14b60a{/static/sql,null,AVAILABLE,@Spark}
   24/03/14 16:39:34 INFO state.StateStoreCoordinatorRef: Registered 
StateStoreCoordinator endpoint
   24/03/14 16:39:34 WARN lineage.LineageWriter: Lineage directory 
/data/var/log/spark/lineage doesn't exist or is not writable. Lineage for this 
application will be disabled.
   24/03/14 16:39:34 INFO util.Utils: Extension 
com.cloudera.spark.lineage.NavigatorQueryListener not being initialized.
   24/03/14 16:39:35 INFO hive.HiveUtils: Initializing HiveMetastoreConnection 
version 2.1 using Spark classes.
   24/03/14 16:39:35 INFO conf.HiveConf: Found configuration file 
file:/etc/hive/conf.cloudera.hive/hive-site.xml
   24/03/14 16:39:35 INFO session.SessionState: Created HDFS directory: 
/tmp/hive/bigdata/0c77058d-a63d-4634-b205-89e6db8702bd
   24/03/14 16:39:35 INFO session.SessionState: Created local directory: 
/tmp/bigdata/0c77058d-a63d-4634-b205-89e6db8702bd
   24/03/14 16:39:35 INFO session.SessionState: Created HDFS directory: 
/tmp/hive/bigdata/0c77058d-a63d-4634-b205-89e6db8702bd/_tmp_space.db
   24/03/14 16:39:35 INFO client.HiveClientImpl: Warehouse location for Hive 
client (version 2.1.1) is /user/hive/warehouse
   24/03/14 16:39:36 INFO hive.metastore: HMS client filtering is enabled.
   24/03/14 16:39:36 INFO hive.metastore: Trying to connect to metastore with 
URI thrift://cdh1:9083
   24/03/14 16:39:36 INFO hive.metastore: Opened a connection to metastore, 
current connections: 1
   24/03/14 16:39:36 INFO hive.metastore: Connected to metastore.
   24/03/14 16:39:36 INFO discovery.AbstractPluginDiscovery: Load SeaTunnelSink 
Plugin from /opt/module/seatunnel/apache-seatunnel-2.3.4/connectors
   Exception in thread "main" java.util.ServiceConfigurationError: 
org.apache.seatunnel.api.sink.SeaTunnelSink: Provider 
org.apache.seatunnel.connectors.seatunnel.elasticsearch.sink.ElasticsearchSink 
could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at 
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at 
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery.loadPluginInstance(AbstractPluginDiscovery.java:302)
           at 
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery.createOptionalPluginInstance(AbstractPluginDiscovery.java:183)
           at 
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery.createPluginInstance(AbstractPluginDiscovery.java:227)
           at 
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery.createPluginInstance(AbstractPluginDiscovery.java:171)
           at 
org.apache.seatunnel.core.starter.spark.execution.SinkExecuteProcessor.fallbackCreateSink(SinkExecuteProcessor.java:176)
           at 
org.apache.seatunnel.core.starter.spark.execution.SinkExecuteProcessor.execute(SinkExecuteProcessor.java:120)
           at 
org.apache.seatunnel.core.starter.spark.execution.SparkExecution.execute(SparkExecution.java:71)
           at 
org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:60)
           at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
           at 
org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.InstantiationException: 
org.apache.seatunnel.connectors.seatunnel.elasticsearch.sink.ElasticsearchSink
           at java.lang.Class.newInstance(Class.java:427)
           at 
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 24 more
   Caused by: java.lang.NoSuchMethodException: 
org.apache.seatunnel.connectors.seatunnel.elasticsearch.sink.ElasticsearchSink.<init>()
           at java.lang.Class.getConstructor0(Class.java:3082)
           at java.lang.Class.newInstance(Class.java:412)
           ... 25 more
   24/03/14 16:39:36 INFO spark.SparkContext: Invoking stop() from shutdown hook
   24/03/14 16:39:36 INFO server.AbstractConnector: Stopped 
Spark@5f18f9d2{HTTP/1.1,[http/1.1]}{0.0.0.0:4041}
   24/03/14 16:39:36 INFO ui.SparkUI: Stopped Spark web UI at http://cdh1:4041
   24/03/14 16:39:36 INFO spark.MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
   24/03/14 16:39:36 INFO memory.MemoryStore: MemoryStore cleared
   24/03/14 16:39:36 INFO storage.BlockManager: BlockManager stopped
   24/03/14 16:39:36 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
   24/03/14 16:39:36 INFO 
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   24/03/14 16:39:36 INFO spark.SparkContext: Successfully stopped SparkContext
   24/03/14 16:39:36 INFO util.ShutdownHookManager: Shutdown hook called
   24/03/14 16:39:36 INFO util.ShutdownHookManager: Deleting directory 
/tmp/spark-eba84842-a375-450e-8aa2-604c616f818f
   24/03/14 16:39:36 INFO util.ShutdownHookManager: Deleting directory 
/tmp/spark-426fbceb-0f4f-4d34-b704-e0a0dd383a5f
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   spark version: 2.4.0-cdh6.3.2
   
   
   ### Java or Scala Version
   
   java version: 1.8.0_181
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to