812406210 opened a new issue, #1691:
URL: https://github.com/apache/incubator-seatunnel/issues/1691

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
    执行 ./bin/start-seatunnel-spark.sh \
   --master local[4] \
   --deploy-mode client \
   --config ./config/spark.streaming.conf.template
   出现的错误
   
   ### SeaTunnel Version
   
    apache-seatunnel-incubating-2.1.0
   
   ### SeaTunnel Config
   
   ```conf
   SPARK_HOME=${SPARK_HOME:-/opt/spark}
   ```
   
   
   ### Running Command
   
   ```shell
   ./bin/start-seatunnel-spark.sh \
   --master local[4] \
   --deploy-mode client \
   --config ./config/spark.streaming.conf.template
   ```
   
   
   ### Error Exception
   
   ```log
   22/04/13 15:38:35 INFO scheduler.DAGScheduler: Job 2 finished: take at 
SparkStreamingExecution.scala:54, took 0.034069 s
   Exception in thread "streaming-job-executor-1" java.lang.NoSuchMethodError: 
org.apache.spark.sql.UDFRegistration.register(Ljava/lang/String;Lorg/apache/spark/sql/expressions/UserDefinedFunction;)Lorg/apache/spark/sql/expressions/UserDefinedFunction;
           at org.apache.seatunnel.spark.transform.Split.process(Split.scala:57)
           at 
org.apache.seatunnel.spark.batch.SparkBatchExecution.transformProcess(SparkBatchExecution.java:70)
           at 
org.apache.seatunnel.spark.stream.SparkStreamingExecution$$anonfun$start$2$$anonfun$apply$1.apply(SparkStreamingExecution.scala:55)
           at 
org.apache.seatunnel.spark.stream.SparkStreamingExecution$$anonfun$start$2$$anonfun$apply$1.apply(SparkStreamingExecution.scala:53)
           at scala.collection.Iterator$class.foreach(Iterator.scala:893)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
           at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
           at 
org.apache.seatunnel.spark.stream.SparkStreamingExecution$$anonfun$start$2.apply(SparkStreamingExecution.scala:53)
           at 
org.apache.seatunnel.spark.stream.SparkStreamingExecution$$anonfun$start$2.apply(SparkStreamingExecution.scala:45)
           at 
org.apache.seatunnel.spark.stream.SparkStreamingSource$$anonfun$start$1.apply(SparkStreamingSource.scala:39)
           at 
org.apache.seatunnel.spark.stream.SparkStreamingSource$$anonfun$start$1.apply(SparkStreamingSource.scala:37)
           at 
org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)
           at 
org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)
           at 
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
           at 
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
           at 
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
           at 
org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415)
           at 
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
           at 
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
           at 
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
           at scala.util.Try$.apply(Try.scala:192)
           at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
           at 
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:256)
           at 
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:256)
           at 
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:256)
           at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
           at 
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:255)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   ^C22/04/13 15:38:35 INFO streaming.StreamingContext: Invoking 
stop(stopGracefully=false) from shutdown hook
   22/04/13 15:38:35 INFO scheduler.ReceiverTracker: Sent stop signal to all 1 
receivers
   ```
   
   
   ### Flink or Spark Version
   
   spark 2.1.2
   
   ### Java or Scala Version
   
   jdk 1.8
   scala 2.12.10
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to