githubuserhk commented on issue #1846:
URL: 
https://github.com/apache/incubator-seatunnel/issues/1846#issuecomment-1321086363

   > ### Search before asking
   > * [x]  I had searched in the 
[issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   > 
   > ### What happened
   > Version 2.1.1, when using spark on yarn mode to 
start,[报org.apache.seatunnel.shade.com](http://xn--org-pb6f.apache.seatunnel.shade.com/)
 .typesafe.config.ConfigException$Missing: No configuration setting found for 
key 'env'. Have you ever encountered
   > 
   > ### SeaTunnel Version
   > 2.1.1
   > 
   > ### SeaTunnel Config
   > ```
   > env {
   >     spark.app.name = "seatunnel_mongo_to_ck_core_device_demo"
   >     # You can set spark configuration here
   >     # see available properties defined by spark: 
https://spark.apache.org/docs/latest/configuration.html#available-properties
   >     spark.executor.instances = 2
   >     spark.executor.cores = 1
   >     spark.executor.memory = "2g"
   > }
   > 
   > source {
   >     mongodb {
   >         readconfig.uri = 
"mongodb://core_user:[email protected]:27018/gizwits_core"
   >         readconfig.database = "gizwits_core"
   >         readconfig.collection = "device"
   >         readconfig.password = "xxx"
   >         readconfig.spark.mongodb.input.partitioner = 
"MongoPaginateBySizePartitioner"
   >         
schema="{\"created_at\":\"date\",\"updated_at\":\"date\",\"product_key\":\"string\",\"did\":\"string\",
 \"passcode\":\"string\", \"mac\":\"string\"}"
   >         result_table_name = "core_device"
   >     }
   > }
   > 
   > transform {
   >    sql {
   >       sql = "SELECT created_at, updated_at, product_key,did,passcode,mac, 
'2022-05-10' as dt FROM core_device"
   >    }
   >    json {
   >       source_field = "created_at"
   >       new_type = "datetime"
   >    }
   >    json {
   >       source_field = "updated_at"
   >       new_type = "datetime"
   >    }
   >    json {
   >       source_field = "dt"
   >       new_type = "date"
   >    }
   > }
   > sink {
   >     clickhouse {
   >         host = "xxx:8123"
   >         database = "test"
   >         table = "ods_core_device"
   >         fields = ["created_at", "updated_at", "product_key", "did", 
"passcode",  "mac", "dt"]
   >         username = "default"
   >     }
   > }
   > ```
   > 
   > ### Running Command
   > ```shell
   > ./bin/start-seatunnel-spark.sh --master yarn --deploy-mode cluster 
--config ./config/to_clickhouse_conf/mongo_2_ck_core_device_demo.conf.template
   > ```
   > 
   > ### Error Exception
   > ```
   > Version 2.1.1, when using spark on yarn mode to 
start,[报org.apache.seatunnel.shade.com](http://xn--org-pb6f.apache.seatunnel.shade.com/)
 .typesafe.config.ConfigException$Missing: No configuration setting found for 
key 'env'. Have you ever encountered
   > 
   > 22/05/11 11:37:03 ERROR Seatunnel: Exception 
StackTrace:org.apache.seatunnel.shade.com.typesafe.config.ConfigException$Missing:
 No configuration setting found for key 'env'
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.findKeyOrNull(SimpleConfig.java:156)
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.findOrNull(SimpleConfig.java:174)
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:188)
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:193)
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.getObject(SimpleConfig.java:268)
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.getConfig(SimpleConfig.java:274)
   >    at 
org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.getConfig(SimpleConfig.java:41)
   >    at 
org.apache.seatunnel.config.EnvironmentFactory.getEnvironment(EnvironmentFactory.java:47)
   >    at 
org.apache.seatunnel.config.ExecutionContext.<init>(ExecutionContext.java:49)
   >    at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:44)
   >    at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
   >    at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
   >    at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
   >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   >    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   >    at java.lang.reflect.Method.invoke(Method.java:498)
   >    at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)
   >  
   > 22/05/11 11:37:03 ERROR Seatunnel: 
   > 
===============================================================================
   > ```
   > 
   > ### Flink or Spark Version
   > spark 2.4.3
   > 
   > ### Java or Scala Version
   > _No response_
   > 
   > ### Screenshots
   > _No response_
   > 
   > ### Are you willing to submit PR?
   > * [ ]  Yes I am willing to submit a PR!
   > 
   > ### Code of Conduct
   > * [x]  I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   hi is this problom solved in the newest dev branch? I met this again..


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to