Hi, why are you using add file for a json file? Cant you just read it as a dataframe?
Regards, Gourav Sengupta On Fri, Aug 20, 2021 at 4:50 PM igyu <i...@21cn.com> wrote: > in spark-shell > I can run > > val url = "hdfs://nameservice1/user/jztwk/config.json" > Spark.sparkContext.addFile(url) > val json_str = readLocalFile(SparkFiles.get(url.split("/").last)) > > but when I make jar package > > spark-submit --master yarn --deploy-mode cluster --principal jztwk/ > had...@join.com --keytab /hadoop/app/jztwk.keytab --class > com.join.Synctool --jars hdfs://nameservice1/sparklib/* > jztsynctools-1.0-SNAPSHOT.jar > > I get a error > > ERROR yarn.Client: Application diagnostics message: User class threw > exception: java.io.FileNotFoundException: > /hadoop/yarn/nm1/usercache/jztwk/appcache/application_1627287887991_0571/spark-020a769c-6d9c-42ff-9bb2-1407cf6ed0bc/userFiles-1f57a3ed-22fa-4464-84e4-e549685b0d2d/hadoop/yarn/nm1/usercache/jztwk/appcache/application_1627287887991_0571/spark-020a769c-6d9c-42ff-9bb2-1407cf6ed0bc/userFiles-1f57a3ed-22fa-4464-84e4-e549685b0d2d/config.json > (No such file or directory) > > > > > but > > ------------------------------ > igyu >