Is there a way to get these set by default in spark-sql shell Thanks, Chirag
From: Akhil Das <ak...@sigmoidanalytics.com<mailto:ak...@sigmoidanalytics.com>> Date: Monday, 29 December 2014 5:53 PM To: Chirag Aggarwal <chirag.aggar...@guavus.com<mailto:chirag.aggar...@guavus.com>> Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>> Subject: Re: Spark Configurations I believe If you use spark-shell or spark-submit, then it will pick up the conf from spark-defaults.conf, If you are running independent application then you can set all those confs while creating the SparkContext. Thanks Best Regards On Mon, Dec 29, 2014 at 5:40 PM, Chirag Aggarwal <chirag.aggar...@guavus.com<mailto:chirag.aggar...@guavus.com>> wrote: Hi, It seems that spark-defaults.conf is not read by spark-sql. Is it used only by spark-shell? Thanks, Chirag