[ 
https://issues.apache.org/jira/browse/SPARK-15600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15303876#comment-15303876
 ] 

Apache Spark commented on SPARK-15600:
--------------------------------------

User 'zjffdu' has created a pull request for this issue:
https://github.com/apache/spark/pull/13357

> Make local mode as default mode
> -------------------------------
>
>                 Key: SPARK-15600
>                 URL: https://issues.apache.org/jira/browse/SPARK-15600
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Jeff Zhang
>            Priority: Minor
>
> Usually I would write spark application in IDE and will run it on small 
> dataset locally. But I have to specify the master as local, otherwise will 
> get the following error. But if I want to run it on large dataset in cluster, 
> then I have to  the remove the code of "SparkConf.setMaster(local[4])" which 
> is inconvenient for me. so I create this ticket to propose that if there's no 
> master specified, then use local mode. 
> {code}
> org.apache.spark.SparkException: A master URL must be set in your 
> configuration
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:401)
>       at 
> com.zjffdu.tutorial.spark.kaggle.crime.SFCrime$.main(SFCrime.scala:14)
>       at com.zjffdu.tutorial.spark.kaggle.crime.SFCrime.main(SFCrime.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
> 16/05/27 14:31:33 INFO spark.SparkContext: Successfully stopped SparkContext
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to