Re: Running spark Java on yarn cluster

2016-08-10 Thread atulp
Thanks Mandar.

Our need is to get sql queries from client and submit over spark cluster. We
don't want application to get submitted for each query. We want executors to
get shared across multiple queries as we would cache rdds which would get
used across queries. 

If I am correct, spark context corresponds to an application. And
application can be used in interactive mode too. So was thinking to create
server which will have spark context pointing to yarn cluster and use this
context to run multiple queries over period. 

Can you suggest way to achieve the requirement of interactive queries and
reuse of executors across queries.

Thanks,
Atul



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-spark-Java-on-yarn-cluster-tp27504p27507.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Running spark Java on yarn cluster

2016-08-10 Thread atulp
Hi Team,

I am new to spark and writing my first program. I have written sample
program with spark master as local. To execute spark over local yarn what
should be value of spark.master property? Can I point to remote yarn
cluster? I would like to execute  this as a java application and not submit
using spark-submit.

Main objective is to create a service which can execute spark sql queries
over yarn cluster.


Thanks in advance.

Regards,
Atul
 

code snippet as below

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;

public class TestSpark {
public static void main(String[] args) {
SparkSession spark = SparkSession.builder().appName("Java Spark 
Sql
Example").config("spark.master","local").getOrCreate();

Dataset df = 
spark.read().json("/spark/sparkSql/employee.json");
System.out.println("Data");
df.cache();
df.show();
//  JavaSparkContext sc = new JavaSparkContext(new
SparkConf().setAppName("SparkJoins").setMaster("yarn-client"));

}
}




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-spark-Java-on-yarn-cluster-tp27504.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org