Refer: https://spark.apache.org/docs/latest/quick-start.html 1. Create a 
singleton SparkContext at initialization of your cluster, the spark-context or 
spark-sql would be accessible through a static method anywhere in your 
application. I recommend using Fair scheduling on your context, to share 
resources among all input requests     SparkSession spark = 
SparkSession.builder().appName("Simple Application").getOrCreate();
 2. From now on, with sc or spark-sql object, something like 
sparkSql.sql("select * from test").collectAsList() would be run as a spark job 
and returns result to your application Sent using Zoho Mail ============ 
Forwarded message ============ From : 崔苗(数据与人工智能产品开发部) <0049003...@znv.com> To 
: "user"<user@spark.apache.org> Date : Thu, 01 Nov 2018 10:52:15 +0330 Subject 
: use spark cluster in java web service ============ Forwarded message 
============ Hi, we want to use spark in our java web service , compute data in 
spark cluster according to request,now we have two probles: 1、 how to get 
sparkSession of remote spark cluster (spark on yarn mode) , we want to keep one 
sparkSession to execute all data compution; 2、how to submit to remote spark 
cluster in java code instead of spark-submit , as we want to execute spark code 
in reponse server; Thanks for any replys 0049003208 0049003...@znv.com 签名由 
网易邮箱大师 定制 --------------------------------------------------------------------- 
To unsubscribe e-mail:user-unsubscr...@spark.apache.org

Reply via email to