[ https://issues.apache.org/jira/browse/HIVE-8833?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14212720#comment-14212720 ]
Marcelo Vanzin commented on HIVE-8833: -------------------------------------- bq. SparkClientImpl ignore spark driver parameters while submit job through SparkSubmit class to spark standalone cluster, I'm not sure why. Can you clarify what you mean here? What exactly is the launch path here (in-process, spark client directly executing SparkSubmit, or spark client executing out-of-process spark-submit script)? In the first two cases, there are some driver options that may not take, since the driver will be executing in the same process as the caller. > Unify spark client API and implement remote spark client.[Spark Branch] > ----------------------------------------------------------------------- > > Key: HIVE-8833 > URL: https://issues.apache.org/jira/browse/HIVE-8833 > Project: Hive > Issue Type: Sub-task > Components: Spark > Reporter: Chengxiang Li > Assignee: Chengxiang Li > Labels: Spark-M3 > Attachments: HIVE-8833.1-spark.patch, HIVE-8833.2-spark.patch > > > Hive would support submitting spark job through both local spark client and > remote spark client. we should unify the spark client API, and implement > remote spark client through Remote Spark Context. -- This message was sent by Atlassian JIRA (v6.3.4#6332)