Hi Karan, In griffin, it submits jobs to livy through "batches" endpoint, you can just try the spark sample submission to have a test, that should work. Actually, griffin hasn't supported the "-negotiate" parameter, we only support some easy ones. Griffin reads the parameters from sparkJob.properties, transfer into the http request: https://github.com/apache/incubator-griffin/blob/master/service/src/main/java/org/apache/griffin/core/job/SparkSubmitJob.java#L157 .
You can just set the "negotiate" option directly in the code or try to get it by parameter like the others. We'll try to support all the livy parameters in griffin in the later. Thanks, Lionel On Mon, Apr 30, 2018 at 2:46 PM, Karan Gupta <[email protected]> wrote: > Hi Lionel, > > I am able to post to Livy directly through the curl, to the "Sessions" > endpoint. > It succeeds and a session gets created.... > > I have not tested the "batches" endpoint, > Can you share me a sample CURL query? We use Kerberos and so we use > "-negotiate" on our CURL cmd line. > > In the sparkJob.properties, we have mentioned the Livy URL. > Is there anything that we need to mention. > We have also loaded the /etc/spark/conf/hive-site.xml to /griffin in HDFS > and configured the sparkJob properties with this information. Please find > below our SparkJob properties.. > > sparkJob.file=hdfs://OURCLUSTER/griffin/griffin-measure.jar > sparkJob.className=org.apache.griffin.measure.Application > sparkJob.args_1=hdfs://OURCLUSTER/griffin/json/env.json > sparkJob.args_3=hdfs,raw > > sparkJob.name=griffin > sparkJob.queue=default > > # options > sparkJob.numExecutors=2 > sparkJob.executorCores=1 > sparkJob.driverMemory=1g > sparkJob.executorMemory=1g > > # shouldn't config in server, but in > sparkJob.jars = hdfs://OURCLUSTER/griffin/datanucleus-api-jdo-3.2.6.jar;\ > hdfs://OURCLUSTER/griffin/datanucleus-core-3.2.10.jar;\ > hdfs://OURCLUSTER/griffin/datanucleus-rdbms-3.2.9.jar > > spark.yarn.dist.files = hdfs://OURCLUSTER/griffin/hive-site.xml > > # livy > # livy.uri=http://10.9.246.187:8998/batches > livy.uri=http://LIVY_SERVER_HOST:8998/batches > > # spark-admin > # spark.uri=http://10.149.247.156:28088 > # spark.uri=http://10.9.246.187:8088 > spark.uri=http://YARN_RESOURCE_MANAGER_HOST:8088 > > From: [email protected] <[email protected]> On Behalf Of Lionel Liu > Sent: Friday, April 27, 2018 8:40 PM > To: Karan Gupta <[email protected]> > Cc: [email protected] > Subject: Re:Livy Error > > Hi Karan, > > I need some more information, how did you configure in > sparkJob.properties? Can you post to livy directly by curl? > Maybe this will help you: https://stackoverflow.com/ > questions/46909048/livy-rest-api-get-requests-work-but- > post-requests-fail-with-401-authentication > > -- > Regards, > Lionel, Liu > > At 2018-04-27 17:37:02, "Karan Gupta" <[email protected]<mailto: > [email protected]>> wrote: > > Hi Lionel, > > I am trying to submit a spark job but the job does not gets submitted and > I see the following error message on the console at the scheduled time. > > 2018-04-27 05:35:00.278 ERROR 106533 --- [ryBean_Worker-5] > o.a.griffin.core.job.SparkSubmitJob : Post to livy error. 401 > Authentication required > > Could you help me out with the same. > > Thank you, > Karan Gupta > ________________________________ > Any comments or statements made in this email are not necessarily those of > Tavant Technologies. The information transmitted is intended only for the > person or entity to which it is addressed and may contain confidential > and/or privileged material. If you have received this in error, please > contact the sender and delete the material from any computer. All emails > sent from or to Tavant Technologies may be subject to our monitoring > procedures. > > > >
