Setting the yarn.resourcemanager.webapp.address.rm1 and yarn.resourcemanager.webapp.address.rm2 in yarn-site.xml seems to have resolved the issue.
Appreciate any comments about the regression from 1.3.1 ? Thanks. Regards, Nachiketa On Fri, Jun 26, 2015 at 1:28 AM, Nachiketa <nachiketa.shu...@gmail.com> wrote: > A few other observations. > > 1. Spark 1.3.1 (custom built against HDP 2.2) was running fine against the > same cluster and same hadoop configuration (hence seems like regression). > > 2. HA is enabled for YARN RM and HDFS (not sure if this would impact > anything but wanted to share anyway). > > 3. Found this issue : https://issues.apache.org/jira/browse/SPARK-5837 > and multiple references to other YARN issues to the same. Continuing to > understand and explore the possibilities documented there. > > Regards, > Nachiketa > > On Fri, Jun 26, 2015 at 12:52 AM, Nachiketa <nachiketa.shu...@gmail.com> > wrote: > >> Spark 1.4.0 - Custom built from source against Hortonworks HDP 2.2 >> (hadoop 2.6.0+) >> HDP 2.2 Cluster (Secure, kerberos) >> >> spark-shell (--master yarn-client) launches fine and the prompt shows up. >> Clicking on the Application Master url on the YARN RM UI, throws 500 >> connect error. >> >> The same build works well against a non-secure cluster (same HDP >> distribution). >> >> No debug logs or stack trace is easily visible. Where do I look for what >> is going wrong ? And has anything changed in spark security that could be >> contributing to this ? >> >> Thank you for your help with this. >> >> >> Regards, >> Nachiketa >> > > > > -- > Regards, > -- Nachiketa > -- Regards, -- Nachiketa