Embedded Derby, which Hive/Spark SQL uses as the default metastore only supports a single user at a time. Till this issue is fixed, you could use another metastore that supports multiple concurrent users (e.g. networked derby or mysql) to get around it.
On 25 October 2015 at 16:15, Ge, Yao (Y.) <y...@ford.com> wrote: > Thanks. I wonder why this is not widely reported in the user forum. The > RELP shell is basically broken in 1.5 .0 and 1.5.1 > > -Yao > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* Sunday, October 25, 2015 12:01 PM > *To:* Ge, Yao (Y.) > *Cc:* user > *Subject:* Re: Spark scala REPL - Unable to create sqlContext > > > > Have you taken a look at the fix for SPARK-11000 which is in the upcoming > 1.6.0 release ? > > > > Cheers > > > > On Sun, Oct 25, 2015 at 8:42 AM, Yao <y...@ford.com> wrote: > > I have not been able to start Spark scala shell since 1.5 as it was not > able > to create the sqlContext during the startup. It complains the metastore_db > is already locked: "Another instance of Derby may have already booted the > database". The Derby log is attached. > > I only have this problem with starting the shell in yarn-client mode. I am > working with HDP2.2.6 which runs Hadoop 2.6. > > -Yao derby.log > <http://apache-spark-user-list.1001560.n3.nabble.com/file/n25195/derby.log > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-scala-REPL-Unable-to-create-sqlContext-tp25195.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > >