To use Spark SQL you need at least spark-sql_2.10 and spark-catalyst_2.10. If you want Hive support, spark-hive_2.10 must also be included.
On Aug 7, 2014, at 2:33 PM, vdiwakar.malladi <vdiwakar.mall...@gmail.com> wrote: > Hi, > > I'm new to Spark. I configured spark in standalone mode on my lap. Right now > i'm in the process of executing the samples to understand more into the > details to Spark components and how they work. > > When i'm trying to run the example program given for Spark SQL using Java > API, i'm unable to resolve the classes of the package > org.apache.spark.sql.api.java; I used maven artifact 'spark-core_2.10'; I > could see that that package is not available in the corresponding jar file. > > Could you please let me know, if i need to do any additional configurations > / add dependencies to resolve the classes and to move forward. > > Thanks in advance. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-tp11618.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org