oops, thanks Yan, you are right, i got

scala> sqlContext.sql("select * from a join b").take(10)
java.lang.RuntimeException: Table Not Found: b
        at scala.sys.package$.error(package.scala:27)
        at
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:90)
        at
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:90)
        at scala.Option.getOrElse(Option.scala:120)
        at
org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:90)

and with hql

scala> hiveContext.hql("select * from a join b").take(10)
warning: there were 1 deprecation warning(s); re-run with -deprecation for
details
14/08/22 14:48:45 INFO parse.ParseDriver: Parsing command: select * from a
join b
14/08/22 14:48:45 INFO parse.ParseDriver: Parse Completed
14/08/22 14:48:45 ERROR metadata.Hive:
NoSuchObjectException(message:default.a table not found)
        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:27129)
        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:27097)
        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:27028)
        at
org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:936)
        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:922)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:854)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
        at com.sun.proxy.$Proxy17.getTable(Unknown Source)
        at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:950)
        at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:924)
        at
org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:59)


so sqlContext is looking up table from
org.apache.spark.sql.catalyst.analysis.SimpleCatalog, Catalog.scala
hiveContext looking up from org.apache.spark.sql.hive.HiveMetastoreCatalog,
HiveMetastoreCatalog.scala

maybe we can do something in sqlContext to register a hive table as
Spark-SQL-Table, need to read column info, partition info, location, SerDe,
Input/OutputFormat and maybe StorageHandler also, from the hive metastore...




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-SQL-Query-and-join-different-data-sources-tp7914p7955.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to