Hi Doug,

For now, I think you can use "sqlContext.sql("USE databaseName")" to change
the current database.

Thanks,

Yin

On Thu, Jun 4, 2015 at 12:04 PM, Yin Huai <yh...@databricks.com> wrote:

> Hi Doug,
>
> sqlContext.table does not officially support database name. It only
> supports table name as the parameter. We will add a method to support
> database name in future.
>
> Thanks,
>
> Yin
>
> On Thu, Jun 4, 2015 at 8:10 AM, Doug Balog <doug.sparku...@dugos.com>
> wrote:
>
>> Hi Yin,
>>  I’m very surprised to hear that its not supported in 1.3 because I’ve
>> been using it since 1.3.0.
>> It worked great up until  SPARK-6908 was merged into master.
>>
>> What is the supported way to get  DF for a table that is not in the
>> default database ?
>>
>> IMHO, If you are not going to support “databaseName.tableName”,
>> sqlContext.table() should have a version that takes a database and a table,
>> ie
>>
>> def table(databaseName: String, tableName: String): DataFrame =
>>   DataFrame(this, catalog.lookupRelation(Seq(databaseName,tableName)))
>>
>> The handling of databases in Spark(sqlContext, hiveContext, Catalog)
>> could be better.
>>
>> Thanks,
>>
>> Doug
>>
>> > On Jun 3, 2015, at 8:21 PM, Yin Huai <yh...@databricks.com> wrote:
>> >
>> > Hi Doug,
>> >
>> > Actually, sqlContext.table does not support database name in both Spark
>> 1.3 and Spark 1.4. We will support it in future version.
>> >
>> > Thanks,
>> >
>> > Yin
>> >
>> >
>> >
>> > On Wed, Jun 3, 2015 at 10:45 AM, Doug Balog <doug.sparku...@dugos.com>
>> wrote:
>> > Hi,
>> >
>> > sqlContext.table(“db.tbl”) isn’t working for me, I get a
>> NoSuchTableException.
>> >
>> > But I can access the table via
>> >
>> > sqlContext.sql(“select * from db.tbl”)
>> >
>> > So I know it has the table info from the metastore.
>> >
>> > Anyone else see this ?
>> >
>> > I’ll keep digging.
>> > I compiled via make-distribution  -Pyarn -phadoop-2.4 -Phive
>> -Phive-thriftserver
>> > It worked for me in 1.3.1
>> >
>> > Cheers,
>> >
>> > Doug
>> >
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>> >
>>
>>
>

Reply via email to