[
https://issues.apache.org/jira/browse/SPARK-11186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Santiago M. Mola updated SPARK-11186:
-------------------------------------
Description:
Default catalog behaviour for caseness is different in {{SQLContext}} and
{{HiveContext}}.
{code}
test("Catalog caseness (SQL)") {
val sqlc = new SQLContext(sc)
val relationName = "MyTable"
sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new
BaseRelation {
override def sqlContext: SQLContext = sqlc
override def schema: StructType = StructType(Nil)
}))
val tables = sqlc.tableNames()
assert(tables.contains(relationName))
}
test("Catalog caseness (Hive)") {
val sqlc = new HiveContext(sc)
val relationName = "MyTable"
sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new
BaseRelation {
override def sqlContext: SQLContext = sqlc
override def schema: StructType = StructType(Nil)
}))
val tables = sqlc.tableNames()
assert(tables.contains(relationName))
}
{code}
Looking at {{HiveContext#SQLSession}}, I see this is the intended behaviour.
But the reason that this is needed seems undocumented (both in the manual or in
the source code comments).
was:
Default catalog behaviour for caseness is different in {{SQLContext}} and
{{HiveContext}}.
{code}
test("Catalog caseness (SQL)") {
val sqlc = new SQLContext(sc)
val relationName = "MyTable"
sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new
BaseRelation {
override def sqlContext: SQLContext = sqlc
override def schema: StructType = StructType(Nil)
}))
val tables = sqlc.tableNames()
assert(tables.contains(relationName))
}
test("Catalog caseness (Hive)") {
val sqlc = new HiveContext(sc)
val relationName = "MyTable"
sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new
BaseRelation {
override def sqlContext: SQLContext = sqlc
override def schema: StructType = StructType(Nil)
}))
val tables = sqlc.tableNames()
assert(tables.contains(relationName))
}
{/code}
Looking at {{HiveContext#SQLSession}}, I see this is the intended behaviour.
But the reason that this is needed seems undocumented (both in the manual or in
the source code comments).
> Caseness inconsistency between SQLContext and HiveContext
> ---------------------------------------------------------
>
> Key: SPARK-11186
> URL: https://issues.apache.org/jira/browse/SPARK-11186
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.1
> Reporter: Santiago M. Mola
> Priority: Minor
>
> Default catalog behaviour for caseness is different in {{SQLContext}} and
> {{HiveContext}}.
> {code}
> test("Catalog caseness (SQL)") {
> val sqlc = new SQLContext(sc)
> val relationName = "MyTable"
> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new
> BaseRelation {
> override def sqlContext: SQLContext = sqlc
> override def schema: StructType = StructType(Nil)
> }))
> val tables = sqlc.tableNames()
> assert(tables.contains(relationName))
> }
> test("Catalog caseness (Hive)") {
> val sqlc = new HiveContext(sc)
> val relationName = "MyTable"
> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new
> BaseRelation {
> override def sqlContext: SQLContext = sqlc
> override def schema: StructType = StructType(Nil)
> }))
> val tables = sqlc.tableNames()
> assert(tables.contains(relationName))
> }
> {code}
> Looking at {{HiveContext#SQLSession}}, I see this is the intended behaviour.
> But the reason that this is needed seems undocumented (both in the manual or
> in the source code comments).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]