Is this 1.6 Ted?

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 25 March 2016 at 22:40, Ted Yu <yuzhih...@gmail.com> wrote:

> Looks like database support was fixed by:
>
> [SPARK-7943] [SPARK-8105] [SPARK-8435] [SPARK-8714] [SPARK-8561] Fixes
> multi-database support
>
> On Fri, Mar 25, 2016 at 3:35 PM, Ashok Kumar <ashok34...@yahoo.com> wrote:
>
>> 1.5.2 Ted.
>>
>> Those two lines I don't know where they come. It finds and gets the table
>> info OK
>>
>> HTH
>>
>>
>> On Friday, 25 March 2016, 22:32, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>
>> Which release of Spark do you use, Mich ?
>>
>> In master branch, the message is more accurate
>> (sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/NoSuchItemException.scala):
>>
>>   override def getMessage: String = s"Table $table not found in database
>> $db"
>>
>>
>> On Fri, Mar 25, 2016 at 3:21 PM, Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>> You can use DESCRIBE FORMATTED <DATABASE>.<TABLE_NAME> to get that info.
>>
>> This is based on the same command in Hive however, it throws two
>> erroneous error lines as shown below (don't see them in Hive DESCRIBE ...)
>>
>> Example
>>
>> scala> sql("describe formatted test.t14").collect.foreach(println)
>> 16/03/25 22:32:38 ERROR Hive: Table test not found: test.test table not
>> found
>> 16/03/25 22:32:38 ERROR Hive: Table test not found: test.test table not
>> found
>> [# col_name             data_type               comment             ]
>> [                ]
>> [invoicenumber          int                                         ]
>> [paymentdate            date                                        ]
>> [net                    decimal(20,2)                               ]
>> [vat                    decimal(20,2)                               ]
>> [total                  decimal(20,2)                               ]
>> [                ]
>> [# Detailed Table Information            ]
>> [Database:              test                     ]
>> [Owner:                 hduser                   ]
>> [
>> *CreateTime:            Fri Mar 25 22:13:44 GMT 2016     ]*[LastAccessTime:
>> UNKNOWN                  ]
>> [Protect Mode:          None                     ]
>> [Retention:             0                        ]
>> [Location:
>> hdfs://rhes564:9000/user/hive/warehouse/test.db/t14      ]
>> [Table Type:            MANAGED_TABLE            ]
>> [Table Parameters:               ]
>> [       COLUMN_STATS_ACCURATE   {\"BASIC_STATS\":\"true\"}]
>> [       comment                 from csv file from excel sheet]
>> [       numFiles                2                   ]
>> [       orc.compress            ZLIB                ]
>> [       totalSize               1090                ]
>> [       transient_lastDdlTime   1458944025          ]
>> [                ]
>> [# Storage Information           ]
>> [SerDe Library:         org.apache.hadoop.hive.ql.io.orc.OrcSerde        ]
>> [InputFormat:           org.apache.hadoop.hive.ql.io.orc.OrcInputFormat  ]
>> [OutputFormat:
>> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat         ]
>> [Compressed:            No                       ]
>> [Num Buckets:           -1                       ]
>> [Bucket Columns:        []                       ]
>> [Sort Columns:          []                       ]
>> [Storage Desc Params:            ]
>> [       serialization.format    1                   ]
>>
>> HTH
>>
>> Dr Mich Talebzadeh
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> On 25 March 2016 at 22:12, Ashok Kumar <ashok34...@yahoo.com.invalid>
>> wrote:
>>
>> Experts,
>>
>> I would like to know when a table was created in Hive database using
>> Spark shell?
>>
>> Thanks
>>
>>
>>
>>
>>
>>
>

Reply via email to