Hi, I am using spark on Hive. Some tables have CHAR type characters. It is my understanding that spark converts varchar characters to String internally however the Spark version 1.5.2 that I have throws error when the underlying Hive table has CHAR fields.
I wanted to when Varchar will be available in Spark. Also Spark does not seem to understand temporary tables. For example the following throws error spark-sql> CREATE TEMPORARY TABLE tmp AS > SELECT t.calendar_month_desc, c.channel_desc, SUM(s.amount_sold) AS TotalSales > FROM sales s, times t, channels c > WHERE s.time_id = t.time_id > AND s.channel_id = c.channel_id > GROUP BY t.calendar_month_desc, c.channel_desc > ; Error in query: Unhandled clauses: TEMPORARY 1, 2,2, 7 . You are likely trying to use an unsupported Hive feature."; Thanks, Dr Mich Talebzadeh LinkedIn https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf Author of the books "A Practitioners Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly: Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly http://talebzadehmich.wordpress.com NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Cloudtechnologypartners Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Cloudtechnologypartners Ltd, its subsidiaries nor their employees accept any responsibility. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org