Hi Santlal,

The problem is that BIGDECIMAL is not a Hive keyword for data types.
DECIMAL is the one you should use as this uses 'bigdecimal' java type
internally.
You can read more information on this wiki:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types#LanguageManualTypes-Decimalsdecimal

- Sergio

On Tue, Aug 4, 2015 at 12:26 AM, Santlal J Gupta <
santlal.gu...@bitwiseglobal.com> wrote:

> Hi,
>
> I want to use Date and BigDecimal datatype with parquet hive.
>
> Currently I am using Hive 0.12.0-cdh5.1.0 version.
>
> When I have written following query I get as given below .
>
> Query :
>
> hive (primitive_db)> create table big_date_test( dob  DATE , salary
> BIGDECIMAL ) stored as parquet;
> NoViableAltException(26@[])
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.type(HiveParser.java:31711)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.colType(HiveParser.java:31476)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.columnNameType(HiveParser.java:31176)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.columnNameTypeList(HiveParser.java:29401)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:4439)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2084)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1344)
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:983)
>         at
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:434)
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:352)
>         at
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:995)
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1038)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)
>         at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> FAILED: ParseException line 1:46 cannot recognize input near 'bigdecimal'
> ')' 'stored' in column type
>
> Please guide me, which version should I use so that I am able to use these
> datatypes.
>
> Thanks
> Santlal
> **************************************Disclaimer******************************************
> This e-mail message and any attachments may contain confidential
> information and is for the sole use of the intended recipient(s) only. Any
> views or opinions presented or implied are solely those of the author and
> do not necessarily represent the views of BitWise. If you are not the
> intended recipient(s), you are hereby notified that disclosure, printing,
> copying, forwarding, distribution, or the taking of any action whatsoever
> in reliance on the contents of this electronic information is strictly
> prohibited. If you have received this e-mail message in error, please
> immediately notify the sender and delete the electronic message and any
> attachments.BitWise does not accept liability for any virus introduced by
> this e-mail or any attachments.
> ********************************************************************************************
>

Reply via email to