[ 
https://issues.apache.org/jira/browse/PHOENIX-2380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15003076#comment-15003076
 ] 

James Taylor commented on PHOENIX-2380:
---------------------------------------

In Phoenix, values in e notation are interpreted as float or double constants. 
If you use decimal notation, it'll treat it as a BigDecimal (which isn't ideal 
if you have e+38 values. Another option is to use bind variables with 
BigDecimal values as your variables you bind, but this won't work well for CSV 
bulk load.

Perhaps best would be if Phoenix detects an e notation value that doesn't fit 
into a float, and then uses a double, and if it doesn't fit into a double, uses 
a BigDecimal. How about if we morph this JIRA into that, [~kliew]?

> 'decimal' columns can only hold 'float' values
> ----------------------------------------------
>
>                 Key: PHOENIX-2380
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2380
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.5.0
>            Reporter: Kevin Liew
>
> {noformat}
> CREATE TABLE test.DECIMAL_TABLE(
>       KeyColumn VARCHAR(255) PRIMARY KEY,
>       Column1 DECIMAL(38, 0));
> {noformat}
> Phoenix will not upsert values larger than or equal to the max float value.
> {noformat}
> upsert into test.decimal_table values ('test', 3.402823466e+38);
> java.sql.SQLException: error while executing SQL "upsert into 
> test.decimal_table values ('test', 3.402823466e+38)": response code 500
>         at org.apache.calcite.avatica.Helper.createException(Helper.java:41)
>         at 
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:112)
>         at 
> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:121)
>         at sqlline.Commands.execute(Commands.java:822)
>         at sqlline.Commands.sql(Commands.java:732)
>         at sqlline.SqlLine.dispatch(SqlLine.java:808)
>         at sqlline.SqlLine.begin(SqlLine.java:681)
>         at sqlline.SqlLine.start(SqlLine.java:398)
>         at sqlline.SqlLine.main(SqlLine.java:292)
> Caused by: java.lang.RuntimeException: response code 500
>         at 
> org.apache.calcite.avatica.remote.RemoteService.apply(RemoteService.java:45)
>         at 
> org.apache.calcite.avatica.remote.JsonService.apply(JsonService.java:207)
>         at 
> org.apache.calcite.avatica.remote.RemoteMeta.prepareAndExecute(RemoteMeta.java:169)
>         at 
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:477)
>         at 
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:109)
>         ... 7 more
> 0: jdbc:phoenix:thin:url=http://localhost:876> upsert into test.decimal_table 
> values ('MaxFloat', 3.402823466e+37);
> 1 row affected (0.113 seconds)
> {noformat}
> or using the bulk loader tool
> {noformat}
> 15/11/05 23:17:15 ERROR util.CSVCommonsLoader: Error upserting record 
> [MaxFloat, 3.402823466e+38]: ERROR 206 (22003): The data exceeds the max 
> capacity for the data type. value=340282346600000000000000000000000000000 
> columnName=COLUMN1
> {noformat}
> This also applies for values smaller than the minimum float value



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to