Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-09 Thread Marco Gaido
gt; Il giorno lun 7 gen 2019 alle ore 15:03 Wenchen Fan > ha scritto: > >> AFAIK parquet spec says decimal scale can't be negative. If we want to >> officially support negative-scale decimal, we should clearly define the >> behavior when writing negative-scale decimal

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-09 Thread Jörn Franke
le decimal, we should clearly define the >> behavior when writing negative-scale decimals to parquet and other data >> sources. The most straightforward way is to fail for this case, but maybe we >> can do something better, like casting decimal(1, -20) to decimal(20, 0)

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-09 Thread Marco Gaido
; I'm OK with it, i.e. fail the write if there are negative-scale decimals >> (we need to document it though). We can improve it later in data source v2. >> >> On Mon, Jan 7, 2019 at 10:09 PM Marco Gaido >> wrote: >> >>> In general we can say that some datasource

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-08 Thread Wenchen Fan
PM Wenchen Fan wrote: > I'm OK with it, i.e. fail the write if there are negative-scale decimals > (we need to document it though). We can improve it later in data source v2. > > On Mon, Jan 7, 2019 at 10:09 PM Marco Gaido > wrote: > >> In general we can say that som

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-07 Thread Wenchen Fan
I'm OK with it, i.e. fail the write if there are negative-scale decimals (we need to document it though). We can improve it later in data source v2. On Mon, Jan 7, 2019 at 10:09 PM Marco Gaido wrote: > In general we can say that some datasources allow them, others fail. At > the mome

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-07 Thread Marco Gaido
en writing negative-scale decimals to parquet and other data > sources. The most straightforward way is to fail for this case, but maybe > we can do something better, like casting decimal(1, -20) to decimal(20, 0) > before writing. > > On Mon, Jan 7, 2019 at 9:32 PM Marco Gaido wr

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-07 Thread Wenchen Fan
AFAIK parquet spec says decimal scale can't be negative. If we want to officially support negative-scale decimal, we should clearly define the behavior when writing negative-scale decimals to parquet and other data sources. The most straightforward way is to fail for this case, but maybe we can do

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-07 Thread Marco Gaido
the result type of decimal operations, and the > behavior when writing out decimals(e.g. we can cast decimal(1, -20) to > decimal(20, 0) before writing). > > Another question is, shall we set a min scale? e.g. shall we allow > decimal(1, -1000)? > > On Thu, Oct 25, 2018 at 9:49 P

Re: [DISCUSS] Support decimals with negative scale in decimal operation

2019-01-06 Thread Wenchen Fan
com/questions/35435691/bigdecimal-precision-and-scale> looks pretty good), and the result type of decimal operations, and the behavior when writing out decimals(e.g. we can cast decimal(1, -20) to decimal(20, 0) before writing). Another question is, shall we set a min scale? e.g. shall we allow dec

Re: Decimals with negative scale

2018-12-19 Thread Marco Gaido
That is feasible, the main point is that negative scales were not really meant to be there in the first place, so it something which was forgot to be forbidden, and it is something which the DBs we are drawing our inspiration from for decimals (mainly SQLServer) do not support. Honestly, my

Re: Decimals with negative scale

2018-12-18 Thread Reynold Xin
arcogaido91@ gmail. com ( >> marcogaid...@gmail.com ) > wrote: >> >> >>> Hi all, >>> >>> >>> as you may remember, there was a design doc to support operations >>> involving decimals with negative scales. After the discussion in

Re: Decimals with negative scale

2018-12-18 Thread Marco Gaido
This is at analysis time. On Tue, 18 Dec 2018, 17:32 Reynold Xin Is this an analysis time thing or a runtime thing? > > On Tue, Dec 18, 2018 at 7:45 AM Marco Gaido > wrote: > >> Hi all, >> >> as you may remember, there was a design doc to support operations >&

Decimals with negative scale

2018-12-18 Thread Marco Gaido
Hi all, as you may remember, there was a design doc to support operations involving decimals with negative scales. After the discussion in the design doc, now the related PR is blocked because for 3.0 we have another option which we can explore, ie. forbidding negative scales. This is probably

[DISCUSS] Support decimals with negative scale in decimal operation

2018-10-25 Thread Marco Gaido
Hi all, a bit more than one month ago, I sent a proposal for handling properly decimals with negative scales in our operations. This is a long standing problem in our codebase as we derived our rules from Hive and SQLServer where negative scales are forbidden, while in Spark

Re: SPIP: support decimals with negative scale in decimal operation

2018-09-23 Thread Felix Cheung
DISCUSS thread is good to have... From: Marco Gaido Sent: Friday, September 21, 2018 3:31 AM To: Wenchen Fan Cc: dev Subject: Re: SPIP: support decimals with negative scale in decimal operation Hi Wenchen, Thank you for the clarification. I agree

Re: SPIP: support decimals with negative scale in decimal operation

2018-09-21 Thread Marco Gaido
Marco Gaido > wrote: > >> Hi all, >> >> I am writing this e-mail in order to discuss the issue which is reported >> in SPARK-25454 and according to Wenchen's suggestion I prepared a design >> doc for it. >> >> The problem we are facing here i

Re: SPIP: support decimals with negative scale in decimal operation

2018-09-21 Thread Wenchen Fan
Gaido wrote: > Hi all, > > I am writing this e-mail in order to discuss the issue which is reported > in SPARK-25454 and according to Wenchen's suggestion I prepared a design > doc for it. > > The problem we are facing here is that our rules for decimals operations > are ta

SPIP: support decimals with negative scale in decimal operation

2018-09-21 Thread Marco Gaido
Hi all, I am writing this e-mail in order to discuss the issue which is reported in SPARK-25454 and according to Wenchen's suggestion I prepared a design doc for it. The problem we are facing here is that our rules for decimals operations are taken from Hive and MS SQL server and they explicitly

Re: Decimals

2017-12-25 Thread Ofir Manor
Hi Marco, great work, I personally hope it gets included soon! I just wanted to clarify one thing - Oracle and PostgreSQL do not have infinite precision. The scale and precision of decimals are just user-defined (explicitly or implicitly). So, both of them follow the exact same rules you mentioned

R: Decimals

2017-12-21 Thread Marco Gaido
t; <marcogaid...@gmail.com> Cc: "Reynold Xin" <r...@databricks.com>; "dev@spark.apache.org" <dev@spark.apache.org> Oggetto: Re: Decimals Losing precision is not acceptable to financial customers. Thus, instead of returning NULL, I saw DB2 issues the following er

Re: Decimals

2017-12-21 Thread Xiao Li
ks when you have time to read it. > > Spark's current implementation of arithmetic operations on decimals was > "copied" from Hive. Thus, the initial goal of the implementation was to be > compliant with Hive, which itself aims to reproduce SQLServer behavior. > Therefore I compar

Re: Decimals

2017-12-19 Thread Marco Gaido
Hello everybody, I did some further researches and now I am sharing my findings. I am sorry, it is going to be a quite long e-mail, but I'd really appreciate some feedbacks when you have time to read it. Spark's current implementation of arithmetic operations on decimals was "copied"

Re: Decimals

2017-12-13 Thread Reynold Xin
Responses inline On Tue, Dec 12, 2017 at 2:54 AM, Marco Gaido wrote: > Hi all, > > I saw in these weeks that there are a lot of problems related to decimal > values (SPARK-22036, SPARK-22755, for instance). Some are related to > historical choices, which I don't know,

Decimals

2017-12-12 Thread Marco Gaido
Hi all, I saw in these weeks that there are a lot of problems related to decimal values (SPARK-22036, SPARK-22755, for instance). Some are related to historical choices, which I don't know, thus please excuse me if I am saying dumb things: - why are we interpreting literal constants in queries

Re: Multiplication on decimals in a dataframe query

2015-12-02 Thread Akhil Das
> I hit a weird issue when I tried to multiply to decimals in a select > (either in scala or as SQL), and Im assuming I must be missing the point. > > The issue is fairly easy to recreate with something like the following: > > > val sqlContext = new org.apache.spark.sql.SQLCont