Wait, what’s wrong with doing it this way?

0: jdbc:drill:zk=sen11:5181,sen12:5181> select cast(`float` as float),
cast (`int` as int) from `floatint.json`;
+---------+---------+
| EXPR$0  | EXPR$1  |
+---------+---------+
| 1.0     | 1       |
| 0.1     | 1       |
+---------+---------+

​

Chris Matta
[email protected]
215-701-3146

On Wed, Jun 17, 2015 at 7:39 PM, Hanifi Gunes <[email protected]> wrote:

> *I was hoping that there was a way to do it for a field-by-field basis.
> But, seems as if that may not be the case.*
> - Unfortunately not. You could consider type casting if you need some
> fields back in numbers though.
>
> -Hanifi
>
> On Wed, Jun 17, 2015 at 4:36 PM, Tim Harper <[email protected]> wrote:
>
> > Okay; I was hoping that there was a way to do it for a field-by-field
> > basis. But, seems as if that may not be the case. Thank you!
> >
> > > On Jun 17, 2015, at 17:33, Hanifi Gunes <[email protected]> wrote:
> > >
> > > *Is this the best solution for this problem? Anyway to provide a type
> > hint
> > > in the query?*
> > > - I guess it depends on how you want to use this field. If you just
> want
> > to
> > > report the field, you could also consider treating this field as string
> > by
> > > issuing
> > >
> > > alter session set `store.json.all_text_mode` = true;
> > >
> > > As otherwise Drill currently does not support type promotion from int
> to
> > > double.
> > >
> > >
> > > -Hanifi
> > >
> > > On Wed, Jun 17, 2015 at 4:28 PM, Tim Harper <[email protected]>
> > wrote:
> > >
> > >> The data:
> > >>
> > >>> timcharper@timcharper:~/data $ cat test.json
> > >>> [
> > >>>  {"leFloat": 1, "leInt": 1},
> > >>>  {"leFloat": 0.1, "leInt": 1}
> > >>> ]
> > >>
> > >>
> > >> The exception:
> > >>
> > >>> 0: jdbc:drill:zk=local> select leFloat from
> > >> file.`/Users/timcharper/data/test.json`;
> > >>> Error: DATA_READ ERROR: Error parsing JSON - You tried to write a
> > Float8
> > >> type when you are using a ValueWriter of type
> NullableBigIntWriterImpl.
> > >>>
> > >>> File  /Users/timcharper/data/test.json
> > >>> Record  2
> > >>> Fragment 0:0
> > >>>
> > >>> [Error Id: e1ba7368-cbe2-401b-aa96-f01b7a8f97ae on 10.0.7.68:31010]
> > >> (state=,code=0)
> > >>
> > >> If I switch the first record and the second, then I get this:
> > >>
> > >>> 0: jdbc:drill:zk=local> select leFloat from
> > >> file.`/Users/timcharper/data/test.json`;
> > >>> Error: DATA_READ ERROR: Error parsing JSON - You tried to write a
> > BigInt
> > >> type when you are using a ValueWriter of type
> NullableFloat8WriterImpl.
> > >>>
> > >>> File  /Users/timcharper/data/test.json
> > >>> Record  2
> > >>> Fragment 0:0
> > >>>
> > >>> [Error Id: 02db1336-cbe2-42e8-aaa3-9a926598295a on 10.0.7.68:31010]
> > >> (state=,code=0)
> > >>
> > >> If I set store.json.read_numbers_as_double to true, then things work,
> > but
> > >> this is less that desirable; Is this the best solution for this
> problem?
> > >> Anyway to provide a type hint in the query?
> > >>
> > >> Thanks!
> > >>
> > >> Tim
> >
> >
>

Reply via email to