I think what you are looking for is covert_to( column_name, 'JSON')

That being said, are you going to be parsing this JSON in the function? I
think it would make more sense to just have the function take the complex
input.

The only case where I would suggest taking in JSON, and parsing it in a
function is if you had a bunch of source data that would put JSON embedded
in a varchar upon a standard read (JSON embedded in a database, parquet
file, JSON string (yes amazingly people do this)) and you want to save
users the hassle of calling convert_from on the varchar.

Adding complexity to the query so that your function can take the varchar
seems counter-intuitive to me.

On Thu, Feb 11, 2016 at 11:13 AM, Karol Potocki <[email protected]> wrote:

> Hi, I am writing a UDF to convert parts of geoJson to other representation.
> Having a query like this:
>
>  select json.features.geometry as geoJson from (select FLATTEN(features)
> as features from dfs.`/home/k255/CA-cities.json`) json limit 2;
>
> and result:
>
>  {"type":"Point","coordinates":[-121.2785588,38.1746398]}
>  {"type":"Point","coordinates":[-121.9591252,37.3946626]}
>
> I need to pass the output (which is map) to UDF which expects string
> (VarCharHolder) to pass it to other function inside.
> Is there a way in drill (other than handling FieldReader) to convert this
> part of a json object to string like this:
>
>  '{"type":"Point","coordinates":[-121.2785588,38.1746398]}'
>  '{"type":"Point","coordinates":[-121.9591252,37.3946626]}'
>
> ?
>
> Thanks,
> Karol Potocki
>

Reply via email to