[ 
https://issues.apache.org/jira/browse/DRILL-1161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rahul Challapalli updated DRILL-1161:
-------------------------------------

    Attachment: error.log

> Drill Parquet writer fails with an Out of Memory issue when the data is large 
> enough
> ------------------------------------------------------------------------------------
>
>                 Key: DRILL-1161
>                 URL: https://issues.apache.org/jira/browse/DRILL-1161
>             Project: Apache Drill
>          Issue Type: Bug
>          Components: Storage - Parquet, Storage - Writer
>            Reporter: Rahul Challapalli
>         Attachments: error.log
>
>
> git.commit.id.abbrev=e5c2da0
> The below query fails with an out of memory issue :
> create table `wide-columns-100000` as select columns[0] col0, cast(columns[1] 
> as int) col1 from `wide-columns-100000.tbl`;
> The source file contains 100000 records. Each record has 2 columns. The first 
> column is a string with 100000 characters in it and the second is an integer. 
> Adding a limit to the above query succeeds. I attached the error messages 
> from drillbit.log and drillbit.out
> Let me know if you need anything more



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to