if you find a way to escape the characters, some pre-processing step then
you may find this useful:
https://cwiki.apache.org/confluence/display/Hive/CSV+Serde

On Fri, Oct 30, 2015 at 11:36 AM, Martin Menzel <martin.men...@gmail.com>
wrote:

> Hi
> Do have access to the data source?
> If not you have first to find out if the data can be mapped to the columns
> in a unique way and for all rows. If yes maybe bindy can be a option to
> convert the data in a first step to tsv.
> I hope this helps.
> Regards
> Martin
> Am 30.10.2015 19:16 schrieb "Vijaya Narayana Reddy Bhoomi Reddy" <
> vijaya.bhoomire...@whishworks.com>:
>
>> Hi,
>>
>> I have a CSV file which contains hunderd thousand rows and about 200+
>> columns. Some of the columns have free text information, which means it
>> might contain characters like comma, colon, quotes etc with in the column
>> content.
>>
>> What is the best way to load such CSV file into Hive?
>>
>> Another serious issue, I have stored the file in a location in HDFS and
>> then created an external hive table on it. However, upon running Create
>> external table using HDP Hive View, the original CSV is no longer present
>> in the folder where it is meant to be. Not sure on how HDP processes and
>> where it is stored? My understanding was that EXTERNAL table wouldnt be
>> moved from their original HDFS location?
>>
>> Request someone to help out!
>>
>>
>> Thanks & Regards
>> Vijay
>>
>>
>>
>> The contents of this e-mail are confidential and for the exclusive use of
>> the intended recipient. If you receive this e-mail in error please delete
>> it from your system immediately and notify us either by e-mail or
>> telephone. You should not copy, forward or otherwise disclose the content
>> of the e-mail. The views expressed in this communication may not
>> necessarily be the view held by WHISHWORKS.
>
>

Reply via email to