Thanks a lot for sharing your views.

On Tue, Aug 26, 2014 at 7:06 AM, Peyman Mohajerian <mohaj...@gmail.com>
wrote:

> The only option I know in that case is using 'string' in hive. Also have
> to see how something like sqoop will bring the data over, perhaps need to
> cast the data type in Teradata first, using views. Those are my thoughts,
> there could be other tricks.
>
>
> On Mon, Aug 25, 2014 at 9:26 PM, reena upadhyay <reena2...@gmail.com>
> wrote:
>
>> Hi,
>>
>> As long as the data type is ANSI complaint, its equivalent type is
>> available in Hive. But there are few data types that are database specific.
>> Like there is a PERIOD data type in teradata, it is specific to teradata
>> only, So how to map such columns in Hive?
>>
>> Thanks.
>>
>>
>> On Tue, Aug 26, 2014 at 6:44 AM, Peyman Mohajerian <mohaj...@gmail.com>
>> wrote:
>>
>>> As far as i know you cannot do that and most likely you don't need it,
>>> here are sample mappings between the two systems:
>>> Teradata
>>>                   Hive
>>>    DECIMAL(x,y)  double  DATE,TIMESTAMP  timestamp
>>> INTEGER,SMALLINT,BYTINT  int  VARCHAR,CHAR  string  DECIMAL(x,0)  bigint
>>>
>>>
>>> I would typically stage data in hadoop as all string and then move it to
>>> hive managed/orc with the above mapping.
>>>
>>>
>>>
>>>
>>> On Mon, Aug 25, 2014 at 8:42 PM, reena upadhyay <reena2...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> Is there any way to create custom user defined data type in Hive? I
>>>> want to move some table data from teradata database to Hive. But in
>>>> teradata database tables, there are few columns data type that are not
>>>> supported in Hive. So to map the source table columns to my destination
>>>> table columns in Hive, I want to create my own data type in Hive.
>>>>
>>>> I know about writing UDF's in Hive but have no idea about creating user
>>>> defined data type in HIve. Any idea and example on the same would be of
>>>> great help.
>>>>
>>>> Thanks.
>>>>
>>>
>>>
>>
>

Reply via email to