Hi Ajit,
thank you for your nice summary. You seems to be missing Sqoop arguments 
--null-string '\\N', --null-non-string '\\N' on import job and 
--input-null-string '\\N', --input-null-non-string '\\N' on export. Would you 
mind adding them and rerunning your work flow?

Jarcec

On Wed, Mar 20, 2013 at 07:30:29AM +0000, Ajit Kumar Shreevastava wrote:
> Hi Jarcec,
> 
> 
> 
> Thanking you for your valuable input and your suggestion seems to be valid. 
> But I have some doubt about the SQOOP  behavior :-->
> 
> 1.       If null create the confusion then some value similar to the below 
> mentioned value are inserted to the oracle table with null treated as   
> string.
> 
>      hive> select * from bttn_bkp_testing
> 
>               > where bttn_id=39126;
> 
> 
> 
> 39126.0 32436.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0     
> #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0     1.0   
>   1.0     NULL   null     20.0    2010-05-04 14:31:17.0   dbmigration     
> 2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976 ei009724      
>   1.0     null    NULL    null   NULL     0.0     61253.0 61124.0 61124.0 
> 61253.0
> 
> 39126.0 50805.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0     
> #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0     1.0   
>   1.0     NULL   null     20.0    2010-05-23 23:18:54.604 ei103215        
> 2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976 ei009724      
>   1.0     null    NULL    null   NULL     0.0     61253.0 61124.0 61124.0 
> 61253.0
> 
> 39126.0 63196.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0     
> #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0     1.0   
>   1.0     NULL   null     20.0    2010-11-04 18:25:23.956 ei103215        
> 2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976 ei009724      
>   1.0     null    NULL    null   NULL     0.0     61253.0 61124.0 61124.0 
> 61253.0
> 
> 
> 
> These values are inserted into the oracle table BTTN_BKP_TEST as follows:--> .
> 
> SQL> Select * from BTTN_BKP_TEST where bttn_id=39126;
> 
> 
> 
> 39126    32436    3276       3              28           1              1     
>          1              1              #FFFFFF               #0000FF          
>      #0000FF                #FFFFFF               0              0            
>                                   1              1              1             
>                  null         20           05/04/2010 2:31:17.000000 PM       
>    dbmigration       01/18/2013 9:11:18.370000 AM  DP_CQ4540        
> 11/29/2010 3:45:03.976000 PM                ei009724              1           
>    null                         null                         0              
> 61253    61124    61124    61253
> 
> 39126    50805    3276       3              28           1              1     
>          1              1              #FFFFFF               #0000FF          
>      #0000FF                #FFFFFF               0              0            
>                                   1              1              1             
>                  null         20           05/23/2010 11:18:54.604000 PM      
>   ei103215              01/18/2013 9:11:18.370000 AM  DP_CQ4540        
> 11/29/2010 3:45:03.976000 PM                ei009724              1           
>    null                         null                         0              
> 61253    61124    61124    61253
> 
> 39126    63196    3276       3              28           1              1     
>          1              1              #FFFFFF               #0000FF          
>      #0000FF                #FFFFFF               0              0            
>                                   1              1              1             
>                  null         20           11/04/2010 6:25:23.956000 PM       
>    ei103215              01/18/2013 9:11:18.370000 AM  DP_CQ4540        
> 11/29/2010 3:45:03.976000 PM                ei009724              1           
>    null                         null                         0              
> 61253    61124    61124    61253
> 
> 
> 
> But the raised exception for below value:-->
> 
> hive> select * from bttn_bkp_testing
> 
>         > where bttn_id= 194628.0;
> 
> 
> 
> 194628.0        577019.0        8910.0  19.0    1.0     1.0     1.0     0.0   
>   0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> NULL    NULL   1.0      NULL    null    20.0    2012-04-19 23:25:48.78  
> ei009724        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0   
>   BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> 61230.0 61259.0
> 
> 194628.0        706360.0        8910.0  19.0    1.0     1.0     1.0     0.0   
>   0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> NULL    NULL   1.0      NULL    null    20.0    2012-05-21 01:01:53.629 
> ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0   
>   BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> 61230.0 61259.0
> 
> 194628.0        1620395.0       8910.0  19.0    1.0     1.0     1.0     0.0   
>   0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> NULL    NULL   1.0      NULL    null    20.0    2012-08-10 04:34:00.203 
> ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0   
>   BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> 61230.0 61259.0
> 
> 194628.0        1694103.0       8910.0  19.0    1.0     1.0     1.0     0.0   
>   0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> NULL    NULL   1.0      NULL    null    20.0    2012-11-08 01:09:15.136 
> ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0   
>   BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> 61230.0 61259.0
> 
> 194628.0        1831767.0       8910.0  19.0    1.0     1.0     1.0     0.0   
>   0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> NULL    NULL   1.0      NULL    null    20.0    2012-12-19 23:44:44.241 
> e0025129        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0   
>   BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> 61230.0 61259.0
> 
> 
> 
> 2.       For your information I have mentioned two interesting fact here for 
> you regarding SQOOP behavior. First, I have imported Bttn table from Oracle 
> into Hive bttn_bkp_test_new table using following command
> 
> [hadoop@NHCLT-PC44-2 ~]$ sqoop import --connect 
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table BTTN 
> --verbose -P --hive-table bttn_bkp_test_new --create-hive-table --hive-import 
> --hive-drop-import-delims --hive-home /home/hadoop/user/hive/warehouse
> 
> And the above command imports all the rows into hive  table bttn_bkp_test_new 
> and SQOOP created some value with null and some with NULL.
> 
> 
> 
> Now I have created a new table Bttn_bkp_test in oracle Database and try to 
> export the above created hive table bttn_bkp_test_new into Oracle table 
> Bttn_bkp_test  :-->
> 
> [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect 
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table 
> BTTN_BKP_TEST --export-dir  
> /home/hadoop/user/hive/warehouse/bttn_bkp_test_new -P -m 1  
> --input-fields-terminated-by '\0001' -verbose
> 
> 
> 
> Now all data inserted properly into BTTN_BKP_TEST and the null value in hive 
> table is inserted as null value not with "null" string  ( All data are 
> similar to the old Bttn table data of Oracle).
> 
> 
> 
> Now, I am created a new table in HIVE  with the following command:-->
> hive> create table bttn_bkp_testing like bttn_bkp_test_new;
> hive> insert OVERWRITE table bttn_bkp_testing
> 
>     > select * from bttn_bkp_test_new
> 
> 
> 
> Now I am putting two scenario for you:-->
> 
> 
> 
> a.       Now  i have truncated the bttn_bkp_test table in oracle and try to 
> repopulate this table with new hive table bttn_bkp_testing which is just 
> created from bttn_bkp_test_new with following command:-->
> [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect 
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table 
> BTTN_BKP_TEST --export-dir  /home/hadoop/user/hive/warehouse/bttn_bkp_testing 
> -P -m 1  --input-fields-terminated-by '\0001' -verbose --update-key 
> BTTN_ID,DATA_INST_ID,SCR_ID --update-mode allowinsert
> 
> 
> 
> And I got below error error:-->
> 
> 13/03/20 12:13:39 DEBUG mapreduce.ExportInputFormat:   
> Paths:/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:0+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:67108864+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:134217728+65312499
>  Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;
> 
> 13/03/20 12:13:39 INFO mapred.JobClient: Running job: job_201303191912_0005
> 
> 13/03/20 12:13:40 INFO mapred.JobClient:  map 0% reduce 0%
> 
> 13/03/20 12:13:52 INFO mapred.JobClient: Task Id : 
> attempt_201303191912_0005_m_000000_0, Status : FAILED
> 
> java.io.IOException: Can't export data, please check task tracker logs
> 
>         at 
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> 
>         at 
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> 
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> 
>         at 
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> 
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> 
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> 
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 
>         at java.security.AccessController.doPrivileged(Native Method)
> 
>         at javax.security.auth.Subject.doAs(Subject.java:396)
> 
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> 
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> Caused by: java.lang.NumberFormatException
> 
>         at java.math.BigDecimal.<init>(BigDecimal.java:459)
> 
>         at java.math.BigDecimal.<init>(BigDecimal.java:728)
> 
>         at BTTN_BKP_TEST.__loadFromFields(BTTN_BKP_TEST.java:1314)
> 
>         at BTTN_BKP_TEST.parse(BTTN_BKP_TEST.java:1191)
> 
>         at 
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> 
>         ... 10 more
> 
> 
> 
> I am attaching Maper log here (syslog_for_first_export).
> 
> In this mapper log  I can see that input file value is  null.
> 
> 
> 
> Any Idea why its behave like above.
> 
> 
> 
> b.      2nd Scenario for you :-->
> 
> Now  i have truncated the bttn_bkp_test table in oracle and try to repopulate 
> this table with new hive table bttn_bkp_testing which is just created from 
> bttn_bkp_test_new with following command:-->
> 
> [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect 
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table 
> BTTN_BKP_TEST --export-dir  /home/hadoop/user/hive/warehouse/bttn_bkp_testing 
> -P -m 1  --input-fields-terminated-by '\0001' -verbose --update-key 
> BTTN_ID,DATA_INST_ID,SCR_ID --update-mode allowinsert --input-null-string 
> '\\N' --input-null-non-string '\\N'
> 
> 
> 
> And I got below error error:-->
> 
> 13/03/20 12:41:58 DEBUG mapreduce.ExportInputFormat:   
> Paths:/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:0+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:67108864+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:134217728+65312499
>  Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;
> 
> 13/03/20 12:41:58 INFO mapred.JobClient: Running job: job_201303191912_0007
> 
> 13/03/20 12:41:59 INFO mapred.JobClient:  map 0% reduce 0%
> 
> 13/03/20 12:42:15 INFO mapred.JobClient:  map 6% reduce 0%
> 
> 13/03/20 12:42:18 INFO mapred.JobClient:  map 11% reduce 0%
> 
> 13/03/20 12:42:21 INFO mapred.JobClient:  map 17% reduce 0%
> 
> 13/03/20 12:42:24 INFO mapred.JobClient:  map 22% reduce 0%
> 
> 13/03/20 12:42:27 INFO mapred.JobClient:  map 27% reduce 0%
> 
> 13/03/20 12:42:30 INFO mapred.JobClient:  map 33% reduce 0%
> 
> 13/03/20 12:42:33 INFO mapred.JobClient:  map 35% reduce 0%
> 
> 13/03/20 12:42:36 INFO mapred.JobClient:  map 39% reduce 0%
> 
> 13/03/20 12:42:39 INFO mapred.JobClient:  map 44% reduce 0%
> 
> 13/03/20 12:42:42 INFO mapred.JobClient:  map 46% reduce 0%
> 
> 13/03/20 12:42:45 INFO mapred.JobClient:  map 51% reduce 0%
> 
> 13/03/20 12:42:48 INFO mapred.JobClient:  map 56% reduce 0%
> 
> 13/03/20 12:42:51 INFO mapred.JobClient:  map 62% reduce 0%
> 
> 13/03/20 12:42:54 INFO mapred.JobClient:  map 65% reduce 0%
> 
> 13/03/20 12:42:59 INFO mapred.JobClient: Task Id : 
> attempt_201303191912_0007_m_000000_0, Status : FAILED
> 
> java.io.IOException: Can't export data, please check task tracker logs
> 
>         at 
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> 
>         at 
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> 
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> 
>         at 
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> 
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> 
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> 
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 
>         at java.security.AccessController.doPrivileged(Native Method)
> 
>         at javax.security.auth.Subject.doAs(Subject.java:396)
> 
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> 
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> Caused by: java.lang.IllegalArgumentException: Timestamp format must be 
> yyyy-mm-dd hh:mm:ss[.fffffffff]
> 
>         at java.sql.Timestamp.valueOf(Timestamp.java:185)
> 
>         at BTTN_BKP_TEST.__loadFromFields(BTTN_BKP_TEST.java:1374)
> 
>         at BTTN_BKP_TEST.parse(BTTN_BKP_TEST.java:1191)
> 
>         at 
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> 
>         ... 10 more
> 
> 
> 
> I am attaching Maper log here (syslog_for_2nd_export).
> 
> In this mapper log  I can see that input file value is  On input file: 
> /home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000.
> 
> 
> 
> Here I can see the null in Hive table bttn_bkp_testing is inserted as "null" 
> string in Oracle table BTTN_BKP_TEST.
> 
> 
> 
> hive> select * from bttn_bkp_testing
> 
>               > where bttn_id=39126;
> 
> 39126.0 32436.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0     
> #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0     1.0   
>   1.0     NULL   null     20.0    2010-05-04 14:31:17.0   dbmigration     
> 2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976 ei009724      
>   1.0     null    NULL    null   NULL     0.0     61253.0 61124.0 61124.0 
> 61253.0
> 
> 39126.0 50805.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0     
> #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0     1.0   
>   1.0     NULL   null     20.0    2010-05-23 23:18:54.604 ei103215        
> 2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976 ei009724      
>   1.0     null    NULL    null   NULL     0.0     61253.0 61124.0 61124.0 
> 61253.0
> 
> 39126.0 63196.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0     
> #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0     1.0   
>   1.0     NULL   null     20.0    2010-11-04 18:25:23.956 ei103215        
> 2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976 ei009724      
>   1.0     null    NULL    null   NULL     0.0     61253.0 61124.0 61124.0 
> 61253.0
> 
> 
> 
> These values are inserted into the oracle table BTTN_BKP_TEST as follows:--> .
> 
> SQL> Select * from BTTN_BKP_TEST where bttn_id=39126;
> 
> 
> 
> 39126            32436    3276       3              28           1            
>   1              1              1              #FFFFFF               #0000FF  
>       #0000FF               #FFFFFF               0              0            
>                                   1              1              1             
>                  null         20        05/04/2010 2:31:17.000000 PM   
> dbmigration       01/18/2013 9:11:18.370000 AM  DP_CQ4540        11/29/2010 
> 3:45:03.976000 PM  ei009724              1              null                  
>        null                         0              61253    61124    61124    
>     61253
> 
> 39126            50805    3276       3              28           1            
>   1              1              1              #FFFFFF               #0000FF  
>       #0000FF               #FFFFFF               0              0            
>                                   1              1              1             
>                  null         20        05/23/2010 11:18:54.604000 PM         
>        ei103215              01/18/2013 9:11:18.370000 AM  DP_CQ4540        
> 11/29/2010 3:45:03.976000 PM   ei009724              1              null      
>                    null                         0              61253        
> 61124    61124    61253
> 
> 39126            63196    3276       3              28           1            
>   1              1              1              #FFFFFF               #0000FF  
>       #0000FF               #FFFFFF               0              0            
>                                   1              1              1             
>                  null         20        11/04/2010 6:25:23.956000 PM   
> ei103215              01/18/2013 9:11:18.370000 AM  DP_CQ4540        
> 11/29/2010 3:45:03.976000 PM  ei009724              1              null       
>                   null                         0              61253    61124  
>   61124        61253
> 
> 
> 
> 
> 
> Looking for your valuable suggestion for the above facts.
> 
> Is this a bug in SQOOP?
> 
> 
> 
> Regards,
> 
> Ajit
> 
> 
> 
> 
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:[email protected]]
> Sent: Wednesday, March 20, 2013 2:56 AM
> To: [email protected]
> Subject: Re: Exporting hive table data into oracle give date format error
> 
> 
> 
> Hi Ajit,
> 
> thank you for sharing the additional data. I've noticed in your data that 
> some of the columns are using \N to denote the NULL value, however some other 
> columns are using string constant "null" (that do not denote NULL in Hive). 
> This also seems to be the case for column DEL_TS. My guess is that Sqoop is 
> trying to decode the "null" string as the timestamp and failing on the 
> "Timestamp format must be..." exception. I would recommend to unify the null 
> representation tokens and run Sqoop export with appropriate one.
> 
> 
> 
> Jarcec
> 
> 
> 
> On Tue, Mar 19, 2013 at 08:13:01AM +0000, Ajit Kumar Shreevastava wrote:
> 
> > Hi Jercec,
> 
> >
> 
> >
> 
> >
> 
> > Thank you for your valuable suggestions.
> 
> >
> 
> >
> 
> >
> 
> > I have applied the below suggestion and re-do all the process again with 
> > the SQOOP1.4.3 (sqoop-1.4.3.bin__hadoop-1.0.0.tar.gz) but I have face same 
> > below error again. Please suggest me.
> 
> >
> 
> >
> 
> >
> 
> > Here I have created table in hive as  suggested by you.
> 
> >
> 
> >
> 
> >
> 
> > hive> create table bttn_bkp_testing like bttn_bkp_test;
> 
> >
> 
> > hive> insert OVERWRITE table bttn_bkp_testing
> 
> >
> 
> >         > select * from bttn_bkp_test;
> 
> >
> 
> >
> 
> >
> 
> > I am also attaching the error file generated by task tracker for your 
> > analysis.
> 
> >
> 
> > It fails for bttn_id = 194628
> 
> >
> 
> >
> 
> >
> 
> > I have queried both the table and records are like
> 
> >
> 
> >
> 
> >
> 
> > hive> select * from bttn_bkp_testing
> 
> >
> 
> >     > where bttn_id=194628;
> 
> >
> 
> >
> 
> >
> 
> > 194628.0        577019.0        8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-04-19 23:25:48.78  
> > ei009724        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        706360.0        8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-05-21 01:01:53.629 
> > ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        1620395.0       8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-08-10 04:34:00.203 
> > ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        1694103.0       8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-11-08 01:09:15.136 
> > ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        1831767.0       8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-12-19 23:44:44.241 
> > e0025129        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> >
> 
> >
> 
> > And
> 
> >
> 
> > hive> select * from bttn_bkp_test_new
> 
> >
> 
> >     > where bttn_id=194628;
> 
> >
> 
> >
> 
> >
> 
> > 194628.0        577019.0        8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-04-19 23:25:48.78  
> > ei009724        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        706360.0        8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-05-21 01:01:53.629 
> > ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        1620395.0       8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-08-10 04:34:00.203 
> > ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        1694103.0       8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-11-08 01:09:15.136 
> > ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> > 194628.0        1831767.0       8910.0  19.0    1.0     1.0     1.0     0.0 
> >     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0  NULL    
> > NULL    NULL   1.0      NULL    null    20.0    2012-12-19 23:44:44.241 
> > e0025129        2013-01-18 09:11:30.245 DP_CQ4540       null    null    0.0 
> >     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0     61259.0 61230.0 
> > 61230.0 61259.0
> 
> >
> 
> >
> 
> >
> 
> > Regards,
> 
> >
> 
> > Ajit Kumar Shreevastava
> 
> >
> 
> >
> 
> >
> 
> > -----Original Message-----
> 
> > From: Jarek Jarcec Cecho [mailto:[email protected]]
> 
> > Sent: Sunday, March 17, 2013 4:29 AM
> 
> > To: [email protected]
> 
> > Subject: Re: Exporting hive table data into oracle give date format error
> 
> >
> 
> >
> 
> >
> 
> > [-CC 
> > [email protected]<mailto:[email protected]<mailto:[email protected]%3cmailto:[email protected]>>]
> 
> >
> 
> >
> 
> >
> 
> > Hi Ajit,
> 
> >
> 
> > would you mind upgrading to Sqoop 1.4.3? We've improved the logging for 
> > this particular exception, so it should significantly help in triangulating 
> > your issue.
> 
> >
> 
> >
> 
> >
> 
> > Jarcec
> 
> >
> 
> >
> 
> >
> 
> > On Wed, Mar 13, 2013 at 01:43:11PM +0000, Ajit Kumar Shreevastava wrote:
> 
> >
> 
> > > Hi All,
> 
> >
> 
> > >
> 
> >
> 
> > > Can you please let me know how can I bypass this error. I am currently 
> > > using Apache  SQOOP version 1.4.2.
> 
> >
> 
> > >
> 
> >
> 
> > >
> 
> >
> 
> > > [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect 
> > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table 
> > > BTTN_BKP_TEST --export-dir  /home/hadoop/user/hive/warehouse/bttn_bkp -P 
> > > -m 1  --input-fields-terminated-by '\0001' --verbose --input-null-string 
> > > '\\N' --input-null-non-string '\\N'
> 
> >
> 
> > >
> 
> >
> 
> > > Please set $HBASE_HOME to the root of your HBase installation.
> 
> >
> 
> > > 13/03/13 18:20:42 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> 
> >
> 
> > > Enter password:
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG sqoop.ConnFactory: Loaded manager factory:
> 
> >
> 
> > > com.cloudera.sqoop.manager.DefaultManagerFactory
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> 
> >
> 
> > > com.cloudera.sqoop.manager.DefaultManagerFactory
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.DefaultManagerFactory: Trying with
> 
> >
> 
> > > scheme: jdbc:oracle:thin:@10.99.42.11
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.OracleManager$ConnCache: Instantiated new 
> > > connection cache.
> 
> >
> 
> > > 13/03/13 18:20:47 INFO manager.SqlManager: Using default fetchSize of
> 
> >
> 
> > > 1000
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> 
> >
> 
> > > org.apache.sqoop.manager.OracleManager@74b23210<mailto:org.apache.sqoop.manager.OracleManager@74b23210<mailto:org.apache.sqoop.manager.OracleManager@74b23210%3cmailto:org.apache.sqoop.manager.OracleManager@74b23210>>
> 
> >
> 
> > > 13/03/13 18:20:47 INFO tool.CodeGenTool: Beginning code generation
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.OracleManager: Using column names
> 
> >
> 
> > > query: SELECT t.* FROM BTTN_BKP_TEST t WHERE 1=0
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.OracleManager: Creating a new
> 
> >
> 
> > > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using
> 
> >
> 
> > > username: HDFSUSER
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.OracleManager: No connection paramenters 
> > > specified. Using regular API for making connection.
> 
> >
> 
> > > 13/03/13 18:20:47 INFO manager.OracleManager: Time zone has been set
> 
> >
> 
> > > to GMT
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.SqlManager: Using fetchSize for next
> 
> >
> 
> > > query: 1000
> 
> >
> 
> > > 13/03/13 18:20:47 INFO manager.SqlManager: Executing SQL statement:
> 
> >
> 
> > > SELECT t.* FROM BTTN_BKP_TEST t WHERE 1=0
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG manager.OracleManager$ConnCache: Caching
> 
> >
> 
> > > released connection for
> 
> >
> 
> > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: selected columns:
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DATA_INST_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   SCR_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_NU
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   CAT
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   WDTH
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   HGHT
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   KEY_SCAN
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   KEY_SHFT
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BLM_FL
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   LCLZ_FL
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   MENU_ITEM_NU
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   ON_ATVT
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   ON_CLIK
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   ENBL_FL
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BLM_SET_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   MKT_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   CRTE_TS
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   CRTE_USER_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   UPDT_TS
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   UPDT_USER_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DEL_TS
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DEL_USER_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DLTD_FL
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   MENU_ITEM_NA
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   PRD_CD
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BLM_SET_NA
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   SOUND_FILE_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: Writing source file:
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TE
> 
> >
> 
> > > ST.java
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: Table name: BTTN_BKP_TEST
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> 
> >
> 
> > > DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2,
> 
> >
> 
> > > KEY_SCAN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12,
> 
> >
> 
> > > BKGD_CPTN_COLR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2,
> 
> >
> 
> > > MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2,
> 
> >
> 
> > > BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93,
> 
> >
> 
> > > CRTE_USER_ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93,
> 
> >
> 
> > > DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, PRD_CD:2, BLM_SET_NA:12,
> 
> >
> 
> > > SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2,
> 
> >
> 
> > > FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> 
> >
> 
> > > BKGD_CPTN_COLR_PRSD_ID:2,
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: sourceFilename is
> 
> >
> 
> > > BTTN_BKP_TEST.java
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager: Found existing
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/
> 
> >
> 
> > > 13/03/13 18:20:47 INFO orm.CompilationManager: HADOOP_HOME is 
> > > /home/hadoop/hadoop-1.0.3/libexec/..
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager: Adding source file:
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TE
> 
> >
> 
> > > ST.java
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager: Invoking javac with args:
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   -sourcepath
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   -d
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   -classpath
> 
> >
> 
> > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   
> > > /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 
> >
> 
> > > Note: 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TEST.java
> > >  uses or overrides a deprecated API.
> 
> >
> 
> > > Note: Recompile with -Xlint:deprecation for details.
> 
> >
> 
> > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Could not rename
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TE
> 
> >
> 
> > > ST.java to /home/hadoop/sqoop-oper/./BTTN_BKP_TEST.java
> 
> >
> 
> > > org.apache.commons.io.FileExistsException: Destination 
> > > '/home/hadoop/sqoop-oper/./BTTN_BKP_TEST.java' already exists
> 
> >
> 
> > >         at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> 
> >
> 
> > >         at 
> > > org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
> 
> >
> 
> > >         at 
> > > org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
> 
> >
> 
> > >         at 
> > > org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
> 
> >
> 
> > >         at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
> 
> >
> 
> > >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> 
> >
> 
> > >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> 
> >
> 
> > >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> 
> >
> 
> > >         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> 
> >
> 
> > >         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> 
> >
> 
> > >         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> 
> >
> 
> > >         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> 
> >
> 
> > > 13/03/13 18:20:48 INFO orm.CompilationManager: Writing jar file:
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TE
> 
> >
> 
> > > ST.jar
> 
> >
> 
> > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Scanning for .class
> 
> >
> 
> > > files in directory:
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531
> 
> >
> 
> > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Got classfile:
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TE
> 
> >
> 
> > > ST.class -> BTTN_BKP_TEST.class
> 
> >
> 
> > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Finished writing jar
> 
> >
> 
> > > file
> 
> >
> 
> > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TE
> 
> >
> 
> > > ST.jar
> 
> >
> 
> > > 13/03/13 18:20:48 INFO mapreduce.ExportJobBase: Beginning export of
> 
> >
> 
> > > BTTN_BKP_TEST
> 
> >
> 
> > > 13/03/13 18:20:48 DEBUG mapreduce.JobBase: Using InputFormat: class
> 
> >
> 
> > > org.apache.sqoop.mapreduce.ExportInputFormat
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG manager.OracleManager$ConnCache: Got cached
> 
> >
> 
> > > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
> 
> >
> 
> > > 13/03/13 18:20:49 INFO manager.OracleManager: Time zone has been set
> 
> >
> 
> > > to GMT
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG manager.OracleManager$ConnCache: Caching
> 
> >
> 
> > > released connection for
> 
> >
> 
> > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/ojdbc6.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/ojdbc6.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/paranamer-2.3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/avro-1.5.3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/commons-io-1.4.jar
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> 
> >
> 
> > > file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar
> 
> >
> 
> > > 13/03/13 18:20:49 INFO input.FileInputFormat: Total input paths to
> 
> >
> 
> > > process : 1
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat: Target
> 
> >
> 
> > > numMapTasks=1
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat: Total input
> 
> >
> 
> > > bytes=172704981
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat:
> 
> >
> 
> > > maxSplitSize=172704981
> 
> >
> 
> > > 13/03/13 18:20:49 INFO input.FileInputFormat: Total input paths to
> 
> >
> 
> > > process : 1
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 
> >
> 
> > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat:   
> > > Paths:/home/hadoop/user/hive/warehouse/bttn_bkp/000000_0:0+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp/000000_0:67108864+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp/000000_0:134217728+38487253
> > >  Locations:NHCLT-PC44-2:;
> 
> >
> 
> > > 13/03/13 18:20:49 INFO mapred.JobClient: Running job:
> 
> >
> 
> > > job_201303121648_0018
> 
> >
> 
> > > 13/03/13 18:20:50 INFO mapred.JobClient:  map 0% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:06 INFO mapred.JobClient:  map 8% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:09 INFO mapred.JobClient:  map 13% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:12 INFO mapred.JobClient:  map 17% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:15 INFO mapred.JobClient:  map 21% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:18 INFO mapred.JobClient:  map 26% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:21 INFO mapred.JobClient:  map 30% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:24 INFO mapred.JobClient:  map 35% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:27 INFO mapred.JobClient:  map 40% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:30 INFO mapred.JobClient:  map 45% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:33 INFO mapred.JobClient:  map 50% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:36 INFO mapred.JobClient:  map 53% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:39 INFO mapred.JobClient:  map 58% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:42 INFO mapred.JobClient:  map 62% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:45 INFO mapred.JobClient:  map 65% reduce 0%
> 
> >
> 
> > > 13/03/13 18:21:47 INFO mapred.JobClient: Task Id :
> 
> >
> 
> > > attempt_201303121648_0018_m_000000_0, Status : FAILED
> 
> >
> 
> > > java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd 
> > > hh:mm:ss[.fffffffff]
> 
> >
> 
> > >         at java.sql.Timestamp.valueOf(Timestamp.java:185)
> 
> >
> 
> > >         at BTTN_BKP_TEST.__loadFromFields(BTTN_BKP_TEST.java:1331)
> 
> >
> 
> > >         at BTTN_BKP_TEST.parse(BTTN_BKP_TEST.java:1148)
> 
> >
> 
> > >         at 
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
> 
> >
> 
> > >         at 
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
> 
> >
> 
> > >         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> 
> >
> 
> > >        at 
> > > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> 
> >
> 
> > >         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> 
> >
> 
> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> 
> >
> 
> > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 
> >
> 
> > >         at java.security.AccessController.doPrivileged(Native Method)
> 
> >
> 
> > >         at javax.security.auth.Subject.doAs(Subject.java:396)
> 
> >
> 
> > >         at 
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> 
> >
> 
> > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> >
> 
> > >
> 
> >
> 
> > >
> 
> >
> 
> > > ::DISCLAIMER::
> 
> >
> 
> > > ----------------------------------------------------------------------
> 
> >
> 
> > > ----------------------------------------------------------------------
> 
> >
> 
> > > --------
> 
> >
> 
> > >
> 
> >
> 
> > > The contents of this e-mail and any attachment(s) are confidential and 
> > > intended for the named recipient(s) only.
> 
> >
> 
> > > E-mail transmission is not guaranteed to be secure or error-free as
> 
> >
> 
> > > information could be intercepted, corrupted, lost, destroyed, arrive
> 
> >
> 
> > > late or incomplete, or may contain viruses in transmission. The e mail 
> > > and its contents (with or without referred errors) shall therefore not 
> > > attach any liability on the originator or HCL or its affiliates.
> 
> >
> 
> > > Views or opinions, if any, presented in this email are solely those of
> 
> >
> 
> > > the author and may not necessarily reflect the views or opinions of
> 
> >
> 
> > > HCL or its affiliates. Any form of reproduction, dissemination,
> 
> >
> 
> > > copying, disclosure, modification, distribution and / or publication of 
> > > this message without the prior written consent of authorized 
> > > representative of HCL is strictly prohibited. If you have received this 
> > > email in error please delete it and notify the sender immediately.
> 
> >
> 
> > > Before opening any email and/or attachments, please check them for 
> > > viruses and other defects.
> 
> >
> 
> > >
> 
> >
> 
> > > ----------------------------------------------------------------------
> 
> >
> 
> > > ----------------------------------------------------------------------
> 
> >
> 
> > > --------
> 
> 
> 
> 



Attachment: signature.asc
Description: Digital signature

Reply via email to