You are welcome, Sir.

By the way, for this new exception, could you provide more about the column
type? It's 'text', not large 'varchar', right? Thanks.

Best,
Mengwei


On Thu, Jul 25, 2013 at 10:21 AM, Κωνσταντίνος Αρετάκης <[email protected]
> wrote:

> Thanks a lot.
>
>
> I did provided partition column at first but I got the following error.
>
> The column was text type.
>
> Now I provided another column with int type and it worked fine.
> So i guess was that I specified partition column with not supported type
> Thanks again!!!
>
>
>
> Exception: org.apache.sqoop.common.SqoopException:
> GENERIC_JDBC_CONNECTOR_0011:The type is not supported - -1
> Stack trace: org.apache.sqoop.common.SqoopException:
> GENERIC_JDBC_CONNECTOR_0011:The type is not supported - -1
> at
> org.apache.sqoop.connector.jdbc.GenericJdbcImportPartitioner.getPartitions(GenericJdbcImportPartitioner.java:87)
>  at
> org.apache.sqoop.connector.jdbc.GenericJdbcImportPartitioner.getPartitions(GenericJdbcImportPartitioner.java:32)
> at
> org.apache.sqoop.job.mr.SqoopInputFormat.getSplits(SqoopInputFormat.java:71)
>  at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
> at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
>  at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
>  at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>  at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
>  at
> org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:265)
> at
> org.apache.sqoop.framework.FrameworkManager.submit(FrameworkManager.java:467)
>  at
> org.apache.sqoop.handler.SubmissionRequestHandler.submissionSubmit(SubmissionRequestHandler.java:112)
> at
> org.apache.sqoop.handler.SubmissionRequestHandler.handleActionEvent(SubmissionRequestHandler.java:98)
>  at
> org.apache.sqoop.handler.SubmissionRequestHandler.handleEvent(SubmissionRequestHandler.java:68)
> at
> org.apache.sqoop.server.v1.SubmissionServlet.handlePostRequest(SubmissionServlet.java:44)
>  at
> org.apache.sqoop.server.SqoopProtocolServlet.doPost(SqoopProtocolServlet.java:63)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
>  at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
>  at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>  at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
>  at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
>  at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
> at
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
>  at
> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
> at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
>  at java.lang.Thread.run(Thread.java:724)
>
>
>
> On Thu, Jul 25, 2013 at 8:09 PM, Mengwei Ding <[email protected]>wrote:
>
>> Hi Konstantinos,
>>
>> Basically, from the exception itself, I could guess that you did not
>> specify column for partition when creating a job. But, still providing more
>> information would be better. :)
>>
>> Best,
>> Mengwei
>>
>>
>> On Thu, Jul 25, 2013 at 10:06 AM, Mengwei Ding <[email protected]>wrote:
>>
>>> Hi Konstantinos,
>>>
>>> Could you kindly provide more information for better diagnose?
>>>
>>> First, please enter the follow command in sqoop2 command line
>>>
>>>  *set option --name verbose --value true*
>>>
>>> and restart your job and then paste the full stack trace.
>>>
>>> Second, could you please show us your job configuration? Please execute
>>>
>>> show job --jid <job_id> --all
>>>
>>>
>>>
>>> On Thu, Jul 25, 2013 at 9:58 AM, Κωνσταντίνος Αρετάκης <
>>> [email protected]> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I try import a table from mysql to HDFS and I get this error
>>>>
>>>> org.apache.sqoop.common.SqoopException: GENERIC_JDBC_CONNECTOR_0005:No
>>>> column is found to partition data
>>>>
>>>> Please help.
>>>>
>>>> Thanks
>>>>
>>>
>>>
>>
>

Reply via email to