Hi,
For sqoop, I would like to implement the following operation??
#!/usr/bin/env bash
# Need to use sqoop to import multiple tables
# sqoop.sh
for table in test1 test2 test3 test4 test5 test6 .
do
sqoop import .
done
but,Oozie seems to be only scheduling a sqoop-action
so,I would like to bulk sqoop through the implementation of shell
${jobTracker}
${nameNode}
mapred.job.queue.name
${queueName}
sqoop.sh
sqoop.sh#sqoop.sh
Map/Reduce failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]
but yarn logs appears below the exception??
2016-09-06 17:41:06,682 INFO [Thread-56]
org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopped
JobHistoryEventHandler. super.stop()
2016-09-06 17:41:06,683 INFO [Thread-56]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Setting job
diagnostics to Job init failed :
org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.io.FileNotFoundException: File does not exist:
hdfs://kunlundev02:8020/user/bbd/.staging/job_1470312512846_0152/job.splitmetainfo
at
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition.createSplits(JobImpl.java:1580)
at
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition.transition(JobImpl.java:1444)
at
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition.transition(JobImpl.java:1402)
at
org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
at
org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
at
org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
at
org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
at
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.handle(JobImpl.java:996)
at
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.handle(JobImpl.java:138)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobEventDispatcher.handle(MRAppMaster.java:1333)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1101)
at
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1540)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1536)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1469)
Caused by: java.io.FileNotFoundException: File does not exist:
hdfs://kunlundev02:8020/user/bbd/.staging/job_1470312512846_0152/job.splitmetainfo
at
org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1219)
at
org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1211)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1211)
at
org.apache.hadoop.mapreduce.split.SplitMetaInfoReader.readSplitMetaInfo(SplitMetaInfoReader.java:51)
at
org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl$InitTransition.createSplits(JobImpl.java:1575)
... 17 more
-- --
??: "Peter Cseh";<gezap...@cloudera.com>;
: 2016??9??6??(??) 8:11
??: "user"<user@oozie.apache.org>;
: Re: oozie execute shell(content hive or sqoop command)
Hi,
you may use the Sqoop action to do the import:
https://oozie.apache.org/docs/4.2.0/DG_SqoopActionExtension.html
gp
On Tue, Sep 6, 2016 at 1:51 PM, wangwei <963906...@qq.com> wrote:
> Hi,
> I have a scene, there are a lot of tables need to use the sqoop import
> mysql, so I need to write the sqoop in the shell script, to cycle through
> all the tables.
> It still appears the same error??
>
>
>
>
> -- --
> ??: "satish saley";<satishsale...@gmail.com>;
> ????: 2016??9??6??(??) ????7:21
> ??: "user"<user@oozie.apache.org>;
>
> : Re: oozie execute shell(content hive or sqoop command)
>
>
>
> Hi,
> For hive scripts, use hive-action. It would easy to follow the pipel