Sorry, I call sqoop job from above function. Can you help me to resolve this.
Thanks On Fri, Aug 30, 2019 at 1:31 AM Chetan Khatri <chetan.opensou...@gmail.com> wrote: > Hi Users, > I am launching a Sqoop job from Spark job and would like to FAIL Spark job > if Sqoop job fails. > > def executeSqoopOriginal(serverName: String, schemaName: String, username: > String, password: String, > query: String, splitBy: String, fetchSize: Int, numMappers: > Int, targetDir: String, jobName: String, dateColumns: String) = { > > val connectionString = "jdbc:sqlserver://" + serverName + ";" + > "databaseName=" + schemaName > var parameters = Array("import") > parameters = parameters :+ "-Dmapreduce.job.user.classpath.first=true" > parameters = parameters :+ "--connect" > parameters = parameters :+ connectionString > parameters = parameters :+ "--mapreduce-job-name" > parameters = parameters :+ jobName > parameters = parameters :+ "--username" > parameters = parameters :+ username > parameters = parameters :+ "--password" > parameters = parameters :+ password > parameters = parameters :+ "--hadoop-mapred-home" > parameters = parameters :+ "/usr/hdp/2.6.5.0-292/hadoop-mapreduce/" > parameters = parameters :+ "--hadoop-home" > parameters = parameters :+ "/usr/hdp/2.6.5.0-292/hadoop/" > parameters = parameters :+ "--query" > parameters = parameters :+ query > parameters = parameters :+ "--split-by" > parameters = parameters :+ splitBy > parameters = parameters :+ "--fetch-size" > parameters = parameters :+ fetchSize.toString > parameters = parameters :+ "--num-mappers" > parameters = parameters :+ numMappers.toString > if (dateColumns.length() > 0) { > parameters = parameters :+ "--map-column-java" > parameters = parameters :+ dateColumns > } > parameters = parameters :+ "--target-dir" > parameters = parameters :+ targetDir > parameters = parameters :+ "--delete-target-dir" > parameters = parameters :+ "--as-avrodatafile" > > } > >