Hello folks,

I have a Spark Streaming application built with Maven (as jar) and deployed
with the spark-submit script. The application project has the following
(main) structure:

myApp

src

main

scala

com.mycompany.package

MyApp.scala
DoSomething.scala
...

resources

aPerlScript.pl
...

test

scala

com.mycompany.package

MyAppTest.scala

...

target

...

pom.xml


In the DoSomething.scala object I have a method (let's call it
doSomething()) that tries to run a perl script as an external
scala.sys.process.Process , taken from the resources folder. I call then
DoSomething.doSomething(). Ok, here's the *issue*: I was not able to access
such script, not with absolute paths, relative paths,
getClass.getClassLoader.getResource, getClass.getResource, I have specified
the resources folder in my pom.xml...None of my attempts succeeded: I don't
know how to find the stuff I put in src/main/resources.

I will appreciate any help.

SIDE NOTES:

   - I use an external Process instead of a Spark  pipe because, at this
   step of my workflow, I must handle binary files as input and output.
   - I'm using spark-streaming 1.1.0, Scala 2.10.4 and Java 7. I build the
   jar with "Maven install" from within Eclipse (Kepler)
   - When I use the getClass.getClassLoader.getResource "standard" method
   to access resources I find that the actual classpath is the "spark-submit"
   script's one.
   -


Thank you and best regards,
Roberto

Reply via email to