Hi,
I want to use a library (JAI) with spark to parse some spatial raster files.
Unfortunately, there are some strange issues. JAI only works when running via
the build tool i.e. `sbt run` when executed in spark.
When executed via spark-submit the error is:
java.lang.IllegalArgumentException: The input argument(s) may not be null.
at
javax.media.jai.ParameterBlockJAI.getDefaultMode(ParameterBlockJAI.java:136)
at javax.media.jai.ParameterBlockJAI.<init>(ParameterBlockJAI.java:157)
at javax.media.jai.ParameterBlockJAI.<init>(ParameterBlockJAI.java:178)
at
org.geotools.process.raster.PolygonExtractionProcess.execute(PolygonExtractionProcess.java:171)
Which looks like some native dependency is not there correctly.
Assuming something is wrong with the class path I tried to run a plain
java/scala function. but this one works just fine.
Is spark messing with the class paths? Actually when trying to run the jar in
NiFi, I see the same problem.
I created a minimal example here:
https://github.com/geoHeil/jai-packaging-problem
<https://github.com/geoHeil/jai-packaging-problem>
Are you aware of any other library which can read ESRi ASCII Grid files and
polygonize them?
I think so far the problem could be tracked down to the instantiation of
new ParameterBlockJAI("Vectorize") which is returning null i.e. the
JAI.registry not finding the desired operation but returning a different error
when the one you usually get.
Regards,
Georg
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
GeoTools-Devel mailing list
GeoTools-Devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/geotools-devel