My spark job fails by trying to initialize Tachyon FS though not configured to. With *Standalone*-setup in my laptop the job *succeeds*. But with *Medos+docker* setup it *fails*(once every few runs). Pls reply if anyone has seen this or know why its looking for Tachyon at all. Below is the exception it fails with:
spark version : 1.6.0 Framework registered with 25a4e045-ffbd-479a-b24c-2d2cfcbf84c9-56355 2016/07/20 15:31:47 ERROR SparkContext: Error initializing SparkContext. java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider tachyon.hadoop.TFS could not be instantiated at java.util.ServiceLoader.fail(ServiceLoader.java:232) at java.util.ServiceLoader.access$100(ServiceLoader.java:185) at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384) at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) at java.util.ServiceLoader$1.next(ServiceLoader.java:480) at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2563) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1650) at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:66) at org.apache.spark.SparkContext.<init>(SparkContext.scala:547) at org.apache.spark.SparkContext.<init>(SparkContext.scala:123) at c at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ExceptionInInitializerError at tachyon.hadoop.AbstractTFS.<init>(AbstractTFS.java:72) at tachyon.hadoop.TFS.<init>(TFS.java:28) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at java.lang.Class.newInstance(Class.java:442) at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380) ... 24 more Caused by: java.util.ConcurrentModificationException at java.util.Hashtable$Enumerator.next(Hashtable.java:1367) at java.util.Hashtable.putAll(Hashtable.java:522) at tachyon.conf.TachyonConf.<init>(TachyonConf.java:158) at tachyon.conf.TachyonConf.<init>(TachyonConf.java:111) at tachyon.client.ClientContext.reset(ClientContext.java:57) at tachyon.client.ClientContext.<clinit>(ClientContext.java:47)