[
https://issues.apache.org/jira/browse/SPARK-16745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15395194#comment-15395194
]
Sean Owen commented on SPARK-16745:
-----------------------------------
I think the answer is reasonably clear here -- it's delayed in initializing
because it can't talk to services it started? that seems like a network or env
issue, not a bug.
> Spark job completed however have to wait for 13 mins (data size is small)
> -------------------------------------------------------------------------
>
> Key: SPARK-16745
> URL: https://issues.apache.org/jira/browse/SPARK-16745
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.6.1
> Environment: Max OS X Yosemite, Terminal, MacBook Air Late 2014
> Reporter: Joe Chong
> Priority: Minor
>
> I submitted a job in scala spark shell to show a DataFrame. The data size is
> about 43K. The job was successful in the end, but took more than 13 minutes
> to resolve. Upon checking the log, there's multiple exception raised on
> "Failed to check existence of class...." with a java.net.connectionexpcetion
> message indicating timeout trying to connect to the port 52067, the repl port
> that Spark setup. Please assist to troubleshoot. Thanks.
> Started Spark in standalone mode
> $ spark-shell --driver-memory 5g --master local[*]
> 16/07/26 21:05:29 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 16/07/26 21:05:30 INFO spark.SecurityManager: Changing view acls to: joechong
> 16/07/26 21:05:30 INFO spark.SecurityManager: Changing modify acls to:
> joechong
> 16/07/26 21:05:30 INFO spark.SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(joechong); users
> with modify permissions: Set(joechong)
> 16/07/26 21:05:30 INFO spark.HttpServer: Starting HTTP Server
> 16/07/26 21:05:30 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 16/07/26 21:05:30 INFO server.AbstractConnector: Started
> [email protected]:52067
> 16/07/26 21:05:30 INFO util.Utils: Successfully started service 'HTTP class
> server' on port 52067.
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /___/ .__/\_,_/_/ /_/\_\ version 1.6.1
> /_/
> Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 16/07/26 21:05:34 INFO spark.SparkContext: Running Spark version 1.6.1
> 16/07/26 21:05:34 INFO spark.SecurityManager: Changing view acls to: joechong
> 16/07/26 21:05:34 INFO spark.SecurityManager: Changing modify acls to:
> joechong
> 16/07/26 21:05:34 INFO spark.SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(joechong); users
> with modify permissions: Set(joechong)
> 16/07/26 21:05:35 INFO util.Utils: Successfully started service 'sparkDriver'
> on port 52072.
> 16/07/26 21:05:35 INFO slf4j.Slf4jLogger: Slf4jLogger started
> 16/07/26 21:05:35 INFO Remoting: Starting remoting
> 16/07/26 21:05:35 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://[email protected]:52074]
> 16/07/26 21:05:35 INFO util.Utils: Successfully started service
> 'sparkDriverActorSystem' on port 52074.
> 16/07/26 21:05:35 INFO spark.SparkEnv: Registering MapOutputTracker
> 16/07/26 21:05:35 INFO spark.SparkEnv: Registering BlockManagerMaster
> 16/07/26 21:05:35 INFO storage.DiskBlockManager: Created local directory at
> /private/var/folders/r7/bs2f87nj6lnd5vm51lvxcw680000gn/T/blockmgr-cd542a27-6ff1-4f51-a72b-78654142fdb6
> 16/07/26 21:05:35 INFO storage.MemoryStore: MemoryStore started with capacity
> 3.4 GB
> 16/07/26 21:05:35 INFO spark.SparkEnv: Registering OutputCommitCoordinator
> 16/07/26 21:05:36 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 16/07/26 21:05:36 INFO server.AbstractConnector: Started
> [email protected]:4040
> 16/07/26 21:05:36 INFO util.Utils: Successfully started service 'SparkUI' on
> port 4040.
> 16/07/26 21:05:36 INFO ui.SparkUI: Started SparkUI at
> http://10.199.29.218:4040
> 16/07/26 21:05:36 INFO executor.Executor: Starting executor ID driver on host
> localhost
> 16/07/26 21:05:36 INFO executor.Executor: Using REPL class URI:
> http://10.199.29.218:52067
> 16/07/26 21:05:36 INFO util.Utils: Successfully started service
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 52075.
> 16/07/26 21:05:36 INFO netty.NettyBlockTransferService: Server created on
> 52075
> 16/07/26 21:05:36 INFO storage.BlockManagerMaster: Trying to register
> BlockManager
> 16/07/26 21:05:36 INFO storage.BlockManagerMasterEndpoint: Registering block
> manager localhost:52075 with 3.4 GB RAM, BlockManagerId(driver, localhost,
> 52075)
> 16/07/26 21:05:36 INFO storage.BlockManagerMaster: Registered BlockManager
> 16/07/26 21:05:36 INFO repl.SparkILoop: Created spark context..
> Spark context available as sc.
> 16/07/26 21:05:37 INFO hive.HiveContext: Initializing execution hive, version
> 1.2.1
> 16/07/26 21:05:37 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0
> 16/07/26 21:05:37 INFO client.ClientWrapper: Loaded
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
> 16/07/26 21:05:38 INFO metastore.HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 16/07/26 21:05:38 INFO metastore.ObjectStore: ObjectStore, initialize called
> 16/07/26 21:05:38 INFO DataNucleus.Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 16/07/26 21:05:38 INFO DataNucleus.Persistence: Property
> datanucleus.cache.level2 unknown - will be ignored
> 16/07/26 21:05:38 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/07/26 21:05:39 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/07/26 21:05:40 INFO metastore.ObjectStore: Setting MetaStore object pin
> classes with
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 16/07/26 21:05:41 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/07/26 21:05:41 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only"
> so does not have its own datastore table.
> 16/07/26 21:05:42 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/07/26 21:05:42 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only"
> so does not have its own datastore table.
> 16/07/26 21:05:43 INFO metastore.MetaStoreDirectSql: Using direct SQL,
> underlying DB is DERBY
> 16/07/26 21:05:43 INFO metastore.ObjectStore: Initialized ObjectStore
> 16/07/26 21:05:43 WARN metastore.ObjectStore: Version information not found
> in metastore. hive.metastore.schema.verification is not enabled so recording
> the schema version 1.2.0
> 16/07/26 21:05:43 WARN metastore.ObjectStore: Failed to get database default,
> returning NoSuchObjectException
> 16/07/26 21:05:44 WARN : Your hostname, Joes-MBA.local resolves to a
> loopback/non-reachable address: fe80:0:0:0:2876:f450:845f:4323%utun0, but we
> couldn't find any external IP address!
> 16/07/26 21:05:49 INFO metastore.HiveMetaStore: Added admin role in metastore
> 16/07/26 21:05:49 INFO metastore.HiveMetaStore: Added public role in metastore
> 16/07/26 21:05:49 INFO metastore.HiveMetaStore: No user is added in admin
> role, since config is empty
> 16/07/26 21:05:49 INFO metastore.HiveMetaStore: 0: get_all_databases
> 16/07/26 21:05:49 INFO HiveMetaStore.audit: ugi=joechong
> ip=unknown-ip-addr cmd=get_all_databases
> 16/07/26 21:05:49 INFO metastore.HiveMetaStore: 0: get_functions: db=default
> pat=*
> 16/07/26 21:05:49 INFO HiveMetaStore.audit: ugi=joechong
> ip=unknown-ip-addr cmd=get_functions: db=default pat=*
> 16/07/26 21:05:49 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/07/26 21:05:51 INFO session.SessionState: Created local directory:
> /var/folders/r7/bs2f87nj6lnd5vm51lvxcw680000gn/T/52cde745-d8e5-4b01-bfe6-ecf311f0968b_resources
> 16/07/26 21:05:51 INFO session.SessionState: Created HDFS directory:
> /tmp/hive/joechong/52cde745-d8e5-4b01-bfe6-ecf311f0968b
> 16/07/26 21:05:51 INFO session.SessionState: Created local directory:
> /var/folders/r7/bs2f87nj6lnd5vm51lvxcw680000gn/T/joechong/52cde745-d8e5-4b01-bfe6-ecf311f0968b
> 16/07/26 21:05:51 INFO session.SessionState: Created HDFS directory:
> /tmp/hive/joechong/52cde745-d8e5-4b01-bfe6-ecf311f0968b/_tmp_space.db
> 16/07/26 21:05:51 INFO hive.HiveContext: default warehouse location is
> /user/hive/warehouse
> 16/07/26 21:05:51 INFO hive.HiveContext: Initializing HiveMetastoreConnection
> version 1.2.1 using Spark classes.
> 16/07/26 21:05:51 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0
> 16/07/26 21:05:51 INFO client.ClientWrapper: Loaded
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
> 16/07/26 21:05:52 INFO metastore.HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 16/07/26 21:05:52 INFO metastore.ObjectStore: ObjectStore, initialize called
> 16/07/26 21:05:52 INFO DataNucleus.Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 16/07/26 21:05:52 INFO DataNucleus.Persistence: Property
> datanucleus.cache.level2 unknown - will be ignored
> 16/07/26 21:05:52 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/07/26 21:05:53 WARN DataNucleus.Connection: BoneCP specified but not
> present in CLASSPATH (or one of dependencies)
> 16/07/26 21:05:53 INFO metastore.ObjectStore: Setting MetaStore object pin
> classes with
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
> 16/07/26 21:05:54 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/07/26 21:05:54 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only"
> so does not have its own datastore table.
> 16/07/26 21:05:54 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/07/26 21:05:54 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only"
> so does not have its own datastore table.
> 16/07/26 21:05:54 INFO DataNucleus.Query: Reading in results for query
> "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is
> closing
> 16/07/26 21:05:54 INFO metastore.MetaStoreDirectSql: Using direct SQL,
> underlying DB is DERBY
> 16/07/26 21:05:54 INFO metastore.ObjectStore: Initialized ObjectStore
> 16/07/26 21:05:55 INFO metastore.HiveMetaStore: Added admin role in metastore
> 16/07/26 21:05:55 INFO metastore.HiveMetaStore: Added public role in metastore
> 16/07/26 21:05:55 INFO metastore.HiveMetaStore: No user is added in admin
> role, since config is empty
> 16/07/26 21:05:55 INFO metastore.HiveMetaStore: 0: get_all_databases
> 16/07/26 21:05:55 INFO HiveMetaStore.audit: ugi=joechong
> ip=unknown-ip-addr cmd=get_all_databases
> 16/07/26 21:05:55 INFO metastore.HiveMetaStore: 0: get_functions: db=default
> pat=*
> 16/07/26 21:05:55 INFO HiveMetaStore.audit: ugi=joechong
> ip=unknown-ip-addr cmd=get_functions: db=default pat=*
> 16/07/26 21:05:55 INFO DataNucleus.Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as
> "embedded-only" so does not have its own datastore table.
> 16/07/26 21:05:55 INFO session.SessionState: Created local directory:
> /var/folders/r7/bs2f87nj6lnd5vm51lvxcw680000gn/T/1c660583-266a-41a2-b01c-682fe360fdf9_resources
> 16/07/26 21:05:55 INFO session.SessionState: Created HDFS directory:
> /tmp/hive/joechong/1c660583-266a-41a2-b01c-682fe360fdf9
> 16/07/26 21:05:55 INFO session.SessionState: Created local directory:
> /var/folders/r7/bs2f87nj6lnd5vm51lvxcw680000gn/T/joechong/1c660583-266a-41a2-b01c-682fe360fdf9
> 16/07/26 21:05:55 INFO session.SessionState: Created HDFS directory:
> /tmp/hive/joechong/1c660583-266a-41a2-b01c-682fe360fdf9/_tmp_space.db
> 16/07/26 21:05:55 INFO repl.SparkILoop: Created sql context (with Hive
> support)..
> SQL context available as sqlContext.
> Submitted to show a DataFrame, but took 13 minutes to complete. In between,
> there’s the same exception raised multiple times around “Failed to check
> existence of class org.apache.spark.sql.catalyst.expressions.GeneratedClass
> on REPL class server
> scala> creditDF.show
> 16/07/26 21:06:58 INFO mapred.FileInputFormat: Total input paths to process :
> 1
> 16/07/26 21:06:58 INFO spark.SparkContext: Starting job: show at <console>:63
> 16/07/26 21:06:58 INFO scheduler.DAGScheduler: Got job 0 (show at
> <console>:63) with 1 output partitions
> 16/07/26 21:06:58 INFO scheduler.DAGScheduler: Final stage: ResultStage 0
> (show at <console>:63)
> 16/07/26 21:06:58 INFO scheduler.DAGScheduler: Parents of final stage: List()
> 16/07/26 21:06:58 INFO scheduler.DAGScheduler: Missing parents: List()
> 16/07/26 21:06:58 INFO scheduler.DAGScheduler: Submitting ResultStage 0
> (MapPartitionsRDD[10] at show at <console>:63), which has no missing parents
> 16/07/26 21:06:58 INFO storage.MemoryStore: Block broadcast_1 stored as
> values in memory (estimated size 23.1 KB, free 104.4 KB)
> 16/07/26 21:06:58 INFO storage.MemoryStore: Block broadcast_1_piece0 stored
> as bytes in memory (estimated size 8.1 KB, free 112.4 KB)
> 16/07/26 21:06:58 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in
> memory on localhost:52075 (size: 8.1 KB, free: 3.4 GB)
> 16/07/26 21:06:58 INFO spark.SparkContext: Created broadcast 1 from broadcast
> at DAGScheduler.scala:1006
> 16/07/26 21:06:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
> from ResultStage 0 (MapPartitionsRDD[10] at show at <console>:63)
> 16/07/26 21:06:58 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with
> 1 tasks
> 16/07/26 21:06:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage
> 0.0 (TID 0, localhost, partition 0,ANY, 2165 bytes)
> 16/07/26 21:06:58 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID
> 0)
> 16/07/26 21:06:58 INFO spark.CacheManager: Partition rdd_7_0 not found,
> computing it
> 16/07/26 21:06:58 INFO rdd.HadoopRDD: Input split:
> hdfs://localhost:9000/user/joechong/creditrisk/germancredit.csv:0+23896
> 16/07/26 21:06:58 INFO Configuration.deprecation: mapred.tip.id is
> deprecated. Instead, use mapreduce.task.id
> 16/07/26 21:06:58 INFO Configuration.deprecation: mapred.task.id is
> deprecated. Instead, use mapreduce.task.attempt.id
> 16/07/26 21:06:58 INFO Configuration.deprecation: mapred.task.is.map is
> deprecated. Instead, use mapreduce.task.ismap
> 16/07/26 21:06:58 INFO Configuration.deprecation: mapred.task.partition is
> deprecated. Instead, use mapreduce.task.partition
> 16/07/26 21:06:58 INFO Configuration.deprecation: mapred.job.id is
> deprecated. Instead, use mapreduce.job.id
> 16/07/26 21:08:14 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class org.apache.spark.sql.catalyst.expressions.GeneratedClass on REPL
> class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:319)
> at
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
> at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
> at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:550)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:575)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:572)
> at
> org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> at
> org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> at
> org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
> at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> at
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.compile(CodeGenerator.scala:515)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:358)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:317)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:32)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:125)
> at
> org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:114)
> at
> org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:108)
> at
> org.apache.spark.sql.execution.ConvertToUnsafe$$anonfun$1.apply(rowFormatConverters.scala:39)
> at
> org.apache.spark.sql.execution.ConvertToUnsafe$$anonfun$1.apply(rowFormatConverters.scala:38)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:08:14 INFO codegen.GenerateUnsafeProjection: Code generated in
> 75659.23049 ms
> 16/07/26 21:09:30 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection
> on REPL class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
> Source)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:359)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:317)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:32)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:125)
> at
> org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:114)
> at
> org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:108)
> at
> org.apache.spark.sql.execution.ConvertToUnsafe$$anonfun$1.apply(rowFormatConverters.scala:39)
> at
> org.apache.spark.sql.execution.ConvertToUnsafe$$anonfun$1.apply(rowFormatConverters.scala:38)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:09:30 INFO storage.MemoryStore: Block rdd_7_0 stored as values in
> memory (estimated size 85.1 KB, free 197.5 KB)
> 16/07/26 21:09:30 INFO storage.BlockManagerInfo: Added rdd_7_0 in memory on
> localhost:52075 (size: 85.1 KB, free: 3.4 GB)
> 16/07/26 21:10:47 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class org.apache.spark.sql.catalyst.expressions.GeneratedClass on REPL
> class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:319)
> at
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
> at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
> at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:550)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:575)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:572)
> at
> org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> at
> org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> at
> org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
> at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> at
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.compile(CodeGenerator.scala:515)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GeneratePredicate$.create(GeneratePredicate.scala:66)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GeneratePredicate$.create(GeneratePredicate.scala:33)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:585)
> at
> org.apache.spark.sql.execution.SparkPlan.newPredicate(SparkPlan.scala:242)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:301)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:300)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:10:47 INFO codegen.GeneratePredicate: Code generated in
> 76101.227709 ms
> 16/07/26 21:12:02 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate on
> REPL class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
> Source)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
> Source)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GeneratePredicate$.create(GeneratePredicate.scala:66)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GeneratePredicate$.create(GeneratePredicate.scala:33)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:585)
> at
> org.apache.spark.sql.execution.SparkPlan.newPredicate(SparkPlan.scala:242)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:301)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:300)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:13:18 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class ByteBuffer on REPL class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at
> org.codehaus.janino.ClassLoaderIClassLoader.findIClass(ClassLoaderIClassLoader.java:78)
> at org.codehaus.janino.IClassLoader.loadIClass(IClassLoader.java:254)
> at
> org.codehaus.janino.UnitCompiler.findTypeByName(UnitCompiler.java:6893)
> at
> org.codehaus.janino.UnitCompiler.reclassifyName(UnitCompiler.java:7129)
> at
> org.codehaus.janino.UnitCompiler.reclassifyName(UnitCompiler.java:6801)
> at org.codehaus.janino.UnitCompiler.reclassify(UnitCompiler.java:6788)
> at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5419)
> at org.codehaus.janino.UnitCompiler.access$15400(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$16.visitAmbiguousName(UnitCompiler.java:5149)
> at org.codehaus.janino.Java$AmbiguousName.accept(Java.java:3135)
> at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
> at org.codehaus.janino.UnitCompiler.findIMethod(UnitCompiler.java:7333)
> at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5663)
> at org.codehaus.janino.UnitCompiler.access$13800(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$16.visitMethodInvocation(UnitCompiler.java:5132)
> at org.codehaus.janino.Java$MethodInvocation.accept(Java.java:3971)
> at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
> at org.codehaus.janino.UnitCompiler.findIMethod(UnitCompiler.java:7333)
> at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5663)
> at org.codehaus.janino.UnitCompiler.access$13800(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$16.visitMethodInvocation(UnitCompiler.java:5132)
> at org.codehaus.janino.Java$MethodInvocation.accept(Java.java:3971)
> at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
> at
> org.codehaus.janino.UnitCompiler.findMostSpecificIInvocable(UnitCompiler.java:7533)
> at
> org.codehaus.janino.UnitCompiler.invokeConstructor(UnitCompiler.java:6505)
> at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:4126)
> at org.codehaus.janino.UnitCompiler.access$7600(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$10.visitNewClassInstance(UnitCompiler.java:3275)
> at org.codehaus.janino.Java$NewClassInstance.accept(Java.java:4085)
> at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:3290)
> at
> org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:4368)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2669)
> at org.codehaus.janino.UnitCompiler.access$4500(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$7.visitAssignment(UnitCompiler.java:2619)
> at org.codehaus.janino.Java$Assignment.accept(Java.java:3405)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:2654)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1643)
> at org.codehaus.janino.UnitCompiler.access$1100(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$4.visitExpressionStatement(UnitCompiler.java:936)
> at org.codehaus.janino.Java$ExpressionStatement.accept(Java.java:2097)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:958)
> at
> org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1007)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:2293)
> at
> org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:822)
> at
> org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:794)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:507)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:658)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:662)
> at org.codehaus.janino.UnitCompiler.access$600(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:350)
> at
> org.codehaus.janino.Java$MemberClassDeclaration.accept(Java.java:1035)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
> at
> org.codehaus.janino.UnitCompiler.compileDeclaredMemberTypes(UnitCompiler.java:769)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:532)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:393)
> at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:347)
> at
> org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1139)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
> at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:322)
> at
> org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:383)
> at
> org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:315)
> at
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
> at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
> at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:550)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:575)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:572)
> at
> org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> at
> org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> at
> org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
> at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> at
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.compile(CodeGenerator.scala:515)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:193)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:63)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:338)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:300)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:14:34 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class ByteOrder on REPL class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at
> org.codehaus.janino.ClassLoaderIClassLoader.findIClass(ClassLoaderIClassLoader.java:78)
> at org.codehaus.janino.IClassLoader.loadIClass(IClassLoader.java:254)
> at
> org.codehaus.janino.UnitCompiler.findTypeByName(UnitCompiler.java:6893)
> at
> org.codehaus.janino.UnitCompiler.reclassifyName(UnitCompiler.java:7129)
> at
> org.codehaus.janino.UnitCompiler.reclassifyName(UnitCompiler.java:6801)
> at org.codehaus.janino.UnitCompiler.reclassify(UnitCompiler.java:6788)
> at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5419)
> at org.codehaus.janino.UnitCompiler.access$15400(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$16.visitAmbiguousName(UnitCompiler.java:5149)
> at org.codehaus.janino.Java$AmbiguousName.accept(Java.java:3135)
> at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
> at org.codehaus.janino.UnitCompiler.findIMethod(UnitCompiler.java:7333)
> at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:3873)
> at org.codehaus.janino.UnitCompiler.access$6900(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$10.visitMethodInvocation(UnitCompiler.java:3263)
> at org.codehaus.janino.Java$MethodInvocation.accept(Java.java:3974)
> at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:3290)
> at
> org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:4368)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2669)
> at org.codehaus.janino.UnitCompiler.access$4500(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$7.visitAssignment(UnitCompiler.java:2619)
> at org.codehaus.janino.Java$Assignment.accept(Java.java:3405)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:2654)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1643)
> at org.codehaus.janino.UnitCompiler.access$1100(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$4.visitExpressionStatement(UnitCompiler.java:936)
> at org.codehaus.janino.Java$ExpressionStatement.accept(Java.java:2097)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:958)
> at
> org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1007)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:2293)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:518)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:658)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:662)
> at org.codehaus.janino.UnitCompiler.access$600(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:350)
> at
> org.codehaus.janino.Java$MemberClassDeclaration.accept(Java.java:1035)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
> at
> org.codehaus.janino.UnitCompiler.compileDeclaredMemberTypes(UnitCompiler.java:769)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:532)
> at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:393)
> at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:185)
> at
> org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:347)
> at
> org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1139)
> at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
> at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:322)
> at
> org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:383)
> at
> org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:315)
> at
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
> at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
> at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:550)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:575)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:572)
> at
> org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> at
> org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> at
> org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
> at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> at
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.compile(CodeGenerator.scala:515)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:193)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:63)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:338)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:300)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:15:49 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class org.apache.spark.sql.catalyst.expressions.GeneratedClass on REPL
> class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:319)
> at
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
> at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
> at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:550)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:575)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:572)
> at
> org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> at
> org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> at
> org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
> at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> at
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.compile(CodeGenerator.scala:515)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:193)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:63)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:338)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:300)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:15:49 INFO columnar.GenerateColumnAccessor: Code generated in
> 226752.311632 ms
> 16/07/26 21:17:04 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificColumnarIterator
> on REPL class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
> Source)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
> Source)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:193)
> at
> org.apache.spark.sql.execution.columnar.GenerateColumnAccessor$.create(GenerateColumnAccessor.scala:63)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:338)
> at
> org.apache.spark.sql.execution.columnar.InMemoryColumnarTableScan$$anonfun$doExecute$1.apply(InMemoryColumnarTableScan.scala:300)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:18:20 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class org.apache.spark.sql.catalyst.expressions.GeneratedClass on REPL
> class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:319)
> at
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
> at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
> at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:550)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:575)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:572)
> at
> org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> at
> org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> at
> org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
> at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> at
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.compile(CodeGenerator.scala:515)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:178)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:30)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.catalyst.expressions.FromUnsafeProjection$.create(Projection.scala:180)
> at
> org.apache.spark.sql.catalyst.expressions.FromUnsafeProjection$.apply(Projection.scala:171)
> at
> org.apache.spark.sql.execution.ConvertToSafe$$anonfun$2.apply(rowFormatConverters.scala:57)
> at
> org.apache.spark.sql.execution.ConvertToSafe$$anonfun$2.apply(rowFormatConverters.scala:56)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:18:20 INFO codegen.GenerateSafeProjection: Code generated in
> 75492.782908 ms
> 16/07/26 21:19:35 ERROR repl.ExecutorClassLoader: Failed to check existence
> of class
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection
> on REPL class server at http://10.199.29.218:52067
> java.net.ConnectException: Operation timed out
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at
> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
> at
> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
> at
> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
> at java.net.Socket.connect(Socket.java:589)
> at java.net.Socket.connect(Socket.java:538)
> at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
> at
> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
> at
> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
> at
> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
> at
> org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromHttpServer(ExecutorClassLoader.scala:108)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:146)
> at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:76)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
> Source)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:179)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:30)
> at
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:588)
> at
> org.apache.spark.sql.catalyst.expressions.FromUnsafeProjection$.create(Projection.scala:180)
> at
> org.apache.spark.sql.catalyst.expressions.FromUnsafeProjection$.apply(Projection.scala:171)
> at
> org.apache.spark.sql.execution.ConvertToSafe$$anonfun$2.apply(rowFormatConverters.scala:57)
> at
> org.apache.spark.sql.execution.ConvertToSafe$$anonfun$2.apply(rowFormatConverters.scala:56)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 16/07/26 21:19:35 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID
> 0). 11072 bytes result sent to driver
> 16/07/26 21:19:35 INFO scheduler.TaskSetManager: Finished task 0.0 in stage
> 0.0 (TID 0) in 757296 ms on localhost (1/1)
> 16/07/26 21:19:35 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0,
> whose tasks have all completed, from pool
> 16/07/26 21:19:35 INFO scheduler.DAGScheduler: ResultStage 0 (show at
> <console>:63) finished in 757.319 s
> 16/07/26 21:19:35 INFO scheduler.DAGScheduler: Job 0 finished: show at
> <console>:63, took 757.396457 s
> +-------------+-------+--------+-------+-------+------+-------+----------+-----------+----------+----------+-----------------+------+----+----------+---------+-------+----------+----------+--------+-------+
> |creditability|balance|duration|history|purpose|amount|savings|employment|instPercent|sexMarried|guarantors|residenceDuration|assets|
> age|concCredit|apartment|credits|occupation|dependents|hasPhone|foreign|
> +-------------+-------+--------+-------+-------+------+-------+----------+-----------+----------+----------+-----------------+------+----+----------+---------+-------+----------+----------+--------+-------+
> | 1.0| 0.0| 18.0| 4.0| 2.0|1049.0| 0.0| 1.0|
> 4.0| 1.0| 0.0| 3.0| 1.0|21.0| 2.0|
> 0.0| 0.0| 2.0| 0.0| 0.0| 0.0|
> | 1.0| 0.0| 9.0| 4.0| 0.0|2799.0| 0.0| 2.0|
> 2.0| 2.0| 0.0| 1.0| 0.0|36.0| 2.0|
> 0.0| 1.0| 2.0| 1.0| 0.0| 0.0|
> | 1.0| 1.0| 12.0| 2.0| 9.0| 841.0| 1.0| 3.0|
> 2.0| 1.0| 0.0| 3.0| 0.0|23.0| 2.0|
> 0.0| 0.0| 1.0| 0.0| 0.0| 0.0|
> | 1.0| 0.0| 12.0| 4.0| 0.0|2122.0| 0.0| 2.0|
> 3.0| 2.0| 0.0| 1.0| 0.0|39.0| 2.0|
> 0.0| 1.0| 1.0| 1.0| 0.0| 1.0|
> | 1.0| 0.0| 12.0| 4.0| 0.0|2171.0| 0.0| 2.0|
> 4.0| 2.0| 0.0| 3.0| 1.0|38.0| 0.0|
> 1.0| 1.0| 1.0| 0.0| 0.0| 1.0|
> | 1.0| 0.0| 10.0| 4.0| 0.0|2241.0| 0.0| 1.0|
> 1.0| 2.0| 0.0| 2.0| 0.0|48.0| 2.0|
> 0.0| 1.0| 1.0| 1.0| 0.0| 1.0|
> | 1.0| 0.0| 8.0| 4.0| 0.0|3398.0| 0.0| 3.0|
> 1.0| 2.0| 0.0| 3.0| 0.0|39.0| 2.0|
> 1.0| 1.0| 1.0| 0.0| 0.0| 1.0|
> | 1.0| 0.0| 6.0| 4.0| 0.0|1361.0| 0.0| 1.0|
> 2.0| 2.0| 0.0| 3.0| 0.0|40.0| 2.0|
> 1.0| 0.0| 1.0| 1.0| 0.0| 1.0|
> | 1.0| 3.0| 18.0| 4.0| 3.0|1098.0| 0.0| 0.0|
> 4.0| 1.0| 0.0| 3.0| 2.0|65.0| 2.0|
> 1.0| 1.0| 0.0| 0.0| 0.0| 0.0|
> | 1.0| 1.0| 24.0| 2.0| 3.0|3758.0| 2.0| 0.0|
> 1.0| 1.0| 0.0| 3.0| 3.0|23.0| 2.0|
> 0.0| 0.0| 0.0| 0.0| 0.0| 0.0|
> | 1.0| 0.0| 11.0| 4.0| 0.0|3905.0| 0.0| 2.0|
> 2.0| 2.0| 0.0| 1.0| 0.0|36.0| 2.0|
> 0.0| 1.0| 2.0| 1.0| 0.0| 0.0|
> | 1.0| 0.0| 30.0| 4.0| 1.0|6187.0| 1.0| 3.0|
> 1.0| 3.0| 0.0| 3.0| 2.0|24.0| 2.0|
> 0.0| 1.0| 2.0| 0.0| 0.0| 0.0|
> | 1.0| 0.0| 6.0| 4.0| 3.0|1957.0| 0.0| 3.0|
> 1.0| 1.0| 0.0| 3.0| 2.0|31.0| 2.0|
> 1.0| 0.0| 2.0| 0.0| 0.0| 0.0|
> | 1.0| 1.0| 48.0| 3.0| 10.0|7582.0| 1.0| 0.0|
> 2.0| 2.0| 0.0| 3.0| 3.0|31.0| 2.0|
> 1.0| 0.0| 3.0| 0.0| 1.0| 0.0|
> | 1.0| 0.0| 18.0| 2.0| 3.0|1936.0| 4.0| 3.0|
> 2.0| 3.0| 0.0| 3.0| 2.0|23.0| 2.0|
> 0.0| 1.0| 1.0| 0.0| 0.0| 0.0|
> | 1.0| 0.0| 6.0| 2.0| 3.0|2647.0| 2.0| 2.0|
> 2.0| 2.0| 0.0| 2.0| 0.0|44.0| 2.0|
> 0.0| 0.0| 2.0| 1.0| 0.0| 0.0|
> | 1.0| 0.0| 11.0| 4.0| 0.0|3939.0| 0.0| 2.0|
> 1.0| 2.0| 0.0| 1.0| 0.0|40.0| 2.0|
> 1.0| 1.0| 1.0| 1.0| 0.0| 0.0|
> | 1.0| 1.0| 18.0| 2.0| 3.0|3213.0| 2.0| 1.0|
> 1.0| 3.0| 0.0| 2.0| 0.0|25.0| 2.0|
> 0.0| 0.0| 2.0| 0.0| 0.0| 0.0|
> | 1.0| 1.0| 36.0| 4.0| 3.0|2337.0| 0.0| 4.0|
> 4.0| 2.0| 0.0| 3.0| 0.0|36.0| 2.0|
> 1.0| 0.0| 2.0| 0.0| 0.0| 0.0|
> | 1.0| 3.0| 11.0| 4.0| 0.0|7228.0| 0.0| 2.0|
> 1.0| 2.0| 0.0| 3.0| 1.0|39.0| 2.0|
> 1.0| 1.0| 1.0| 0.0| 0.0| 0.0|
> +-------------+-------+--------+-------+-------+------+-------+----------+-----------+----------+----------+-----------------+------+----+----------+---------+-------+----------+----------+--------+-------+
> only showing top 20 rows
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]