Have a look at SciSpark we’ve integrated Apache Spark, Zeppelin, 
in a docker:

http://github.com/SciSpark/

Cheers,
Chris

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Chris Mattmann, Ph.D.
Chief Architect
Instrument Software and Science Data Systems Section (398)
NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
Office: 168-519, Mailstop: 168-527
Email: [email protected]
WWW:  http://sunset.usc.edu/~mattmann/
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Director, Information Retrieval and Data Science Group (IRDS)
Adjunct Associate Professor, Computer Science Department
University of Southern California, Los Angeles, CA 90089 USA
WWW: http://irds.usc.edu/
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++









On 8/2/16, 1:25 PM, "Coakley, Kevin" <[email protected]> wrote:

>Hi Wail,
>
>I was able to get the asterixdb-spark-connector to work as long as asterixdb, 
>zeppelin and spark are all running on the same server. 
>
>When I try to access the asterixdb on a remote server, I receive the 
>org.apache.hyracks.api.exceptions.HyracksDataException: Connection fail  error 
>at the bottom of this email. 
>
>I don’t believe there are any firewalls between the two systems so I am unsure 
>why I am receiving a connection failure. I looked at the hyracks documentation 
>at 
>https://github.com/apache/asterixdb/tree/master/hyracks-fullstack/hyracks/hyracks-documentation/src/books/user-guide
> it didn’t mention anything about how to access hyracks remotely. I couldn’t 
>find any additional documentation by searching Google.
>
>
>$ /opt/spark/bin/spark-shell --packages 
>org.apache.asterix:asterixdb-spark-connector_2.10:1.6.0 --conf 
>spark.asterix.connection.host=10.128.5.192 --conf 
>spark.asterix.connection.port=19002 --conf spark.asterix.frame.size=131072
>
>…
>
>scala>       rddAql.collect().foreach(println)
>16/08/02 20:18:49 DEBUG ClosureCleaner: +++ Cleaning closure <function1> 
>(org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12) +++
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + declared fields: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public static final long 
>org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.serialVersionUID
>16/08/02 20:18:49 DEBUG ClosureCleaner:      private final 
>org.apache.spark.rdd.RDD$$anonfun$collect$1 
>org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$1
>2.$outer
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + declared methods: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public final java.lang.Object 
>org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(java.lang.Object)
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public final java.lang.Object 
>org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(scala.collection.Ite
>rator)
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + inner classes: 0
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outer classes: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      
>org.apache.spark.rdd.RDD$$anonfun$collect$1
>16/08/02 20:18:49 DEBUG ClosureCleaner:      org.apache.spark.rdd.RDD
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outer objects: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      <function0>
>16/08/02 20:18:49 DEBUG ClosureCleaner:      AsterixRDD[0] at RDD at 
>AsterixRDD.scala:38
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + populating accessed fields because 
>this is the starting closure
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + fields accessed by starting 
>closure: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      (class 
>org.apache.spark.rdd.RDD$$anonfun$collect$1,Set($outer))
>16/08/02 20:18:49 DEBUG ClosureCleaner:      (class 
>org.apache.spark.rdd.RDD,Set(org$apache$spark$rdd$RDD$$evidence$1))
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outermost object is not a closure, 
>so do not clone it: (class org.apache.spark.rdd.RDD,AsterixRDD[0] at RDD at Ast
>erixRDD.scala:38)
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + cloning the object <function0> of 
>class org.apache.spark.rdd.RDD$$anonfun$collect$1
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + cleaning cloned closure <function0> 
>recursively (org.apache.spark.rdd.RDD$$anonfun$collect$1)
>16/08/02 20:18:49 DEBUG ClosureCleaner: +++ Cleaning closure <function0> 
>(org.apache.spark.rdd.RDD$$anonfun$collect$1) +++
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + declared fields: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public static final long 
>org.apache.spark.rdd.RDD$$anonfun$collect$1.serialVersionUID
>16/08/02 20:18:49 DEBUG ClosureCleaner:      private final 
>org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD$$anonfun$collect$1.$outer
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + declared methods: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public org.apache.spark.rdd.RDD 
>org.apache.spark.rdd.RDD$$anonfun$collect$1.org$apache$spark$rdd$RDD$$anonfun$$$
>outer()
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public final java.lang.Object 
>org.apache.spark.rdd.RDD$$anonfun$collect$1.apply()
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + inner classes: 1
>16/08/02 20:18:49 DEBUG ClosureCleaner:      
>org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outer classes: 1
>16/08/02 20:18:49 DEBUG ClosureCleaner:      org.apache.spark.rdd.RDD
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outer objects: 1
>16/08/02 20:18:49 DEBUG ClosureCleaner:      AsterixRDD[0] at RDD at 
>AsterixRDD.scala:38
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + fields accessed by starting 
>closure: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      (class 
>org.apache.spark.rdd.RDD$$anonfun$collect$1,Set($outer))
>16/08/02 20:18:49 DEBUG ClosureCleaner:      (class 
>org.apache.spark.rdd.RDD,Set(org$apache$spark$rdd$RDD$$evidence$1))
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outermost object is not a closure, 
>so do not clone it: (class org.apache.spark.rdd.RDD,AsterixRDD[0] at RDD at Ast
>erixRDD.scala:38)
>16/08/02 20:18:49 DEBUG ClosureCleaner:  +++ closure <function0> 
>(org.apache.spark.rdd.RDD$$anonfun$collect$1) is now cleaned +++
>16/08/02 20:18:49 DEBUG ClosureCleaner:  +++ closure <function1> 
>(org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12) is now cleaned +++
>16/08/02 20:18:49 DEBUG ClosureCleaner: +++ Cleaning closure <function2> 
>(org.apache.spark.SparkContext$$anonfun$runJob$5) +++
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + declared fields: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public static final long 
>org.apache.spark.SparkContext$$anonfun$runJob$5.serialVersionUID
>16/08/02 20:18:49 DEBUG ClosureCleaner:      private final scala.Function1 
>org.apache.spark.SparkContext$$anonfun$runJob$5.cleanedFunc$1
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + declared methods: 2
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public final java.lang.Object 
>org.apache.spark.SparkContext$$anonfun$runJob$5.apply(java.lang.Object,java.lang.O
>bject)
>16/08/02 20:18:49 DEBUG ClosureCleaner:      public final java.lang.Object 
>org.apache.spark.SparkContext$$anonfun$runJob$5.apply(org.apache.spark.TaskContext
>,scala.collection.Iterator)
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + inner classes: 0
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outer classes: 0
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + outer objects: 0
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + populating accessed fields because 
>this is the starting closure
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + fields accessed by starting 
>closure: 0
>16/08/02 20:18:49 DEBUG ClosureCleaner:  + there are no enclosing objects!
>16/08/02 20:18:49 DEBUG ClosureCleaner:  +++ closure <function2> 
>(org.apache.spark.SparkContext$$anonfun$runJob$5) is now cleaned +++
>16/08/02 20:18:49 INFO SparkContext: Starting job: collect at <console>:42
>16/08/02 20:18:49 INFO DAGScheduler: Got job 0 (collect at <console>:42) with 
>1 output partitions
>16/08/02 20:18:49 INFO DAGScheduler: Final stage: ResultStage 0 (collect at 
><console>:42)
>16/08/02 20:18:49 INFO DAGScheduler: Parents of final stage: List()
>16/08/02 20:18:49 INFO DAGScheduler: Missing parents: List()
>16/08/02 20:18:49 DEBUG DAGScheduler: submitStage(ResultStage 0)
>16/08/02 20:18:49 DEBUG DAGScheduler: missing: List()
>16/08/02 20:18:49 INFO DAGScheduler: Submitting ResultStage 0 (AsterixRDD[0] 
>at RDD at AsterixRDD.scala:38), which has no missing parents
>16/08/02 20:18:49 DEBUG DAGScheduler: submitMissingTasks(ResultStage 0)
>16/08/02 20:18:49 INFO MemoryStore: Block broadcast_0 stored as values in 
>memory (estimated size 1312.0 B, free 1312.0 B)
>16/08/02 20:18:49 DEBUG BlockManager: Put block broadcast_0 locally took  102 
>ms
>16/08/02 20:18:49 DEBUG BlockManager: Putting block broadcast_0 without 
>replication took  103 ms
>16/08/02 20:18:49 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes 
>in memory (estimated size 912.0 B, free 2.2 KB)
>16/08/02 20:18:49 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 
>localhost:42758 (size: 912.0 B, free: 517.4 MB)
>16/08/02 20:18:49 DEBUG BlockManagerMaster: Updated info of block 
>broadcast_0_piece0
>16/08/02 20:18:49 DEBUG BlockManager: Told master about block 
>broadcast_0_piece0
>16/08/02 20:18:49 DEBUG BlockManager: Put block broadcast_0_piece0 locally 
>took  5 ms
>16/08/02 20:18:49 DEBUG BlockManager: Putting block broadcast_0_piece0 without 
>replication took  5 ms
>16/08/02 20:18:49 INFO SparkContext: Created broadcast 0 from broadcast at 
>DAGScheduler.scala:1006
>16/08/02 20:18:49 INFO DAGScheduler: Submitting 1 missing tasks from 
>ResultStage 0 (AsterixRDD[0] at RDD at AsterixRDD.scala:38)
>16/08/02 20:18:49 DEBUG DAGScheduler: New pending partitions: Set(0)
>16/08/02 20:18:49 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
>16/08/02 20:18:49 DEBUG TaskSetManager: Epoch for TaskSet 0.0: 0
>16/08/02 20:18:49 DEBUG TaskSetManager: Valid locality levels for TaskSet 0.0: 
>ANY
>16/08/02 20:18:49 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, 
>runningTasks: 0
>16/08/02 20:18:49 DEBUG TaskSetManager: Valid locality levels for TaskSet 0.0: 
>ANY
>16/08/02 20:18:49 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 
>localhost, partition 0,ANY, 5872 bytes)
>16/08/02 20:18:49 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
>16/08/02 20:18:49 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.httpcomponents_httpclient-4.5.jar 
>with timestamp 1470169042777
>16/08/02 20:18:49 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:49 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.httpcomponents_httpclient-4.5.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb
>128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp3860074775671300011.tmp
>16/08/02 20:18:49 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.httpco
>mponents_httpclient-4.5.jar to class loader
>16/08/02 20:18:49 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.scala-lang_scala-compiler-2.10.4.jar with 
>timestamp 1470169042864
>16/08/02 20:18:49 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:49 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.scala-lang_scala-compiler-2.10.4.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb1288
>10f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp3262286823800108479.tmp
>16/08/02 20:18:49 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.scala-lang_sc
>ala-compiler-2.10.4.jar to class loader
>16/08/02 20:18:49 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.slf4j_slf4j-api-1.6.1.jar with timestamp 
>1470169042806
>16/08/02 20:18:49 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:49 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.slf4j_slf4j-api-1.6.1.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFi
>les-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp4983533479624179412.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.slf4j_slf4j-a
>pi-1.6.1.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/net.liftweb_lift-json_2.10-2.6.2.jar with 
>timestamp 1470169042778
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/net.liftweb_lift-json_2.10-2.6.2.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2
>/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp7676771515275148134.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/net.liftweb_lift-
>json_2.10-2.6.2.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/commons-logging_commons-logging-1.2.jar with 
>timestamp 1470169042807
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/commons-logging_commons-logging-1.2.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12881
>0f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp8603065512312611872.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/commons-logging_c
>ommons-logging-1.2.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.scala-lang_scalap-2.10.4.jar with timestamp 
>1470169042810
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.scala-lang_scalap-2.10.4.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/use
>rFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1356133302927190413.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.scala-lang_sc
>alap-2.10.4.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.asterix_asterixdb-spark-connector_2.10-1.6.0.jar
> with timestamp 147016904
>2761
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.asterix_asterixdb-spark-connector_2.10-1.6.0.jar
> to /tmp/spark-67977d02-e7fd
>-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp5674715681903620103.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.asteri
>x_asterixdb-spark-connector_2.10-1.6.0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-util-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042780
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-util-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4237-b9
>11-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp960508663840606601.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-util-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-servlet-8.0.0.RC0.jar 
>with timestamp 1470169042804
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-servlet-8.0.0.RC0.jar 
>to /tmp/spark-67977d02-e7fd-4237-b911-25f
>b128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp6629368718502114303.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-servlet-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.wicket_wicket-request-1.5.2.jar with 
>timestamp 1470169042806
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.wicket_wicket-request-1.5.2.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12
>8810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1723007311578117804.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.wicket
>_wicket-request-1.5.2.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.scala-lang_scala-reflect-2.10.4.jar with 
>timestamp 1470169042881
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.scala-lang_scala-reflect-2.10.4.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12881
>0f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp7325758342859460972.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.scala-lang_sc
>ala-reflect-2.10.4.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/commons-io_commons-io-2.4.jar with timestamp 
>1470169042783
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/commons-io_commons-io-2.4.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFi
>les-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp5519211317591426960.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/commons-io_common
>s-io-2.4.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-control-nc-0.2.18-SNAPSHOT.jar
> with timestamp 14701690427
>64
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-control-nc-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4
>237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp3919109807459653847.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-control-nc-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-io-8.0.0.RC0.jar with 
>timestamp 1470169042803
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-io-8.0.0.RC0.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb1288
>10f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp4524176199382352705.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-io-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-control-cc-0.2.18-SNAPSHOT.jar
> with timestamp 14701690427
>82
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-control-cc-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4
>237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp4072935671878616979.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-control-cc-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-security-8.0.0.RC0.jar 
>with timestamp 1470169042805
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-security-8.0.0.RC0.jar 
>to /tmp/spark-67977d02-e7fd-4237-b911-25
>fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1641593757599706804.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-security-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-continuation-8.0.0.RC0.jar
> with timestamp 1470169042802
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-continuation-8.0.0.RC0.jar
> to /tmp/spark-67977d02-e7fd-4237-b91
>1-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp6705043369408149452.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-continuation-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-data-std-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042782
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-data-std-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-423
>7-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp7285196531917051222.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-data-std-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.wicket_wicket-util-1.5.2.jar with 
>timestamp 1470169042806
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.wicket_wicket-util-1.5.2.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12881
>0f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp2425095420520230824.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.wicket
>_wicket-util-1.5.2.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.json_json-20090211.jar with timestamp 
>1470169042779
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.json_json-20090211.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles
>-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp7246507731018592850.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.json_json-200
>90211.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-net-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042780
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-net-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4237-b91
>1-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp4617288623442337252.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-net-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/com.thoughtworks.paranamer_paranamer-2.4.1.jar 
>with timestamp 1470169042810
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/com.thoughtworks.paranamer_paranamer-2.4.1.jar 
>to /tmp/spark-67977d02-e7fd-4237-b911-25
>fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp4990079550464975335.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/com.thoughtworks.
>paranamer_paranamer-2.4.1.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-ipc-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042779
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-ipc-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4237-b91
>1-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp7836301704668276963.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-ipc-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-control-common-0.2.18-SNAPSHOT.jar
> with timestamp 1470169
>042783
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-control-common-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7
>fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp611546527112857923.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-control-common-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-http-8.0.0.RC0.jar with 
>timestamp 1470169042803
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-http-8.0.0.RC0.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12
>8810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1312243926013444902.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-http-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.ini4j_ini4j-0.5.4.jar with timestamp 
>1470169042801
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.ini4j_ini4j-0.5.4.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-
>601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp6295471697891446278.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.ini4j_ini4j-0
>.5.4.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-comm-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042781
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-comm-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4237-b9
>11-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp7808003918156444019.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-comm-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-api-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042763
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-api-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4237-b91
>1-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1674943860545785458.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-api-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-dataflow-common-0.2.18-SNAPSHOT.jar
> with timestamp 147016
>9042764
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-dataflow-common-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e
>7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp8322251742856356113.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-dataflow-common-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.commons_commons-lang3-3.1.jar with 
>timestamp 1470169042780
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.commons_commons-lang3-3.1.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb1288
>10f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1348552964800068025.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.common
>s_commons-lang3-3.1.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-client-0.2.18-SNAPSHOT.jar
> with timestamp 1470169042763
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.hyracks_hyracks-client-0.2.18-SNAPSHOT.jar
> to /tmp/spark-67977d02-e7fd-4237-
>b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp2029035581834026042.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.hyrack
>s_hyracks-client-0.2.18-SNAPSHOT.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.httpcomponents_httpcore-4.4.1.jar 
>with timestamp 1470169042807
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.httpcomponents_httpcore-4.4.1.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb
>128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1299168202947951583.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.httpco
>mponents_httpcore-4.4.1.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-webapp-8.0.0.RC0.jar 
>with timestamp 1470169042784
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-webapp-8.0.0.RC0.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb
>128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp6872334740679608070.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-webapp-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-util-8.0.0.RC0.jar with 
>timestamp 1470169042804
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-util-8.0.0.RC0.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12
>8810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1130775847969234836.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-util-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/commons-codec_commons-codec-1.9.jar with 
>timestamp 1470169042808
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/commons-codec_commons-codec-1.9.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/
>userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp6734358121338695352.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/commons-codec_com
>mons-codec-1.9.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/com.googlecode.json-simple_json-simple-1.1.jar 
>with timestamp 1470169042782
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/com.googlecode.json-simple_json-simple-1.1.jar 
>to /tmp/spark-67977d02-e7fd-4237-b911-25
>fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1101986196715574441.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/com.googlecode.js
>on-simple_json-simple-1.1.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.mortbay.jetty_servlet-api-3.0.20100224.jar 
>with timestamp 1470169042802
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.mortbay.jetty_servlet-api-3.0.20100224.jar 
>to /tmp/spark-67977d02-e7fd-4237-b911-25
>fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp8117640740880704102.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.mortbay.jetty
>_servlet-api-3.0.20100224.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-xml-8.0.0.RC0.jar with 
>timestamp 1470169042804
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-xml-8.0.0.RC0.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128
>810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp6442266218584598106.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-xml-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.apache.wicket_wicket-core-1.5.2.jar with 
>timestamp 1470169042800
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.apache.wicket_wicket-core-1.5.2.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb12881
>0f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1041242714193765966.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.apache.wicket
>_wicket-core-1.5.2.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/args4j_args4j-2.0.12.jar with timestamp 
>1470169042779
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/args4j_args4j-2.0.12.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-6
>01aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp5666028419739000174.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/args4j_args4j-2.0
>.12.jar to class loader
>16/08/02 20:18:50 INFO Executor: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-server-8.0.0.RC0.jar 
>with timestamp 1470169042784
>16/08/02 20:18:50 DEBUG Utils: fetchFile not using security
>16/08/02 20:18:50 INFO Utils: Fetching 
>http://10.128.5.183:38874/jars/org.eclipse.jetty_jetty-server-8.0.0.RC0.jar to 
>/tmp/spark-67977d02-e7fd-4237-b911-25fb
>128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/fetchFileTemp1033020651873699094.tmp
>16/08/02 20:18:50 INFO Executor: Adding 
>file:/tmp/spark-67977d02-e7fd-4237-b911-25fb128810f2/userFiles-601aa2fd-61b6-4f3d-a326-fb0e0fb72af0/org.eclipse.jetty
>_jetty-server-8.0.0.RC0.jar to class loader
>16/08/02 20:18:50 DEBUG Executor: Task 0's epoch is 0
>16/08/02 20:18:50 DEBUG BlockManager: Getting local block broadcast_0
>16/08/02 20:18:50 DEBUG BlockManager: Level for block broadcast_0 is 
>StorageLevel(true, true, false, true, 1)
>16/08/02 20:18:50 DEBUG BlockManager: Getting block broadcast_0 from memory
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>        at 
> org.apache.hyracks.net.protocols.tcp.TCPEndpoint$IOThread.run(TCPEndpoint.java:190)
>16/08/02 20:18:50 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
>org.apache.hyracks.api.exceptions.HyracksDataException: Connection failure
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:105)
>        at 
> org.apache.asterix.connector.result.AsterixResultReader.<init>(AsterixResultReader.scala:59)
>        at 
> org.apache.asterix.connector.rdd.AsterixRDD.compute(AsterixRDD.scala:77)
>        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>        at org.apache.spark.scheduler.Task.run(Task.scala:89)
>        at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>        at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>        at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>        at java.lang.Thread.run(Thread.java:745)
>Caused by: org.apache.hyracks.net.exceptions.NetException: Connection failure
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MultiplexedConnection.waitUntilConnected(MultiplexedConnection.java:119)
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MuxDemux.connect(MuxDemux.java:141)
>        at 
> org.apache.hyracks.client.net.ClientNetworkManager.connect(ClientNetworkManager.java:53)
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:103)
>        ... 10 more
>16/08/02 20:18:50 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0, 
>runningTasks: 0
>16/08/02 20:18:50 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 
>localhost): org.apache.hyracks.api.exceptions.HyracksDataException: Connection 
>fail
>ure
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:105)
>        at 
> org.apache.asterix.connector.result.AsterixResultReader.<init>(AsterixResultReader.scala:59)
>        at 
> org.apache.asterix.connector.rdd.AsterixRDD.compute(AsterixRDD.scala:77)
>        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>        at org.apache.spark.scheduler.Task.run(Task.scala:89)
>        at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>        at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>        at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>        at java.lang.Thread.run(Thread.java:745)
>Caused by: org.apache.hyracks.net.exceptions.NetException: Connection failure
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MultiplexedConnection.waitUntilConnected(MultiplexedConnection.java:119)
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MuxDemux.connect(MuxDemux.java:141)
>        at 
> org.apache.hyracks.client.net.ClientNetworkManager.connect(ClientNetworkManager.java:53)
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:103)
>        ... 10 more
>
>16/08/02 20:18:50 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; 
>aborting job
>16/08/02 20:18:50 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks 
>have all completed, from pool 
>16/08/02 20:18:50 INFO TaskSchedulerImpl: Cancelling stage 0
>16/08/02 20:18:50 INFO DAGScheduler: ResultStage 0 (collect at <console>:42) 
>failed in 1.099 s
>16/08/02 20:18:50 DEBUG DAGScheduler: After removal of stage 0, remaining 
>stages = 0
>16/08/02 20:18:50 INFO DAGScheduler: Job 0 failed: collect at <console>:42, 
>took 1.412302 s
>org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in 
>stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID >0
>, localhost): org.apache.hyracks.api.exceptions.HyracksDataException: 
>Connection failure
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:105)
>        at 
> org.apache.asterix.connector.result.AsterixResultReader.<init>(AsterixResultReader.scala:59)
>        at 
> org.apache.asterix.connector.rdd.AsterixRDD.compute(AsterixRDD.scala:77)
>        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>        at org.apache.spark.scheduler.Task.run(Task.scala:89)
>        at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>        at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>        at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>        at java.lang.Thread.run(Thread.java:745)
>Caused by: org.apache.hyracks.net.exceptions.NetException: Connection failure
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MultiplexedConnection.waitUntilConnected(MultiplexedConnection.java:119)
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MuxDemux.connect(MuxDemux.java:141)
>        at 
> org.apache.hyracks.client.net.ClientNetworkManager.connect(ClientNetworkManager.java:53)
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:103)
>        ... 10 more
>
>Driver stacktrace:
>        at 
> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
>        at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>        at 
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
>        at scala.Option.foreach(Option.scala:236)
>        at 
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
>        at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
>        at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
>        at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
>        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>        at 
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
>        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
>        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
>        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
>        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
>        at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
>        at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
>        at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
>        at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
>        at org.apache.spark.rdd.RDD.collect(RDD.scala:926)
>        at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
>        at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
>        at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
>        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
>        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
>        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55)
>        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
>        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
>        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
>        at $iwC$$iwC$$iwC.<init>(<console>:63)
>        at $iwC$$iwC.<init>(<console>:65)
>        at $iwC.<init>(<console>:67)
>        at <init>(<console>:69)
>        at .<init>(<console>:73)
>        at .<clinit>(<console>)
>        at .<init>(<console>:7)
>        at .<clinit>(<console>)
>        at $print(<console>)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:498)
>        at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>        at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>        at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>        at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>        at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>        at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>        at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>        at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>        at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>        at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>        at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>        at org.apache.spark.repl.Main$.main(Main.scala:31)
>        at org.apache.spark.repl.Main.main(Main.scala)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:498)
>        at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>        at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>Caused by: org.apache.hyracks.api.exceptions.HyracksDataException: Connection 
>failure
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:105)
>        at 
> org.apache.asterix.connector.result.AsterixResultReader.<init>(AsterixResultReader.scala:59)
>        at 
> org.apache.asterix.connector.rdd.AsterixRDD.compute(AsterixRDD.scala:77)
>        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>        at org.apache.spark.scheduler.Task.run(Task.scala:89)
>        at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>        at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>        at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>        at java.lang.Thread.run(Thread.java:745)
>Caused by: org.apache.hyracks.net.exceptions.NetException: Connection failure
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MultiplexedConnection.waitUntilConnected(MultiplexedConnection.java:119)
>        at 
> org.apache.hyracks.net.protocols.muxdemux.MuxDemux.connect(MuxDemux.java:141)
>        at 
> org.apache.hyracks.client.net.ClientNetworkManager.connect(ClientNetworkManager.java:53)
>        at 
> org.apache.hyracks.comm.channels.DatasetNetworkInputChannel.open(DatasetNetworkInputChannel.java:103)
>        ... 10 more
>
>
>
>
>On 7/29/16, 12:31 PM, "Wail Alkowaileet" <[email protected]> wrote:
>
>    Hi Illdar and Kevin,
>    
>    Sorry for the late reply. Yes the servlet that provides the result
>    locations is still not in the codebase.
>    If you can apply the changes <https://asterix-gerrit.ics.uci.edu/#/c/1003/>
>    into your AsterixDB and try it again I would be thankful.
>    
>    I'm still working on the Sonar comments. But it should still work fine.
>    
>    Thanks.
>    
>    On Fri, Jul 29, 2016 at 1:21 AM, Ildar Absalyamov <
>    [email protected]> wrote:
>    
>    > I also got the same error with similar using connector inside zeppelin.
>    >
>    > > On Jul 26, 2016, at 16:40, Coakley, Kevin <[email protected]> wrote:
>    > >
>    > > Hi Wail,
>    > >
>    > > I am running the contents of
>    > 
> https://github.com/Nullification/asterixdb-spark-connector/blob/master/zeppelin-notebook/asterixdb-spark-example/note.json
>    > using spark-shell and I get a 404 error when trying to access
>    > /query/result/location?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D 
> HTTP/1.1. I
>    > don’t know if this is an error with what I am doing, an AsterixDB error 
> or
>    > an error with asterixdb-spark-connector. Thank you for any help that you
>    > can provide.
>    > >
>    > > Below is the command that I used and the error (I removed everything
>    > else):
>    > >
>    > > /opt/spark/bin/spark-shell --packages
>    > org.apache.asterix:asterixdb-spark-connector_2.10:1.6.0 --conf
>    > spark.asterix.connection.host=10.128.5.170 --conf
>    > spark.asterix.connection.port=19002 --conf 
> spark.asterix.frame.size=131072
>    > >
>    > > …..
>    > > scala> val df = sqlContext.aql(aqlQuery,infer = true, printCaseClasses 
> =
>    > true)
>    > > 16/07/26 23:27:11 INFO SparkContextFunctions:
>    > spark.asterix.connection.host 10.128.5.170
>    > > 16/07/26 23:27:11 INFO SparkContextFunctions:
>    > spark.asterix.connection.port 19002
>    > > 16/07/26 23:27:11 INFO SparkContextFunctions: spark.asterix.frame.size
>    > 131072
>    > > 16/07/26 23:27:11 INFO SparkContextFunctions: 
> spark.asterix.frame.number
>    > 1
>    > > 16/07/26 23:27:11 INFO SparkContextFunctions:
>    > spark.asterix.reader.number 2
>    > > 16/07/26 23:27:11 INFO SparkContextFunctions:
>    > spark.asterix.prefetch.threshold 2
>    > > 16/07/26 23:27:11 DEBUG RequestAddCookies: CookieSpec selected:
>    > best-match
>    > > 16/07/26 23:27:11 DEBUG RequestAuthCache: Auth cache not set in the
>    > context
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > request: [route: {}->http://10.128.5.170:19002][total kept alive: 0;
>    > route allocated:
>    > > 0 of 2; total allocated: 0 of 20]
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > leased: [id: 0][route: {}->http://10.128.5.170:19002][total kept alive:
>    > 0; route allo
>    > > cated: 1 of 2; total allocated: 1 of 20]
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Opening connection {}->
>    > http://10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG HttpClientConnectionManager: Connecting to /
>    > 10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Executing request POST
>    > /aql?mode=asynchronous&schema-inferencer=Spark HTTP/1.1
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Target auth state: UNCHALLENGED
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Proxy auth state: UNCHALLENGED
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> POST
>    > /aql?mode=asynchronous&schema-inferencer=Spark HTTP/1.1
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> Content-Length: 386
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> Content-Type:
>    > text/plain; charset=UTF-8
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> Host:
>    > 10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> Connection:
>    > Keep-Alive
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> User-Agent:
>    > Apache-HttpClient/4.3.2 (java 1.5)
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 >> Accept-Encoding:
>    > gzip,deflate
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "POST
>    > /aql?mode=asynchronous&schema-inferencer=Spark HTTP/1.1[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "Content-Length:
>    > 386[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "Content-Type:
>    > text/plain; charset=UTF-8[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "Host:
>    > 10.128.5.170:19002[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "Connection:
>    > Keep-Alive[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "User-Agent:
>    > Apache-HttpClient/4.3.2 (java 1.5)[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "Accept-Encoding:
>    > gzip,deflate[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "            let
>    > $exampleSet := [[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "             {"name" 
> :
>    > "Ann", "age" : 20, "salary" : 100000},[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "             {"name" 
> :
>    > "Bob", "age" : 30, "salary" : 200000},[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "             {"name" 
> :
>    > "Cat", "age" : 40, "salary" : 300000, "dependents" : [1, 2, 3]},[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "             {"name" 
> :
>    > "Cat", "age" : 50, "salary" : 400000}[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "            ][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "            for $x in
>    > $exampleSet[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "            return
>    > $x[\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 >> "         "
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "HTTP/1.1 200
>    > OK[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "Content-Type:
>    > application/json;charset=utf-8[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "Transfer-Encoding:
>    > chunked[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "Server:
>    > Jetty(8.0.0.RC0)[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "11[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 <<
>    > "{"handle":[13,0]}[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "0[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-0 << "[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 << HTTP/1.1 200 OK
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 << Content-Type:
>    > application/json;charset=utf-8
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 << Transfer-Encoding:
>    > chunked
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-0 << Server:
>    > Jetty(8.0.0.RC0)
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Connection can be kept alive
>    > indefinitely
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > [id: 0][route: {}->http://10.128.5.170:19002] can be kept alive
>    > indefinitely
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > released: [id: 0][route: {}->http://10.128.5.170:19002][total kept alive:
>    > 1; route al
>    > > located: 1 of 2; total allocated: 1 of 20]
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > manager is shutting down
>    > > 16/07/26 23:27:11 DEBUG DefaultManagedHttpClientConnection:
>    > http-outgoing-0: Close connection
>    > > 16/07/26 23:27:11 DEBUG DefaultManagedHttpClientConnection:
>    > http-outgoing-0: Close connection
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > manager shut down
>    > > 16/07/26 23:27:11 INFO AsterixHttpAPI: Response Handle: 
> {"handle":[13,0]}
>    > > 16/07/26 23:27:11 INFO AsterixHttpAPI: Handle(JID:13,RSID:0)
>    > > 16/07/26 23:27:11 DEBUG AsterixHttpAPI: Get status of:
>    > %7B%22handle%22%3A%5B13%2C0%5D%7D
>    > > 16/07/26 23:27:11 DEBUG RequestAddCookies: CookieSpec selected:
>    > best-match
>    > > 16/07/26 23:27:11 DEBUG RequestAuthCache: Auth cache not set in the
>    > context
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > request: [route: {}->http://10.128.5.170:19002][total kept alive: 0;
>    > route allocated:
>    > > 0 of 2; total allocated: 0 of 20]
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > leased: [id: 1][route: {}->http://10.128.5.170:19002][total kept alive:
>    > 0; route allo
>    > > cated: 1 of 2; total allocated: 1 of 20]
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Opening connection {}->
>    > http://10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG HttpClientConnectionManager: Connecting to /
>    > 10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Executing request GET
>    > /query/status?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D HTTP/1.1
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Target auth state: UNCHALLENGED
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Proxy auth state: UNCHALLENGED
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 >> GET
>    > /query/status?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D HTTP/1.1
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 >> Host:
>    > 10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 >> Connection:
>    > Keep-Alive
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 >> User-Agent:
>    > Apache-HttpClient/4.3.2 (java 1.5)
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 >> Accept-Encoding:
>    > gzip,deflate
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 >> "GET
>    > /query/status?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D HTTP/1.1[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 >> "Host:
>    > 10.128.5.170:19002[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 >> "Connection:
>    > Keep-Alive[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 >> "User-Agent:
>    > Apache-HttpClient/4.3.2 (java 1.5)[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 >> "Accept-Encoding:
>    > gzip,deflate[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 >> "[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 << "HTTP/1.1 200
>    > OK[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 << "Content-Type:
>    > text/html;charset=UTF-8[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 << "Content-Length:
>    > 20[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 << "Server:
>    > Jetty(8.0.0.RC0)[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 << "[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-1 << "{"status":"SUCCESS"}"
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 << HTTP/1.1 200 OK
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 << Content-Type:
>    > text/html;charset=UTF-8
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 << Content-Length: 20
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-1 << Server:
>    > Jetty(8.0.0.RC0)
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Connection can be kept alive
>    > indefinitely
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > [id: 1][route: {}->http://10.128.5.170:19002] can be kept alive
>    > indefinitely
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > released: [id: 1][route: {}->http://10.128.5.170:19002][total kept alive:
>    > 1; route al
>    > > located: 1 of 2; total allocated: 1 of 20]
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > manager is shutting down
>    > > 16/07/26 23:27:11 DEBUG DefaultManagedHttpClientConnection:
>    > http-outgoing-1: Close connection
>    > > 16/07/26 23:27:11 DEBUG DefaultManagedHttpClientConnection:
>    > http-outgoing-1: Close connection
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > manager shut down
>    > > 16/07/26 23:27:11 DEBUG AsterixHttpAPI: Get locations of:
>    > %7B%22handle%22%3A%5B13%2C0%5D%7D
>    > > 16/07/26 23:27:11 DEBUG RequestAddCookies: CookieSpec selected:
>    > best-match
>    > > 16/07/26 23:27:11 DEBUG RequestAuthCache: Auth cache not set in the
>    > context
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > request: [route: {}->http://10.128.5.170:19002][total kept alive: 0;
>    > route allocated:
>    > > 0 of 2; total allocated: 0 of 20]
>    > > 16/07/26 23:27:11 DEBUG PoolingHttpClientConnectionManager: Connection
>    > leased: [id: 2][route: {}->http://10.128.5.170:19002][total kept alive:
>    > 0; route allo
>    > > cated: 1 of 2; total allocated: 1 of 20]
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Opening connection {}->
>    > http://10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG HttpClientConnectionManager: Connecting to /
>    > 10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Executing request GET
>    > /query/result/location?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D HTTP/1.1
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Target auth state: UNCHALLENGED
>    > > 16/07/26 23:27:11 DEBUG MainClientExec: Proxy auth state: UNCHALLENGED
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-2 >> GET
>    > /query/result/location?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D HTTP/1.1
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-2 >> Host:
>    > 10.128.5.170:19002
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-2 >> Connection:
>    > Keep-Alive
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-2 >> User-Agent:
>    > Apache-HttpClient/4.3.2 (java 1.5)
>    > > 16/07/26 23:27:11 DEBUG headers: http-outgoing-2 >> Accept-Encoding:
>    > gzip,deflate
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-2 >> "GET
>    > /query/result/location?handle=%7B%22handle%22%3A%5B13%2C0%5D%7D
>    > HTTP/1.1[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-2 >> "Host:
>    > 10.128.5.170:19002[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-2 >> "Connection:
>    > Keep-Alive[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-2 >> "User-Agent:
>    > Apache-HttpClient/4.3.2 (java 1.5)[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-2 >> "Accept-Encoding:
>    > gzip,deflate[\r][\n]"
>    > > 16/07/26 23:27:11 DEBUG wire: http-outgoing-2 >> "[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "HTTP/1.1 404 Not
>    > Found[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "Cache-Control:
>    > must-revalidate,no-cache,no-store[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "Content-Type:
>    > text/html;charset=ISO-8859-1[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "Content-Length:
>    > 1288[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "Server:
>    > Jetty(8.0.0.RC0)[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "[\r][\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<html>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<head>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<meta
>    > http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<title>Error 404 Not
>    > Found</title>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "</head>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<body>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<h2>HTTP ERROR:
>    > 404</h2>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<p>Problem accessing
>    > /query/result/location. Reason:[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<pre>    Not
>    > Found</pre></p>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "<hr
>    > /><i><small>Powered by Jetty://</small></i>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "
>    >                           [\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "</body>[\n]"
>    > > 16/07/26 23:27:12 DEBUG wire: http-outgoing-2 << "</html>[\n]"
>    > > 16/07/26 23:27:12 DEBUG headers: http-outgoing-2 << HTTP/1.1 404 Not
>    > Found
>    > > 16/07/26 23:27:12 DEBUG headers: http-outgoing-2 << Cache-Control:
>    > must-revalidate,no-cache,no-store
>    > > 16/07/26 23:27:12 DEBUG headers: http-outgoing-2 << Content-Type:
>    > text/html;charset=ISO-8859-1
>    > > 16/07/26 23:27:12 DEBUG headers: http-outgoing-2 << Content-Length: 
> 1288
>    > > 16/07/26 23:27:12 DEBUG headers: http-outgoing-2 << Server:
>    > Jetty(8.0.0.RC0)
>    > > 16/07/26 23:27:12 DEBUG MainClientExec: Connection can be kept alive
>    > indefinitely
>    > > 16/07/26 23:27:12 DEBUG PoolingHttpClientConnectionManager: Connection
>    > [id: 2][route: {}->http://10.128.5.170:19002] can be kept alive
>    > indefinitely
>    > > 16/07/26 23:27:12 DEBUG PoolingHttpClientConnectionManager: Connection
>    > released: [id: 2][route: {}->http://10.128.5.170:19002][total kept alive:
>    > 1; route al
>    > > located: 1 of 2; total allocated: 1 of 20]
>    > > 16/07/26 23:27:12 DEBUG PoolingHttpClientConnectionManager: Connection
>    > manager is shutting down
>    > > 16/07/26 23:27:12 DEBUG DefaultManagedHttpClientConnection:
>    > http-outgoing-2: Close connection
>    > > 16/07/26 23:27:12 DEBUG DefaultManagedHttpClientConnection:
>    > http-outgoing-2: Close connection
>    > > 16/07/26 23:27:12 DEBUG PoolingHttpClientConnectionManager: Connection
>    > manager shut down
>    > > 16/07/26 23:27:12 INFO AsterixHttpAPI: Result Locations: <html>
>    > > <head>
>    > > <meta http-equiv="Content-Type" 
> content="text/html;charset=ISO-8859-1"/>
>    > > <title>Error 404 Not Found</title>
>    > > </head>
>    > > <body>
>    > > <h2>HTTP ERROR: 404</h2>
>    > > <p>Problem accessing /query/result/location. Reason:
>    > > <pre>    Not Found</pre></p>
>    > > <hr /><i><small>Powered by Jetty://</small></i>
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > > </body>
>    > > </html>
>    > >
>    > > net.liftweb.json.JsonParser$ParseException: unknown token <
>    > > Near: <h
>    > >        at net.liftweb.json.JsonParser$Parser.fail(JsonParser.scala:234)
>    > >        at
>    > net.liftweb.json.JsonParser$Parser.nextToken(JsonParser.scala:321)
>    > >        at
>    > net.liftweb.json.JsonParser$$anonfun$2.apply(JsonParser.scala:188)
>    > >        at
>    > net.liftweb.json.JsonParser$$anonfun$2.apply(JsonParser.scala:141)
>    > >        at net.liftweb.json.JsonParser$.parse(JsonParser.scala:80)
>    > >        at net.liftweb.json.JsonParser$.parse(JsonParser.scala:45)
>    > >        at net.liftweb.json.package$.parse(package.scala:41)
>    > >        at net.liftweb.json.Serialization$.read(Serialization.scala:58)
>    > >        at
>    > 
> org.apache.asterix.connector.AsterixHttpAPI.getResultLocations(AsterixHttpAPI.scala:129)
>    > >        at
>    > 
> org.apache.asterix.connector.SparkContextFunctions.executeQuery(SparkContextFunctions.scala:103)
>    > >        at
>    > 
> org.apache.asterix.connector.SparkContextFunctions.aql(SparkContextFunctions.scala:80)
>    > >        at
>    > 
> org.apache.spark.sql.asterix.SQLContextFunctions.executeQuery(SQLContextFunctions.scala:103)
>    > >        at
>    > 
> org.apache.spark.sql.asterix.SQLContextFunctions.aql(SQLContextFunctions.scala:84)
>    > >        at
>    > 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>    > >        at
>    > 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
>    > >        at
>    > $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
>    > >        at
>    > $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
>    > >        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
>    > >        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
>    > >        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
>    > >        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
>    > >        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
>    > >        at $iwC$$iwC$$iwC.<init>(<console>:56)
>    > >        at $iwC$$iwC.<init>(<console>:58)
>    > >        at $iwC.<init>(<console>:60)
>    > >        at <init>(<console>:62)
>    > >        at .<init>(<console>:66)
>    > >        at .<clinit>(<console>)
>    > >        at .<init>(<console>:7)
>    > >        at .<clinit>(<console>)
>    > >        at $print(<console>)
>    > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    > >        at
>    > 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>    > >        at
>    > 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    > >        at java.lang.reflect.Method.invoke(Method.java:498)
>    > >        at
>    > 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>    > >        at
>    > 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>    > >        at
>    > org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>    > >        at
>    > org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>    > >        at
>    > org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>    > >        at
>    > org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>    > >        at
>    > 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>    > >        at 
> org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>    > >        at
>    > org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>    > >        at
>    > org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>    > >        at org.apache.spark.repl.SparkILoop.org
>    > $apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>    > >        at
>    > 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>    > >        at
>    > 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>    > >        at
>    > 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>    > >        at
>    > 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>    > >        at org.apache.spark.repl.SparkILoop.org
>    > $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>    > >        at 
> org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>    > >        at org.apache.spark.repl.Main$.main(Main.scala:31)
>    > >        at org.apache.spark.repl.Main.main(Main.scala)
>    > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    > >        at
>    > 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>    > >        at
>    > 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    > >        at java.lang.reflect.Method.invoke(Method.java:498)
>    > >        at
>    > 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>    > >        at
>    > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>    > >        at
>    > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>    > >        at
>    > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>    > >        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>    > >
>    > >
>    > >
>    > >
>    > >
>    > >
>    > > From: Wail Alkowaileet <[email protected]>
>    > > Reply-To: "[email protected]" <[email protected]>
>    > > Date: Monday, July 18, 2016 at 5:26 AM
>    > > To: "[email protected]" <[email protected]>, "
>    > [email protected]" <[email protected]>
>    > > Subject: Re: Trio: AsterixDB, Spark and Zeppelin.
>    > >
>    > > Sorry. Here's the link for the connecot:
>    > > https://github.com/Nullification/asterixdb-spark-connector
>    > >
>    > > On Mon, Jul 18, 2016 at 2:34 PM, Wail Alkowaileet <[email protected]>
>    > wrote:
>    > > Dears,
>    > >
>    > > Finally I finished cleaning and documenting the AsterixDB-Spark
>    > connector and finalize Zeppelin interpreter for AQL and SQL++.
>    > >
>    > > AsterixDB-Spark Connector:
>    > > • Supports both AQL and SQL++ queries.
>    > > • Much cleaner code now.
>    > > • Please if you have ANY problem with it, create an issue in the 
> project
>    > repo.
>    > > • I'm working on a tutorial-video from the build to use it in Zeppelin.
>    > > • I recommend you to use Zeppelin. (you can import the connector 
> example
>    > notebook)
>    > > Source Code: https://github.com/Nullification/astreixdb-spark-connector
>    > >
>    > > Apache Zeppelin with AsterixDB interpreter:
>    > > • Supports JSON-flattening (which will allow zeppelin to visualize
>    > results).
>    > > • See attached screenshots.
>    > > • Will try to initiate pull request to merge it to Zeppelin master.
>    > > Source Code: https://github.com/Nullification/zeppelin
>    > >
>    > > Finally, I just submitted Schema Inferencer. I have work on some Sonar
>    > comments and it should be ready soon.
>    > >
>    > > Thanks!
>    > >
>    > > --
>    > >
>    > > Regards,
>    > > Wail Alkowaileet
>    > >
>    > >
>    > >
>    > >
>    > > --
>    > >
>    > > Regards,
>    > > Wail Alkowaileet
>    > >
>    > >
>    >
>    > Best regards,
>    > Ildar
>    >
>    >
>    
>    
>    -- 
>    
>    *Regards,*
>    Wail Alkowaileet
>    
>

Reply via email to