chenpengchuan created SEDONA-53:
-----------------------------------

             Summary: SpatialKnnQuery NullPointerException
                 Key: SEDONA-53
                 URL: https://issues.apache.org/jira/browse/SEDONA-53
             Project: Apache Sedona
          Issue Type: Bug
         Environment: spark-2.4.5-bin-hadoop2.7
java version "1.8.0_60"
            Reporter: chenpengchuan
             Fix For: 1.0.0


SpatialKnnQuery while useIndex is false and k larger than spatialRDD's 
approximateTotalCount

Will throw an exception:

Caused by: java.lang.NullPointerException
 at 
org.datasyslab.geospark.knnJudgement.GeometryDistanceComparator.compare(GeometryDistanceComparator.java:61)
 at 
org.datasyslab.geospark.knnJudgement.GeometryDistanceComparator.compare(GeometryDistanceComparator.java:29)
 at scala.math.LowPriorityOrderingImplicits$$anon$7.compare(Ordering.scala:153)
 at org.apache.spark.util.collection.Utils$$anon$1.compare(Utils.scala:35)
 at org.spark_project.guava.collect.Ordering.max(Ordering.java:551)
 at org.spark_project.guava.collect.Ordering.leastOf(Ordering.java:667)
 at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
 at 
org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$32.apply(RDD.scala:1478)
 at 
org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$32.apply(RDD.scala:1475)
 at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
 at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)
 at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
 at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
 at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
 at org.apache.spark.scheduler.Task.run(Task.scala:123)
 at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
 at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to