Hi Alexey,

Thanks for advice, with queryEntity.setValueType("DT1") i can save pairs to
cache, but i get another exception when i try to get my data back:

scala> cache.count
[Stage 3:>                                                       (0 + 0) /
1024]16/03/08 01:36:24 ERROR Executor: Exception in task 1.0 in stage 3.0
(TID 5)
javax.cache.CacheException: class org.apache.ignite.IgniteCheckedException:
Failed to read class name from file [id=99745,
file=/usr/ignite/work/marshaller/99745.classname]
        at
org.apache.ignite.internal.processors.cache.GridCacheUtils.convertToCacheException(GridCacheUtils.java:1597)
        at
org.apache.ignite.internal.processors.cache.query.GridCacheQueryAdapter$CacheQueryFallbackFuture.retryIfPossible(GridCacheQueryAdapter.java:700)
        at
org.apache.ignite.internal.processors.cache.query.GridCacheQueryAdapter$CacheQueryFallbackFuture.next(GridCacheQueryAdapter.java:670)
        at
org.apache.ignite.internal.processors.cache.IgniteCacheProxy$5.onHasNext(IgniteCacheProxy.java:529)
        at
org.apache.ignite.internal.util.GridCloseableIteratorAdapter.hasNextX(GridCloseableIteratorAdapter.java:53)
        at
org.apache.ignite.internal.util.lang.GridIteratorAdapter.hasNext(GridIteratorAdapter.java:45)
        at
org.apache.ignite.spark.impl.IgniteQueryIterator.hasNext(IgniteQueryIterator.scala:24)
        at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1595)
        at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1143)
        at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1143)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)


I have created docker images (spark 1.6 +ignite 1.6), probably you can have
a quick look 
docker run -it dmitryb/ignite-spark:0.0.1

I build ignite from https://git-wip-us.apache.org/repos/asf/ignite using
scala 2.10
mvn clean package -DskipTests -Dscala-2.10


Regards,
Dmitry



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/index-and-query-org-apache-ignite-spark-IgniteRDD-String-org-apache-spark-sql-Row-tp3343p3388.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Reply via email to