Re: Multiple servers in a Ignite Cluster

2016-11-10 Thread Tracyl
Hi Vlad, The ideal work flow for my use case is: I host two clusters, one is computation cluster that run Spark jobs, the other is data cluster that host Ignite node and cache hot data. Then at the run time, multiple Spark jobs share this data cluster and query it. The problem I have is, I am

Re: Multiple servers in a Ignite Cluster

2016-11-09 Thread Tracyl
Thanks. Works fine now. -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Multiple-servers-in-a-Ignite-Cluster-tp8840p8851.html Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Re: Multiple servers in a Ignite Cluster

2016-11-09 Thread Tracyl
Thanks. In that case, my question is how to define the scope of cluster(Or how to specify the cluster a server belongs to)? I assume if someone else start a ignite node, would my ignite server auto-discover it as well? -- View this message in context:

Multiple servers in a Ignite Cluster

2016-11-09 Thread Tracyl
I have following ignite config: def initializeIgniteConfig() = { val ipFinder = new TcpDiscoveryVmIpFinder() val HOST = "xx.xx.xx.xx:47500..47509" ipFinder.setAddresses(Collections.singletonList(HOST)) val discoverySpi = new TcpDiscoverySpi()

Re: What's the difference between EntryProcessor and distributed closure?

2016-11-08 Thread Tracyl
Thanks Alexey. By predicate/projection pushdown, I mean: currently I am storing a native Spark Row object as value format of IgniteCache. If I retrieve it as an IgniteRDD, I only want certain column of that Row object rather than returning entire Row and do filter/projection at Spark level. Do

What's the difference between EntryProcessor and distributed closure?

2016-11-07 Thread Tracyl
What I would like to do is to achieve predicate/column project push-down to the ignite cache layer. I guess this two options could do it, isn't it? If so, what's the difference? Are there any other options to achieve predicate push-down? Thanks in advance! -- View this message in context:

Is it possible to enable both REPLICATED and PARTITIONED?

2016-10-09 Thread Tracyl
As subject shows. -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Is-it-possible-to-enable-both-REPLICATED-and-PARTITIONED-tp8167.html Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Re: Fail to cache rdd: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

2016-10-06 Thread Tracyl
Hi Denis, This is really helpful. Yes, I need the original dataframe for other API. Now I am using RDD[String, Row] as type and caching dataframe using: val rdd = df.map(row => (row.getAs[String]("KEY"), row)) igniteRDD.savePairs(rdd) It works perfectly fine. Also I was able to reconstruct the

Re: Fail to cache rdd: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

2016-10-06 Thread Tracyl
Thanks for prompt reply. So if I want to cache dataframe in IgniteCache, I have to do define a custom data model class(e.g. https://github.com/apache/ignite/blob/master/examples/src/main/java/org/apache/ignite/examples/model/Person.java ) as a schema of dataframe, then construct objects and

Fail to cache rdd: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

2016-10-06 Thread Tracyl
Hi team, I was trying to cache dataframe in Ignite Cache. I was able to cache generic type data elements(RDD). However each time when I use igniteRDDF.saveValues() to cache a non-generic data type(e.g. RDD), it will trigger the noSuchMethod for saveValues as following shows. I am using

Could IgniteCache be accessed thr Spark JDBC data source api?

2016-10-06 Thread Tracyl
Hey team, I was able to use JDBC driver tool to access IgniteCache. Is it possible to connect to Ignite thr Spark JDBC data source api? Below are the code and exceptions I got. It seems like the connection is successful but there are datatype mapping issues. Do I need to define some schema from

test

2016-10-06 Thread Tracyl
test -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/test-tp8120.html Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Does IgniteCache could be access thr Spark JDBC data source api?

2016-10-06 Thread Tracyl
Hey team, I was able to use JDBC driver tool to access IgniteCache. Is it possible to connect to Ignite thr Spark JDBC data source api? Below are the code and exceptions I got. It seems like the connection is successful but there are datatype mapping issues. Do I need to define some schema from

Fail to cache rdd: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

2016-10-06 Thread Tracyl
Hi team, I was trying to cache dataframe in Ignite Cache. I was able to cache generic type data elements(RDD). However each time when I use igniteRDDF.saveValues() to cache a non-generic data type(e.g. RDD), it will trigger the noSuchMethod for saveValues as following shows. I am using