Re: using JavaRDD in spark-redis connector

2015-10-27 Thread Rohith P
got it ..thank u...




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391p14812.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: using JavaRDD in spark-redis connector

2015-09-30 Thread Akhil Das
You can create a JavaRDD as normal and then call the .rdd() to get the RDD.

Thanks
Best Regards

On Mon, Sep 28, 2015 at 9:01 PM, Rohith P <rparameshw...@couponsinc.com>
wrote:

> Hi all,
>   I am trying to work with spark-redis connector (redislabs) which
> requires all transactions between redis and spark be in RDD's. The language
> I am using is Java but the connector does not accept JavaRDD's .So I tried
> using Spark context in my code instead of JavaSparkContext. But when I
> wanted to create a RDD using sc.parallelize , it asks for some scala
> related
> parameters as opposed to lists in java when I tries to have both
> javaSparkContext and sparkcontext(for connector) then Multiple contexts
> cannot be opened was the error
>  The code that I have been trying 
>
>
> // initialize spark context
> private static RedisContext config() {
> conf = new SparkConf().setAppName("redis-jedis");
> sc2=new SparkContext(conf);
> RedisContext rc=new RedisContext(sc2);
> return rc;
>
> }
> //write to redis which requires the data to be in RDD
> private static void WriteUserTacticData(RedisContext rc, String
> userid,
> String tacticsId, String value) {
> hostTup= calling(redisHost,redisPort);
> String key=userid+"-"+tacticsId;
> RDD<Tuple2String, String>>
> newTup=createTuple(key,value);
> rc.toRedisKV(newTup,hostTup);
>
> // the createTuple where the RDD is to be created which will be inserted
> into redis
> private static RDD<Tuple2String, String>> createTuple(String
> key,
> String value) {
> sc=new JavaSparkContext(conf);
> ArrayList<Tuple2String,String>> list= new
> ArrayList<Tuple2String,String>>();
> Tuple2<String,String> e= new Tuple2<String,
> String>(key,value);
> list.add(e);
> JavaRDD<Tuple2String,String>> javardd=
> sc.parallelize(list);
> RDD<Tuple2String,String>>
> newTupRdd=JavaRDD.toRDD(javardd);
> sc.close();
> return newTupRdd;
> }
>
>
>
> How would I create an RDD(not javaRDD) in java which will be accepted by
> redis connector... Any kind of related to the topic would be
> appretiated..
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


using JavaRDD in spark-redis connector

2015-09-28 Thread Rohith P
Hi all,
  I am trying to work with spark-redis connector (redislabs) which
requires all transactions between redis and spark be in RDD's. The language 
I am using is Java but the connector does not accept JavaRDD's .So I tried
using Spark context in my code instead of JavaSparkContext. But when I
wanted to create a RDD using sc.parallelize , it asks for some scala related
parameters as opposed to lists in java when I tries to have both
javaSparkContext and sparkcontext(for connector) then Multiple contexts
cannot be opened was the error
 The code that I have been trying 


// initialize spark context
private static RedisContext config() {
conf = new SparkConf().setAppName("redis-jedis");
sc2=new SparkContext(conf);
RedisContext rc=new RedisContext(sc2);
return rc;

}
//write to redis which requires the data to be in RDD 
private static void WriteUserTacticData(RedisContext rc, String userid,
String tacticsId, String value) {
hostTup= calling(redisHost,redisPort);
String key=userid+"-"+tacticsId;
RDD<Tuple2String, String>> newTup=createTuple(key,value);
rc.toRedisKV(newTup,hostTup);

// the createTuple where the RDD is to be created which will be inserted
into redis
private static RDD<Tuple2String, String>> createTuple(String key,
String value) {
sc=new JavaSparkContext(conf);
ArrayList<Tuple2String,String>> list= new
ArrayList<Tuple2String,String>>();
Tuple2<String,String> e= new Tuple2<String, String>(key,value);
list.add(e);
JavaRDD<Tuple2String,String>> javardd= sc.parallelize(list);
RDD<Tuple2String,String>> newTupRdd=JavaRDD.toRDD(javardd); 
sc.close();
return newTupRdd;
}



How would I create an RDD(not javaRDD) in java which will be accepted by
redis connector... Any kind of related to the topic would be
appretiated..





--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org