Hi All,I was able to solve both these issues. Thanks!
Just FYI:
For 1:






import org.apache.spark.rdd;
import org.apache.spark.rdd.RDD;
For 2: 







        rdd.map(x => jc_.score(str1, new StringWrapper(x)))     From: 
ssti...@live.com
To: u...@spark.incubator.apache.org
Subject: Basic Scala and Spark questions
Date: Mon, 23 Jun 2014 10:38:04 -0700




















Hi All,I am new so Scala and Spark. I have a basic question. I have the 
following import statements in my Scala program. I want to pass my function 
(printScore) to Spark. It will compare a string 
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf/* import thirdparty jars */  I have the 
following method in my Scala class:
class DistanceClass{val ta = new textAnalytics();
def printScore(sourceStr: String, rdd: RDD[String]) {
// Third party jars have StringWrapper  val str1 = new StringWrapper 
(sourceStr)val ta_ = this.ta;
rdd.map(str1, x => ta_.score(str1, StringWrapper(x))       }
I am using Eclipse for development. I have the following questions:1. I get 
error Not found: type RDD error. Can someone please tell me which jars do I 
need to add as external jars and what dhoulf I add iunder import statements so 
that this error will go away. 2. Also, including StringWrapper(x) inside map, 
will that be OK? rdd.map(str1, x => ta_.score(str1, StringWrapper(x))
                                                                                
  

Reply via email to