Hello Tilak,
1. I get error Not found: type RDD error. Can someone please tell me which jars 
do I need to add as external jars and what dhoulf I add iunder import 
statements so that this error will go away.
Do you not see any issues with the import statements?
Add the spark-assembly-1.0.0-hadoop2.2.0.jar file as a dependency.
You can download Spark from here (http://spark.apache.org/downloads.html). 
You'll find the above mentioned jar in the lib folder.
Import Statement: import org.apache.spark.rdd.RDD
From: Sameer Tilak [mailto:ssti...@live.com]
Sent: Monday, June 23, 2014 10:38 AM
To: u...@spark.incubator.apache.org
Subject: Basic Scala and Spark questions

Hi All,
I am new so Scala and Spark. I have a basic question. I have the following 
import statements in my Scala program. I want to pass my function (printScore) 
to Spark. It will compare a string

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
/* import thirdparty jars */

I have the following method in my Scala class:

class DistanceClass
{
val ta = new textAnalytics();

def printScore(sourceStr: String, rdd: RDD[String])
{

// Third party jars have StringWrapper
val str1 = new StringWrapper (sourceStr)
val ta_ = this.ta;

rdd.map(str1, x => ta_.score(str1, StringWrapper(x))

}

I am using Eclipse for development. I have the following questions:
1. I get error Not found: type RDD error. Can someone please tell me which jars 
do I need to add as external jars and what dhoulf I add iunder import 
statements so that this error will go away.
2. Also, including StringWrapper(x) inside map, will that be OK? rdd.map(str1, 
x => ta_.score(str1, StringWrapper(x))

Reply via email to