Ch 6 listing from Advanced Analytics with Spark generates error. The listing 
is 
def plainTextToLemmas(text: String, stopWords: Set[String],pipeline: 
StanfordCoreNLP)    : Seq[String] = {    val doc = newAnnotation(text)   
pipeline.annotate(doc)    val lemmas = newArrayBuffer[String]()    val 
sentences =doc.get(classOf[SentencesAnnotation])    for (sentence <-sentences; 
token <- sentence.get(classOf[TokensAnnotation])) {      val lemma 
=token.get(classOf[LemmaAnnotation])      if (lemma.length> 2 && 
!stopWords.contains(lemma) && isOnlyLetters(lemma)) {        lemmas 
+=lemma.toLowerCase      }    }    lemmas  }
The error is
<console>:37: error: value foreach is not a member of 
java.util.List[edu.stanford.nlp.util.CoreMap]
           for (sentence <- sentences; token <- sentence.get(classOf[TokensAnnot
ation])) {
                            ^

Reply via email to