[ 
https://issues.apache.org/jira/browse/SPARK-10460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14733918#comment-14733918
 ] 

FELIPE Q B ALMEIDA edited comment on SPARK-10460 at 9/7/15 5:00 PM:
--------------------------------------------------------------------

I am. These are the imports I'm using:

{code:title=imports.scala|borderStyle=solid}
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.sql.{SQLContext,Row}
import org.apache.spark.sql.functions.{udf,col,max,min}

import scala.util.Random
{code}


was (Author: queirozfcom):
I am. These are the imports I'm using:

{code:title=foo.scala|borderStyle=solid}
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.sql.{SQLContext,Row}
import org.apache.spark.sql.functions.{udf,col,max,min}

import scala.util.Random
{code}

> fieldIndex method missing on spark.sql.Row
> ------------------------------------------
>
>                 Key: SPARK-10460
>                 URL: https://issues.apache.org/jira/browse/SPARK-10460
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1
>         Environment: I'm running on an Ubuntu 14.04 32-bit machine, Java 7, 
> spark 1.4.1. Jar was created using sbt-assembly. I've tested both using spark 
> submit and in spark-shell. Both time I had errors in the exact same spot.
>            Reporter: FELIPE Q B ALMEIDA
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> {code:title=foo.scala|borderStyle=solid}
> val sc       = new SparkContext(cnf)
>               
> val sqlContext = new SQLContext(sc)
> import sqlContext.implicits._
> // initializing the dataframe from json file
> val reviewsDF = sqlContext.jsonFile(inputDir)
> val schema = reviewsDF.schema
> val cleanRDD = reviewsDF.rdd.filter{row:Row => 
>     // 
> ***************************************************************************
>     //error: value fieldIndex is not a member of org.apache.spark.sql.row
>     val unixTimestampIndex = row.fieldIndex("unixReviewTime")
>     // 
> ***************************************************************************
>     val tryLong = Try(row.getLong(unixTimestampIndex))
>      (row.anyNull == false && tryLong.isSuccess)
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to