it seems that it's nothing about settings, i tried take action, and find it's
ok, but error occurs when i tried count and collect

val a = sc.textFile("any file")
a.take(n).foreach(println) //ok

a.count() //failed
a.collect()//failed


val b = sc.parallelize((Array(1,2,3,4))

b.take(n).foreach(println) //ok

b.count() //ok
b.collect()//ok

it's so weird


Arpit Tak-2 wrote
> Okk fine,
> 
> try like this , i tried and it works..
> specify spark path also in constructor...
> and also
> export SPARK_JAVA_OPTS="-Xms300m -Xmx512m -XX:MaxPermSize=1g"
> 
> import org.apache.spark.SparkContext
>     import org.apache.spark.SparkContext._
>     object SimpleApp {
>        def main(args: Array[String]) {
>           val logFile = "/var/log/auth.log" // read any file.....
>           val sc = new SparkContext("spark://localhost:7077", "Simple
> App",
> "/home/ubuntu/spark-0.9.1-incubating/",
>           List("target/scala-2.10/simple-project_2.10-2.0.jar"))
>           val tr = sc.textFile(logFile).cache
>           tr.take(100).foreach(println)
> 
>        }
>     }
> 
> This will work....
> 
> 
> On Thu, Apr 24, 2014 at 3:00 PM, wxhsdp <

> wxhsdp@

> > wrote:
> 
>> hi arpit,
>> on spark shell, i can read local file properly,
>> but when i use sbt run, error occurs.
>> the sbt error message is in the beginning of the thread
>>
>>
>> Arpit Tak-2 wrote
>> > Hi,
>> >
>> > You should be able to read it, file://or file:/// not even required for
>> > reading locally , just path is enough..
>> > what error message you getting on spark-shell while reading...
>> > for local:
>> >
>> >
>> > Also read the same from hdfs file also ...
>> > put your README file there and read , it  works in both ways..
>> > val a= sc.textFile("hdfs://localhost:54310/t/README.md")
>> >
>> > also, print stack message of your spark-shell...
>> >
>> >
>> > On Thu, Apr 24, 2014 at 2:25 PM, wxhsdp <
>>
>> > wxhsdp@
>>
>> > > wrote:
>> >
>> >> thanks for your reply, adnan, i tried
>> >> val logFile = "file:///home/wxhsdp/spark/example/standalone/README.md"
>> >> i think there needs three left slash behind file:
>> >>
>> >> it's just the same as val logFile =
>> >> "home/wxhsdp/spark/example/standalone/README.md"
>> >> the error remains:(
>> >>
>> >>
>> >>
>> >> --
>> >> View this message in context:
>> >>
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p4743.html
>> >> Sent from the Apache Spark User List mailing list archive at
>> Nabble.com.
>> >>
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p4745.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p4752.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to