.
With Regards
Sree
On Thursday, April 16, 2015 9:07 PM, Arun Lists lists.a...@gmail.com
wrote:
Here is what I got from the engineer who worked on building Spark and
using it on Windows:
1) Hadoop winutils.exe is needed on Windows, even for local files – and
you have to set
We run Spark on Mac and Linux but also need to run it on Windows 8.1 and
Windows Server. We ran into problems with the Scala 2.10 binary bundle for
Spark 1.3.0 but managed to get it working. However, on Mac/Linux, we are on
Scala 2.11.6 (we built Spark from the sources). On Windows, however
errors are you
seeing?
Matei
On Apr 16, 2015, at 9:23 AM, Arun Lists lists.a...@gmail.com wrote:
We run Spark on Mac and Linux but also need to run it on Windows 8.1
and Windows Server. We ran into problems with the Scala 2.10 binary bundle
for Spark 1.3.0 but managed to get it working
)
perhaps the class is package private or something, and the repl somehow
subverts it ...
On Tue, Apr 14, 2015 at 5:44 PM, Arun Lists lists.a...@gmail.com wrote:
Hi Imran,
Thanks for the response! However, I am still not there yet.
In the Scala interpreter, I can do:
scala classOf
-by-one:
scala classOf[scala.reflect.ClassTag$$anon$1]
res0: Class[scala.reflect.ClassTag[T]{def unapply(x$1:
scala.runtime.BoxedUnit): Option[_]; def arrayClass(x$1: Class[_]):
Class[_]}] = class scala.reflect.ClassTag$$anon$1
On Mon, Apr 13, 2015 at 6:09 PM, Arun Lists lists.a...@gmail.com
Hi,
I am trying to register classes with KryoSerializer. This has worked with
other programs. Usually the error messages are helpful in indicating which
classes need to be registered. But with my current program, I get the
following cryptic error message:
*Caused by:
Hi,
Does SparkContext's textFile() method handle files with Unicode characters?
How about files in UTF-8 format?
Going further, is it possible to specify encodings to the method? If not,
what should one do if the files to be read are in some encoding?
Thanks,
arun
Thanks!
arun
On Wed, Apr 8, 2015 at 10:51 AM, java8964 java8...@hotmail.com wrote:
Spark use the Hadoop TextInputFormat to read the file. Since Hadoop is
almost only supporting Linux, so UTF-8 is the only encoding supported, as
it is the the one on Linux.
If you have other encoding data,
Hi,
Is it possible to specify a Spark property like spark.local.dir from the
command line when running an application using spark-submit?
Thanks,
arun
Hi,
We are trying to run a Spark application using spark-submit on Windows 8.1.
The application runs successfully to completion on MacOS 10.10 and on
Ubuntu Linux. On Windows, we get the following error messages (see below).
It appears that Spark is trying to delete some temporary directory that
I just figured this out from the documentation:
--conf spark.local.dir=C:\Temp
On Tue, Apr 7, 2015 at 5:00 PM, Arun Lists lists.a...@gmail.com wrote:
Hi,
Is it possible to specify a Spark property like spark.local.dir from the
command line when running an application using spark-submit
I am trying to register classes with KryoSerializer. I get the following
error message:
How do I find out what class is being referred to by: *OpenHashMap$mcI$sp ?*
*com.esotericsoftware.kryo.KryoException:
java.lang.IllegalArgumentException: Class is not registered:
Here is the relevant snippet of code in my main program:
===
sparkConf.set(spark.serializer,
org.apache.spark.serializer.KryoSerializer)
sparkConf.set(spark.kryo.registrationRequired, true)
val summaryDataClass = classOf[SummaryData]
val summaryViewClass
been fixed in
https://github.com/apache/spark/pull/4258 but not yet been merged.
Best Regards,
Shixiong Zhu
2015-02-02 10:08 GMT+08:00 Arun Lists lists.a...@gmail.com:
Here is the relevant snippet of code in my main program:
===
sparkConf.set
in a JAR file will
necessarily be specific to where the JAR is on the local filesystem and
that is not portable or the right way to read a resource. But you didn't
specify the problem here.
On Jan 14, 2015 5:15 AM, Arun Lists lists.a...@gmail.com wrote:
I experimented with using getResourceAsStream
I experimented with using getResourceAsStream(cls, fileName) instead
cls.getResource(fileName).toURI. That works!
I have no idea why the latter method does not work in Spark. Any
explanations would be welcome.
Thanks,
arun
On Tue, Jan 13, 2015 at 6:35 PM, Arun Lists lists.a...@gmail.com wrote
you're
running with Scala 2.11 too?
On Tue, Jan 13, 2015 at 6:58 AM, Arun Lists lists.a...@gmail.com wrote:
I have a Spark application that was assembled using sbt 0.13.7, Scala
2.11,
and Spark 1.2.0. In build.sbt, I am running on Mac OSX Yosemite.
I use provided for the Spark
In some classes, I initialize some values from resource files using the
following snippet:
new File(cls.getResource(fileName).toURI)
This works fine in SBT. When I run it using spark-submit, I get a
bunch of errors because the classes cannot be initialized. What can I
do to make such
I have a Spark application that was assembled using sbt 0.13.7, Scala 2.11,
and Spark 1.2.0. In build.sbt, I am running on Mac OSX Yosemite.
I use provided for the Spark dependencies. I can run the application fine
within sbt.
I run into problems when I try to run it from the command line. Here
19 matches
Mail list logo