I too stuck on same issue , but on shark (0.9 with spark-0.9 ) on
hadoop-2.2.0 .
On rest hadoop versions , it works perfect
Regards,
Arpit Tak
On Wed, Apr 16, 2014 at 11:18 PM, Aureliano Buendia buendia...@gmail.comwrote:
Is this resolved in spark 0.9.1?
On Tue, Apr 15, 2014 at 6:55
I've received the same error with Spark built using Maven. It turns out that
mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at
runtime. Protobuf included by Akka is shaded and doesn't cause any problems.
The solution is to update the mesos dependency to 0.18.0 in spark's
Hi Prasad
Sorry for missing your reply.
https://gist.github.com/thegiive/10791823
Here it is.
Wisely Chen
On Fri, Apr 4, 2014 at 11:57 PM, Prasad ramachandran.pra...@gmail.comwrote:
Hi Wisely,
Could you please post your pom.xml here.
Thanks
--
View this message in context:
Hi Wisely,
Could you please post your pom.xml here.
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p3770.html
Sent from the Apache Spark User List
Egor, i encounter the same problem which you have asked in this thread:
http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCAMrx5DwJVJS0g_FE7_2qwMu4Xf0y5VfV=tlyauv2kh5v4k6...@mail.gmail.com%3E
have you fixed this problem?
i am using shark to read a table which i have created on
Starting with Spark 0.9 the protobuf dependency we use is shaded and
cannot interfere with other protobuf libaries including those in
Hadoop. Not sure what's going on in this case. Would someone who is
having this problem post exactly how they are building spark?
- Patrick
On Fri, Mar 21, 2014
On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski
og...@plainvanillagames.com wrote:
On 3/18/14, 4:49 AM, dmpou...@gmail.com wrote:
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia wrote:
Is there a reason for spark using the older akka?
On Sun, Mar 2, 2014 at 1:53 PM, 1esha
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia wrote:
Is there a reason for spark using the older akka?
On Sun, Mar 2, 2014 at 1:53 PM, 1esha alexey.r...@gmail.com wrote:
The problem is in akka remote. It contains files compiled with 2.4.*. When
you run it with 2.5.* in
On 3/18/14, 4:49 AM, dmpou...@gmail.com wrote:
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia wrote:
Is there a reason for spark using the older akka?
On Sun, Mar 2, 2014 at 1:53 PM, 1esha alexey.r...@gmail.com wrote:
The problem is in akka remote. It contains files compiled
Spark 0.9 uses protobuf 2.5.0
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
So there is not any reason why you can't read some messages from hadoop 2.2
with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of
course it's very bad,
In that same pom
profile
idyarn/id
properties
hadoop.major.version2/hadoop.major.version
hadoop.version2.2.0/hadoop.version
protobuf.version2.5.0/protobuf.version
/properties
modules
moduleyarn/module
/modules
/profile
11 matches
Mail list logo