Any word on this one ?
On Apr 2, 2014, at 12:26 AM, Vipul Pandey <vipan...@gmail.com> wrote:

> I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus 
> generated also has both shaded and real version of protobuf classes
> 
> Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv 
> ./assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>  | grep proto | grep /Message
>   1190 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageOrBuilder.class
>   2913 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/Message$Builder.class
>    704 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageLite.class
>   1904 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageLite$Builder.class
>    257 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageLiteOrBuilder.class
>    508 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message.class
>   2661 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message$Builder.class
>    478 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message.class
>   1748 Wed Apr 02 00:20:00 PDT 2014 
> com/google/protobuf/MessageLite$Builder.class
>    668 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite.class
>    245 Wed Apr 02 00:20:00 PDT 2014 
> com/google/protobuf/MessageLiteOrBuilder.class
>   1112 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageOrBuilder.class
> 
> 
> 
> 
> 
> On Apr 1, 2014, at 11:44 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> 
>> It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
>> 
>> 
>> On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey <vipan...@gmail.com> wrote:
>> how do you recommend building that - it says 
>> ERROR] Failed to execute goal 
>> org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly 
>> (default-cli) on project spark-0.9.0-incubating: Error reading assemblies: 
>> No assembly descriptors found. -> [Help 1]
>> upon runnning 
>> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
>> 
>> 
>> On Apr 1, 2014, at 4:13 PM, Patrick Wendell <pwend...@gmail.com> wrote:
>> 
>>> Do you get the same problem if you build with maven?
>>> 
>>> 
>>> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey <vipan...@gmail.com> wrote:
>>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly 
>>> 
>>> That's all I do. 
>>> 
>>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell <pwend...@gmail.com> wrote:
>>> 
>>>> Vidal - could you show exactly what flags/commands you are using when you 
>>>> build spark to produce this assembly?
>>>> 
>>>> 
>>>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey <vipan...@gmail.com> wrote:
>>>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't 
>>>>> be getting pulled in unless you are directly using akka yourself. Are you?
>>>> 
>>>> No i'm not. Although I see that protobuf libraries are directly pulled 
>>>> into the 0.9.0 assembly jar - I do see the shaded version as well. 
>>>> e.g. below for Message.class
>>>> 
>>>> -bash-4.1$ jar -ftv 
>>>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>>>>  | grep protobuf | grep /Message.class
>>>>    478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>>>    508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>>>> 
>>>> 
>>>>> Does your project have other dependencies that might be indirectly 
>>>>> pulling in protobuf 2.4.1? It would be helpful if you could list all of 
>>>>> your dependencies including the exact Spark version and other libraries.
>>>> 
>>>> I did have another one which I moved to the end of classpath - even ran 
>>>> partial code without that dependency but it still failed whenever I use 
>>>> the jar with ScalaBuf dependency. 
>>>> Spark version is 0.9.0
>>>> 
>>>> 
>>>> ~Vipul
>>>> 
>>>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell <pwend...@gmail.com> wrote:
>>>> 
>>>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't 
>>>>> be getting pulled in unless you are directly using akka yourself. Are you?
>>>>> 
>>>>> Does your project have other dependencies that might be indirectly 
>>>>> pulling in protobuf 2.4.1? It would be helpful if you could list all of 
>>>>> your dependencies including the exact Spark version and other libraries.
>>>>> 
>>>>> - Patrick
>>>>> 
>>>>> 
>>>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey <vipan...@gmail.com> wrote:
>>>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same 
>>>>> issue. any word on this one?
>>>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep <kanwal...@gmail.com> wrote:
>>>>> 
>>>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 
>>>>> > 0.9 with
>>>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar 
>>>>> > deployed
>>>>> > on each of the spark worker nodes.
>>>>> > The message is compiled using 2.5 but then on runtime it is being
>>>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>>>> >
>>>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> >
>>>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article 
>>>>> > -
>>>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able 
>>>>> > to
>>>>> > use different version of protobuf in the application.
>>>>> >
>>>>> >
>>>>> >
>>>>> >
>>>>> >
>>>>> > --
>>>>> > View this message in context: 
>>>>> > http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>>>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>> 
>>>>> 
>>>> 
>>>> 
>>> 
>>> 
>> 
>> 
> 

Reply via email to