oh i just noticed the big warning in spark 1.x Logging

 * NOTE: DO NOT USE this class outside of Spark. It is intended as an
internal utility.
 *       This will likely be changed or removed in future releases.

On Tue, Mar 15, 2016 at 3:29 PM, Koert Kuipers <ko...@tresata.com> wrote:

> makes sense
>
> note that Logging was not private[spark] in 1.x, which is why i used it.
>
> On Tue, Mar 15, 2016 at 12:55 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
>
>> Logging is a "private[spark]" class so binary compatibility is not
>> important at all, because code outside of Spark isn't supposed to use
>> it. Mixing Spark library versions is also not recommended, not just
>> because of this reason.
>>
>> There have been other binary changes in the Logging class in the past too.
>>
>> On Tue, Mar 15, 2016 at 7:49 AM, Koert Kuipers <ko...@tresata.com> wrote:
>> > i have been using spark 2.0 snapshots with some libraries build for
>> spark
>> > 1.0 so far (simply because it worked). in last few days i noticed this
>> new
>> > error:
>> >
>> > [error] Uncaught exception when running
>> > com.tresata.spark.sql.fieldsapi.FieldsApiSpec:
>> java.lang.AbstractMethodError
>> > sbt.ForkMain$ForkError: java.lang.AbstractMethodError: null
>> >     at org.apache.spark.Logging$class.log(Logging.scala:46)
>> >     at
>> > com.tresata.spark.sorted.PairRDDFunctions.log(PairRDDFunctions.scala:13)
>> >
>> > so it seems spark made binary incompatible changes in logging.
>> > i do not think spark 2.0 is trying to have binary compatibility with
>> 1.0 so
>> > i assume this is a non-issue, but just in case the assumptions are
>> different
>> > (or incompatibilities are actively minimized) i wanted to point it out.
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>
>

Reply via email to