Yeah, sadly this dependency was introduced when someone consolidated the
logging infrastructure.  However, the dependency should be very small and
thus easy to remove, and I would like catalyst to be usable outside of
Spark.  A pull request to make this possible would be welcome.

Ideally, we'd create some sort of spark common package that has things like
logging.  That way catalyst could depend on that, without pulling in all of
Hadoop, etc.  Maybe others have opinions though, so I'm cc-ing the dev list.


On Mon, Jul 14, 2014 at 12:21 AM, Yanbo Liang <yanboha...@gmail.com> wrote:

> Make Catalyst independent of Spark is the goal of Catalyst, maybe need
> time and evolution.
> I awared that package org.apache.spark.sql.catalyst.util
> embraced org.apache.spark.util.{Utils => SparkUtils},
> so that Catalyst has a dependency on Spark core.
> I'm not sure whether it will be replaced by other component independent of
> Spark in later release.
>
>
> 2014-07-14 11:51 GMT+08:00 Aniket Bhatnagar <aniket.bhatna...@gmail.com>:
>
> As per the recent presentation given in Scala days (
>> http://people.apache.org/~marmbrus/talks/SparkSQLScalaDays2014.pdf), it
>> was mentioned that Catalyst is independent of Spark. But on inspecting
>> pom.xml of sql/catalyst module, it seems it has a dependency on Spark Core.
>> Any particular reason for the dependency? I would love to use Catalyst
>> outside Spark
>>
>> (reposted as previous email bounced. Sorry if this is a duplicate).
>>
>
>

Reply via email to