+1

在 2015年10月22日 09:27, [email protected] 写道:
Hi, Luke

Yes. We are interested in this progress. Would love to know the

performance test between these two engines.

Best,
Sun.



[email protected]
From: Luke Han
Date: 2015-10-21 22:33
To: [email protected]
Subject: Re: why not Spark?
Is there anyone still interesting to ask this question?
We have some roughly benchmark result between Spark cubing and MR Cubing. Would like to know from community about the requirement and use cases. Thanks. Best Regards!
---------------------
Luke Han On Mon, Apr 13, 2015 at 9:29 PM, Luke Han <[email protected]> wrote:
Using this JIRA to track: https://issues.apache.org/jira/browse/KYLIN-679

Thanks.


Best Regards!
---------------------

Luke Han

2015-04-11 9:15 GMT+08:00 Li Yang <[email protected]>:

Spark could improve cube build greatly when the data fits in memory. We
have the extension point in design already. If anyone like to contribute
effort here, let us know.

On Thu, Apr 9, 2015 at 10:35 PM, Luke Han <[email protected]> wrote:

Subscribe this mailing list, follow up @ApacheKylin twitter.
The Kylin website also under refactoring to add more content about such
materials, coming soon.

Thanks.


Best Regards!
---------------------

Luke Han

2015-04-09 21:14 GMT+08:00 林澍荣 <[email protected]>:

Thanks, Luke! but how do I get such materials in the future?

2015-04-09 16:59 GMT+08:00 Luke Han <[email protected]>:

Hi Rong,
     Spark is actually hot topic around Kylin, please refer to my
presentation last month on Spark Meetup bay area:



http://www.slideshare.net/lukehan/adding-spark-support-to-kylin-spark-meetupv11
     We also would like to have more comments, inputs and ideas from
community to adding Spark support in Kylin.

     Thanks.

Luke


Best Regards!
---------------------

Luke Han

2015-04-09 16:34 GMT+08:00 林澍荣 <[email protected]>:

I have a question, why does Kylin not use Spark for cube building
job?
Spark features DAG and in-memory computing, and these features
will
improve
the cube building speed that is under mapreduce.
Thanks for any response, Shon


Reply via email to