Hey Parviz,

There was a similar thread a while ago... I think that many companies like
to be discrete about the size of large clusters. But of course it would be
great if people wanted to share openly :)

For my part - I can say that Spark has been benchmarked on
hundreds-of-nodes clusters before and on jobs that crunch hundreds of
terabytes (uncompressed) of data.

- Patrick


On Fri, Apr 4, 2014 at 12:05 PM, Parviz Deyhim <pdey...@gmail.com> wrote:

> Spark community,
>
>
> What's the size of the largest Spark cluster ever deployed? I've heard
> Yahoo is running Spark on several hundred nodes but don't know the actual
> number.
>
> can someone share?
>
> Thanks
>

Reply via email to