Hi,

thanks for your interest,

as far as I know the only way to make a data available to every node on the
cluster in the same fashion is using SparkContext
sc.addFile("/path/to/file") and then get it's location via
SparkFiles.get(fileName)

Please let me know if that helps!

--
Alexander


On Thu, May 14, 2015 at 6:13 AM, Tang, Jie <jta...@ebay.com> wrote:

>  Hello,
>
>
>
> I know Zeppelin can upload spark dependencies using %dep. Is there a way
> to upload a data file which can be accessed by spark cluster? If so, where
> will the data file be uploaded?
>
>
>
> Thanks,
>
> Jie
>

Reply via email to