aimed at handling substantial volumes of data. As part of
our deployment strategy, we are endeavouring to implement a Spark-based
application on our Azure Kubernetes service.
Regrettably, we have encountered challenges from a security perspective with
the latest Apache Spark Docker image
You find several presentations on this at the Spark summit web page.
Generally you have also to make a decision if you run one cluster for all
applications or one cluster per application in the container context.
Not sure though why do you want to run just on one node. If you have only one
Folks,
Can you share your experience of running spark under docker on a single
local / standalone node.
Anybody using it under production environments ?, we have a existing
Docker Swarm deployment, and i want to run Spark in a seperate FAT VM
hooked / controlled by docker swarm
I know there is
Try to build a flat (uber) jar which includes all dependencies.
11 Окт 2016 г. 22:11 пользователь "doruchiulan" <doru.chiu...@gmail.com>
написал:
> Hi,
>
> I have a problem that's bothering me for a few days, and I'm pretty out of
> ideas.
>
> I built a Spark
Hi,
I have a problem that's bothering me for a few days, and I'm pretty out of
ideas.
I built a Spark docker container where Spark runs in standalone mode. Both
master and worker are started there.
Now I tried to deploy my Spark Scala App in a separate container(same
machine) where I pass
Hi,
Whats the difference between amplab docker
https://github.com/amplab/docker-scripts and spark docker
https://github.com/apache/spark/tree/master/docker?
Thanks,
Josh
Hi,
Is the spark docker script now mature enough to substitute spark-ec2
script? Anyone here using the docker script is production?
AM, Aureliano Buendia buendia...@gmail.comwrote:
Hi,
Is the spark docker script now mature enough to substitute spark-ec2
script? Anyone here using the docker script is production?