from pyspark import SparkContext, SparkConf
import random
conf = SparkConf().\
setMaster("spark://192.168.0.39:7077").setAppName("Job Name: Calculate Pi")
sc = SparkContext(conf=conf)
FeigenBaumConstant = 4.6692016
TheFour = 4
# calculate the value of π using Mandelbrot set Random
The spark UI is misleading in spark 2.4.4. I moved to spark 2.4.5 and it
fixed it. Now, your problem should be somewhere else. Probably related to
memory consumption but not the one you see in the UI.
Best regards,
Ali Gouta.
On Sun, May 17, 2020 at 7:36 PM András Kolbert
wrote:
> Hi,
>
> I hav
Hi,
I have a streaming job (Spark 2.4.4) in which the memory usage keeps
increasing over time.
Periodically (20-25) mins the executors fall over
(org.apache.spark.shuffle.MetadataFetchFailedException: Missing an output
location for shuffle 6987) due to out of memory. In the UI, I can see that
the