Hi Szymon,
There seems to be a common misconception that setting JAVA_HOME will set the
version of Java that is used.
This is not true, because in most environments you need to have a PATH
environment variable set up that points at the version of Java that you want to
use.
You can set
Severity: important
Description:
The Apache Spark UI offers the possibility to enable ACLs via the
configuration option spark.acls.enable. With an authentication filter, this
checks whether a user has access permissions to view or modify the
application. If ACLs are enabled, a code path in
Hey
Could you provide some pseudo code ?
Also what kind of machine are you using per executor ? How many cores per
executor ?
What's the size of input data and what's the size of the output ?
What kind of errors are you getting ?
Best
Tufan
On Sun, 17 Jul 2022 at 00:31, Orkhan Dadashov
wrote:
We are happy to announce the availability of Apache Spark 3.2.2!
Spark 3.2.2 is a maintenance release containing stability fixes. This
release is based on the branch-3.2 maintenance branch of Spark. We strongly
recommend all 3.2 users to upgrade to this stable release.
To download Spark 3.2.2,