Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread L. C. Hsieh
Thanks, Chao!

On Wed, Nov 30, 2022 at 9:58 AM huaxin gao  wrote:
>
> Thanks Chao for driving the release!
>
> On Wed, Nov 30, 2022 at 9:24 AM Dongjoon Hyun  wrote:
>>
>> Thank you, Chao!
>>
>> On Wed, Nov 30, 2022 at 8:16 AM Yang,Jie(INF)  wrote:
>>>
>>> Thanks, Chao!
>>>
>>>
>>>
>>> 发件人: Maxim Gekk 
>>> 日期: 2022年11月30日 星期三 19:40
>>> 收件人: Jungtaek Lim 
>>> 抄送: Wenchen Fan , Chao Sun , dev 
>>> , user 
>>> 主题: Re: [ANNOUNCE] Apache Spark 3.2.3 released
>>>
>>>
>>>
>>> Thank you, Chao!
>>>
>>>
>>>
>>> On Wed, Nov 30, 2022 at 12:42 PM Jungtaek Lim 
>>>  wrote:
>>>
>>> Thanks Chao for driving the release!
>>>
>>>
>>>
>>> On Wed, Nov 30, 2022 at 6:03 PM Wenchen Fan  wrote:
>>>
>>> Thanks, Chao!
>>>
>>>
>>>
>>> On Wed, Nov 30, 2022 at 1:33 AM Chao Sun  wrote:
>>>
>>> We are happy to announce the availability of Apache Spark 3.2.3!
>>>
>>> Spark 3.2.3 is a maintenance release containing stability fixes. This
>>> release is based on the branch-3.2 maintenance branch of Spark. We strongly
>>> recommend all 3.2 users to upgrade to this stable release.
>>>
>>> To download Spark 3.2.3, head over to the download page:
>>> https://spark.apache.org/downloads.html
>>>
>>> To view the release notes:
>>> https://spark.apache.org/releases/spark-release-3-2-3.html
>>>
>>> We would like to acknowledge all community members for contributing to this
>>> release. This release would not have been possible without you.
>>>
>>> Chao
>>>
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread huaxin gao
Thanks Chao for driving the release!

On Wed, Nov 30, 2022 at 9:24 AM Dongjoon Hyun 
wrote:

> Thank you, Chao!
>
> On Wed, Nov 30, 2022 at 8:16 AM Yang,Jie(INF)  wrote:
>
>> Thanks, Chao!
>>
>>
>>
>> *发件人**: *Maxim Gekk 
>> *日期**: *2022年11月30日 星期三 19:40
>> *收件人**: *Jungtaek Lim 
>> *抄送**: *Wenchen Fan , Chao Sun ,
>> dev , user 
>> *主题**: *Re: [ANNOUNCE] Apache Spark 3.2.3 released
>>
>>
>>
>> Thank you, Chao!
>>
>>
>>
>> On Wed, Nov 30, 2022 at 12:42 PM Jungtaek Lim <
>> kabhwan.opensou...@gmail.com> wrote:
>>
>> Thanks Chao for driving the release!
>>
>>
>>
>> On Wed, Nov 30, 2022 at 6:03 PM Wenchen Fan  wrote:
>>
>> Thanks, Chao!
>>
>>
>>
>> On Wed, Nov 30, 2022 at 1:33 AM Chao Sun  wrote:
>>
>> We are happy to announce the availability of Apache Spark 3.2.3!
>>
>> Spark 3.2.3 is a maintenance release containing stability fixes. This
>> release is based on the branch-3.2 maintenance branch of Spark. We
>> strongly
>> recommend all 3.2 users to upgrade to this stable release.
>>
>> To download Spark 3.2.3, head over to the download page:
>> https://spark.apache.org/downloads.html
>> 
>>
>> To view the release notes:
>> https://spark.apache.org/releases/spark-release-3-2-3.html
>> 
>>
>> We would like to acknowledge all community members for contributing to
>> this
>> release. This release would not have been possible without you.
>>
>> Chao
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread Dongjoon Hyun
Thank you, Chao!

On Wed, Nov 30, 2022 at 8:16 AM Yang,Jie(INF)  wrote:

> Thanks, Chao!
>
>
>
> *发件人**: *Maxim Gekk 
> *日期**: *2022年11月30日 星期三 19:40
> *收件人**: *Jungtaek Lim 
> *抄送**: *Wenchen Fan , Chao Sun ,
> dev , user 
> *主题**: *Re: [ANNOUNCE] Apache Spark 3.2.3 released
>
>
>
> Thank you, Chao!
>
>
>
> On Wed, Nov 30, 2022 at 12:42 PM Jungtaek Lim <
> kabhwan.opensou...@gmail.com> wrote:
>
> Thanks Chao for driving the release!
>
>
>
> On Wed, Nov 30, 2022 at 6:03 PM Wenchen Fan  wrote:
>
> Thanks, Chao!
>
>
>
> On Wed, Nov 30, 2022 at 1:33 AM Chao Sun  wrote:
>
> We are happy to announce the availability of Apache Spark 3.2.3!
>
> Spark 3.2.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.2 maintenance branch of Spark. We strongly
> recommend all 3.2 users to upgrade to this stable release.
>
> To download Spark 3.2.3, head over to the download page:
> https://spark.apache.org/downloads.html
> 
>
> To view the release notes:
> https://spark.apache.org/releases/spark-release-3-2-3.html
> 
>
> We would like to acknowledge all community members for contributing to this
> release. This release would not have been possible without you.
>
> Chao
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [SPARK STRUCTURED STREAMING] : Rocks DB uses off-heap usage

2022-11-30 Thread Adam Binford
We started hitting this as well, seeing 90+ GB resident memory on a 25 GB
heap executor. After a lot of manually testing fixes, I finally figured out
the root problem: https://issues.apache.org/jira/browse/SPARK-41339

Starting to work on a PR now to fix.

On Mon, Sep 12, 2022 at 10:46 AM Artemis User 
wrote:

> The off-heap memory isn't subjected to GC.  So the obvious reason is that
> your have too many states to maintain in your streaming app, and the GC
> couldn't keep up, and end up with resources but to die.  Are you using
> continues processing or microbatch in structured streaming?  You may want
> to lower your incoming data rate and/or increase your microbatch size so to
> lower the number of states to be persisted/maintained...
>
> On 9/11/22 10:59 AM, akshit marwah wrote:
>
> Hi Team,
>
> We are trying to shift from HDFS State Manager to Rocks DB State Manager,
> but while doing POC we realised it is using much more off-heap space than
> expected. Because of this, the executors get killed with  : *out of**
> physical memory exception.*
>
> Could you please help in understanding, why is there a massive increase in
> off-heap space, and what can we do about it?
>
> We are using, SPARK 3.2.1 with 1 executor and 1 executor core, to
> understand the memory requirements -
> 1. Rocks DB Run - took 3.5 GB heap and 11.5 GB Res Memory
> 2. Hdfs State Manager - took 5 GB heap and 10 GB Res Memory.
>
> Thanks,
> Akshit
>
>
> Thanks and regards
> - Akshit Marwah
>
>
>

-- 
Adam Binford


Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread Yang,Jie(INF)
Thanks, Chao!

发件人: Maxim Gekk 
日期: 2022年11月30日 星期三 19:40
收件人: Jungtaek Lim 
抄送: Wenchen Fan , Chao Sun , dev 
, user 
主题: Re: [ANNOUNCE] Apache Spark 3.2.3 released

Thank you, Chao!

On Wed, Nov 30, 2022 at 12:42 PM Jungtaek Lim 
mailto:kabhwan.opensou...@gmail.com>> wrote:
Thanks Chao for driving the release!

On Wed, Nov 30, 2022 at 6:03 PM Wenchen Fan 
mailto:cloud0...@gmail.com>> wrote:
Thanks, Chao!

On Wed, Nov 30, 2022 at 1:33 AM Chao Sun 
mailto:sunc...@apache.org>> wrote:
We are happy to announce the availability of Apache Spark 3.2.3!

Spark 3.2.3 is a maintenance release containing stability fixes. This
release is based on the branch-3.2 maintenance branch of Spark. We strongly
recommend all 3.2 users to upgrade to this stable release.

To download Spark 3.2.3, head over to the download page:
https://spark.apache.org/downloads.html

To view the release notes:
https://spark.apache.org/releases/spark-release-3-2-3.html

We would like to acknowledge all community members for contributing to this
release. This release would not have been possible without you.

Chao

-
To unsubscribe e-mail: 
dev-unsubscr...@spark.apache.org


unsubsrcibe

2022-11-30 Thread Suryanarayana Garlapati (Nokia)


Regards
Surya



Re: Error using SPARK with Rapid GPU

2022-11-30 Thread Alessandro Bellina
Vajiha filed a spark-rapids discussion here
https://github.com/NVIDIA/spark-rapids/discussions/7205, so if you are
interested please follow there.

On Wed, Nov 30, 2022 at 7:17 AM Vajiha Begum S A <
vajihabegu...@maestrowiz.com> wrote:

> Hi,
> I'm using an Ubuntu system with the NVIDIA Quadro K1200 with GPU memory
> 20GB
> Installed - CUDF 22.10.0 jar file, Rapid 4 Spark 2.12-22.10.0 jar file,
> CUDA Toolkit 11.8.0 Linux version., JAVA 8
> I'm running only single server, Master is localhost
>
> I'm trying to run pyspark code through spark submit & Python idle. I'm
> getting errors. Kindly help me to resolve this error.
> Kindly give suggestions where I have made mistakes.
>
> *Error when running code through spark-submit:*
>spark-submit /home/mwadmin/Documents/test.py
> 22/11/30 14:59:32 WARN Utils: Your hostname, mwadmin-HP-Z440-Workstation
> resolves to a loopback address: 127.0.1.1; using ***.***.**.** instead (on
> interface eno1)
> 22/11/30 14:59:32 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
> another address
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 22/11/30 14:59:32 INFO SparkContext: Running Spark version 3.2.2
> 22/11/30 14:59:32 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 22/11/30 14:59:33 INFO ResourceUtils:
> ==
> 22/11/30 14:59:33 INFO ResourceUtils: No custom resources configured for
> spark.driver.
> 22/11/30 14:59:33 INFO ResourceUtils:
> ==
> 22/11/30 14:59:33 INFO SparkContext: Submitted application: Spark.com
> 22/11/30 14:59:33 INFO ResourceProfile: Default ResourceProfile created,
> executor resources: Map(cores -> name: cores, amount: 1, script: , vendor:
> , memory -> name: memory, amount: 1024, script: , vendor: , offHeap ->
> name: offHeap, amount: 0, script: , vendor: , gpu -> name: gpu, amount: 1,
> script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0,
> gpu -> name: gpu, amount: 0.5)
> 22/11/30 14:59:33 INFO ResourceProfile: Limiting resource is cpus at 1
> tasks per executor
> 22/11/30 14:59:33 WARN ResourceUtils: The configuration of resource: gpu
> (exec = 1, task = 0.5/2, runnable tasks = 2) will result in wasted
> resources due to resource cpus limiting the number of runnable tasks per
> executor to: 1. Please adjust your configuration.
> 22/11/30 14:59:33 INFO ResourceProfileManager: Added ResourceProfile id: 0
> 22/11/30 14:59:33 INFO SecurityManager: Changing view acls to: mwadmin
> 22/11/30 14:59:33 INFO SecurityManager: Changing modify acls to: mwadmin
> 22/11/30 14:59:33 INFO SecurityManager: Changing view acls groups to:
> 22/11/30 14:59:33 INFO SecurityManager: Changing modify acls groups to:
> 22/11/30 14:59:33 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users  with view permissions: Set(mwadmin);
> groups with view permissions: Set(); users  with modify permissions:
> Set(mwadmin); groups with modify permissions: Set()
> 22/11/30 14:59:33 INFO Utils: Successfully started service 'sparkDriver'
> on port 45883.
> 22/11/30 14:59:33 INFO SparkEnv: Registering MapOutputTracker
> 22/11/30 14:59:33 INFO SparkEnv: Registering BlockManagerMaster
> 22/11/30 14:59:33 INFO BlockManagerMasterEndpoint: Using
> org.apache.spark.storage.DefaultTopologyMapper for getting topology
> information
> 22/11/30 14:59:33 INFO BlockManagerMasterEndpoint:
> BlockManagerMasterEndpoint up
> 22/11/30 14:59:33 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
> 22/11/30 14:59:33 INFO DiskBlockManager: Created local directory at
> /tmp/blockmgr-647d2c2a-72e4-402d-aeff-d7460726eb6d
> 22/11/30 14:59:33 INFO MemoryStore: MemoryStore started with capacity
> 366.3 MiB
> 22/11/30 14:59:33 INFO SparkEnv: Registering OutputCommitCoordinator
> 22/11/30 14:59:33 INFO Utils: Successfully started service 'SparkUI' on
> port 4040.
> 22/11/30 14:59:33 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at
> htttp://localhost:4040
> 22/11/30 14:59:33 INFO ShimLoader: Loading shim for Spark version: 3.2.2
> 22/11/30 14:59:33 INFO ShimLoader: Complete Spark build info: 3.2.2,
> https://github.com/apache/spark, HEAD,
> 78a5825fe266c0884d2dd18cbca9625fa258d7f7, 2022-07-11T15:44:21Z
> 22/11/30 14:59:33 INFO ShimLoader: findURLClassLoader found a
> URLClassLoader org.apache.spark.util.MutableURLClassLoader@1530c739
> 22/11/30 14:59:33 INFO ShimLoader: Updating spark classloader
> org.apache.spark.util.MutableURLClassLoader@1530c739 with the URLs:
> jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark3xx-common/,
> jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark322/
> 22/11/30 14:59:33 INFO ShimLoader: Spark classLoader
> org.apache.spark.util.MutableURLClassLoader@1530c739 updated successfully
> 22/11/30 

Error using SPARK with Rapid GPU

2022-11-30 Thread Vajiha Begum S A
Hi,
I'm using an Ubuntu system with the NVIDIA Quadro K1200 with GPU memory 20GB
Installed - CUDF 22.10.0 jar file, Rapid 4 Spark 2.12-22.10.0 jar file,
CUDA Toolkit 11.8.0 Linux version., JAVA 8
I'm running only single server, Master is localhost

I'm trying to run pyspark code through spark submit & Python idle. I'm
getting errors. Kindly help me to resolve this error.
Kindly give suggestions where I have made mistakes.

*Error when running code through spark-submit:*
   spark-submit /home/mwadmin/Documents/test.py
22/11/30 14:59:32 WARN Utils: Your hostname, mwadmin-HP-Z440-Workstation
resolves to a loopback address: 127.0.1.1; using ***.***.**.** instead (on
interface eno1)
22/11/30 14:59:32 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
22/11/30 14:59:32 INFO SparkContext: Running Spark version 3.2.2
22/11/30 14:59:32 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
22/11/30 14:59:33 INFO ResourceUtils:
==
22/11/30 14:59:33 INFO ResourceUtils: No custom resources configured for
spark.driver.
22/11/30 14:59:33 INFO ResourceUtils:
==
22/11/30 14:59:33 INFO SparkContext: Submitted application: Spark.com
22/11/30 14:59:33 INFO ResourceProfile: Default ResourceProfile created,
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor:
, memory -> name: memory, amount: 1024, script: , vendor: , offHeap ->
name: offHeap, amount: 0, script: , vendor: , gpu -> name: gpu, amount: 1,
script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0,
gpu -> name: gpu, amount: 0.5)
22/11/30 14:59:33 INFO ResourceProfile: Limiting resource is cpus at 1
tasks per executor
22/11/30 14:59:33 WARN ResourceUtils: The configuration of resource: gpu
(exec = 1, task = 0.5/2, runnable tasks = 2) will result in wasted
resources due to resource cpus limiting the number of runnable tasks per
executor to: 1. Please adjust your configuration.
22/11/30 14:59:33 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/11/30 14:59:33 INFO SecurityManager: Changing view acls to: mwadmin
22/11/30 14:59:33 INFO SecurityManager: Changing modify acls to: mwadmin
22/11/30 14:59:33 INFO SecurityManager: Changing view acls groups to:
22/11/30 14:59:33 INFO SecurityManager: Changing modify acls groups to:
22/11/30 14:59:33 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users  with view permissions: Set(mwadmin);
groups with view permissions: Set(); users  with modify permissions:
Set(mwadmin); groups with modify permissions: Set()
22/11/30 14:59:33 INFO Utils: Successfully started service 'sparkDriver' on
port 45883.
22/11/30 14:59:33 INFO SparkEnv: Registering MapOutputTracker
22/11/30 14:59:33 INFO SparkEnv: Registering BlockManagerMaster
22/11/30 14:59:33 INFO BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
22/11/30 14:59:33 INFO BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up
22/11/30 14:59:33 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/11/30 14:59:33 INFO DiskBlockManager: Created local directory at
/tmp/blockmgr-647d2c2a-72e4-402d-aeff-d7460726eb6d
22/11/30 14:59:33 INFO MemoryStore: MemoryStore started with capacity 366.3
MiB
22/11/30 14:59:33 INFO SparkEnv: Registering OutputCommitCoordinator
22/11/30 14:59:33 INFO Utils: Successfully started service 'SparkUI' on
port 4040.
22/11/30 14:59:33 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at
htttp://localhost:4040
22/11/30 14:59:33 INFO ShimLoader: Loading shim for Spark version: 3.2.2
22/11/30 14:59:33 INFO ShimLoader: Complete Spark build info: 3.2.2,
https://github.com/apache/spark, HEAD,
78a5825fe266c0884d2dd18cbca9625fa258d7f7, 2022-07-11T15:44:21Z
22/11/30 14:59:33 INFO ShimLoader: findURLClassLoader found a
URLClassLoader org.apache.spark.util.MutableURLClassLoader@1530c739
22/11/30 14:59:33 INFO ShimLoader: Updating spark classloader
org.apache.spark.util.MutableURLClassLoader@1530c739 with the URLs:
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark3xx-common/,
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark322/
22/11/30 14:59:33 INFO ShimLoader: Spark classLoader
org.apache.spark.util.MutableURLClassLoader@1530c739 updated successfully
22/11/30 14:59:33 INFO ShimLoader: Updating spark classloader
org.apache.spark.util.MutableURLClassLoader@1530c739 with the URLs:
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark3xx-common/,
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark322/
22/11/30 14:59:33 INFO ShimLoader: Spark classLoader

Error - using Spark with GPU

2022-11-30 Thread Vajiha Begum S A
 spark-submit /home/mwadmin/Documents/test.py
22/11/30 14:59:32 WARN Utils: Your hostname, mwadmin-HP-Z440-Workstation
resolves to a loopback address: 127.0.1.1; using ***.***.**.** instead (on
interface eno1)
22/11/30 14:59:32 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
22/11/30 14:59:32 INFO SparkContext: Running Spark version 3.2.2
22/11/30 14:59:32 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
22/11/30 14:59:33 INFO ResourceUtils:
==
22/11/30 14:59:33 INFO ResourceUtils: No custom resources configured for
spark.driver.
22/11/30 14:59:33 INFO ResourceUtils:
==
22/11/30 14:59:33 INFO SparkContext: Submitted application: Spark.com
22/11/30 14:59:33 INFO ResourceProfile: Default ResourceProfile created,
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor:
, memory -> name: memory, amount: 1024, script: , vendor: , offHeap ->
name: offHeap, amount: 0, script: , vendor: , gpu -> name: gpu, amount: 1,
script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0,
gpu -> name: gpu, amount: 0.5)
22/11/30 14:59:33 INFO ResourceProfile: Limiting resource is cpus at 1
tasks per executor
22/11/30 14:59:33 WARN ResourceUtils: The configuration of resource: gpu
(exec = 1, task = 0.5/2, runnable tasks = 2) will result in wasted
resources due to resource cpus limiting the number of runnable tasks per
executor to: 1. Please adjust your configuration.
22/11/30 14:59:33 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/11/30 14:59:33 INFO SecurityManager: Changing view acls to: mwadmin
22/11/30 14:59:33 INFO SecurityManager: Changing modify acls to: mwadmin
22/11/30 14:59:33 INFO SecurityManager: Changing view acls groups to:
22/11/30 14:59:33 INFO SecurityManager: Changing modify acls groups to:
22/11/30 14:59:33 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users  with view permissions: Set(mwadmin);
groups with view permissions: Set(); users  with modify permissions:
Set(mwadmin); groups with modify permissions: Set()
22/11/30 14:59:33 INFO Utils: Successfully started service 'sparkDriver' on
port 45883.
22/11/30 14:59:33 INFO SparkEnv: Registering MapOutputTracker
22/11/30 14:59:33 INFO SparkEnv: Registering BlockManagerMaster
22/11/30 14:59:33 INFO BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
22/11/30 14:59:33 INFO BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up
22/11/30 14:59:33 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/11/30 14:59:33 INFO DiskBlockManager: Created local directory at
/tmp/blockmgr-647d2c2a-72e4-402d-aeff-d7460726eb6d
22/11/30 14:59:33 INFO MemoryStore: MemoryStore started with capacity 366.3
MiB
22/11/30 14:59:33 INFO SparkEnv: Registering OutputCommitCoordinator
22/11/30 14:59:33 INFO Utils: Successfully started service 'SparkUI' on
port 4040.
22/11/30 14:59:33 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at
htttp://localhost:4040
22/11/30 14:59:33 INFO ShimLoader: Loading shim for Spark version: 3.2.2
22/11/30 14:59:33 INFO ShimLoader: Complete Spark build info: 3.2.2,
https://github.com/apache/spark, HEAD,
78a5825fe266c0884d2dd18cbca9625fa258d7f7, 2022-07-11T15:44:21Z
22/11/30 14:59:33 INFO ShimLoader: findURLClassLoader found a
URLClassLoader org.apache.spark.util.MutableURLClassLoader@1530c739
22/11/30 14:59:33 INFO ShimLoader: Updating spark classloader
org.apache.spark.util.MutableURLClassLoader@1530c739 with the URLs:
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark3xx-common/,
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark322/
22/11/30 14:59:33 INFO ShimLoader: Spark classLoader
org.apache.spark.util.MutableURLClassLoader@1530c739 updated successfully
22/11/30 14:59:33 INFO ShimLoader: Updating spark classloader
org.apache.spark.util.MutableURLClassLoader@1530c739 with the URLs:
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark3xx-common/,
jar:file:/home/mwadmin/spark-3.2.2-bin-hadoop3.2/jars/rapids-4-spark_2.12-22.10.0.jar!/spark322/
22/11/30 14:59:33 INFO ShimLoader: Spark classLoader
org.apache.spark.util.MutableURLClassLoader@1530c739 updated successfully
22/11/30 14:59:33 INFO RapidsPluginUtils: RAPIDS Accelerator build:
{version=22.10.0, user=, url=https://github.com/NVIDIA/spark-rapids.git,
date=2022-10-17T11:25:41Z,
revision=c75a2eafc9ce9fb3e6ab75c6677d97bf681bff50, cudf_version=22.10.0,
branch=HEAD}
22/11/30 14:59:33 INFO RapidsPluginUtils: RAPIDS Accelerator JNI build:
{version=22.10.0, user=, url=https://github.com/NVIDIA/spark-rapids-jni.git,
date=2022-10-14T05:19:41Z,

Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread Maxim Gekk
Thank you, Chao!

On Wed, Nov 30, 2022 at 12:42 PM Jungtaek Lim 
wrote:

> Thanks Chao for driving the release!
>
> On Wed, Nov 30, 2022 at 6:03 PM Wenchen Fan  wrote:
>
>> Thanks, Chao!
>>
>> On Wed, Nov 30, 2022 at 1:33 AM Chao Sun  wrote:
>>
>>> We are happy to announce the availability of Apache Spark 3.2.3!
>>>
>>> Spark 3.2.3 is a maintenance release containing stability fixes. This
>>> release is based on the branch-3.2 maintenance branch of Spark. We
>>> strongly
>>> recommend all 3.2 users to upgrade to this stable release.
>>>
>>> To download Spark 3.2.3, head over to the download page:
>>> https://spark.apache.org/downloads.html
>>>
>>> To view the release notes:
>>> https://spark.apache.org/releases/spark-release-3-2-3.html
>>>
>>> We would like to acknowledge all community members for contributing to
>>> this
>>> release. This release would not have been possible without you.
>>>
>>> Chao
>>>
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>


Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread Jungtaek Lim
Thanks Chao for driving the release!

On Wed, Nov 30, 2022 at 6:03 PM Wenchen Fan  wrote:

> Thanks, Chao!
>
> On Wed, Nov 30, 2022 at 1:33 AM Chao Sun  wrote:
>
>> We are happy to announce the availability of Apache Spark 3.2.3!
>>
>> Spark 3.2.3 is a maintenance release containing stability fixes. This
>> release is based on the branch-3.2 maintenance branch of Spark. We
>> strongly
>> recommend all 3.2 users to upgrade to this stable release.
>>
>> To download Spark 3.2.3, head over to the download page:
>> https://spark.apache.org/downloads.html
>>
>> To view the release notes:
>> https://spark.apache.org/releases/spark-release-3-2-3.html
>>
>> We would like to acknowledge all community members for contributing to
>> this
>> release. This release would not have been possible without you.
>>
>> Chao
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: [ANNOUNCE] Apache Spark 3.2.3 released

2022-11-30 Thread Wenchen Fan
Thanks, Chao!

On Wed, Nov 30, 2022 at 1:33 AM Chao Sun  wrote:

> We are happy to announce the availability of Apache Spark 3.2.3!
>
> Spark 3.2.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.2 maintenance branch of Spark. We strongly
> recommend all 3.2 users to upgrade to this stable release.
>
> To download Spark 3.2.3, head over to the download page:
> https://spark.apache.org/downloads.html
>
> To view the release notes:
> https://spark.apache.org/releases/spark-release-3-2-3.html
>
> We would like to acknowledge all community members for contributing to this
> release. This release would not have been possible without you.
>
> Chao
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>