Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-25 Thread Holden Karau
On Fri, Feb 25, 2022 at 8:31 AM Ismaël Mejía  wrote:

> The ready to use docker images are great news. I have been waiting for
> this for so long! Extra kudos for including ARM64 versions too!
>
> I am curious, what are the non-ASF artifacts included in them (or you
> refer to the OS specific elements with other licenses?), and what
> consequences might be for end users because of that.
>
OS elements, JDK, Python, etc. For any licensing concerns you should
probably consult a lawyer.

>
> Thanks and kudos to everyone who helped to make this happen!
> Ismaël
>
> ps. Any plans to make this images official docker images at some point
> (for the extra security/validation) [1]
> [1] https://docs.docker.com/docker-hub/official_images/
>
> On Mon, Feb 21, 2022 at 10:09 PM Holden Karau 
> wrote:
> >
> > We are happy to announce the availability of Spark 3.1.3!
> >
> > Spark 3.1.3 is a maintenance release containing stability fixes. This
> > release is based on the branch-3.1 maintenance branch of Spark. We
> strongly
> > recommend all 3.1 users to upgrade to this stable release.
> >
> > To download Spark 3.1.3, head over to the download page:
> > https://spark.apache.org/downloads.html
> >
> > To view the release notes:
> > https://spark.apache.org/releases/spark-release-3-1-3.html
> >
> > We would like to acknowledge all community members for contributing to
> this
> > release. This release would not have been possible without you.
> >
> > New Dockerhub magic in this release:
> >
> > We've also started publishing docker containers to the Apache Dockerhub,
> > these contain non-ASF artifacts that are subject to different license
> terms than the
> > Spark release. The docker containers are built for Linux x86 and ARM64
> since that's
> > what I have access to (thanks to NV for the ARM64 machines).
> >
> > You can get them from https://hub.docker.com/apache/spark (and spark-r
> and spark-py) :)
> > (And version 3.2.1 is also now published on Dockerhub).
> >
> > Holden
> >
> > --
> > Twitter: https://twitter.com/holdenkarau
> > Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9
> > YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Twitter: https://twitter.com/holdenkarau
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  
YouTube Live Streams: https://www.youtube.com/user/holdenkarau


Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-25 Thread Ismaël Mejía
The ready to use docker images are great news. I have been waiting for
this for so long! Extra kudos for including ARM64 versions too!

I am curious, what are the non-ASF artifacts included in them (or you
refer to the OS specific elements with other licenses?), and what
consequences might be for end users because of that.

Thanks and kudos to everyone who helped to make this happen!
Ismaël

ps. Any plans to make this images official docker images at some point
(for the extra security/validation) [1]
[1] https://docs.docker.com/docker-hub/official_images/

On Mon, Feb 21, 2022 at 10:09 PM Holden Karau  wrote:
>
> We are happy to announce the availability of Spark 3.1.3!
>
> Spark 3.1.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.1 maintenance branch of Spark. We strongly
> recommend all 3.1 users to upgrade to this stable release.
>
> To download Spark 3.1.3, head over to the download page:
> https://spark.apache.org/downloads.html
>
> To view the release notes:
> https://spark.apache.org/releases/spark-release-3-1-3.html
>
> We would like to acknowledge all community members for contributing to this
> release. This release would not have been possible without you.
>
> New Dockerhub magic in this release:
>
> We've also started publishing docker containers to the Apache Dockerhub,
> these contain non-ASF artifacts that are subject to different license terms 
> than the
> Spark release. The docker containers are built for Linux x86 and ARM64 since 
> that's
> what I have access to (thanks to NV for the ARM64 machines).
>
> You can get them from https://hub.docker.com/apache/spark (and spark-r and 
> spark-py) :)
> (And version 3.2.1 is also now published on Dockerhub).
>
> Holden
>
> --
> Twitter: https://twitter.com/holdenkarau
> Books (Learning Spark, High Performance Spark, etc.): https://amzn.to/2MaRAG9
> YouTube Live Streams: https://www.youtube.com/user/holdenkarau

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-22 Thread Denis Bolshakov
I understand that, and we do so, but news about official images were
breaking me 

Ok, I will follow you on those activities.


Thanks for the quick response.

On Tue, 22 Feb 2022 at 22:03, Holden Karau  wrote:

> So your more than welcome to still build your own Spark docker containers
> with the docker image tool, these are provided to make it easier for folks
> without specific needs. In the future well hopefully have published Spark
> containers tagged for different JDKs but that work has not yet been done.
>
> On Tue, Feb 22, 2022 at 10:51 AM Denis Bolshakov <
> bolshakov.de...@gmail.com> wrote:
>
>> Hello Holden,
>>
>> Could you please provide more details and plan for docker images support?
>>
>> So far I see that there are only two tags, I get from them spark version,
>> but there is no information about java, hadoop, scala versions.
>>
>> Also there is no description on docker hub, probably it would be nice to
>> put a link to Docker files in github repository.
>>
>> What directories are expected to be mounted and ports forwarded? How can
>> I mount the krb5.conf file and directory where my kerberos ticket is
>> located?
>>
>> I've pulled the docker image with tag spark 3.2.1 and I see that there is
>> java 11 and hadoop 3.3, but our environment requires us to have other
>> versions.
>>
>> On Tue, 22 Feb 2022 at 16:29, Mich Talebzadeh 
>> wrote:
>>
>>> Well that is just a recommendation.
>>>
>>> The onus is on me the user to download and go through dev and test
>>> running suite of batch jobs to ensure that all work ok, especially on the
>>> edge, sign the release off and roll it in out into production. It won’t be
>>> prudent otherwise.
>>>
>>> HHH
>>>
>>> On Tue, 22 Feb 2022 at 12:12, Bjørn Jørgensen 
>>> wrote:
>>>
 "Spark 3.1.3 is a maintenance release containing stability fixes. This
 release is based on the branch-3.1 maintenance branch of Spark. We strongly
 recommend all 3.1.3 users to upgrade to this stable release."
 https://spark.apache.org/releases/spark-release-3-1-3.html

 Do we have another 3.13 or do we strongly recommend all 3.1.2 users to
 upgrade to this stable release ?

 tir. 22. feb. 2022 kl. 09:50 skrev angers zhu :

> Hi,  seems
>
>- [SPARK-35391] :
>Memory leak in ExecutorAllocationListener breaks dynamic allocation 
> under
>high load
>
> Links to wrong jira ticket?
>
> Mich Talebzadeh  于2022年2月22日周二 15:49写道:
>
>> Well, that is pretty easy to do.
>>
>> However, a quick fix for now could be to retag the image created. It
>> is a small volume which can be done manually for now. For example, I just
>> downloaded v3.1.3
>>
>>
>> docker image ls
>>
>> REPOSITORY TAG
>> IMAGE ID   CREATEDSIZE
>>
>> apache/spark   v3.1.3
>>  31ed15daa2bf   12 hours ago   531MB
>>
>> Retag it with
>>
>>
>> docker tag 31ed15daa2bf
>> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>>
>> docker image ls
>>
>> REPOSITORY   TAG
>>   IMAGE ID   CREATED
>> SIZE
>>
>> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
>>31ed15daa2bf   12 hours 
>> ago
>>  531MB
>>
>> Then push it with (example)
>>
>> docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>>
>>
>> HTH
>>
>>
>>view my Linkedin profile
>> 
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility
>> for any loss, damage or destruction of data or any other property which 
>> may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Mon, 21 Feb 2022 at 23:51, Holden Karau 
>> wrote:
>>
>>> Yeah I think we should still adopt that naming convention, however
>>> no one has taken the time submit write a script to do it yet so until we
>>> get that script merged I think we'll just have one build. I can try and 
>>> do
>>> that for the next release but it would be a great 2nd issue for someone
>>> getting more familiar with the release tooling.
>>>
>>> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
>>> mich.talebza...@gmail.com> wrote:
>>>
 Ok thanks for the correction.

 The docker pull line shows as 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-22 Thread Holden Karau
So your more than welcome to still build your own Spark docker containers
with the docker image tool, these are provided to make it easier for folks
without specific needs. In the future well hopefully have published Spark
containers tagged for different JDKs but that work has not yet been done.

On Tue, Feb 22, 2022 at 10:51 AM Denis Bolshakov 
wrote:

> Hello Holden,
>
> Could you please provide more details and plan for docker images support?
>
> So far I see that there are only two tags, I get from them spark version,
> but there is no information about java, hadoop, scala versions.
>
> Also there is no description on docker hub, probably it would be nice to
> put a link to Docker files in github repository.
>
> What directories are expected to be mounted and ports forwarded? How can I
> mount the krb5.conf file and directory where my kerberos ticket is located?
>
> I've pulled the docker image with tag spark 3.2.1 and I see that there is
> java 11 and hadoop 3.3, but our environment requires us to have other
> versions.
>
> On Tue, 22 Feb 2022 at 16:29, Mich Talebzadeh 
> wrote:
>
>> Well that is just a recommendation.
>>
>> The onus is on me the user to download and go through dev and test
>> running suite of batch jobs to ensure that all work ok, especially on the
>> edge, sign the release off and roll it in out into production. It won’t be
>> prudent otherwise.
>>
>> HHH
>>
>> On Tue, 22 Feb 2022 at 12:12, Bjørn Jørgensen 
>> wrote:
>>
>>> "Spark 3.1.3 is a maintenance release containing stability fixes. This
>>> release is based on the branch-3.1 maintenance branch of Spark. We strongly
>>> recommend all 3.1.3 users to upgrade to this stable release."
>>> https://spark.apache.org/releases/spark-release-3-1-3.html
>>>
>>> Do we have another 3.13 or do we strongly recommend all 3.1.2 users to
>>> upgrade to this stable release ?
>>>
>>> tir. 22. feb. 2022 kl. 09:50 skrev angers zhu :
>>>
 Hi,  seems

- [SPARK-35391] :
Memory leak in ExecutorAllocationListener breaks dynamic allocation 
 under
high load

 Links to wrong jira ticket?

 Mich Talebzadeh  于2022年2月22日周二 15:49写道:

> Well, that is pretty easy to do.
>
> However, a quick fix for now could be to retag the image created. It
> is a small volume which can be done manually for now. For example, I just
> downloaded v3.1.3
>
>
> docker image ls
>
> REPOSITORY TAG
> IMAGE ID   CREATEDSIZE
>
> apache/spark   v3.1.3
>31ed15daa2bf   12 hours ago   531MB
>
> Retag it with
>
>
> docker tag 31ed15daa2bf
> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>
> docker image ls
>
> REPOSITORY   TAG
>   IMAGE ID   CREATED
> SIZE
>
> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
>  31ed15daa2bf   12 hours ago
>  531MB
>
> Then push it with (example)
>
> docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>
>
> HTH
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for
> any loss, damage or destruction of data or any other property which may
> arise from relying on this email's technical content is explicitly
> disclaimed. The author will in no case be liable for any monetary damages
> arising from such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 23:51, Holden Karau 
> wrote:
>
>> Yeah I think we should still adopt that naming convention, however no
>> one has taken the time submit write a script to do it yet so until we get
>> that script merged I think we'll just have one build. I can try and do 
>> that
>> for the next release but it would be a great 2nd issue for someone 
>> getting
>> more familiar with the release tooling.
>>
>> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> Ok thanks for the correction.
>>>
>>> The docker pull line shows as follows:
>>>
>>> docker pull apache/spark:v3.2.1
>>>
>>>
>>> So this only tells me the version of Spark 3.2.1
>>>
>>>
>>> I thought we discussed deciding on the docker naming conventions in
>>> detail, and broadly agreed on what needs to be in the naming convention.
>>> For example, in this thread:
>>>
>>>
>>> Time to start publishing Spark Docker 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-22 Thread Denis Bolshakov
Hello Holden,

Could you please provide more details and plan for docker images support?

So far I see that there are only two tags, I get from them spark version,
but there is no information about java, hadoop, scala versions.

Also there is no description on docker hub, probably it would be nice to
put a link to Docker files in github repository.

What directories are expected to be mounted and ports forwarded? How can I
mount the krb5.conf file and directory where my kerberos ticket is located?

I've pulled the docker image with tag spark 3.2.1 and I see that there is
java 11 and hadoop 3.3, but our environment requires us to have other
versions.

On Tue, 22 Feb 2022 at 16:29, Mich Talebzadeh 
wrote:

> Well that is just a recommendation.
>
> The onus is on me the user to download and go through dev and test running
> suite of batch jobs to ensure that all work ok, especially on the edge,
> sign the release off and roll it in out into production. It won’t be
> prudent otherwise.
>
> HHH
>
> On Tue, 22 Feb 2022 at 12:12, Bjørn Jørgensen 
> wrote:
>
>> "Spark 3.1.3 is a maintenance release containing stability fixes. This
>> release is based on the branch-3.1 maintenance branch of Spark. We strongly
>> recommend all 3.1.3 users to upgrade to this stable release."
>> https://spark.apache.org/releases/spark-release-3-1-3.html
>>
>> Do we have another 3.13 or do we strongly recommend all 3.1.2 users to
>> upgrade to this stable release ?
>>
>> tir. 22. feb. 2022 kl. 09:50 skrev angers zhu :
>>
>>> Hi,  seems
>>>
>>>- [SPARK-35391] :
>>>Memory leak in ExecutorAllocationListener breaks dynamic allocation under
>>>high load
>>>
>>> Links to wrong jira ticket?
>>>
>>> Mich Talebzadeh  于2022年2月22日周二 15:49写道:
>>>
 Well, that is pretty easy to do.

 However, a quick fix for now could be to retag the image created. It is
 a small volume which can be done manually for now. For example, I just
 downloaded v3.1.3


 docker image ls

 REPOSITORY TAG
   IMAGE ID   CREATEDSIZE

 apache/spark   v3.1.3
31ed15daa2bf   12 hours ago   531MB

 Retag it with


 docker tag 31ed15daa2bf
 apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster

 docker image ls

 REPOSITORY   TAG
 IMAGE ID   CREATED
 SIZE

 apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
  31ed15daa2bf   12 hours ago
  531MB

 Then push it with (example)

 docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster


 HTH


view my Linkedin profile
 


  https://en.everybodywiki.com/Mich_Talebzadeh



 *Disclaimer:* Use it at your own risk. Any and all responsibility for
 any loss, damage or destruction of data or any other property which may
 arise from relying on this email's technical content is explicitly
 disclaimed. The author will in no case be liable for any monetary damages
 arising from such loss, damage or destruction.




 On Mon, 21 Feb 2022 at 23:51, Holden Karau 
 wrote:

> Yeah I think we should still adopt that naming convention, however no
> one has taken the time submit write a script to do it yet so until we get
> that script merged I think we'll just have one build. I can try and do 
> that
> for the next release but it would be a great 2nd issue for someone getting
> more familiar with the release tooling.
>
> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
> mich.talebza...@gmail.com> wrote:
>
>> Ok thanks for the correction.
>>
>> The docker pull line shows as follows:
>>
>> docker pull apache/spark:v3.2.1
>>
>>
>> So this only tells me the version of Spark 3.2.1
>>
>>
>> I thought we discussed deciding on the docker naming conventions in
>> detail, and broadly agreed on what needs to be in the naming convention.
>> For example, in this thread:
>>
>>
>> Time to start publishing Spark Docker Images? -
>> mich.talebza...@gmail.com - Gmail (google.com)
>> 
>>  dated
>> 22nd July 2021
>>
>>
>> Referring to that, I think the broad agreement was that the docker
>> image name should be of the form:
>>
>>
>> The name of the file provides:
>>
>>- Built for spark or spark-py (PySpark) spark-r
>>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>>- Scala 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-22 Thread Mich Talebzadeh
Well that is just a recommendation.

The onus is on me the user to download and go through dev and test running
suite of batch jobs to ensure that all work ok, especially on the edge,
sign the release off and roll it in out into production. It won’t be
prudent otherwise.

HHH

On Tue, 22 Feb 2022 at 12:12, Bjørn Jørgensen 
wrote:

> "Spark 3.1.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.1 maintenance branch of Spark. We strongly
> recommend all 3.1.3 users to upgrade to this stable release."
> https://spark.apache.org/releases/spark-release-3-1-3.html
>
> Do we have another 3.13 or do we strongly recommend all 3.1.2 users to
> upgrade to this stable release ?
>
> tir. 22. feb. 2022 kl. 09:50 skrev angers zhu :
>
>> Hi,  seems
>>
>>- [SPARK-35391] :
>>Memory leak in ExecutorAllocationListener breaks dynamic allocation under
>>high load
>>
>> Links to wrong jira ticket?
>>
>> Mich Talebzadeh  于2022年2月22日周二 15:49写道:
>>
>>> Well, that is pretty easy to do.
>>>
>>> However, a quick fix for now could be to retag the image created. It is
>>> a small volume which can be done manually for now. For example, I just
>>> downloaded v3.1.3
>>>
>>>
>>> docker image ls
>>>
>>> REPOSITORY TAG
>>>   IMAGE ID   CREATEDSIZE
>>>
>>> apache/spark   v3.1.3
>>>  31ed15daa2bf   12 hours ago   531MB
>>>
>>> Retag it with
>>>
>>>
>>> docker tag 31ed15daa2bf
>>> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>>>
>>> docker image ls
>>>
>>> REPOSITORY   TAG
>>> IMAGE ID   CREATED
>>> SIZE
>>>
>>> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
>>>31ed15daa2bf   12 hours ago
>>>  531MB
>>>
>>> Then push it with (example)
>>>
>>> docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>>>
>>>
>>> HTH
>>>
>>>
>>>view my Linkedin profile
>>> 
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Mon, 21 Feb 2022 at 23:51, Holden Karau  wrote:
>>>
 Yeah I think we should still adopt that naming convention, however no
 one has taken the time submit write a script to do it yet so until we get
 that script merged I think we'll just have one build. I can try and do that
 for the next release but it would be a great 2nd issue for someone getting
 more familiar with the release tooling.

 On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
 mich.talebza...@gmail.com> wrote:

> Ok thanks for the correction.
>
> The docker pull line shows as follows:
>
> docker pull apache/spark:v3.2.1
>
>
> So this only tells me the version of Spark 3.2.1
>
>
> I thought we discussed deciding on the docker naming conventions in
> detail, and broadly agreed on what needs to be in the naming convention.
> For example, in this thread:
>
>
> Time to start publishing Spark Docker Images? -
> mich.talebza...@gmail.com - Gmail (google.com)
> 
>  dated
> 22nd July 2021
>
>
> Referring to that, I think the broad agreement was that the docker
> image name should be of the form:
>
>
> The name of the file provides:
>
>- Built for spark or spark-py (PySpark) spark-r
>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>- Scala version; 2.1.2
>- The OS version based on JAVA: 8-jre-slim-buster,
>11-jre-slim-buster meaning JAVA 8 and JAVA 11 respectively
>
> I believe it is a good thing and we ought to adopt that convention.
> For example:
>
>
> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>
>
> HTH
>
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for
> any loss, damage or destruction of data or any other property which may
> arise from relying on this email's technical content is explicitly
> disclaimed. The author will in no case be liable for any monetary damages
> arising from such loss, damage or 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-22 Thread Bjørn Jørgensen
"Spark 3.1.3 is a maintenance release containing stability fixes. This
release is based on the branch-3.1 maintenance branch of Spark. We strongly
recommend all 3.1.3 users to upgrade to this stable release."
https://spark.apache.org/releases/spark-release-3-1-3.html

Do we have another 3.13 or do we strongly recommend all 3.1.2 users to
upgrade to this stable release ?

tir. 22. feb. 2022 kl. 09:50 skrev angers zhu :

> Hi,  seems
>
>- [SPARK-35391] :
>Memory leak in ExecutorAllocationListener breaks dynamic allocation under
>high load
>
> Links to wrong jira ticket?
>
> Mich Talebzadeh  于2022年2月22日周二 15:49写道:
>
>> Well, that is pretty easy to do.
>>
>> However, a quick fix for now could be to retag the image created. It is a
>> small volume which can be done manually for now. For example, I just
>> downloaded v3.1.3
>>
>>
>> docker image ls
>>
>> REPOSITORY TAG
>> IMAGE ID   CREATEDSIZE
>>
>> apache/spark   v3.1.3
>>  31ed15daa2bf   12 hours ago   531MB
>>
>> Retag it with
>>
>>
>> docker tag 31ed15daa2bf
>> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>>
>> docker image ls
>>
>> REPOSITORY   TAG
>>   IMAGE ID   CREATEDSIZE
>>
>> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
>>31ed15daa2bf   12 hours ago
>>  531MB
>>
>> Then push it with (example)
>>
>> docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>>
>>
>> HTH
>>
>>
>>view my Linkedin profile
>> 
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Mon, 21 Feb 2022 at 23:51, Holden Karau  wrote:
>>
>>> Yeah I think we should still adopt that naming convention, however no
>>> one has taken the time submit write a script to do it yet so until we get
>>> that script merged I think we'll just have one build. I can try and do that
>>> for the next release but it would be a great 2nd issue for someone getting
>>> more familiar with the release tooling.
>>>
>>> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
>>> mich.talebza...@gmail.com> wrote:
>>>
 Ok thanks for the correction.

 The docker pull line shows as follows:

 docker pull apache/spark:v3.2.1


 So this only tells me the version of Spark 3.2.1


 I thought we discussed deciding on the docker naming conventions in
 detail, and broadly agreed on what needs to be in the naming convention.
 For example, in this thread:


 Time to start publishing Spark Docker Images? -
 mich.talebza...@gmail.com - Gmail (google.com)
 
  dated
 22nd July 2021


 Referring to that, I think the broad agreement was that the docker
 image name should be of the form:


 The name of the file provides:

- Built for spark or spark-py (PySpark) spark-r
- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
- Scala version; 2.1.2
- The OS version based on JAVA: 8-jre-slim-buster,
11-jre-slim-buster meaning JAVA 8 and JAVA 11 respectively

 I believe it is a good thing and we ought to adopt that convention. For
 example:


 spark-py-3.2.1-scala_2.12-11-jre-slim-buster


 HTH



view my Linkedin profile
 


  https://en.everybodywiki.com/Mich_Talebzadeh



 *Disclaimer:* Use it at your own risk. Any and all responsibility for
 any loss, damage or destruction of data or any other property which may
 arise from relying on this email's technical content is explicitly
 disclaimed. The author will in no case be liable for any monetary damages
 arising from such loss, damage or destruction.




 On Mon, 21 Feb 2022 at 21:58, Holden Karau 
 wrote:

> My bad, the correct link is:
>
> https://hub.docker.com/r/apache/spark/tags
>
> On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
> mich.talebza...@gmail.com> wrote:
>
>> well that docker link is not found! may be permission issue
>>
>> [image: image.png]
>>
>>
>>
>>
>>view my Linkedin profile
>> 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-22 Thread angers zhu
Hi,  seems

   - [SPARK-35391] :
   Memory leak in ExecutorAllocationListener breaks dynamic allocation under
   high load

Links to wrong jira ticket?

Mich Talebzadeh  于2022年2月22日周二 15:49写道:

> Well, that is pretty easy to do.
>
> However, a quick fix for now could be to retag the image created. It is a
> small volume which can be done manually for now. For example, I just
> downloaded v3.1.3
>
>
> docker image ls
>
> REPOSITORY TAG
> IMAGE ID   CREATEDSIZE
>
> apache/spark   v3.1.3
>31ed15daa2bf   12 hours ago   531MB
>
> Retag it with
>
>
> docker tag 31ed15daa2bf
> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>
> docker image ls
>
> REPOSITORY   TAG
>   IMAGE ID   CREATEDSIZE
>
> apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
>  31ed15daa2bf   12 hours ago   531MB
>
> Then push it with (example)
>
> docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster
>
>
> HTH
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 23:51, Holden Karau  wrote:
>
>> Yeah I think we should still adopt that naming convention, however no one
>> has taken the time submit write a script to do it yet so until we get that
>> script merged I think we'll just have one build. I can try and do that for
>> the next release but it would be a great 2nd issue for someone getting more
>> familiar with the release tooling.
>>
>> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> Ok thanks for the correction.
>>>
>>> The docker pull line shows as follows:
>>>
>>> docker pull apache/spark:v3.2.1
>>>
>>>
>>> So this only tells me the version of Spark 3.2.1
>>>
>>>
>>> I thought we discussed deciding on the docker naming conventions in
>>> detail, and broadly agreed on what needs to be in the naming convention.
>>> For example, in this thread:
>>>
>>>
>>> Time to start publishing Spark Docker Images? -
>>> mich.talebza...@gmail.com - Gmail (google.com)
>>> 
>>>  dated
>>> 22nd July 2021
>>>
>>>
>>> Referring to that, I think the broad agreement was that the docker image
>>> name should be of the form:
>>>
>>>
>>> The name of the file provides:
>>>
>>>- Built for spark or spark-py (PySpark) spark-r
>>>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>>>- Scala version; 2.1.2
>>>- The OS version based on JAVA: 8-jre-slim-buster,
>>>11-jre-slim-buster meaning JAVA 8 and JAVA 11 respectively
>>>
>>> I believe it is a good thing and we ought to adopt that convention. For
>>> example:
>>>
>>>
>>> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>>>
>>>
>>> HTH
>>>
>>>
>>>
>>>view my Linkedin profile
>>> 
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Mon, 21 Feb 2022 at 21:58, Holden Karau  wrote:
>>>
 My bad, the correct link is:

 https://hub.docker.com/r/apache/spark/tags

 On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
 mich.talebza...@gmail.com> wrote:

> well that docker link is not found! may be permission issue
>
> [image: image.png]
>
>
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for
> any loss, damage or destruction of data or any other property which may
> arise from relying on this email's technical content is explicitly
> disclaimed. The author will in no case be liable for any monetary damages
> arising from such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 21:09, Holden Karau 
> wrote:
>
>> We are happy 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Mich Talebzadeh
Well, that is pretty easy to do.

However, a quick fix for now could be to retag the image created. It is a
small volume which can be done manually for now. For example, I just
downloaded v3.1.3


docker image ls

REPOSITORY TAG
  IMAGE ID   CREATEDSIZE

apache/spark   v3.1.3
   31ed15daa2bf   12 hours ago   531MB

Retag it with


docker tag 31ed15daa2bf
apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster

docker image ls

REPOSITORY   TAG
IMAGE ID   CREATEDSIZE

apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster   latest
 31ed15daa2bf   12 hours ago   531MB

Then push it with (example)

docker push apache/spark/tags/spark-3.1.3-scala_2.12-8-jre-slim-buster


HTH


   view my Linkedin profile



 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Mon, 21 Feb 2022 at 23:51, Holden Karau  wrote:

> Yeah I think we should still adopt that naming convention, however no one
> has taken the time submit write a script to do it yet so until we get that
> script merged I think we'll just have one build. I can try and do that for
> the next release but it would be a great 2nd issue for someone getting more
> familiar with the release tooling.
>
> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh 
> wrote:
>
>> Ok thanks for the correction.
>>
>> The docker pull line shows as follows:
>>
>> docker pull apache/spark:v3.2.1
>>
>>
>> So this only tells me the version of Spark 3.2.1
>>
>>
>> I thought we discussed deciding on the docker naming conventions in
>> detail, and broadly agreed on what needs to be in the naming convention.
>> For example, in this thread:
>>
>>
>> Time to start publishing Spark Docker Images? - mich.talebza...@gmail.com
>> - Gmail (google.com)
>> 
>>  dated
>> 22nd July 2021
>>
>>
>> Referring to that, I think the broad agreement was that the docker image
>> name should be of the form:
>>
>>
>> The name of the file provides:
>>
>>- Built for spark or spark-py (PySpark) spark-r
>>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>>- Scala version; 2.1.2
>>- The OS version based on JAVA: 8-jre-slim-buster, 11-jre-slim-buster
>>meaning JAVA 8 and JAVA 11 respectively
>>
>> I believe it is a good thing and we ought to adopt that convention. For
>> example:
>>
>>
>> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>>
>>
>> HTH
>>
>>
>>
>>view my Linkedin profile
>> 
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Mon, 21 Feb 2022 at 21:58, Holden Karau  wrote:
>>
>>> My bad, the correct link is:
>>>
>>> https://hub.docker.com/r/apache/spark/tags
>>>
>>> On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
>>> mich.talebza...@gmail.com> wrote:
>>>
 well that docker link is not found! may be permission issue

 [image: image.png]




view my Linkedin profile
 


  https://en.everybodywiki.com/Mich_Talebzadeh



 *Disclaimer:* Use it at your own risk. Any and all responsibility for
 any loss, damage or destruction of data or any other property which may
 arise from relying on this email's technical content is explicitly
 disclaimed. The author will in no case be liable for any monetary damages
 arising from such loss, damage or destruction.




 On Mon, 21 Feb 2022 at 21:09, Holden Karau 
 wrote:

> We are happy to announce the availability of Spark 3.1.3!
>
> Spark 3.1.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.1 maintenance branch of Spark. We
> strongly
> recommend all 3.1 users to upgrade to this stable release.
>
> To download Spark 3.1.3, head over to the download page:
> https://spark.apache.org/downloads.html
>
> To view the release notes:
> 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Prasad Paravatha
Apologies, please ignore my previous message

On Mon, Feb 21, 2022 at 5:56 PM Prasad Paravatha 
wrote:

> FYI, I am getting 404 for https://hub.docker.com/apache/spark
>
> On Mon, Feb 21, 2022 at 5:51 PM Holden Karau  wrote:
>
>> Yeah I think we should still adopt that naming convention, however no one
>> has taken the time submit write a script to do it yet so until we get that
>> script merged I think we'll just have one build. I can try and do that for
>> the next release but it would be a great 2nd issue for someone getting more
>> familiar with the release tooling.
>>
>> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> Ok thanks for the correction.
>>>
>>> The docker pull line shows as follows:
>>>
>>> docker pull apache/spark:v3.2.1
>>>
>>>
>>> So this only tells me the version of Spark 3.2.1
>>>
>>>
>>> I thought we discussed deciding on the docker naming conventions in
>>> detail, and broadly agreed on what needs to be in the naming convention.
>>> For example, in this thread:
>>>
>>>
>>> Time to start publishing Spark Docker Images? -
>>> mich.talebza...@gmail.com - Gmail (google.com)
>>> 
>>>  dated
>>> 22nd July 2021
>>>
>>>
>>> Referring to that, I think the broad agreement was that the docker image
>>> name should be of the form:
>>>
>>>
>>> The name of the file provides:
>>>
>>>- Built for spark or spark-py (PySpark) spark-r
>>>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>>>- Scala version; 2.1.2
>>>- The OS version based on JAVA: 8-jre-slim-buster,
>>>11-jre-slim-buster meaning JAVA 8 and JAVA 11 respectively
>>>
>>> I believe it is a good thing and we ought to adopt that convention. For
>>> example:
>>>
>>>
>>> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>>>
>>>
>>> HTH
>>>
>>>
>>>
>>>view my Linkedin profile
>>> 
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Mon, 21 Feb 2022 at 21:58, Holden Karau  wrote:
>>>
 My bad, the correct link is:

 https://hub.docker.com/r/apache/spark/tags

 On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
 mich.talebza...@gmail.com> wrote:

> well that docker link is not found! may be permission issue
>
> [image: image.png]
>
>
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for
> any loss, damage or destruction of data or any other property which may
> arise from relying on this email's technical content is explicitly
> disclaimed. The author will in no case be liable for any monetary damages
> arising from such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 21:09, Holden Karau 
> wrote:
>
>> We are happy to announce the availability of Spark 3.1.3!
>>
>> Spark 3.1.3 is a maintenance release containing stability fixes. This
>> release is based on the branch-3.1 maintenance branch of Spark. We
>> strongly
>> recommend all 3.1 users to upgrade to this stable release.
>>
>> To download Spark 3.1.3, head over to the download page:
>> https://spark.apache.org/downloads.html
>>
>> To view the release notes:
>> https://spark.apache.org/releases/spark-release-3-1-3.html
>>
>> We would like to acknowledge all community members for contributing
>> to this
>> release. This release would not have been possible without you.
>>
>> *New Dockerhub magic in this release:*
>>
>> We've also started publishing docker containers to the Apache
>> Dockerhub,
>> these contain non-ASF artifacts that are subject to different license
>> terms than the
>> Spark release. The docker containers are built for Linux x86 and
>> ARM64 since that's
>> what I have access to (thanks to NV for the ARM64 machines).
>>
>> You can get them from https://hub.docker.com/apache/spark (and
>> spark-r and spark-py) :)
>> (And version 3.2.1 is also now published on Dockerhub).
>>
>> Holden
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>> Books (Learning Spark, High Performance Spark, etc.):
>> https://amzn.to/2MaRAG9  
>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>
>


Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Prasad Paravatha
FYI, I am getting 404 for https://hub.docker.com/apache/spark

On Mon, Feb 21, 2022 at 5:51 PM Holden Karau  wrote:

> Yeah I think we should still adopt that naming convention, however no one
> has taken the time submit write a script to do it yet so until we get that
> script merged I think we'll just have one build. I can try and do that for
> the next release but it would be a great 2nd issue for someone getting more
> familiar with the release tooling.
>
> On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh 
> wrote:
>
>> Ok thanks for the correction.
>>
>> The docker pull line shows as follows:
>>
>> docker pull apache/spark:v3.2.1
>>
>>
>> So this only tells me the version of Spark 3.2.1
>>
>>
>> I thought we discussed deciding on the docker naming conventions in
>> detail, and broadly agreed on what needs to be in the naming convention.
>> For example, in this thread:
>>
>>
>> Time to start publishing Spark Docker Images? - mich.talebza...@gmail.com
>> - Gmail (google.com)
>> 
>>  dated
>> 22nd July 2021
>>
>>
>> Referring to that, I think the broad agreement was that the docker image
>> name should be of the form:
>>
>>
>> The name of the file provides:
>>
>>- Built for spark or spark-py (PySpark) spark-r
>>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>>- Scala version; 2.1.2
>>- The OS version based on JAVA: 8-jre-slim-buster, 11-jre-slim-buster
>>meaning JAVA 8 and JAVA 11 respectively
>>
>> I believe it is a good thing and we ought to adopt that convention. For
>> example:
>>
>>
>> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>>
>>
>> HTH
>>
>>
>>
>>view my Linkedin profile
>> 
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Mon, 21 Feb 2022 at 21:58, Holden Karau  wrote:
>>
>>> My bad, the correct link is:
>>>
>>> https://hub.docker.com/r/apache/spark/tags
>>>
>>> On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
>>> mich.talebza...@gmail.com> wrote:
>>>
 well that docker link is not found! may be permission issue

 [image: image.png]




view my Linkedin profile
 


  https://en.everybodywiki.com/Mich_Talebzadeh



 *Disclaimer:* Use it at your own risk. Any and all responsibility for
 any loss, damage or destruction of data or any other property which may
 arise from relying on this email's technical content is explicitly
 disclaimed. The author will in no case be liable for any monetary damages
 arising from such loss, damage or destruction.




 On Mon, 21 Feb 2022 at 21:09, Holden Karau 
 wrote:

> We are happy to announce the availability of Spark 3.1.3!
>
> Spark 3.1.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.1 maintenance branch of Spark. We
> strongly
> recommend all 3.1 users to upgrade to this stable release.
>
> To download Spark 3.1.3, head over to the download page:
> https://spark.apache.org/downloads.html
>
> To view the release notes:
> https://spark.apache.org/releases/spark-release-3-1-3.html
>
> We would like to acknowledge all community members for contributing to
> this
> release. This release would not have been possible without you.
>
> *New Dockerhub magic in this release:*
>
> We've also started publishing docker containers to the Apache
> Dockerhub,
> these contain non-ASF artifacts that are subject to different license
> terms than the
> Spark release. The docker containers are built for Linux x86 and ARM64
> since that's
> what I have access to (thanks to NV for the ARM64 machines).
>
> You can get them from https://hub.docker.com/apache/spark (and
> spark-r and spark-py) :)
> (And version 3.2.1 is also now published on Dockerhub).
>
> Holden
>
> --
> Twitter: https://twitter.com/holdenkarau
> Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9  
> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>

>>>
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>> Books (Learning Spark, High Performance Spark, etc.):
>>> https://amzn.to/2MaRAG9  
>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>
>>
>
> --
> Twitter: 

Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Holden Karau
Yeah I think we should still adopt that naming convention, however no one
has taken the time submit write a script to do it yet so until we get that
script merged I think we'll just have one build. I can try and do that for
the next release but it would be a great 2nd issue for someone getting more
familiar with the release tooling.

On Mon, Feb 21, 2022 at 2:18 PM Mich Talebzadeh 
wrote:

> Ok thanks for the correction.
>
> The docker pull line shows as follows:
>
> docker pull apache/spark:v3.2.1
>
>
> So this only tells me the version of Spark 3.2.1
>
>
> I thought we discussed deciding on the docker naming conventions in
> detail, and broadly agreed on what needs to be in the naming convention.
> For example, in this thread:
>
>
> Time to start publishing Spark Docker Images? - mich.talebza...@gmail.com
> - Gmail (google.com)
> 
>  dated
> 22nd July 2021
>
>
> Referring to that, I think the broad agreement was that the docker image
> name should be of the form:
>
>
> The name of the file provides:
>
>- Built for spark or spark-py (PySpark) spark-r
>- Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
>- Scala version; 2.1.2
>- The OS version based on JAVA: 8-jre-slim-buster, 11-jre-slim-buster
>meaning JAVA 8 and JAVA 11 respectively
>
> I believe it is a good thing and we ought to adopt that convention. For
> example:
>
>
> spark-py-3.2.1-scala_2.12-11-jre-slim-buster
>
>
> HTH
>
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 21:58, Holden Karau  wrote:
>
>> My bad, the correct link is:
>>
>> https://hub.docker.com/r/apache/spark/tags
>>
>> On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> well that docker link is not found! may be permission issue
>>>
>>> [image: image.png]
>>>
>>>
>>>
>>>
>>>view my Linkedin profile
>>> 
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Mon, 21 Feb 2022 at 21:09, Holden Karau  wrote:
>>>
 We are happy to announce the availability of Spark 3.1.3!

 Spark 3.1.3 is a maintenance release containing stability fixes. This
 release is based on the branch-3.1 maintenance branch of Spark. We
 strongly
 recommend all 3.1 users to upgrade to this stable release.

 To download Spark 3.1.3, head over to the download page:
 https://spark.apache.org/downloads.html

 To view the release notes:
 https://spark.apache.org/releases/spark-release-3-1-3.html

 We would like to acknowledge all community members for contributing to
 this
 release. This release would not have been possible without you.

 *New Dockerhub magic in this release:*

 We've also started publishing docker containers to the Apache Dockerhub,
 these contain non-ASF artifacts that are subject to different license
 terms than the
 Spark release. The docker containers are built for Linux x86 and ARM64
 since that's
 what I have access to (thanks to NV for the ARM64 machines).

 You can get them from https://hub.docker.com/apache/spark (and spark-r
 and spark-py) :)
 (And version 3.2.1 is also now published on Dockerhub).

 Holden

 --
 Twitter: https://twitter.com/holdenkarau
 Books (Learning Spark, High Performance Spark, etc.):
 https://amzn.to/2MaRAG9  
 YouTube Live Streams: https://www.youtube.com/user/holdenkarau

>>>
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>> Books (Learning Spark, High Performance Spark, etc.):
>> https://amzn.to/2MaRAG9  
>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>
>

-- 
Twitter: https://twitter.com/holdenkarau
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  
YouTube Live Streams: https://www.youtube.com/user/holdenkarau


Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Mich Talebzadeh
Ok thanks for the correction.

The docker pull line shows as follows:

docker pull apache/spark:v3.2.1


So this only tells me the version of Spark 3.2.1


I thought we discussed deciding on the docker naming conventions in detail,
and broadly agreed on what needs to be in the naming convention. For
example, in this thread:


Time to start publishing Spark Docker Images? - mich.talebza...@gmail.com -
Gmail (google.com)

dated
22nd July 2021


Referring to that, I think the broad agreement was that the docker image
name should be of the form:


The name of the file provides:

   - Built for spark or spark-py (PySpark) spark-r
   - Spark version: 3.1.1, 3.1.2, 3.2.1 etc.
   - Scala version; 2.1.2
   - The OS version based on JAVA: 8-jre-slim-buster, 11-jre-slim-buster
   meaning JAVA 8 and JAVA 11 respectively

I believe it is a good thing and we ought to adopt that convention. For
example:


spark-py-3.2.1-scala_2.12-11-jre-slim-buster


HTH



   view my Linkedin profile



 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Mon, 21 Feb 2022 at 21:58, Holden Karau  wrote:

> My bad, the correct link is:
>
> https://hub.docker.com/r/apache/spark/tags
>
> On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh 
> wrote:
>
>> well that docker link is not found! may be permission issue
>>
>> [image: image.png]
>>
>>
>>
>>
>>view my Linkedin profile
>> 
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Mon, 21 Feb 2022 at 21:09, Holden Karau  wrote:
>>
>>> We are happy to announce the availability of Spark 3.1.3!
>>>
>>> Spark 3.1.3 is a maintenance release containing stability fixes. This
>>> release is based on the branch-3.1 maintenance branch of Spark. We
>>> strongly
>>> recommend all 3.1 users to upgrade to this stable release.
>>>
>>> To download Spark 3.1.3, head over to the download page:
>>> https://spark.apache.org/downloads.html
>>>
>>> To view the release notes:
>>> https://spark.apache.org/releases/spark-release-3-1-3.html
>>>
>>> We would like to acknowledge all community members for contributing to
>>> this
>>> release. This release would not have been possible without you.
>>>
>>> *New Dockerhub magic in this release:*
>>>
>>> We've also started publishing docker containers to the Apache Dockerhub,
>>> these contain non-ASF artifacts that are subject to different license
>>> terms than the
>>> Spark release. The docker containers are built for Linux x86 and ARM64
>>> since that's
>>> what I have access to (thanks to NV for the ARM64 machines).
>>>
>>> You can get them from https://hub.docker.com/apache/spark (and spark-r
>>> and spark-py) :)
>>> (And version 3.2.1 is also now published on Dockerhub).
>>>
>>> Holden
>>>
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>> Books (Learning Spark, High Performance Spark, etc.):
>>> https://amzn.to/2MaRAG9  
>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>
>>
>
> --
> Twitter: https://twitter.com/holdenkarau
> Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9  
> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>


Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Holden Karau
My bad, the correct link is:

https://hub.docker.com/r/apache/spark/tags

On Mon, Feb 21, 2022 at 1:17 PM Mich Talebzadeh 
wrote:

> well that docker link is not found! may be permission issue
>
> [image: image.png]
>
>
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 21 Feb 2022 at 21:09, Holden Karau  wrote:
>
>> We are happy to announce the availability of Spark 3.1.3!
>>
>> Spark 3.1.3 is a maintenance release containing stability fixes. This
>> release is based on the branch-3.1 maintenance branch of Spark. We
>> strongly
>> recommend all 3.1 users to upgrade to this stable release.
>>
>> To download Spark 3.1.3, head over to the download page:
>> https://spark.apache.org/downloads.html
>>
>> To view the release notes:
>> https://spark.apache.org/releases/spark-release-3-1-3.html
>>
>> We would like to acknowledge all community members for contributing to
>> this
>> release. This release would not have been possible without you.
>>
>> *New Dockerhub magic in this release:*
>>
>> We've also started publishing docker containers to the Apache Dockerhub,
>> these contain non-ASF artifacts that are subject to different license
>> terms than the
>> Spark release. The docker containers are built for Linux x86 and ARM64
>> since that's
>> what I have access to (thanks to NV for the ARM64 machines).
>>
>> You can get them from https://hub.docker.com/apache/spark (and spark-r
>> and spark-py) :)
>> (And version 3.2.1 is also now published on Dockerhub).
>>
>> Holden
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>> Books (Learning Spark, High Performance Spark, etc.):
>> https://amzn.to/2MaRAG9  
>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>
>

-- 
Twitter: https://twitter.com/holdenkarau
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  
YouTube Live Streams: https://www.youtube.com/user/holdenkarau


Re: [ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Mich Talebzadeh
well that docker link is not found! may be permission issue

[image: image.png]




   view my Linkedin profile



 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Mon, 21 Feb 2022 at 21:09, Holden Karau  wrote:

> We are happy to announce the availability of Spark 3.1.3!
>
> Spark 3.1.3 is a maintenance release containing stability fixes. This
> release is based on the branch-3.1 maintenance branch of Spark. We strongly
> recommend all 3.1 users to upgrade to this stable release.
>
> To download Spark 3.1.3, head over to the download page:
> https://spark.apache.org/downloads.html
>
> To view the release notes:
> https://spark.apache.org/releases/spark-release-3-1-3.html
>
> We would like to acknowledge all community members for contributing to this
> release. This release would not have been possible without you.
>
> *New Dockerhub magic in this release:*
>
> We've also started publishing docker containers to the Apache Dockerhub,
> these contain non-ASF artifacts that are subject to different license
> terms than the
> Spark release. The docker containers are built for Linux x86 and ARM64
> since that's
> what I have access to (thanks to NV for the ARM64 machines).
>
> You can get them from https://hub.docker.com/apache/spark (and spark-r
> and spark-py) :)
> (And version 3.2.1 is also now published on Dockerhub).
>
> Holden
>
> --
> Twitter: https://twitter.com/holdenkarau
> Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9  
> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>


[ANNOUNCE] Apache Spark 3.1.3 released + Docker images

2022-02-21 Thread Holden Karau
We are happy to announce the availability of Spark 3.1.3!

Spark 3.1.3 is a maintenance release containing stability fixes. This
release is based on the branch-3.1 maintenance branch of Spark. We strongly
recommend all 3.1 users to upgrade to this stable release.

To download Spark 3.1.3, head over to the download page:
https://spark.apache.org/downloads.html

To view the release notes:
https://spark.apache.org/releases/spark-release-3-1-3.html

We would like to acknowledge all community members for contributing to this
release. This release would not have been possible without you.

*New Dockerhub magic in this release:*

We've also started publishing docker containers to the Apache Dockerhub,
these contain non-ASF artifacts that are subject to different license terms
than the
Spark release. The docker containers are built for Linux x86 and ARM64
since that's
what I have access to (thanks to NV for the ARM64 machines).

You can get them from https://hub.docker.com/apache/spark (and spark-r and
spark-py) :)
(And version 3.2.1 is also now published on Dockerhub).

Holden

-- 
Twitter: https://twitter.com/holdenkarau
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  
YouTube Live Streams: https://www.youtube.com/user/holdenkarau