Hi bo

I am interested to contribute. 
But I don’t have free access to any cloud provider. Not sure how I can get free 
access. I know Google, aws, azure only provides temp free access, it may not be 
sufficient.

Guidance is appreciated.

Sarath 

Sent from my iPhone

> On Feb 23, 2022, at 2:01 AM, bo yang <bobyan...@gmail.com> wrote:
> 
> 
> Right, normally people start with simple script, then add more stuff, like 
> permission and more components. After some time, people want to run the 
> script consistently in different environments. Things will become complex.
> 
> That is why we want to see whether people have interest for such a "one 
> click" tool to make things easy.
> 
> 
>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mich.talebza...@gmail.com> 
>> wrote:
>> Hi,
>> 
>> There are two distinct actions here; namely Deploy and Run.
>> 
>> Deployment can be done by command line script with autoscaling. In the newer 
>> versions of Kubernnetes you don't even need to specify the node types, you 
>> can leave it to the Kubernetes cluster  to scale up and down and decide on 
>> node type.
>> 
>> The second point is the running spark that you will need to submit. However, 
>> that depends on setting up access permission, use of service accounts, 
>> pulling the correct dockerfiles for the driver and the executors. Those 
>> details add to the complexity.
>> 
>> Thanks
>> 
>> 
>>    view my Linkedin profile
>> 
>> 
>> 
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>> 
>>  
>> Disclaimer: Use it at your own risk. Any and all responsibility for any 
>> loss, damage or destruction of data or any other property which may arise 
>> from relying on this email's technical content is explicitly disclaimed. The 
>> author will in no case be liable for any monetary damages arising from such 
>> loss, damage or destruction.
>>  
>> 
>> 
>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bobyan...@gmail.com> wrote:
>>> Hi Spark Community,
>>> 
>>> We built an open source tool to deploy and run Spark on Kubernetes with a 
>>> one click command. For example, on AWS, it could automatically create an 
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will 
>>> be able to use curl or a CLI tool to submit Spark application. After the 
>>> deployment, you could also install Uber Remote Shuffle Service to enable 
>>> Dynamic Allocation on Kuberentes.
>>> 
>>> Anyone interested in using or working together on such a tool?
>>> 
>>> Thanks,
>>> Bo
>>> 

Reply via email to