This is an automated email from the ASF dual-hosted git repository.
yao pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git
The following commit(s) were added to refs/heads/asf-site by this push:
new d3fcd0f907 docs: update kubeflow spark operator URL (#553)
d3fcd0f907 is described below
commit d3fcd0f907f8b9b8ab52d3f409b3e984052179dd
Author: Zev Isert <[email protected]>
AuthorDate: Tue Sep 10 19:25:09 2024 -0700
docs: update kubeflow spark operator URL (#553)
* docs: update kubeflow spark operator URL
The GoogleCloudPlatform spark-on-k8s-operator was rehomed to the kubeflow
community
* build: bundle exec jekyll build
---
site/third-party-projects.html | 2 +-
third-party-projects.md | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/site/third-party-projects.html b/site/third-party-projects.html
index 626227d444..cbb07d2506 100644
--- a/site/third-party-projects.html
+++ b/site/third-party-projects.html
@@ -205,7 +205,7 @@ storage system that supports running Spark</li>
<li><a href="https://github.com/filodb/FiloDB">FiloDB</a> - a Spark
integrated analytical/columnar
database, with in-memory option capable of sub-second concurrent queries</li>
<li><a href="http://zeppelin-project.org/">Zeppelin</a> - Multi-purpose
notebook which supports 20+ language backends, including Apache Spark</li>
- <li><a
href="https://github.com/GoogleCloudPlatform/spark-on-k8s-operator">K8S
Operator for Apache Spark</a> - Kubernetes operator for specifying and managing
the lifecycle of Apache Spark applications on Kubernetes.</li>
+ <li><a href="https://github.com/kubeflow/spark-operator">Kubeflow Spark
Operator</a> - Kubernetes operator for managing the lifecycle of Apache Spark
applications on Kubernetes.</li>
<li><a
href="https://developer.ibm.com/storage/products/ibm-spectrum-conductor-spark/">IBM
Spectrum Conductor</a> - Cluster management software that integrates with
Spark and modern computing frameworks.</li>
<li><a href="https://mlflow.org">MLflow</a> - Open source platform to manage
the machine learning lifecycle, including deploying models from diverse machine
learning libraries on Apache Spark.</li>
<li><a
href="https://datafu.apache.org/docs/spark/getting-started.html">Apache
DataFu</a> - A collection of utils and user-defined-functions for working with
large scale data in Apache Spark, as well as making Scala-Python
interoperability easier.</li>
diff --git a/third-party-projects.md b/third-party-projects.md
index ed7e7b3353..e83ff1eadf 100644
--- a/third-party-projects.md
+++ b/third-party-projects.md
@@ -52,7 +52,7 @@ storage system that supports running Spark
- <a href="https://github.com/filodb/FiloDB">FiloDB</a> - a Spark integrated
analytical/columnar
database, with in-memory option capable of sub-second concurrent queries
- <a href="http://zeppelin-project.org/">Zeppelin</a> - Multi-purpose notebook
which supports 20+ language backends, including Apache Spark
-- <a href="https://github.com/GoogleCloudPlatform/spark-on-k8s-operator">K8S
Operator for Apache Spark</a> - Kubernetes operator for specifying and managing
the lifecycle of Apache Spark applications on Kubernetes.
+- <a href="https://github.com/kubeflow/spark-operator">Kubeflow Spark
Operator</a> - Kubernetes operator for managing the lifecycle of Apache Spark
applications on Kubernetes.
- <a
href="https://developer.ibm.com/storage/products/ibm-spectrum-conductor-spark/">IBM
Spectrum Conductor</a> - Cluster management software that integrates with
Spark and modern computing frameworks.
- <a href="https://mlflow.org">MLflow</a> - Open source platform to manage the
machine learning lifecycle, including deploying models from diverse machine
learning libraries on Apache Spark.
- <a href="https://datafu.apache.org/docs/spark/getting-started.html">Apache
DataFu</a> - A collection of utils and user-defined-functions for working with
large scale data in Apache Spark, as well as making Scala-Python
interoperability easier.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]