andreaspeters commented on a change in pull request #10977:
URL: https://github.com/apache/airflow/pull/10977#discussion_r491816264



##########
File path: docs/howto/use-mesos.rst
##########
@@ -0,0 +1,84 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+
+Scaling Out with Mesos (community contributed)
+==============================================
+
+There are two ways you can run airflow as a mesos framework:
+
+1. Running airflow tasks directly on mesos slaves, requiring each mesos slave 
to have airflow installed and configured.
+2. Running airflow tasks inside a docker container that has airflow installed, 
which is run on a mesos slave.
+
+Tasks executed directly on mesos slaves
+---------------------------------------
+
+:class:`airflow.contrib.executors.mesos_executor.MesosExecutor` allows you to 
schedule airflow tasks on a Mesos cluster.
+For this to work, you need a running mesos cluster and you must perform the 
following
+steps -
+
+1. Install airflow on a mesos slave where web server and scheduler will run,
+   let's refer to this as the "Airflow server".
+2. On the Airflow server, install mesos python eggs from `mesos downloads 
<http://open.mesosphere.com/downloads/mesos/>`_.
+3. On the Airflow server, use a database (such as mysql) which can be accessed 
from all mesos
+   slaves and add configuration in ``airflow.cfg``.
+4. Change your ``airflow.cfg`` to point executor parameter to
+   ``MesosExecutor`` and provide related Mesos settings.
+5. On all mesos slaves, install airflow. Copy the ``airflow.cfg`` from
+   Airflow server (so that it uses same sql alchemy connection).
+6. On all mesos slaves, run the following for serving logs:
+
+.. code-block:: bash
+
+    airflow serve_logs

Review comment:
       the howto doc's I want update a little bit later. at first I just 
reinclude the older versions. if it's not ok, then we can remove them and I can 
add them in some weeks. What do u prefer?

##########
File path: docs/howto/use-mesos.rst
##########
@@ -0,0 +1,84 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+
+Scaling Out with Mesos (community contributed)
+==============================================
+
+There are two ways you can run airflow as a mesos framework:
+
+1. Running airflow tasks directly on mesos slaves, requiring each mesos slave 
to have airflow installed and configured.
+2. Running airflow tasks inside a docker container that has airflow installed, 
which is run on a mesos slave.
+
+Tasks executed directly on mesos slaves
+---------------------------------------
+
+:class:`airflow.contrib.executors.mesos_executor.MesosExecutor` allows you to 
schedule airflow tasks on a Mesos cluster.
+For this to work, you need a running mesos cluster and you must perform the 
following
+steps -
+
+1. Install airflow on a mesos slave where web server and scheduler will run,
+   let's refer to this as the "Airflow server".
+2. On the Airflow server, install mesos python eggs from `mesos downloads 
<http://open.mesosphere.com/downloads/mesos/>`_.
+3. On the Airflow server, use a database (such as mysql) which can be accessed 
from all mesos
+   slaves and add configuration in ``airflow.cfg``.
+4. Change your ``airflow.cfg`` to point executor parameter to
+   ``MesosExecutor`` and provide related Mesos settings.
+5. On all mesos slaves, install airflow. Copy the ``airflow.cfg`` from
+   Airflow server (so that it uses same sql alchemy connection).
+6. On all mesos slaves, run the following for serving logs:
+
+.. code-block:: bash
+
+    airflow serve_logs
+
+7. On Airflow server, to start processing/scheduling DAGs on mesos, run:
+
+.. code-block:: bash
+
+    airflow scheduler -p
+
+Note: We need -p parameter to pickle the DAGs.

Review comment:
       I don't need pickle. :-) 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to