[jira] [Comment Edited] (AIRFLOW-952) Cannot save an empty extra field via the connections UI

2018-04-12 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16436795#comment-16436795
 ] 

sam sen edited comment on AIRFLOW-952 at 4/13/18 4:38 AM:
--

We ran into this and I believe this PR should fix it.

 

[Github Pull Request|https://github.com/apache/incubator-airflow/pull/3222] 

 


was (Author: ssen-ichain):
We ran into this and I believe this PR should fix it.

 

[PR|https://github.com/apache/incubator-airflow/pull/3222] 

 

> Cannot save an empty extra field via the connections UI 
> 
>
> Key: AIRFLOW-952
> URL: https://issues.apache.org/jira/browse/AIRFLOW-952
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: models, ui
>Reporter: Vijay Bhat
>Assignee: Vijay Bhat
>Priority: Minor
>
> Once you fill out the extra field parameter in the connections web UI, you 
> cannot clear it out. 
> Steps to reproduce:
> - open the default mysql connection via the web UI
> - enter a JSON string for the extra field and save
> - go back to editing the mysql connection and clear out the extra field.
> - hit save, and go back to the mysql connection edit UI, and the JSON string 
> will still be there.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-952) Cannot save an empty extra field via the connections UI

2018-04-12 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16436795#comment-16436795
 ] 

sam sen commented on AIRFLOW-952:
-

We ran into this and I believe this PR should fix it.

 

[PR|https://github.com/apache/incubator-airflow/pull/3222] 

 

> Cannot save an empty extra field via the connections UI 
> 
>
> Key: AIRFLOW-952
> URL: https://issues.apache.org/jira/browse/AIRFLOW-952
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: models, ui
>Reporter: Vijay Bhat
>Assignee: Vijay Bhat
>Priority: Minor
>
> Once you fill out the extra field parameter in the connections web UI, you 
> cannot clear it out. 
> Steps to reproduce:
> - open the default mysql connection via the web UI
> - enter a JSON string for the extra field and save
> - go back to editing the mysql connection and clear out the extra field.
> - hit save, and go back to the mysql connection edit UI, and the JSON string 
> will still be there.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-952) Cannot save an empty extra field via the connections UI

2018-04-12 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16436795#comment-16436795
 ] 

sam sen edited comment on AIRFLOW-952 at 4/13/18 4:38 AM:
--

We ran into this and I believe this PR should fix it.

[Github Pull Request|https://github.com/apache/incubator-airflow/pull/3222] 

 


was (Author: ssen-ichain):
We ran into this and I believe this PR should fix it.

 

[Github Pull Request|https://github.com/apache/incubator-airflow/pull/3222] 

 

> Cannot save an empty extra field via the connections UI 
> 
>
> Key: AIRFLOW-952
> URL: https://issues.apache.org/jira/browse/AIRFLOW-952
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: models, ui
>Reporter: Vijay Bhat
>Assignee: Vijay Bhat
>Priority: Minor
>
> Once you fill out the extra field parameter in the connections web UI, you 
> cannot clear it out. 
> Steps to reproduce:
> - open the default mysql connection via the web UI
> - enter a JSON string for the extra field and save
> - go back to editing the mysql connection and clear out the extra field.
> - hit save, and go back to the mysql connection edit UI, and the JSON string 
> will still be there.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Reopened] (AIRFLOW-2122) SSHOperator throws an error

2018-02-20 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen reopened AIRFLOW-2122:
--

> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code: 
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/bin/airflow", line 27, in 
> [2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
> args.func(args)
> [2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
> pool=args.pool,
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = func(*args, **kwargs)
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
> _run_raw_task
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = task_copy.execute(context=context)
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
> line 146, in execute
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: 
> raise AirflowException("SSH operator error: {0}".format(str(e)))
> [2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
> airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
> attribute 'lower'
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2122) SSHOperator throws an error

2018-02-19 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369146#comment-16369146
 ] 

sam sen commented on AIRFLOW-2122:
--

That did it!!! Thanks

> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code: 
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/bin/airflow", line 27, in 
> [2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
> args.func(args)
> [2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
> pool=args.pool,
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = func(*args, **kwargs)
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
> _run_raw_task
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = task_copy.execute(context=context)
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
> line 146, in execute
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: 
> raise AirflowException("SSH operator error: {0}".format(str(e)))
> [2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
> airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
> attribute 'lower'
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-2122) SSHOperator throws an error

2018-02-19 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen closed AIRFLOW-2122.

Resolution: Not A Bug

> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code: 
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/bin/airflow", line 27, in 
> [2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
> args.func(args)
> [2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
> pool=args.pool,
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = func(*args, **kwargs)
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
> _run_raw_task
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = task_copy.execute(context=context)
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
> line 146, in execute
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: 
> raise AirflowException("SSH operator error: {0}".format(str(e)))
> [2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
> airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
> attribute 'lower'
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2122) SSHOperator throws an error

2018-02-18 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-2122:
-
Description: 
Here's my code: 
{code:java}
dag = DAG('transfer_ftp_s3', default_args=default_args,schedule_interval=None) 
}}
task = SSHOperator(ssh_conn_id='ssh_node', 
                   task_id="check_ftp_for_new_files", 
                   command="echo 'hello world'", 
                   do_xcom_push=True, dag=dag,)
{code}
 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 

  was:
Here's my code:

 

 

 
{code:java}
dag = DAG('transfer_ftp_s3', default_args=default_args,schedule_interval=None) 
}}
task = SSHOperator(ssh_conn_id='ssh_node', 
                   task_id="check_ftp_for_new_files", 
                   command="echo 'hello world'", 
                   do_xcom_push=True, dag=dag,)
{code}
 

 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 


> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code: 
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/bin/airflow", line 27, in 
> [2018-02-19 06:48:02,692] 

[jira] [Updated] (AIRFLOW-2122) SSHOperator throws an error

2018-02-18 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-2122:
-
Description: 
Here's my code:

 

 

 
{code:java}
dag = DAG('transfer_ftp_s3', default_args=default_args,schedule_interval=None) 
}}
task = SSHOperator(ssh_conn_id='ssh_node', 
                   task_id="check_ftp_for_new_files", 
                   command="echo 'hello world'", 
                   do_xcom_push=True, dag=dag,)
{code}
 

 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 

  was:
Here's my code:

 

 

 
{code:java}
dag = DAG('transfer_ftp_s3', default_args=default_args,schedule_interval=None) 
}}
task = SSHOperator(ssh_conn_id='ssh_node', 
                                  task_id="check_ftp_for_new_files", 
                                  command="echo 'hello world'", 
                                  do_xcom_push=True, dag=dag,)
{code}
 

 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 


> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code:
>  
>  
>  
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   

[jira] [Updated] (AIRFLOW-2122) SSHOperator throws an error

2018-02-18 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-2122:
-
Description: 
Here's my code:

 

 

 
{code:java}
dag = DAG('transfer_ftp_s3', default_args=default_args,schedule_interval=None) 
}}
task = SSHOperator(ssh_conn_id='ssh_node', 
                                  task_id="check_ftp_for_new_files", 
                                  command="echo 'hello world'", 
                                  do_xcom_push=True, dag=dag,)
{code}
 

 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 

  was:
Here's my code:

 

 

{{dag = DAG('transfer_ftp_s3', 
default_args=default_args,schedule_interval=None) }}

{{task = SSHOperator(ssh_conn_id='ssh_node',}}
{{   task_id="check_ftp_for_new_files", }}
{{   command="echo 'hello world'", }}
{{   do_xcom_push=True, dag=dag,)}}

 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 


> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code:
>  
>  
>  
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                                   task_id="check_ftp_for_new_files", 
>                                   command="echo 'hello world'", 
>                                   do_xcom_push=True, dag=dag,)
> {code}
>  
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] 

[jira] [Created] (AIRFLOW-2122) SSHOperator throws an error

2018-02-18 Thread sam sen (JIRA)
sam sen created AIRFLOW-2122:


 Summary: SSHOperator throws an error
 Key: AIRFLOW-2122
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
 Project: Apache Airflow
  Issue Type: Bug
Reporter: sam sen


Here's my code:

 

 

{{dag = DAG('transfer_ftp_s3', 
default_args=default_args,schedule_interval=None) }}

{{task = SSHOperator(ssh_conn_id='ssh_node',}}
{{   task_id="check_ftp_for_new_files", }}
{{   command="echo 'hello world'", }}
{{   do_xcom_push=True, dag=dag,)}}

 

Here's the error
{code:java}
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: Traceback 
(most recent call last):
[2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/bin/airflow", line 27, in 
[2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
args.func(args)
[2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
pool=args.pool,
[2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= func(*args, **kwargs)
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
_run_raw_task
[2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: result 
= task_copy.execute(context=context)
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
"/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
line 146, in execute
[2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: raise 
AirflowException("SSH operator error: {0}".format(str(e)))
[2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
attribute 'lower'
{code}
 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-342) exception in 'airflow scheduler' : Connection reset by peer

2017-11-06 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16240450#comment-16240450
 ] 

sam sen commented on AIRFLOW-342:
-

FYI, I solved my issue by modifying the config option for 
`celery_result_backend`. I was following various online tutorials and they 
differed in terms of the backend datastore. I initially had it set to the same 
setting as the `broker_url`. Once I pointed it to my MySQL instance, I no 
longer received the error message.

RabbitMQ - 3.6.1
Airflow - 1.8.2
AMQ - 2.2.2
Celery 4.1

>  exception in 'airflow scheduler' : Connection reset by peer
> 
>
> Key: AIRFLOW-342
> URL: https://issues.apache.org/jira/browse/AIRFLOW-342
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, scheduler
>Affects Versions: Airflow 1.7.1.3
> Environment: OS: Red Hat Enterprise Linux Server 7.2 (Maipo)
> Python: 2.7.5
> Airflow: 1.7.1.3
>Reporter: Hila Visan
>Assignee: Hila Visan
>
> 'airflow scheduler' command throws an exception when running it. 
> Despite the exception, the workers run the tasks from the queues as expected.
> Error details:
>  
> [2016-06-30 19:00:10,130] {jobs.py:758} ERROR - [Errno 104] Connection reset 
> by peer
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 755, in 
> _execute
> executor.heartbeat()
>   File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", 
> line 107, in heartbeat
> self.sync()
>   File 
> "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 
> 74, in sync
> state = async.state
>   File "/usr/lib/python2.7/site-packages/celery/result.py", line 394, in state
> return self._get_task_meta()['status']
>   File "/usr/lib/python2.7/site-packages/celery/result.py", line 339, in 
> _get_task_meta
> return self._maybe_set_cache(self.backend.get_task_meta(self.id))
>   File "/usr/lib/python2.7/site-packages/celery/backends/amqp.py", line 163, 
> in get_task_meta
> binding.declare()
>   File "/usr/lib/python2.7/site-packages/kombu/entity.py", line 521, in 
> declare
>self.exchange.declare(nowait)
>   File "/usr/lib/python2.7/site-packages/kombu/entity.py", line 174, in 
> declare
> nowait=nowait, passive=passive,
>   File "/usr/lib/python2.7/site-packages/amqp/channel.py", line 615, in 
> exchange_declare
> self._send_method((40, 10), args)
>   File "/usr/lib/python2.7/site-packages/amqp/abstract_channel.py", line 56, 
> in _send_method
> self.channel_id, method_sig, args, content,
>   File "/usr/lib/python2.7/site-packages/amqp/method_framing.py", line 221, 
> in write_method
> write_frame(1, channel, payload)
>   File "/usr/lib/python2.7/site-packages/amqp/transport.py", line 182, in 
> write_frame
> frame_type, channel, size, payload, 0xce,
>   File "/usr/lib64/python2.7/socket.py", line 224, in meth
> return getattr(self._sock,name)(*args)
> error: [Errno 104] Connection reset by peer
> [2016-06-30 19:00:10,131] {jobs.py:759} ERROR - Tachycardia!



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (AIRFLOW-342) exception in 'airflow scheduler' : Connection reset by peer

2017-11-06 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16240339#comment-16240339
 ] 

sam sen commented on AIRFLOW-342:
-

Same here, is there a workaround? We tried using Celery 4.x but downgraded to 
see if that would fix the issue, nope.

>  exception in 'airflow scheduler' : Connection reset by peer
> 
>
> Key: AIRFLOW-342
> URL: https://issues.apache.org/jira/browse/AIRFLOW-342
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, scheduler
>Affects Versions: Airflow 1.7.1.3
> Environment: OS: Red Hat Enterprise Linux Server 7.2 (Maipo)
> Python: 2.7.5
> Airflow: 1.7.1.3
>Reporter: Hila Visan
>Assignee: Hila Visan
>
> 'airflow scheduler' command throws an exception when running it. 
> Despite the exception, the workers run the tasks from the queues as expected.
> Error details:
>  
> [2016-06-30 19:00:10,130] {jobs.py:758} ERROR - [Errno 104] Connection reset 
> by peer
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 755, in 
> _execute
> executor.heartbeat()
>   File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", 
> line 107, in heartbeat
> self.sync()
>   File 
> "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 
> 74, in sync
> state = async.state
>   File "/usr/lib/python2.7/site-packages/celery/result.py", line 394, in state
> return self._get_task_meta()['status']
>   File "/usr/lib/python2.7/site-packages/celery/result.py", line 339, in 
> _get_task_meta
> return self._maybe_set_cache(self.backend.get_task_meta(self.id))
>   File "/usr/lib/python2.7/site-packages/celery/backends/amqp.py", line 163, 
> in get_task_meta
> binding.declare()
>   File "/usr/lib/python2.7/site-packages/kombu/entity.py", line 521, in 
> declare
>self.exchange.declare(nowait)
>   File "/usr/lib/python2.7/site-packages/kombu/entity.py", line 174, in 
> declare
> nowait=nowait, passive=passive,
>   File "/usr/lib/python2.7/site-packages/amqp/channel.py", line 615, in 
> exchange_declare
> self._send_method((40, 10), args)
>   File "/usr/lib/python2.7/site-packages/amqp/abstract_channel.py", line 56, 
> in _send_method
> self.channel_id, method_sig, args, content,
>   File "/usr/lib/python2.7/site-packages/amqp/method_framing.py", line 221, 
> in write_method
> write_frame(1, channel, payload)
>   File "/usr/lib/python2.7/site-packages/amqp/transport.py", line 182, in 
> write_frame
> frame_type, channel, size, payload, 0xce,
>   File "/usr/lib64/python2.7/socket.py", line 224, in meth
> return getattr(self._sock,name)(*args)
> error: [Errno 104] Connection reset by peer
> [2016-06-30 19:00:10,131] {jobs.py:759} ERROR - Tachycardia!



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Created] (AIRFLOW-1285) Missing icon new install

2017-06-06 Thread sam sen (JIRA)
sam sen created AIRFLOW-1285:


 Summary: Missing icon new install
 Key: AIRFLOW-1285
 URL: https://issues.apache.org/jira/browse/AIRFLOW-1285
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: 1.8.1
Reporter: sam sen
 Attachments: Screen Shot 2017-06-06 at 3.30.04 PM.png

New install from pip is missing the icon to the left of the Dags. No way to 
disable a DAG.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1220) Can't clear airflow jobs when using schedule_interval of None

2017-05-17 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1220?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1220:
-
Description: 
My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 

{code:none}
[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag stats 
for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType
{code}

  was:
My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 

{code:txt}
[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag stats 
for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType
{code}


> Can't clear airflow jobs when using schedule_interval of None
> -
>
> Key: AIRFLOW-1220
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1220
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam 

[jira] [Updated] (AIRFLOW-1220) Can't clear airflow jobs when using schedule_interval of None

2017-05-17 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1220?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1220:
-
Description: 
My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 

{code:txt}
[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag stats 
for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType
{code}

  was:
My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 


[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag stats 
for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType



> Can't clear airflow jobs when using schedule_interval of None
> -
>
> Key: AIRFLOW-1220
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1220
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>
> My dag 

[jira] [Updated] (AIRFLOW-1220) Can't clear airflow jobs when using schedule_interval of None

2017-05-17 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1220?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1220:
-
Description: 
My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 


[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag stats 
for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType


  was:
My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 

```[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag 
stats for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType
```


> Can't clear airflow jobs when using schedule_interval of None
> -
>
> Key: AIRFLOW-1220
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1220
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>
> My dag is set to 

[jira] [Created] (AIRFLOW-1220) Can't clear airflow jobs when using schedule_interval of None

2017-05-17 Thread sam sen (JIRA)
sam sen created AIRFLOW-1220:


 Summary: Can't clear airflow jobs when using schedule_interval of 
None
 Key: AIRFLOW-1220
 URL: https://issues.apache.org/jira/browse/AIRFLOW-1220
 Project: Apache Airflow
  Issue Type: Bug
Reporter: sam sen


My dag is set to use `schedule_interval=None`. I'm attempting to clear jobs so 
I can re-run failed jobs but I'm encountering the following errors. 

```[2017-05-17 14:02:14,718] {models.py:3834} WARNING - Could not update dag 
stats for s3_dag_test
[2017-05-17 14:02:14,718] {models.py:3835} ERROR - can't compare 
datetime.datetime to NoneType
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/models.py", line 3831, in 
set_dirty
session.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
874, in commit
self.transaction.commit()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
461, in commit
self._prepare_impl()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
441, in _prepare_impl
self.session.flush()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2139, in flush
self._flush(objects)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2259, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 
2223, in _flush
flush_context.execute()
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
389, in execute
rec.execute(self)
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 
577, in execute
uow
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
243, in delete_obj
uowtransaction))
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
357, in _organize_states_for_delete
states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1108, in _connections_for_states
for state in _sort_states(states):
  File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 
1130, in _sort_states
sorted(persistent, key=lambda q: q.key[1])
TypeError: can't compare datetime.datetime to NoneType
```



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen closed AIRFLOW-1183.

Resolution: Fixed

> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> *Error*
> {code}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {code}
> *Code*
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp',
> deploy_mode='cluster',
> dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

*Error*
{code}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {code}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {quote}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> *Error*
> {code}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {code}
> *Code*
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

*Error*
{code}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {code}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp',
deploy_mode='cluster',
dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

*Error*
{code}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {code}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> *Error*
> {code}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {code}
> *Code*
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp',
> deploy_mode='cluster',
> dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {quote}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> {quote}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'} {quote}
> *Code*
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

*Code*
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

#Code
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> {quote}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> {quote}
> *Code*
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

#Code
{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> {quote}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> {quote}
> #Code
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

{code:python}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> {quote}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> {quote}
> {code:python}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

{code}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

{code:python}
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
{code}


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> {quote}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> {quote}
> {code}
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.


bq. PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.


bq. PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
bq. *args: ()
bq. **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> bq. PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> #Code
> ```
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

{quote}
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
{quote}

#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.


bq. PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> {quote}
> PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> {quote}
> #Code
> ```
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.


bq. PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

```
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}```


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> bq. PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> *args: ()
> **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> #Code
> ```
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen updated AIRFLOW-1183:
-
Description: 
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.


bq. PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
bq. *args: ()
bq. **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```

  was:
How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.


bq. PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```


> How to pass Spark URL for standalone cluster?
> -
>
> Key: AIRFLOW-1183
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: Airflow 1.8
>Reporter: sam sen
>Priority: Critical
>
> How can I pass my Spark URL? When I look in the logs I see `--master` is 
> pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing 
> it within the function but I'm getting an error.
> bq. PendingDeprecationWarning: Invalid arguments were passed to 
> SparkSubmitOperator. Support for passing such arguments will be dropped in 
> Airflow 2.0. Invalid arguments were:
> bq. *args: ()
> bq. **kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}
> #Code
> ```
> testSpark = SparkSubmitOperator(
>task_id='test-spark',
> deploy_mode='cluster',
> 
> application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
> java_class='SimpleApp'
>dag=dag)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (AIRFLOW-1183) How to pass Spark URL for standalone cluster?

2017-05-09 Thread sam sen (JIRA)
sam sen created AIRFLOW-1183:


 Summary: How to pass Spark URL for standalone cluster?
 Key: AIRFLOW-1183
 URL: https://issues.apache.org/jira/browse/AIRFLOW-1183
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators
Affects Versions: Airflow 1.8
Reporter: sam sen
Priority: Critical


How can I pass my Spark URL? When I look in the logs I see `--master` is 
pointed to "yarn." Also, the same thing for `cluster-mode`. I tried passing it 
within the function but I'm getting an error.

```
PendingDeprecationWarning: Invalid arguments were passed to 
SparkSubmitOperator. Support for passing such arguments will be dropped in 
Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'deploy_mode': 'cluster', 'java_class': 'SimpleApp'}```


#Code
```
testSpark = SparkSubmitOperator(
   task_id='test-spark',
deploy_mode='cluster',
application='src/main/scala/target/scala-2.11/simple-project_2.11-1.0.jar',
java_class='SimpleApp'
   dag=dag)
```



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)