[jira] [Commented] (AIRFLOW-4761) Airflow Task Clear function throws error
[ https://issues.apache.org/jira/browse/AIRFLOW-4761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17098649#comment-17098649 ] Kamil Choudhury commented on AIRFLOW-4761: -- I'm still running into this on 1.10.10/python 2.7.10, and can reproduce it programmatically. I'm not too familiar with airflow internals, but am happy to walk through it with people who are more experienced. Thanks! > Airflow Task Clear function throws error > > > Key: AIRFLOW-4761 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4761 > Project: Apache Airflow > Issue Type: Bug > Components: DAG, DagRun >Affects Versions: 1.10.3 > Environment: CentOS 7, Python 2.7.10 >Reporter: Ben Storrie >Priority: Major > > When using the airflow webserver to clear a task inside a dagrun, an error is > thrown on certain types of tasks: > > {code:java} > Traceback (most recent call last): > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 2311, in wsgi_app > response = self.full_dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1834, in full_dispatch_request > rv = self.handle_user_exception(e) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1737, in handle_user_exception > reraise(exc_type, exc_value, tb) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1832, in full_dispatch_request > rv = self.dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1818, in dispatch_request > return self.view_functions[rule.endpoint](**req.view_args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 69, in inner > return self._run_view(f, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 368, in _run_view > return fn(self, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_login/utils.py", > line 261, in decorated_view > return func(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 275, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 322, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/views.py", > line 1202, in clear > include_upstream=upstream) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3830, in sub_dag > dag = copy.deepcopy(self) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3815, in __deepcopy__ > setattr(result, k, copy.deepcopy(v, memo)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 163, in deepcopy > y = copier(x, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 257, in _deepcopy_dict > y[deepcopy(key, memo)] = deepcopy(value, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 2492, in __deepcopy__ > setattr(result, k, copy.copy(v)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 96, in copy > return _reconstruct(x, rv, 0) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 329, in _reconstruct > y = callable(*args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy_reg.py", > line 93, in __newobj__ > return cls.__new__(cls, *args) > TypeError: instancemethod expected at least 2 arguments, got 0{code} > > I had expected AIRFLOW-2060 being resolved to resolve this on upgrade to > 1.10.3: > {code:java} > (my-hadoop-airflow) [user@hostname ~]$ pip freeze | grep pendulum > pendulum==1.4.4{code} > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4761) Airflow Task Clear function throws error
[ https://issues.apache.org/jira/browse/AIRFLOW-4761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16952351#comment-16952351 ] Connie Chen commented on AIRFLOW-4761: -- [~brstorrie] the weird part is that it works with the cli and that some tasks within the same DAG work and some don't, and I have one task that is being used across two DAGs, in one it work and in the other it doesn't. > Airflow Task Clear function throws error > > > Key: AIRFLOW-4761 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4761 > Project: Apache Airflow > Issue Type: Bug > Components: DAG, DagRun >Affects Versions: 1.10.3 > Environment: CentOS 7, Python 2.7.10 >Reporter: Ben Storrie >Priority: Major > > When using the airflow webserver to clear a task inside a dagrun, an error is > thrown on certain types of tasks: > > {code:java} > Traceback (most recent call last): > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 2311, in wsgi_app > response = self.full_dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1834, in full_dispatch_request > rv = self.handle_user_exception(e) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1737, in handle_user_exception > reraise(exc_type, exc_value, tb) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1832, in full_dispatch_request > rv = self.dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1818, in dispatch_request > return self.view_functions[rule.endpoint](**req.view_args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 69, in inner > return self._run_view(f, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 368, in _run_view > return fn(self, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_login/utils.py", > line 261, in decorated_view > return func(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 275, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 322, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/views.py", > line 1202, in clear > include_upstream=upstream) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3830, in sub_dag > dag = copy.deepcopy(self) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3815, in __deepcopy__ > setattr(result, k, copy.deepcopy(v, memo)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 163, in deepcopy > y = copier(x, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 257, in _deepcopy_dict > y[deepcopy(key, memo)] = deepcopy(value, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 2492, in __deepcopy__ > setattr(result, k, copy.copy(v)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 96, in copy > return _reconstruct(x, rv, 0) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 329, in _reconstruct > y = callable(*args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy_reg.py", > line 93, in __newobj__ > return cls.__new__(cls, *args) > TypeError: instancemethod expected at least 2 arguments, got 0{code} > > I had expected AIRFLOW-2060 being resolved to resolve this on upgrade to > 1.10.3: > {code:java} > (my-hadoop-airflow) [user@hostname ~]$ pip freeze | grep pendulum > pendulum==1.4.4{code} > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4761) Airflow Task Clear function throws error
[ https://issues.apache.org/jira/browse/AIRFLOW-4761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16952347#comment-16952347 ] Ben Storrie commented on AIRFLOW-4761: -- I was not. It generally happens on clearing any task within dags that have certain kinds of hooks or operators. The S3Hook is a good example, I have to clear an entire dagrun rather than a task within a dagrun if I have multiple S3 ops, and they fail on deepcopy > Airflow Task Clear function throws error > > > Key: AIRFLOW-4761 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4761 > Project: Apache Airflow > Issue Type: Bug > Components: DAG, DagRun >Affects Versions: 1.10.3 > Environment: CentOS 7, Python 2.7.10 >Reporter: Ben Storrie >Priority: Major > > When using the airflow webserver to clear a task inside a dagrun, an error is > thrown on certain types of tasks: > > {code:java} > Traceback (most recent call last): > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 2311, in wsgi_app > response = self.full_dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1834, in full_dispatch_request > rv = self.handle_user_exception(e) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1737, in handle_user_exception > reraise(exc_type, exc_value, tb) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1832, in full_dispatch_request > rv = self.dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1818, in dispatch_request > return self.view_functions[rule.endpoint](**req.view_args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 69, in inner > return self._run_view(f, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 368, in _run_view > return fn(self, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_login/utils.py", > line 261, in decorated_view > return func(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 275, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 322, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/views.py", > line 1202, in clear > include_upstream=upstream) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3830, in sub_dag > dag = copy.deepcopy(self) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3815, in __deepcopy__ > setattr(result, k, copy.deepcopy(v, memo)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 163, in deepcopy > y = copier(x, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 257, in _deepcopy_dict > y[deepcopy(key, memo)] = deepcopy(value, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 2492, in __deepcopy__ > setattr(result, k, copy.copy(v)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 96, in copy > return _reconstruct(x, rv, 0) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 329, in _reconstruct > y = callable(*args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy_reg.py", > line 93, in __newobj__ > return cls.__new__(cls, *args) > TypeError: instancemethod expected at least 2 arguments, got 0{code} > > I had expected AIRFLOW-2060 being resolved to resolve this on upgrade to > 1.10.3: > {code:java} > (my-hadoop-airflow) [user@hostname ~]$ pip freeze | grep pendulum > pendulum==1.4.4{code} > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4761) Airflow Task Clear function throws error
[ https://issues.apache.org/jira/browse/AIRFLOW-4761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16952339#comment-16952339 ] Connie Chen commented on AIRFLOW-4761: -- [~brstorrie] were you able to fix this? I am getting this error too when upgrading to 1.10.5, but I can't figure out why it's happening to some tasks and some not > Airflow Task Clear function throws error > > > Key: AIRFLOW-4761 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4761 > Project: Apache Airflow > Issue Type: Bug > Components: DAG, DagRun >Affects Versions: 1.10.3 > Environment: CentOS 7, Python 2.7.10 >Reporter: Ben Storrie >Priority: Major > > When using the airflow webserver to clear a task inside a dagrun, an error is > thrown on certain types of tasks: > > {code:java} > Traceback (most recent call last): > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 2311, in wsgi_app > response = self.full_dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1834, in full_dispatch_request > rv = self.handle_user_exception(e) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1737, in handle_user_exception > reraise(exc_type, exc_value, tb) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1832, in full_dispatch_request > rv = self.dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1818, in dispatch_request > return self.view_functions[rule.endpoint](**req.view_args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 69, in inner > return self._run_view(f, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 368, in _run_view > return fn(self, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_login/utils.py", > line 261, in decorated_view > return func(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 275, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 322, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/views.py", > line 1202, in clear > include_upstream=upstream) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3830, in sub_dag > dag = copy.deepcopy(self) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3815, in __deepcopy__ > setattr(result, k, copy.deepcopy(v, memo)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 163, in deepcopy > y = copier(x, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 257, in _deepcopy_dict > y[deepcopy(key, memo)] = deepcopy(value, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 2492, in __deepcopy__ > setattr(result, k, copy.copy(v)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 96, in copy > return _reconstruct(x, rv, 0) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 329, in _reconstruct > y = callable(*args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy_reg.py", > line 93, in __newobj__ > return cls.__new__(cls, *args) > TypeError: instancemethod expected at least 2 arguments, got 0{code} > > I had expected AIRFLOW-2060 being resolved to resolve this on upgrade to > 1.10.3: > {code:java} > (my-hadoop-airflow) [user@hostname ~]$ pip freeze | grep pendulum > pendulum==1.4.4{code} > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4761) Airflow Task Clear function throws error
[ https://issues.apache.org/jira/browse/AIRFLOW-4761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16877845#comment-16877845 ] jack commented on AIRFLOW-4761: --- Can you provide more info on what sort of tasks this happens? > Airflow Task Clear function throws error > > > Key: AIRFLOW-4761 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4761 > Project: Apache Airflow > Issue Type: Bug > Components: DAG, DagRun >Affects Versions: 1.10.3 > Environment: CentOS 7, Python 2.7.10 >Reporter: Ben Storrie >Priority: Major > > When using the airflow webserver to clear a task inside a dagrun, an error is > thrown on certain types of tasks: > > {code:java} > Traceback (most recent call last): > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 2311, in wsgi_app > response = self.full_dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1834, in full_dispatch_request > rv = self.handle_user_exception(e) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1737, in handle_user_exception > reraise(exc_type, exc_value, tb) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1832, in full_dispatch_request > rv = self.dispatch_request() > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", > line 1818, in dispatch_request > return self.view_functions[rule.endpoint](**req.view_args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 69, in inner > return self._run_view(f, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", > line 368, in _run_view > return fn(self, *args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_login/utils.py", > line 261, in decorated_view > return func(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 275, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", > line 322, in wrapper > return f(*args, **kwargs) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/views.py", > line 1202, in clear > include_upstream=upstream) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3830, in sub_dag > dag = copy.deepcopy(self) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 3815, in __deepcopy__ > setattr(result, k, copy.deepcopy(v, memo)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 163, in deepcopy > y = copier(x, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 257, in _deepcopy_dict > y[deepcopy(key, memo)] = deepcopy(value, memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 174, in deepcopy > y = copier(memo) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", > line 2492, in __deepcopy__ > setattr(result, k, copy.copy(v)) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 96, in copy > return _reconstruct(x, rv, 0) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", > line 329, in _reconstruct > y = callable(*args) > File > "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy_reg.py", > line 93, in __newobj__ > return cls.__new__(cls, *args) > TypeError: instancemethod expected at least 2 arguments, got 0{code} > > I had expected AIRFLOW-2060 being resolved to resolve this on upgrade to > 1.10.3: > {code:java} > (my-hadoop-airflow) [user@hostname ~]$ pip freeze | grep pendulum > pendulum==1.4.4{code} > > -- This message was sent by Atlassian JIRA (v7.6.3#76005)