wolfier opened a new issue #10876:
URL: https://github.com/apache/airflow/issues/10876
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
This questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.7+
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
```
Client Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.5",
GitCommit:"20c265fef0741dd71a66480e35bd69f18351daea", GitTreeState:"clean",
BuildDate:"2019-10-15T19:16:51Z", GoVersion:"go1.12.10", Compiler:"gc",
Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"14+",
GitVersion:"v1.14.10-gke.42",
GitCommit:"42bef28c2031a74fc68840fce56834ff7ea08518", GitTreeState:"clean",
BuildDate:"2020-06-02T16:07:00Z", GoVersion:"go1.12.12b4", Compiler:"gc",
Platform:"linux/amd64"}
```
**What happened**:
Airflow cannot load a serialized DAG that has a BigQueryOperator.
```
-------------------------------------------------------------------------------
Node: 8c5770a964d6
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in
wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in
full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1821, in
handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.7/site-packages/flask/_compat.py", line 39,
in reraise
raise value
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1950, in
full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1936, in
dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File
"/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line
121, in wrapper
return f(self, *args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/flask_appbuilder/security/decorators.py",
line 109, in wraps
return f(self, *args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line
92, in view_func
return f(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line
56, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", line
74, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/views.py",
line 1386, in tree
dag = dagbag.get_dag(dag_id)
File "/usr/local/lib/python3.7/site-packages/airflow/models/dagbag.py",
line 140, in get_dag
dag = row.dag
File
"/usr/local/lib/python3.7/site-packages/airflow/models/serialized_dag.py", line
135, in dag
dag = SerializedDAG.from_dict(self.data) # type: Any
File
"/usr/local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py",
line 619, in from_dict
return cls.deserialize_dag(serialized_obj['dag'])
File
"/usr/local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py",
line 558, in deserialize_dag
task["task_id"]: SerializedBaseOperator.deserialize_operator(task) for
task in v
File
"/usr/local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py",
line 558, in <dictcomp>
task["task_id"]: SerializedBaseOperator.deserialize_operator(task) for
task in v
File
"/usr/local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py",
line 394, in deserialize_operator
op_predefined_extra_links = cls._deserialize_operator_extra_links(v)
File
"/usr/local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py",
line 468, in _deserialize_operator_extra_links
data, single_op_link_class_name) # type: BaseOperatorLink
File "/usr/local/lib/python3.7/site-packages/cattr/converters.py", line
192, in structure
return self._structure_func.dispatch(cl)(obj, cl)
File "/usr/local/lib/python3.7/site-packages/cattr/converters.py", line
317, in structure_attrs_fromdict
return cl(**conv_obj) # type: ignore
TypeError: __init__() missing 1 required positional argument: 'index'
```
**What you expected to happen**:
For Airflow to load the DAG views normally.
**How to reproduce it**:
Enable DAG Serialization as instructed in the [Airflow
documentation](https://airflow.apache.org/docs/stable/dag-serialization.html).
Create a DAG that has a BigQueryOperator task defined.
Open the DAG in the Airflow UI and you will get the error.
```
from datetime import datetime
from airflow.models import DAG
from airflow.contrib.operators.bigquery_operator import BigQueryOperator
dag = DAG(
dag_id='my_dag',
schedule_interval='@once',
start_date=datetime(2020, 1, 1)
)
test = BigQueryOperator(
sql=["asdsa"],
task_id='test2',
dag=dag
)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]