zachliu opened a new issue #14421:
URL: https://github.com/apache/airflow/issues/14421


   **Apache Airflow version**: 2.0.1
   
   **Environment**: Docker on Linux Mint 20.1, image based on 
apache/airflow:2.0.1-python3.8
   
   **What happened**:
   
   I'm using the airflow API and the following exception occurred:
   
   ```python
   >>> import json
   >>> import requests
   >>> from requests.auth import HTTPBasicAuth
   >>> payload = {"dag_ids": ["{my_dag_id}"]}
   >>> r = 
requests.post("https://localhost:8080/api/v1/dags/~/dagRuns/~/taskInstances/list";,
 auth=HTTPBasicAuth('username', 'password'), data=json.dumps(payload), 
headers={'Content-Type': 'application/json'})
   >>> r.status_code
   500
   >>> print(r.text)
   {
     "detail": "None is not of type 'string'\n\nFailed validating 'type' in 
schema['allOf'][0]['properties'][
   'task_instances']['items']['properties']['operator']:\n    {'type': 
'string'}\n\nOn instance['task_instanc
   es'][5]['operator']:\n    None",
     "status": 500,
     "title": "Response body does not conform to specification",
     "type": 
"https://airflow.apache.org/docs/2.0.1/stable-rest-api-ref.html#section/Errors/Unknown";
   }
   None is not of type 'string'
   
   Failed validating 'type' in 
schema['allOf'][0]['properties']['task_instances']['items']['properties']['ope
   rator']:
       {'type': 'string'}
   
   On instance['task_instances'][5]['operator']:
       None
   ```
   This happens on all the "old" task instances before upgrading to 2.0.0
   There is no issue with new task instances created after the upgrade.
   
   <!-- (please include exact error messages if you can) -->
   
   **What do you think went wrong?**:
   The `operator` column was introduced in 2.0.0. But during migration, all the 
existing database entries are filled with `NULL` values. So I had to execute 
this manually in my database
   ```sql
   UPDATE task_instance SET operator = 'NoOperator' WHERE operator IS NULL;
   ```
   
   **How to reproduce it**:
   
   * Run airflow 1.10.14
   * Create a DAG with multiple tasks and run them
   * Upgrade airflow to 2.0.0 or 2.0.1
   * Make the API call as above
   
   **Anything else we need to know**:
   Similar to https://github.com/apache/airflow/issues/13799 but not exactly 
the same
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to