jedcunningham commented on a change in pull request #18575:
URL: https://github.com/apache/airflow/pull/18575#discussion_r719622348



##########
File path: airflow/models/dagbag.py
##########
@@ -599,7 +599,12 @@ def _serialize_dag_capturing_errors(dag, session):
             except OperationalError:
                 raise
             except Exception:
-                self.log.exception("Failed to write serialized DAG: %s", 
dag.full_filepath)
+                self.log.exception("Failed to write serialized DAG '%s' in: 
%s", dag.dag_id, dag.fileloc)
+                # Update the dag_hash to force resync as something (likely 
_sync_perm_for_dag) failed
+                session.flush()
+                
session.query(SerializedDagModel).filter(SerializedDagModel.dag_id == 
dag.dag_id).update(
+                    {"dag_hash": "(force resync)"}

Review comment:
       It gets updated in `SerializedDagModel.write_dag`. This is a 'soft' 
import_error - the DAG was parsed and successfully serialized, however syncing 
DAG specific permissions failed. An admin (or other user who has non-DAG 
specific access) can still use/run/interact with it normally. We just need to 
resync the next time or the import_error goes away _and_ when whatever 
`access_control` issue was fixed (e.g. role created, no DAG file change) so 
syncing DAG specific permissions can run again.
   
   We could make it "fatal". Alternatively, looking at this again, I can 
refactor this section to handle serializing failures and DAG specific perms 
separately. That might make it a little more clear.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to