djerraballi opened a new issue #10954:
URL: https://github.com/apache/airflow/issues/10954


   When enabling airflow serialized dags in airflow 1.10.12: 
   relevant airflor.cfg
   ```
   store_serialized_dags = True 
   store_dag_code = True        
   
   # You can also update the following default configurations based on your 
needs       
   min_serialized_dag_update_interval = 30      
   min_serialized_dag_fetch_interval = 10
   ```
   with a Mysql Server running 5.7.12, and the airflow database encoded in 
latin-1 (ugh bad mysql default). 
   
   I core dumps that are filling up my container.
   Looking at the serialized.dag
   ```
   ==> scheduler-stderr.log <==
   Fatal Python error: Cannot recover from stack overflow.
   Current thread 0x00007f6565221400 (most recent call first):
     File "/usr/local/lib/python3.6/site-packages/pendulum/pendulum.py", line 
129 in __init__
     File "/usr/local/lib/python3.6/site-packages/pendulum/pendulum.py", line 
219 in instance
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 205 in _serialize
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 196 in <dictcomp>
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 196 in _serialize
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 163 in serialize_to_json
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 555 in serialize_dag
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 201 in _serialize
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 165 in serialize_to_json
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 352 in serialize_operator
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 203 in _serialize
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 557 in <listcomp>
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 557 in serialize_dag
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 201 in _serialize
     File 
"/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", 
line 165 in serialize_to_json
     File "/usr/local/src/apache-....
   ```
   
   ```
   ile "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 387, 
in _fetch_row
       return self._result.fetch_row(size, self._fetch_type)
   UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf1 in position 2686: 
invalid continuation byte
   called from: 
   Process DagFileProcessor2121-Process:
   Traceback (most recent call last):
     File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in 
_bootstrap
       self.run()
     File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
       self._target(*self._args, **self._kwargs)
     File "/usr/local/src/apache-airflow/airflow/jobs/scheduler_job.py", line 
159, in _run_file_processor
       pickle_dags)
     File "/usr/local/src/apache-airflow/airflow/utils/db.py", line 74, in 
wrapper
       return func(*args, **kwargs)
     File "/usr/local/src/apache-airflow/airflow/jobs/scheduler_job.py", line 
1609, in process_file
       dag.sync_to_db()
     File "/usr/local/src/apache-airflow/airflow/utils/db.py", line 74, in 
wrapper
       return func(*args, **kwargs)
     File "/usr/local/src/apache-airflow/airflow/models/dag.py", line 1552, in 
sync_to_db
       session=session
     File "/usr/local/src/apache-airflow/airflow/utils/db.py", line 70, in 
wrapper
       return func(*args, **kwargs)
     File "/usr/local/src/apache-airflow/airflow/models/serialized_dag.py", 
line 120, in write_dag
       session.merge(new_serialized_dag)
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to