GitHub user saumyasuhagiya created a discussion: Invalid JSON text: "Invalid 
value." at position x in value for column \'serialized_dag.data\'.

### Apache Airflow version

Other Airflow 2 version (please specify below)

### If "Other Airflow 2 version" selected, which one?

2.7.3

### What happened?

I recently tried adding new code changes. The environment have only one dynamic 
DAG. Everything was working fine earlier.
However I'm getting below error after deployment.

I had this error few days back while i had disk full, however, it came again 
even though i do not have disk full.  

When i checked logs, code seems to process all DAGs based on configurarion but 
failing at the end.



### What you think should happen instead?

> [2025-03-18T20:31:34.425+0000] {processor.py:182} ERROR - Got an exception! 
> Propagating...
Traceback (most recent call last):
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/engine/base.py", line 
1885, in _execute_context
    self.dialect.do_executemany(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/dialects/mysql/mysqldb.py",
 line 180, in do_executemany
    rowcount = cursor.executemany(statement, parameters)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
230, in executemany
    return self._do_execute_many(
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
261, in _do_execute_many
    rows += self.execute(sql + postfix)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
206, in execute
    res = self._query(query)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
319, in _query
    db.query(q)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/connections.py", 
line 254, in query
    _mysql.connection.query(self, query)
MySQLdb.OperationalError: (3140, 'Invalid JSON text: "Invalid value." at 
position 1262 in value for column \'serialized_dag.data\'.')

The above exception was the direct cause of the following exception:

- > Traceback (most recent call last):

  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/dag_processing/processor.py",
 line 178, in _run_file_processor
    _handle_dag_file_processing()
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/dag_processing/processor.py",
 line 159, in _handle_dag_file_processing
    result: tuple[int, int] = dag_file_processor.process_file(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/utils/session.py", line 
79, in wrapper
    return func(*args, session=session, **kwargs)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/dag_processing/processor.py",
 line 857, in process_file
    serialize_errors = DagFileProcessor.save_dag_to_db(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/api_internal/internal_api_call.py",
 line 114, in wrapper
    return func(*args, **kwargs)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/utils/session.py", line 
79, in wrapper
    return func(*args, session=session, **kwargs)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/dag_processing/processor.py",
 line 893, in save_dag_to_db
    import_errors = DagBag._sync_to_db(dags=dags, 
processor_subdir=dag_directory, session=session)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/utils/session.py", line 
76, in wrapper
    return func(*args, **kwargs)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/models/dagbag.py", line 
654, in _sync_to_db
    for attempt in run_with_db_retries(logger=log):
  File "/opt/myenv/venv/lib64/python3.8/site-packages/tenacity/__init__.py", 
line 347, in __iter__
    do = self.iter(retry_state=retry_state)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/tenacity/__init__.py", 
line 325, in iter
    raise retry_exc.reraise()
  File "/opt/myenv/venv/lib64/python3.8/site-packages/tenacity/__init__.py", 
line 158, in reraise
    raise self.last_attempt.result()
  File 
"/opt/rh/rh-python38/root/usr/lib64/python3.8/concurrent/futures/_base.py", 
line 437, in result
    return self.__get_result()
  File 
"/opt/rh/rh-python38/root/usr/lib64/python3.8/concurrent/futures/_base.py", 
line 389, in __get_result
    raise self._exception
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/models/dagbag.py", line 
668, in _sync_to_db
    DAG.bulk_write_to_db(dags.values(), processor_subdir=processor_subdir, 
session=session)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/airflow/utils/session.py", line 
76, in wrapper
    return func(*args, **kwargs)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/airflow/models/dag.py", 
line 3113, in bulk_write_to_db
    session.flush()  # this is required to ensure each dataset has its PK loaded
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/session.py", line 
3449, in flush
    self._flush(objects)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/session.py", line 
3589, in _flush
    transaction.rollback(_capture_exception=True)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/util/langhelpers.py", 
line 70, in __exit__
    compat.raise_(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/util/compat.py", line 
211, in raise_
    raise exception
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/session.py", line 
3549, in _flush
    flush_context.execute()
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/unitofwork.py", 
line 456, in execute
    rec.execute(self)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/unitofwork.py", 
line 630, in execute
    util.preloaded.orm_persistence.save_obj(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/persistence.py", 
line 245, in save_obj
    _emit_insert_statements(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/orm/persistence.py", 
line 1097, in _emit_insert_statements
    c = connection._execute_20(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/engine/base.py", line 
1710, in _execute_20
    return meth(self, args_10style, kwargs_10style, execution_options)
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/sql/elements.py", 
line 334, in _execute_on_connection
    return connection._execute_clauseelement(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/engine/base.py", line 
1577, in _execute_clauseelement
    ret = self._execute_context(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/engine/base.py", line 
1948, in _execute_context
    self._handle_dbapi_exception(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/engine/base.py", line 
2129, in _handle_dbapi_exception
    util.raise_(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/util/compat.py", line 
211, in raise_
    raise exception
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/engine/base.py", line 
1885, in _execute_context
    self.dialect.do_executemany(
  File 
"/opt/myenv/venv/lib64/python3.8/site-packages/sqlalchemy/dialects/mysql/mysqldb.py",
 line 180, in do_executemany
    rowcount = cursor.executemany(statement, parameters)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
230, in executemany
    return self._do_execute_many(
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
261, in _do_execute_many
    rows += self.execute(sql + postfix)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
206, in execute
    res = self._query(query)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/cursors.py", line 
319, in _query
    db.query(q)
  File "/opt/myenv/venv/lib64/python3.8/site-packages/MySQLdb/connections.py", 
line 254, in query
    _mysql.connection.query(self, query)
sqlalchemy.exc.OperationalError: (MySQLdb.OperationalError) (3140, 'Invalid 
JSON text: "Invalid value." at position 1262 in value for column 
\'serialized_dag.data\'.')
[SQL: INSERT INTO serialized_dag (dag_id, fileloc, fileloc_hash, data, 
data_compressed, last_updated, dag_hash, processor_subdir) VALUES (%s, %s, %s, 
%s, %s, %s, %s, %s)]

### How to reproduce

Unfortunately, I do not have any way to reproduce the issue.

### Operating System

CentOS Linux

### Versions of Apache Airflow Providers

apache-airflow-providers-apache-beam==5.1.1
apache-airflow-providers-celery==3.6.0
apache-airflow-providers-common-sql==1.5.2
apache-airflow-providers-databricks==4.3.0
apache-airflow-providers-ftp==3.4.2
apache-airflow-providers-google==10.2.0
apache-airflow-providers-http==4.4.2
apache-airflow-providers-imap==3.2.2
apache-airflow-providers-jdbc==4.0.0
apache-airflow-providers-microsoft-azure==6.1.2
apache-airflow-providers-mysql==5.1.1
apache-airflow-providers-samba==4.2.1
apache-airflow-providers-sftp==4.3.1
apache-airflow-providers-sqlite==3.4.2
apache-airflow-providers-ssh==3.7.1

### Deployment

Official Apache Airflow Helm Chart

### Deployment details

Using google compute engine to deploy DAG code. Airflow reads from DAG folder 
as configured.

### Anything else?

Recently started happening only on this enivornmet. So i believe something to 
do with code. However ideally if DAGs are processed successfully, then issue 
should not come.

### Are you willing to submit PR?

- [x] Yes I am willing to submit a PR!

### Code of Conduct

- [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)


GitHub link: https://github.com/apache/airflow/discussions/63564

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to