jedcunningham commented on code in PR #44533:
URL: https://github.com/apache/airflow/pull/44533#discussion_r1900186588


##########
newsfragments/44533.significant.rst:
##########
@@ -0,0 +1,21 @@
+Update conf column in dag_run table type from byte ( that store a python 
pickle ) to JSON. It is important to note that existing dagrun records will 
lose their conf data if an offline migration is performed
+
+.. Provide additional contextual information
+
+Column conf of the table dag_run is using the type byte ( and storing a python 
pickle ) on the database , since airflow only support postgres 12+ and mysql 8+ 
, we updated it to  json type .

Review Comment:
   ```suggestion
   The ``conf`` column is changing from pickle to json, thus, the values in 
that column cannot be migrated during offline migrations. If you want to retain 
``conf`` values for existing DagRuns, you must do a normal, non-offline, 
migration.
   ```



##########
newsfragments/44533.significant.rst:
##########
@@ -0,0 +1,21 @@
+Update conf column in dag_run table type from byte ( that store a python 
pickle ) to JSON. It is important to note that existing dagrun records will 
lose their conf data if an offline migration is performed
+
+.. Provide additional contextual information
+
+Column conf of the table dag_run is using the type byte ( and storing a python 
pickle ) on the database , since airflow only support postgres 12+ and mysql 8+ 
, we updated it to  json type .
+
+.. Check the type of change that applies to this change
+
+* Types of change
+
+  * [ ] DAG changes
+  * [ ] Config changes
+  * [ ] API changes
+  * [ ] CLI changes
+  * [x] Behaviour changes
+  * [ ] Plugin changes
+  * [ ] Dependency change
+
+.. List the migration rules needed for this change (see 
https://github.com/apache/airflow/issues/41641)
+
+* Migrations rules needed

Review Comment:
   I'd probably remove all of this - this isn't really a breaking change 
ultimately, but a "normal" significant newsfragment.



##########
airflow/migrations/versions/0055_3_0_0_remove_pickled_data_from_dagrun_table.py:
##########
@@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+remove pickled data from dagrun table.
+
+Revision ID: e39a26ac59f6
+Revises: 38770795785f
+Create Date: 2024-12-01 08:33:15.425141
+
+"""
+
+from __future__ import annotations
+
+import json
+import pickle
+from textwrap import dedent
+
+import sqlalchemy as sa
+from alembic import context, op
+from sqlalchemy import text
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "e39a26ac59f6"
+down_revision = "38770795785f"
+branch_labels = None
+depends_on = None
+airflow_version = "3.0.0"
+
+
+def upgrade():
+    """Apply remove pickled data from dagrun table."""
+    conn = op.get_bind()
+    conf_type = sa.JSON().with_variant(postgresql.JSONB, "postgresql")
+    op.add_column("dag_run", sa.Column("conf_json", conf_type, nullable=True))
+
+    if context.is_offline_mode():
+        # Update the dag_run.conf column value to NULL
+        print(
+            dedent("""
+            ------------
+            --  WARNING: Unable to migrate the data in the 'conf' column while 
in offline mode!
+            --  The 'conf' column will be set to NULL in offline mode.
+            --  Avoid using offline mode if you need to retain 'conf' values.
+            ------------
+            """)
+        )
+    else:
+        BATCH_SIZE = 2
+        offset = 0
+        while True:
+            rows = conn.execute(
+                text(
+                    f"SELECT id,conf FROM dag_run WHERE conf IS not NULL order 
by id LIMIT {BATCH_SIZE} OFFSET {offset}"
+                )
+            ).fetchall()
+            if not rows:
+                break
+            for row in rows:
+                row_id, pickle_data = row
+
+                try:
+                    original_data = pickle.loads(pickle_data)
+                    json_data = json.dumps(original_data)
+                    conn.execute(text(f"UPDATE dag_run SET conf_json 
='{json_data}' WHERE id = {row_id}"))
+                except Exception as e:
+                    print(f"Error converting dagrun conf to json for dagrun ID 
{row_id}: {e}")
+                    continue
+            offset += BATCH_SIZE
+
+    op.drop_column("dag_run", "conf")
+
+    op.alter_column("dag_run", "conf_json", existing_type=conf_type, 
new_column_name="conf")
+
+
+def downgrade():
+    """Unapply Remove pickled data from dagrun table."""
+    conn = op.get_bind()
+    conf_type = sa.LargeBinary().with_variant(postgresql.BYTEA, "postgresql")
+    op.add_column("dag_run", sa.Column("conf_pickle", conf_type, 
nullable=True))
+
+    if context.is_offline_mode():
+        # Update the dag_run.conf column value to NULL
+        print(
+            dedent("""
+            ------------
+            --  WARNING: Unable to migrate the data in the 'conf' column while 
in offline mode!
+            --  The 'conf' column will be set to NULL in offline mode.
+            --  Avoid using offline mode if you need to retain 'conf' values.
+            ------------
+            """)
+        )
+
+        conn.execute(text("UPDATE dag_run set conf=null WHERE conf IS NOT 
NULL"))
+    else:
+        BATCH_SIZE = 2
+        offset = 0
+        while True:
+            rows = conn.execute(
+                text(
+                    f"SELECT id,conf FROM dag_run WHERE conf IS not NULL order 
by id LIMIT {BATCH_SIZE} OFFSET {offset}"
+                )
+            ).fetchall()
+            if not rows:
+                break
+            for row in rows:
+                row_id, json_data = row
+
+                try:
+                    pickled_data = pickle.dumps(json_data, 
protocol=pickle.HIGHEST_PROTOCOL)
+                    conn.execute(
+                        text("""
+                            UPDATE dag_run
+                            SET conf_pickle = :pickle_data
+                            WHERE id = :id
+                        """),
+                        {"pickle_data": pickled_data, "id": row_id},
+                    )
+                except Exception as e:
+                    print(f"Error processing row ID {row_id}: {e}")

Review Comment:
   ```suggestion
                       print(f"Error pickling dagrun conf for dagrun ID 
{row_id}: {e}")
   ```



##########
airflow/migrations/versions/0055_3_0_0_remove_pickled_data_from_dagrun_table.py:
##########
@@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+remove pickled data from dagrun table.
+
+Revision ID: e39a26ac59f6
+Revises: 38770795785f
+Create Date: 2024-12-01 08:33:15.425141
+
+"""
+
+from __future__ import annotations
+
+import json
+import pickle
+from textwrap import dedent
+
+import sqlalchemy as sa
+from alembic import context, op
+from sqlalchemy import text
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "e39a26ac59f6"
+down_revision = "38770795785f"
+branch_labels = None
+depends_on = None
+airflow_version = "3.0.0"
+
+
+def upgrade():
+    """Apply remove pickled data from dagrun table."""
+    conn = op.get_bind()
+    conf_type = sa.JSON().with_variant(postgresql.JSONB, "postgresql")
+    op.add_column("dag_run", sa.Column("conf_json", conf_type, nullable=True))
+
+    if context.is_offline_mode():
+        # Update the dag_run.conf column value to NULL
+        print(
+            dedent("""
+            ------------
+            --  WARNING: Unable to migrate the data in the 'conf' column while 
in offline mode!
+            --  The 'conf' column will be set to NULL in offline mode.
+            --  Avoid using offline mode if you need to retain 'conf' values.
+            ------------
+            """)
+        )
+    else:
+        BATCH_SIZE = 2
+        offset = 0
+        while True:
+            rows = conn.execute(
+                text(
+                    f"SELECT id,conf FROM dag_run WHERE conf IS not NULL order 
by id LIMIT {BATCH_SIZE} OFFSET {offset}"
+                )
+            ).fetchall()
+            if not rows:
+                break
+            for row in rows:
+                row_id, pickle_data = row
+
+                try:
+                    original_data = pickle.loads(pickle_data)
+                    json_data = json.dumps(original_data)
+                    conn.execute(text(f"UPDATE dag_run SET conf_json 
='{json_data}' WHERE id = {row_id}"))
+                except Exception as e:
+                    print(f"Error converting dagrun conf to json for dagrun ID 
{row_id}: {e}")
+                    continue
+            offset += BATCH_SIZE
+
+    op.drop_column("dag_run", "conf")
+
+    op.alter_column("dag_run", "conf_json", existing_type=conf_type, 
new_column_name="conf")
+
+
+def downgrade():
+    """Unapply Remove pickled data from dagrun table."""
+    conn = op.get_bind()
+    conf_type = sa.LargeBinary().with_variant(postgresql.BYTEA, "postgresql")
+    op.add_column("dag_run", sa.Column("conf_pickle", conf_type, 
nullable=True))
+
+    if context.is_offline_mode():
+        # Update the dag_run.conf column value to NULL
+        print(
+            dedent("""
+            ------------
+            --  WARNING: Unable to migrate the data in the 'conf' column while 
in offline mode!
+            --  The 'conf' column will be set to NULL in offline mode.
+            --  Avoid using offline mode if you need to retain 'conf' values.
+            ------------
+            """)
+        )
+
+        conn.execute(text("UPDATE dag_run set conf=null WHERE conf IS NOT 
NULL"))

Review Comment:
   ```suggestion
   ```
   
   (fyi: I won't normally duplicate suggestions if they occur more than once)



##########
newsfragments/44533.significant.rst:
##########
@@ -0,0 +1,21 @@
+Update conf column in dag_run table type from byte ( that store a python 
pickle ) to JSON. It is important to note that existing dagrun records will 
lose their conf data if an offline migration is performed

Review Comment:
   ```suggestion
   During offline migration, ``DagRun.conf`` is cleared
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to