GitHub user oggers created a discussion: airflow-db-not-allowed error when 
importing PostgresqlHook in a jupyter notebook and executing with 
PapermillOperator

### Apache Airflow Provider(s)

papermill

### Versions of Apache Airflow Providers

apache-airflow-providers-papermill==3.11.3

### Apache Airflow version

3.0.6

### Operating System

Debian GNU/Linux 12 (bookworm)

### Deployment

Other Docker-based deployment

### Deployment details

_No response_

### What happened

I have a jupyter notebook that contains:

```
from airflow.providers.postgres.hooks.postgres import PostgresHook
```

when executing this notebook with a PapermillOperator I get the following error:

```
---------------------------------------------------------------------------
ArgumentError                             Traceback (most recent call last)
Cell In[1], line 1
----> 1 from airflow.providers.postgres.hooks.postgres import PostgresHook

File 
[~/.local/lib/python3.12/site-packages/airflow/__init__.py:79](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/airflow/__init__.py#line=78)
     73 # Perform side-effects unless someone has explicitly opted out before 
import
     74 # WARNING: DO NOT USE THIS UNLESS YOU REALLY KNOW WHAT YOU'RE DOING.
     75 # This environment variable prevents proper initialization, and things 
like
     76 # configs, logging, the ORM, etc. will be broken. It is only useful if 
you only
     77 # access certain trivial constants and free functions (e.g. 
`__version__`).
     78 if not os.environ.get("_AIRFLOW__AS_LIBRARY", None):
---> 79     settings.initialize()
     81 # Things to lazy import in form {local_name: ('target_module', 
'target_name', 'deprecated')}
     82 __lazy_imports: dict[str, tuple[str, str, bool]] = {
     83     "DAG": (".models.dag", "DAG", False),
     84     "Asset": (".assets", "Asset", False),
   (...)     89     "Dataset": (".sdk.definitions.asset", "Dataset", True),
     90 }

File 
[~/.local/lib/python3.12/site-packages/airflow/settings.py:618](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/airflow/settings.py#line=617),
 in initialize()
    616     is_worker = os.environ.get("_AIRFLOW__REEXECUTED_PROCESS") == "1"
    617     if not is_worker:
--> 618         configure_orm()
    619 configure_action_logging()
    621 # mask the sensitive_config_values

File 
[~/.local/lib/python3.12/site-packages/airflow/settings.py:359](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/airflow/settings.py#line=358),
 in configure_orm(disable_connection_pool, pool_class)
    353 if SQL_ALCHEMY_CONN.startswith("sqlite"):
    354     # FastAPI runs sync endpoints in a separate thread. SQLite does not 
allow
    355     # to use objects created in another threads by default. Allowing 
that in test
    356     # to so the `test` thread and the tested endpoints can use common 
objects.
    357     connect_args["check_same_thread"] = False
--> 359 engine = create_engine(SQL_ALCHEMY_CONN, connect_args=connect_args, 
**engine_args, future=True)
    360 async_engine = create_async_engine(SQL_ALCHEMY_CONN_ASYNC, future=True)
    361 AsyncSession = sessionmaker(
    362     bind=async_engine,
    363     autocommit=False,
   (...)    366     expire_on_commit=False,
    367 )

File <string>:2, in create_engine(url, **kwargs)

File 
[~/.local/lib/python3.12/site-packages/sqlalchemy/util/deprecations.py:375](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/sqlalchemy/util/deprecations.py#line=374),
 in deprecated_params.<locals>.decorate.<locals>.warned(fn, *args, **kwargs)
    368     if m in kwargs:
    369         _warn_with_version(
    370             messages[m],
    371             versions[m],
    372             version_warnings[m],
    373             stacklevel=3,
    374         )
--> 375 return fn(*args, **kwargs)

File 
[~/.local/lib/python3.12/site-packages/sqlalchemy/engine/create.py:514](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/sqlalchemy/engine/create.py#line=513),
 in create_engine(url, **kwargs)
    511 kwargs.pop("empty_in_strategy", None)
    513 # create url.URL object
--> 514 u = _url.make_url(url)
    516 u, plugins, kwargs = u._instantiate_plugins(kwargs)
    518 entrypoint = u._get_entrypoint()

File 
[~/.local/lib/python3.12/site-packages/sqlalchemy/engine/url.py:738](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/sqlalchemy/engine/url.py#line=737),
 in make_url(name_or_url)
    725 """Given a string or unicode instance, produce a new URL instance.
    726 
    727 
   (...)    734 
    735 """
    737 if isinstance(name_or_url, util.string_types):
--> 738     return _parse_url(name_or_url)
    739 else:
    740     return name_or_url

File 
[~/.local/lib/python3.12/site-packages/sqlalchemy/engine/url.py:799](http://jupyter-new.local.zabik/user/airflow/lab/tree/bizak/dags/items_available/~/.local/lib/python3.12/site-packages/sqlalchemy/engine/url.py#line=798),
 in _parse_url(name)
    796     return URL.create(name, **components)
    798 else:
--> 799     raise exc.ArgumentError(
    800         "Could not parse SQLAlchemy URL from string '%s'" % name
    801     )

ArgumentError: Could not parse SQLAlchemy URL from string 
'airflow-db-not-allowed:///'
```


### What you think should happen instead

It should allow using dbhooks and airflow connection inside a jupyter notebook.

### How to reproduce

Create a notebook with only:

```
from airflow.providers.postgres.hooks.postgres import PostgresHook
```

and execute it with **PapermillOperator**

Other hooks like **MsSqlHook** also fails.

### Anything else

_No response_

### Are you willing to submit PR?

- [ ] Yes I am willing to submit a PR!

### Code of Conduct

- [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)


GitHub link: https://github.com/apache/airflow/discussions/56595

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to