logs of tasks can be seen in airflow UI (http://localhost:28080
user/password => admin/admin) or in /opt/airflow/logs folder . Note also
that you will have to - likely create the right connection (you can also do
it via UI / Admin /Connection).

On Mon, May 5, 2025 at 10:26 PM Stephen Mallette <spmalle...@gmail.com>
wrote:

> Thanks, I'm not sure if setting the --use-airflow-version to 2.10.5 helped
> but this time when i installed/restarted the dag loaded properly.
>
> I'm now trying to get the actual connectivity to Gremlin Server to work now
> - still failing when the dag is executed. Given the simplicity of what I'm
> doing, i'm guessing that the Gremlin driver is simply not connecting to
> Gremlin Server for some reason. That said, I don't see much in the logs to
> help me. I know pretty much every single driver connection problem there is
> for TinkerPop, but the Airflow tmux screens aren't revealing any TinkerPop
> errors - maybe i'm not looking in the right place? sorry for so many basic
> questions...still extremely new to Airflow/Breeze.
>
>
>
> On Mon, May 5, 2025 at 12:36 PM Jarek Potiuk <ja...@potiuk.com> wrote:
>
> > * Use --use-airflow-version 2.10.5
> > * actually install tinkerpop provider RC from PyPI - you need to install
> it
> > (and you need to restart all the components in tmux) - alternatively add
> > startyup script to install it
> >
> >
> https://github.com/apache/airflow/blob/main/dev/breeze/doc/02_customizing.rst#customizing-breeze-startup
> >
> > On Mon, May 5, 2025 at 6:02 PM Stephen Mallette <spmalle...@gmail.com>
> > wrote:
> >
> > > Changed the subject to not muddy the VOTE thread.
> > >
> > > Removing the --use-airflow-version seemed to help get things going, but
> > > still not quite working. The dag isn't loading properly - guess it
> can't
> > > find the TinkerPop provider for some reason:
> > >
> > > Traceback (most recent call last):
> > >   File "<frozen importlib._bootstrap>", line 241, in
> > > _call_with_frames_removed
> > >   File "/files/dags/tinkerpop.py", line 7, in <module>
> > >     from airflow.providers.apache.tinkerpop.operators.gremlin import
> > > GremlinOperator
> > > ModuleNotFoundError: No module named
> > > 'airflow.providers.apache.tinkerpop.operators.gremlin'
> > >
> > > I imagine I just don't quite know the Airflow/Breeze environment well
> > > enough yet. I'll keep trying as time allows but if anyone has any
> helpful
> > > hints to move my testing along more quickly I'm happy to hear them.
> > > thanks.....
> > >
> > > On Mon, May 5, 2025 at 8:19 AM Amogh Desai <amoghdesai....@gmail.com>
> > > wrote:
> > >
> > > > Hello Stephen,
> > > >
> > > > Did you copy paste the command below:
> > > > *breeze start-airflow --use-airflow-version 2.2.4 --python 3.9
> > --backend
> > > > postgres \*
> > > > *    --load-example-dags --load-default-connections*
> > > >
> > > > This is likely because the command is probably super outdated now and
> > it
> > > > internally runs
> > > > some "removed" airflow commands? Generally speaking, `breeze down`
> and
> > > > reinstall
> > > > should fix it, but I guess the command needs to be updated too.
> Please
> > > feel
> > > > free to
> > > > open a PR for updating that command.
> > > >
> > > > Thanks & Regards,
> > > > Amogh Desai
> > > >
> > > >
> > > > On Mon, May 5, 2025 at 5:06 PM Stephen Mallette <
> spmalle...@gmail.com>
> > > > wrote:
> > > >
> > > > > I was trying out the contributor instructions for testing the
> > release.
> > > I
> > > > > gravitated toward the instructions using Breeze:
> > > > >
> > > > >
> > > > >
> > > >
> > >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDERS.md#installing-with-breeze
> > > > >
> > > > > and am getting an error. I assume there's just something wrong with
> > my
> > > > > environment though I wondered if it were possible that some
> > > documentation
> > > > > had gotten out of date? When I the breeze command noted there i
> get:
> > > > >
> > > > > airflow command error: argument GROUP_OR_COMMAND: invalid choice:
> > > 'users'
> > > > > (choose from 'api-server', 'assets', 'backfill', 'cheat-sheet',
> > > 'config',
> > > > > 'connections', 'dag-processor', 'dags', 'db', 'info', 'jobs',
> > > 'kerberos',
> > > > > 'plugins', 'pools', 'providers', 'rotate-fernet-key', 'scheduler',
> > > > > 'standalone', 'tasks', 'triggerer', 'variables', 'version'), see
> help
> > > > > above.
> > > > >
> > > > > Error: check_environment returned 2. Exiting.
> > > > >
> > > > > I tried a few basic things to fix on my own like breeze cleanup,
> > > > > self-upgrade for breeze, tinkered with some of the arguments, but
> all
> > > > ended
> > > > > in this fashion. Just curious if anyone knows what might be wrong
> > > there?
> > > > >
> > > > >
> > > > >
> > > > > On Mon, May 5, 2025 at 3:38 AM Elad Kalif <elad...@apache.org>
> > wrote:
> > > > >
> > > > > > Hey all,
> > > > > >
> > > > > > I have just cut the ad-hoc wave Airflow Providers packages. This
> > > email
> > > > is
> > > > > > calling a vote on the release, which will last for 72 hours -
> which
> > > > means
> > > > > > that it will end on May 08, 2025 07:35 AM UTC and until 3 binding
> > +1
> > > > > votes
> > > > > > have been received.
> > > > > >
> > > > > > Consider this my (binding) +1.
> > > > > >
> > > > > > Airflow Providers are available at:
> > > > > > https://dist.apache.org/repos/dist/dev/airflow/providers/
> > > > > >
> > > > > > *apache-airflow-providers-<PROVIDER>-*.tar.gz* are the binary
> > > > > >  Python "sdist" release - they are also official "sources" for
> the
> > > > > Provider
> > > > > > distributions.
> > > > > >
> > > > > > *apache_airflow_providers_<PROVIDER>-*.whl are the binary
> > > > > >  Python "wheel" release.
> > > > > >
> > > > > > The test procedure for PMC members is described in
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDERS.md#verify-the-release-candidate-by-pmc-members
> > > > > >
> > > > > > The test procedure for and Contributors who would like to test
> this
> > > RC
> > > > is
> > > > > > described in:
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDERS.md#verify-the-release-candidate-by-contributors
> > > > > >
> > > > > >
> > > > > > Public keys are available at:
> > > > > > https://dist.apache.org/repos/dist/release/airflow/KEYS
> > > > > >
> > > > > > Please vote accordingly:
> > > > > >
> > > > > > [ ] +1 approve
> > > > > > [ ] +0 no opinion
> > > > > > [ ] -1 disapprove with the reason
> > > > > >
> > > > > > Only votes from PMC members are binding, but members of the
> > community
> > > > are
> > > > > > encouraged to test the release and vote with "(non-binding)".
> > > > > >
> > > > > > Please note that the version number excludes the 'rcX' string.
> > > > > > This will allow us to rename the artifact without modifying
> > > > > > the artifact checksums when we actually release.
> > > > > >
> > > > > > The status of testing the providers by the community is kept
> here:
> > > > > > https://github.com/apache/airflow/issues/50189
> > > > > >
> > > > > > The issue is also the easiest way to see important PRs included
> in
> > > the
> > > > RC
> > > > > > candidates.
> > > > > > Detailed changelog for the providers will be published in the
> > > > > documentation
> > > > > > after the
> > > > > > RC candidates are released.
> > > > > >
> > > > > > You can find the RC packages in PyPI following these links:
> > > > > >
> > > > > >
> https://pypi.org/project/apache-airflow-providers-amazon/9.7.0rc1/
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> https://pypi.org/project/apache-airflow-providers-apache-tinkerpop/1.0.0rc1/
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> https://pypi.org/project/apache-airflow-providers-common-messaging/1.0.1rc1/
> > > > > >
> > > >
> > https://pypi.org/project/apache-airflow-providers-common-sql/1.27.0rc1/
> > > > > >
> > > https://pypi.org/project/apache-airflow-providers-snowflake/6.3.0rc1/
> > > > > >
> > > > > > Cheers,
> > > > > > Elad Kalif
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to