Skip to content

Commit

Permalink
Add a few examples showing --airflow-extras use with breeze (apache#3…
Browse files Browse the repository at this point in the history
…3341)

Using start-airflow with released version of Airflow in PyPI only
install airflow with preinstalled providers by default. This for
example causes failure when you want to install 2.7.0rc1 and
use celery executor when running "start-airflow". It's easy to
make it work by adding `--airflow-extras celery` but it was not
too obvious.

This PR adds a few examples in relevant places of BREEZE, release
and TESTING documentation to surface that you can add extras and
that "celery" extra should be used with Celery Executor when you
run `--use-airflow-version` for 2.7.0+.

Also removes some old `wrong` references to non-existing
`--use-airflow-from-pypi` flag used for Airflow 1.10 (?)
  • Loading branch information
potiuk authored Aug 12, 2023
1 parent d2c0bbe commit d69ffaf
Show file tree
Hide file tree
Showing 3 changed files with 28 additions and 9 deletions.
19 changes: 17 additions & 2 deletions BREEZE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -572,13 +572,28 @@ When you are starting airflow from local sources, www asset compilation is autom
breeze --python 3.8 --backend mysql start-airflow
You can also use it to start different executor.

.. code-block:: bash
breeze start-airflow --executor CeleryExecutor
You can also use it to start any released version of Airflow from ``PyPI`` with the
``--use-airflow-version`` flag.
``--use-airflow-version`` flag - useful for testing and looking at issues raised for specific version.

.. code-block:: bash
breeze start-airflow --python 3.8 --backend mysql --use-airflow-version 2.7.0
When you are installing version from PyPI, it's also possible to specify extras that should be used
when installing Airflow - you can provide several extras separated by coma - for example to install
providers together with Airflow that you are installing. For example when you are using celery executor
in Airflow 2.7.0+ you need to add ``celery`` extra.

.. code-block:: bash
breeze start-airflow --python 3.8 --backend mysql --use-airflow-version 2.2.5
breeze start-airflow --use-airflow-version 2.7.0 --executor CeleryExecutor --airflow-extras celery
These are all available flags of ``start-airflow`` command:

Expand Down
6 changes: 0 additions & 6 deletions TESTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1908,12 +1908,6 @@ to test them.
The DAGs can be run in the main version of Airflow but they also work
with older versions.

To run the tests for Airflow 1.10.* series, you need to run Breeze with
``--use-airflow-pypi-version=<VERSION>`` to re-install a different version of Airflow.

You should also consider running it with ``restart`` command when you change the installed version.
This will clean-up the database so that you start with a clean DB and not DB installed in a previous version.
So typically you'd run it like ``breeze --use-airflow-pypi-version=1.10.9 restart``.

Tracking SQL statements
=======================
Expand Down
12 changes: 11 additions & 1 deletion dev/README_RELEASE_AIRFLOW.md
Original file line number Diff line number Diff line change
Expand Up @@ -683,9 +683,19 @@ There is also an easy way of installation with Breeze if you have the latest sou
Running the following command will use tmux inside breeze, create `admin` user and run Webserver & Scheduler:

```shell script
breeze start-airflow --use-airflow-version <VERSION>rc<X> --python 3.8 --backend postgres
breeze start-airflow --use-airflow-version 2.7.0rc1 --python 3.8 --backend postgres
```

You can also choose different executors and extras to install when you are installing airflow this way. For
example in order to run Airflow with CeleryExecutor and install celery, google and amazon provider (as of
Airflow 2.7.0, you need to have celery provider installed to run Airflow with CeleryExecutor) you can run:

```shell script
breeze start-airflow --use-airflow-version 2.7.0rc1 --python 3.8 --backend postgres \
--executor CeleryExecutor --airflow-extras "celery,google,amazon"
```


Once you install and run Airflow, you should perform any verification you see as necessary to check
that the Airflow works as you expected.

Expand Down

0 comments on commit d69ffaf

Please sign in to comment.