Are the artifacts supposed to include `rc4` in their name? The rc3 had
that, but rc4 appears to have file names with no RC in the filename.

On Mon, Aug 7, 2017 at 10:23 PM, Maxime Beauchemin <
maximebeauche...@gmail.com> wrote:

> 1.8.2 RC4 is baked and available at:
> https://dist.apache.org/repos/dist/dev/incubator/airflow, public keys
> are available at https://dist.apache.org/repos/dist/release/incubator/airf
> low.
>
> apache-airflow-1.8.2+incubating-source.tar.gz
> <https://dist.apache.org/repos/dist/dev/incubator/airflow/1.8.2rc4/apache-
> airflow-1.8.2+incubating-source.tar.gz>
> is
> a source release that comes with INSTALL instructions.
>
> Along with it, for convenience, find the binary Python "sdist" as
> apache-airflow-1.8.2+incubating-bin.tar.gz
> <https://dist.apache.org/repos/dist/dev/incubator/airflow/1.8.2rc4/apache-
> airflow-1.8.2+incubating-bin.tar.gz>
>
> 1.8.2 RC4 is build upon 1.8.1 with the commits listed bellow on top of it.
> I added the JIRAs that were identified blockers and targeted 1.8.2. I
> attempted to bring in all of the JIRAs that targeted 1.8.2 but bailed on
> the ones that were generating merge conflicts. I also added all of the
> JIRAs that we've been running in production at Airbnb.
>
> Issues fixed:
> 9a53e66 [AIRFLOW-809][AIRFLOW-1] Use __eq__ ColumnOperator When Testing
> Booleans
> 333e0b3 [AIRFLOW-1296] Propagate SKIPPED to all downstream tasks
> 93825d5 [AIRFLOW-XXX] Re-enable caching for hadoop components
> 33a9dcb [AIRFLOW-XXX] Pin Hive and Hadoop to a specific version and create
> writable warehouse dir
> 7cff6cd [AIRFLOW-1308] Disable nanny usage for Dask
> 570b2ed [AIRFLOW-1294] Backfills can loose tasks to execute
> 3f48d48 [AIRFLOW-1291] Update NOTICE and LICENSE files to match ASF
> requirements
> 69bd269 [AIRFLOW-1160] Update Spark parameters for Mesos
> 9692510 [AIRFLOW 1149][AIRFLOW-1149] Allow for custom filters in Jinja2
> templates
> 6de5330 [AIRFLOW-1119] Fix unload query so headers are on first row[]
> b4e9eb8 [AIRFLOW-1089] Add Spark application arguments
> a4083f3 [AIRFLOW-1078] Fix latest_runs endpoint for old flask versions
> 7a02841 [AIRFLOW-1074] Don't count queued tasks for concurrency limits
> a2c18a5 [AIRFLOW-1064] Change default sort to job_id for
> TaskInstanceModelView
> d1c64ab [AIRFLOW-1038] Specify celery serialization options explicitly
> b4ee88a [AIRFLOW-1036] Randomize exponential backoff
> 9fca409 [AIRFLOW-993] Update date inference logic
> 272c2f5 [AIRFLOW-1167] Support microseconds in FTPHook modification time
> c7c0b72 [AIRFLOW-1179] Fix Pandas 0.2x breaking Google BigQuery change
> acd0166 [AIRFLOW-1263] Dynamic height for charts
> 7f33f6e [AIRFLOW-1266] Increase width of gantt y axis
> fc33c04 [AIRFLOW-1290] set docs author to 'Apache Airflow'
> 2e9eee3 [AIRFLOW-1282] Fix known event column sorting
> 2389a8a [AIRFLOW-1166] Speed up _change_state_for_tis_without_dagrun
> bf966e6 [AIRFLOW-1192] Some enhancements to qubole_operator
> 57d5bcd [AIRFLOW-1281] Sort variables by key field by default
> 802fc15 [AIRFLOW-1244] Forbid creation of a pool with empty name
> 1232b6a [AIRFLOW-1243] DAGs table has no default entries to show
> b0ba3c9 [AIRFLOW-1227] Remove empty column on the Logs view
> c406652 [AIRFLOW-1226] Remove empty column on the Jobs view
> 51a83cc [AIRFLOW-1199] Fix create modal
> cac7d4c [AIRFLOW-1200] Forbid creation of a variable with an empty key
> 5f3ee52 [AIRFLOW-1186] Sort dag.get_task_instances by execution_date
> f446c08 [AIRFLOW-1145] Fix closest_date_partition function with before set
> to True If we're looking for the closest date before, we should take the
> latest date in the list of date before.
> 93b8e96 [AIRFLOW-1180] Fix flask-wtf version for test_csrf_rejection
> bb56805 [AIRFLOW-1170] DbApiHook insert_rows inserts parameters separately
> 093b2f0 [AIRFLOW-1150] Fix scripts execution in sparksql hook[]
> 777f181 [AIRFLOW-1168] Add closing() to all connections and cursors
>
> As part of the process I updated the release intructions here:
> https://cwiki.apache.org/confluence/display/AIRFLOW/Releasing+Airflow
>
> Max
>

Reply via email to