A dedicated task at front could solve this pipeline level environment setup.
# About the "pipeline environment setup"
For my case I am trying to expose some pipeline variables by XCOM that tasks
could get this when running, so that I want to this "expose some pipeline
variables" could be done
As Scott pointed out, we at Astronomer.io have taken the lightweight image
philosophy to heart with our docker images. Our base image inherits from
Alpine
(https://github.com/astronomerio/astronomer) and our Airflow image (
You may consider this base image we put together at Blue Apron. My fork
fixes a build issue by pinning to pip < 10.
https://github.com/joenap/airflow-base
Joe Nap
On Mon, May 14, 2018 at 4:37 PM, Daniel Imberman
wrote:
> @Fokko
>
> I definitely agree with that. I
@Fokko
I definitely agree with that. I think that having a "super lightweight"
image for just running a basic airflow instance makes sense. We could even
name the image something like airflow-k8s so people know it's ONLY meant
to work in a k8s cluster. I'm trying to figure out what methods
Hi Daniel,
My dear colleague from GoDataDriven, Bas Harenslak, started on building an
official Docker container on the Dockerhub. I've put him in the CC. In the
end I strongly believe the image should end up in the official Docker
repository: https://github.com/docker-library/official-images
AstronomerIO has done quite a bit of work on this:
https://github.com/astronomerio/astronomer
https://open.astronomer.io/airflow/index.html
On May 14, 2018, 1:08 PM -0700, Daniel Imberman ,
wrote:
> Hi everyone,
>
> I've started looking into creating an official
@Erik Erlandson has had conversations about publishing
docker images with the ASF Legal team.
Adding him to the thread.
On Mon, May 14, 2018 at 1:07 PM Daniel Imberman
wrote:
> Hi everyone,
>
> I've started looking into creating an official airflow
Hi everyone,
I've started looking into creating an official airflow docker container
s.t. users of the KubernetesExecutor could auto-pull from helm
charts/deployment yamls/etc. I was wondering what everyone thinks the best
way to do this would be? Is there an official apache docker repo? Is there
Hi Jørn,
Great that you are test driving it. It does not have an affect if inlets and
outlets are not used thus although experimental it will end up in 1.10. It does
need more iterations as some very valuable points have been made and I am
working to see how those can get integrated properly.
On Sat, 5 May 2018 at 23.49, Bolke de Bruin wrote:
> Hi All,
> I have made a first implementation that allows tracking of lineage in
> Airflow and integration with Apache Atlas.
Snip
>
>
> I’m looking forward to your comments!
>
>
Hi all,
I see that spark submit operator [0] doesn't support properties-file[1]
parameter.
I need resolve properties from file for my job and now I'm reading properties
line-by-line from file and build conf property to spark submit. It is not user
friendly for me.
Do community have objections
11 matches
Mail list logo