Thanks Andrew and Max, I explored the docs and various code snippets shared
by other developers. I have managed to use a custom hook in my dag.

Regarding contributing back to the project, I would love to, I have
explored the basic part of Adobe Analytics API as of now. I would add more
functionality and probably have a PR in near future.

Next step is to complete my custom hook for boto3 (for aws).

For other folks, so here are the steps that I have followed:
1. Created a new folder within the DAGs folder. I am not sure if I could
have put the hook in the plugins folder, will explore this option at a
later stage.
2. Put my python file containing the custom hook code in the new folder
3. Put my dag file in the Dag's folder.
4. Restarted the webserver (although I am not sure if this step played a
role or not). I did this step as I had a new python library which I had
imported.
5. Started worker using command: airflow worker
6. Started scheduler using command: airflow scheduler

End result, I could see my DAG on the UI and I could see the data been
fetched from the API.

Thanks & Regards,
Vikas


On Fri, Sep 23, 2016 at 2:30 PM, Maxime Beauchemin <
[email protected]> wrote:

> Any reason why you want to package it as a plugin?
>
> It's pretty easy to just have your custom hooks and operators live
> alongside your pipelines, maybe in a `common` folder. In your pipeline file
> you just import what you need, relative to your file, from the same repo.
> Just don't forget the `__init__.py` files that make a folder a package in
> python so that the import statements can work.
>
> If you want to contribute it back to the project, send a PR that add it to
> the `contrib/hooks` folder.
>
> Max
>
> On Fri, Sep 23, 2016 at 8:24 AM, Andrew Phillips <[email protected]>
> wrote:
>
>> Are the above steps correct or am I missing something?
>>>
>>
>> Is something not working as expected, e.g. are you unable to *use* the
>> plugin from your DAGs? If so, could you provide a few more details on the
>> error?
>>
>> Regards
>>
>> ap
>>
>
>

Reply via email to