One option I would suggest is look into Apache Apex
<http://apex.apache.org/> It has good ingestion solutions.

Apex supports many ingestion usecases like copying data from HDFS to
different sink like other file systems, databases, messaging systems etc.
But I guess they don't support Neo4j directly.
Still achieving use case of HDFS -> Neo4j looks simple, where you might
have to do some coding for writing to Neo4j. Here
<https://github.com/DataTorrent/app-templates/tree/master/hdfs-to-kafka-sync>
is an example for ingesting data from HDFS -> Kafka. You can achieve your
usecase by replacing kafka writer with Neo4j writer.

Let me know if you are interested in knowing more details.

-Priyanka

On Tue, Jan 10, 2017 at 4:10 PM, unmesha sreeveni <[email protected]>
wrote:

> ​Hi,
>
>  I have my input file in HDFS. How to store that data to Neo4j db. Is
> there any way to do the same? ​
>
> --
> *Thanks & Regards *
>
>
> *Unmesha Sreeveni U.B*
> *Hadoop, Bigdata Developer*
> *Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
> http://www.unmeshasreeveni.blogspot.in/
>
>
>

Reply via email to