Hi Jia, thank you very much for your questions. Sqoop is designed as a batch tool to transfer data from database and warehouse systems to Hadoop ecosystem and vice-versa. Right now, Sqoop is supporting only JDBC compliant databases. This requirement will however fade away with Sqoop 2.
Based on your questions, it seems to me that you're more looking for an online ingest system rather than batch one. In such case, I would recommend checking Apache Flume project [1] that aim to address online data ingesting issue. Links: 1: http://flume.apache.org/ On Wed, Nov 28, 2012 at 02:19:57PM +0800, jia jimin wrote: > Hi there, > > I am investigating Sqoop on windows for importing data to HDFS and have > some questions : > > 1. Does Sqoop support importing non-relational data such as event log or > text file to HDFS ? > > 2. If our client machine changed frequently ( recycle old machines and add > new machines) , Can sqoop automatically import data by changing some > configurations dynamically ? > > Thanks for looking at these questions ! > > Regards > Benjamin
signature.asc
Description: Digital signature
