There is absolutely no need to be sorry. And once you have the data
inside your Hdfs you can use importtsv to imort the data like this :
$ bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv
-Dimporttsv.columns=a,b,c <tablename> <hdfs-inputdir>


Regards,
    Mohammad Tariq


On Thu, Jul 19, 2012 at 5:12 PM, iwannaplay games
<funnlearnfork...@gmail.com> wrote:
> Hi,
>
> I am sorry ,i am troubling you a lot.
>
> Thanks for helping me :)
>
> This file is there in my hdfs IPData ( both in csv and tsv format).
> Can i transfer the records from this file and populate this data using
> importtsv?
> If yes then what source and destination should i write.
>
> Thanks
> Prabhjot
>
>
> On 7/19/12, Mohammad Tariq <donta...@gmail.com> wrote:
>> By looking at the csv, I would suggest to create an hbase table, say
>> 'IPINFO' with one column family, say 'cf' having 3 columns for
>> 'startip', 'endip' and 'countryname' respectively.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jul 19, 2012 at 4:44 PM, Mohammad Tariq <donta...@gmail.com> wrote:
>>> You have a few options. You can write a java program using Hbase API
>>> to do that. But you won't be able to exploit the parallelism to its
>>> fullest. Another option is to write a MapReduce program to do the
>>> same. You can also use Hive and Pig to serve the purpose. But if you
>>> are just starting with Hbase, then I would suggest to first get
>>> yourself familiar with Hbase API first and then go for other things.
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>> On Thu, Jul 19, 2012 at 4:12 PM, iwannaplay games
>>> <funnlearnfork...@gmail.com> wrote:
>>>> PFA the csv file
>>>>
>>>> Thanks
>>>> Prabhjot
>>>>
>>>>
>>>> On 7/19/12, Mohammad Tariq <donta...@gmail.com> wrote:
>>>>> Could you show me the structure of your csv, if possible??
>>>>>
>>>>> Regards,
>>>>>     Mohammad Tariq
>>>>>
>>>>>
>>>>> On Thu, Jul 19, 2012 at 4:03 PM, iwannaplay games
>>>>> <funnlearnfork...@gmail.com> wrote:
>>>>>> Thanks Tariq
>>>>>>
>>>>>> Now i want to convert  this csv file to table.(HBase table with column
>>>>>> families)
>>>>>> How can i do that
>>>>>>
>>>>>> Regards
>>>>>> Prabhjot
>>>>>>
>>>>>>
>>>>>> On 7/19/12, Mohammad Tariq <donta...@gmail.com> wrote:
>>>>>>> Hi Prabhjot
>>>>>>>
>>>>>>>       You can also use :
>>>>>>> hadoop fs -put <src fs path> <destn hdfs path>
>>>>>>>
>>>>>>> Regards,
>>>>>>>     Mohammad Tariq
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Jul 19, 2012 at 3:52 PM, Bejoy Ks <bejoy.had...@gmail.com>
>>>>>>> wrote:
>>>>>>>> Hi Prabhjot
>>>>>>>>
>>>>>>>> Yes, Just use the filesystem commands
>>>>>>>> hadoop fs -copyFromLocal <src fs path> <destn hdfs path>
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Bejoy KS
>>>>>>>>
>>>>>>>> On Thu, Jul 19, 2012 at 3:49 PM, iwannaplay games
>>>>>>>> <funnlearnfork...@gmail.com> wrote:
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> I am unable to use sqoop and want to load data in hdfs for testing,
>>>>>>>>> Is there any way by which i can load my csv or text  file to hadoop
>>>>>>>>> file system directly without writing code in java
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Prabhjot
>>>>>>>
>>>>>
>>

Reply via email to