Hi All,
My input data looks like below as | delimited and I want to extract appid,
appname, bundleid etc, please help me to create hive table ,
|0|{\x22appid\x22:\x228\x22,\x22appname\x22:\x22CONVX-0008\x22,\x22bundleid\x22:\x22com.zeptolab.timetravel.free.google\x22}|14|
--
Thanks,
Kishore.
Try
*hadoop fs -rmr /tmp*
--
*Thanks Regards *
*Unmesha Sreeveni U.B*
*Hadoop, Bigdata Developer*
*Center for Cyber Security | Amrita Vishwa Vidyapeetham*
http://www.unmeshasreeveni.blogspot.in/
config.set(fs.defaultFS, hdfs://port/);
config.set(hadoop.job.ugi, hdfs);
On Fri, Apr 25, 2014 at 10:46 PM, Oleg Zhurakousky
oleg.zhurakou...@gmail.com wrote:
Yes, it will be copied since it goes to each job's namesapce
On Fri, Apr 25, 2014 at 1:14 PM, Steve Lewis
check this link:
http://www.unmeshasreeveni.blogspot.in/2014/04/hadoop-installation-for-beginners.html
and
http://www.unmeshasreeveni.blogspot.in/2014/04/hadoop-wordcount-example-in-detail.html
On Fri, Apr 25, 2014 at 5:29 PM, Shahab Yunus shahab.yu...@gmail.comwrote:
Assuming, you are talking
In order to get started with hadoop you need to install cgywin (Provides an
exact look and feel as linux)
Or else u can run ubundu in a vmPlayer
Once you done this
You can download hadoop directly from Apache or from other vendors
And Follow thiese steps:
Are you asking about standalone mode where we run hadoop using local fs?
--
*Thanks Regards *
*Unmesha Sreeveni U.B*
*Hadoop, Bigdata Developer*
*Center for Cyber Security | Amrita Vishwa Vidyapeetham*
http://www.unmeshasreeveni.blogspot.in/
oracle virtual box, and hortonworks sandbox for virtual box also work
watch you tube videos on the steps
From: unmesha sreeveni unmeshab...@gmail.com
To: User Hadoop user@hadoop.apache.org
Sent: Wednesday, April 30, 2014 6:51 AM
Subject: Re: How do I get
Hi Sam,
Bryan meant the last config bit:
property
namedfs.client.failover.proxy.provider.gilbert-prod/name
valueorg.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider/value
/property
This is the class the client will use to perform the failover (i.e.
active NN
Have anyone experienced something like this before?
2014-04-26 12:03 GMT-03:00 Murillo Flores muri...@chaordic.com.br:
Hello everyone,
I'm trying to run a MR job in hadoop 2.2.0 with the input files coming
from s3. One of these input files is an empty file (zero byte length) and
because of
Any suggestions?
-
Hello,
I am having issue with partitioning data between mapper and reducers when the
key is numeric. When I switch it to one character string it works fine, but I
have more then 26 keys so looking to alternative way.
My data look like:
10 \t comment10 \t data
Hi,
I wasn't able to find any information regarding the security of using distcp to
transfer data to S3 using something like hadoop distcp hdfs://...
s3n://access-key:secret-key@bucket/
I know for example in the Python library boto you can specify is_secure=True to
use SSL, but it is not
Can I do this while the job is still running ? I know I cant delete the
directory , but I just want confirmation that the data Hadoop writes into
/tmp/hadoop-df/nm-local-dir (df being my user name) can be discarded while
the job is being executed.
On Wed, Apr 30, 2014 at 6:40 AM, unmesha
Hi,
I see these two knobs in Fair scheduler - 'assignMultiple'
and 'continuos scheduling'.
1. Are there performance benefits using them ? What are the cons ?
2. Also is there any problem with 'continuous scheduling',I'm asking
because this is not mentioned in the FS doc ?
--
Thanks,
Ashwin
Sharing a little experience for me to get hadoop working on windows...
I need to build the native binaries (winutils.exe, hadoop.dll) from Hadoop
source code, and copy them to Hadoop bin directory. The maven pom project
cannot build these binaries well on windows. I have to manually open the
sln
Hi,
I was using hadoop 2.2.0 to build my application. I was using
HttpConfig.getSchemaPrefix() api call. When I updated hadoop to 2.4.0, the
compilation fails for my application and I see that HttpConfig
(org.apache.hadoop.http.HttpConfig) APIs have changed.
How do I get the schrema Prefix in
Hi,
Can you describe your use cases, that is, how the prefix is used? Usually
you can get around with it by generating relative URLs, which starts at
//.
~Haohui
On Wed, Apr 30, 2014 at 2:31 PM, Gaurav Gupta gaurav.gopi...@gmail.comwrote:
Hi,
I was using hadoop 2.2.0 to build my
I am trying to get the container logs url and here is the code snippet
containerLogsUrl = HttpConfig.getSchemePrefix() +
this.container.nodeHttpAddress + /node/containerlogs/ + id + / +
System.getenv(ApplicationConstants.Environment.USER.toString());
Thanks
Gaurav
On Wed, Apr 30, 2014 at
Can You just try and see this.
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
fs.deleteOnExit(path/to/tmp);
On Thu, May 1, 2014 at 12:10 AM, S.L simpleliving...@gmail.com wrote:
Can I do this while the job is still running ? I know I cant delete the
directory
18 matches
Mail list logo