please do
ls -arlt /usr/bin/java
On Wed, Jun 3, 2015 at 2:40 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
Hi,
Also
kadir@kbeyaz:~/hadoop-2.7.0$ which java
/usr/bin/java
On Wed, Jun 3, 2015 at 10:39 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
Hi Ilker,
Fair enough, however, we are using hadoop everyday, and I cannot explain
the advantages over a basic email...
Java home is an environment variable, let's offline this, not the entire
user group needs to bothered with basics. Send me an email
iozkay...@gmail.com
Let's
On Jun 3, 2015 4:03 PM,
What do you have for your LD_LIBRARY_PATH??
On Wed, Jun 3, 2015 at 2:02 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
mand I got following
Hi Ilker,
It is empty. Should I set it to a value?
On Wed, Jun 3, 2015 at 10:09 PM, Ilker Ozkaymak iozkay...@gmail.com wrote:
What do you have for your LD_LIBRARY_PATH??
On Wed, Jun 3, 2015 at 2:02 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
mand I got following
--
*Kadir
One more question would you please run
/usr/bin/java/bin/java -version
and
ls -arlt /usr/bin/java/bin/java
On Wed, Jun 3, 2015 at 2:31 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
ked all variables at line 171, all of t
Hi All,
I am new at hadoop.
I started working from following link :
*http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html*
There is an example, I am applying
Information that you pasted about java is absolutely correct, that is why I
was asking you to find the actual java home.
Again, you have some sort of a pathing issue with your variables, most
likely in the hadoop env file...
On Wed, Jun 3, 2015 at 3:20 PM, Kadir Beyazlı kadirbeya...@gmail.com
Hi Ilker,
I will try to find the error.
I started hadoop today and started from first lesson of hadoop offical page.
The only command about setting hadoop env is for JAVA HOME until now so if
it is because of env variables I must say that haddop official page is not
user firendly
Regarding
Hi Ilker,
kadir@kbeyaz:~/hadoop-2.7.0$ /usr/bin/java/bin/java -version
bash: /usr/bin/java/bin/java: Not a directory
kadir@kbeyaz:~/hadoop-2.7.0$ ls -arlt /usr/bin/java/bin/java
ls: cannot access /usr/bin/java/bin/java: Not a directory
On Wed, Jun 3, 2015 at 10:35 PM, Ilker Ozkaymak
Hi,
kadir@kbeyaz:~/hadoop-2.7.0$ ls -arlt /usr/bin/java
lrwxrwxrwx 1 root root 22 Oca 27 11:42 /usr/bin/java -
/etc/alternatives/java
Also;
kadir@kbeyaz:~/hadoop-2.7.0$ java -version
openjdk version 1.8.0_40-internal
OpenJDK Runtime Environment (build 1.8.0_40-internal-b09)
OpenJDK 64-Bit
But I firtslt tried to find where my java with follwing command :
*kadir@kbeyaz:~/hadoop-2.7.0$ which java/usr/bin/java*
Then I set JAVA_HOME to */usr/bin/java* and edited etc/hadoop-env.sh
I did not write* /usr/bin/java/bin/java* it and I am trying to understand
how it found that path while
So your actual java_home is under /etc/alternatives it is not
/usr/bin/java/bin/java
second path doesn't exists that is why it is erroring...
Please verify your java home and try again with correct path...
On Wed, Jun 3, 2015 at 2:46 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
Hi,
Hi Ilker,
I set JAVA_HOME to /usr
I googled and found following info :
*JAVA_HOME is not the name of the java executable. But of the directory,
java was installed in. The executable should be
$JAVA_HOME/bin/java.JAVA_HOME must always point to the home directory of
the java installation,
Hello -
I am trying to work through the documentation at
http://wiki.apache.org/hadoop/Hadoop2OnWindows to get a basic single node
instance of Hadoop running on Windows. I am on step 3.5, where I am executing
the line %HADOOP_PREFIX%\bin\hdfs dfs -put myfile.txt /, and I get the
following
Hi,
Also
kadir@kbeyaz:~/hadoop-2.7.0$ which java
/usr/bin/java
On Wed, Jun 3, 2015 at 10:39 PM, Kadir Beyazlı kadirbeya...@gmail.com
wrote:
Hi Ilker,
kadir@kbeyaz:~/hadoop-2.7.0$ /usr/bin/java/bin/java -version
bash: /usr/bin/java/bin/java: Not a directory
kadir@kbeyaz:~/hadoop-2.7.0$ ls
/usr/bin/java is executable, which is basically a link to
/etc/alternates/java. /etc/alternates should be another link to another
location, you can verify with ls -alrt /etc/alternates it is the JRE came
with your installation.
On Wed, Jun 3, 2015 at 3:02 PM, Kadir Beyazlı kadirbeya...@gmail.com
Hi, do u have below property on core-site.xml file used by your hdfs?
property
namehadoop.proxyuser.HTTP.hosts/name
value*/value
/property
property
namehadoop.proxyuser.HTTP.groups/name
value*/value
/property
Hello all,
We need to run several HTTPFS instances on our
Hi,
Thanks for your answer.
With this setup, only the HTTP user will be able to impersonate other users, so
HTTPFS has to run with the HTTP user.
Instead, I need users to run HTTPFS with their own user, not with the HTTP user.
Thanks
From: Wellington Chevreuil
Out of curiosity, what is the added benefit of having HttpFs run as separate
team users give you?
If the APIs are invoked with SPNEGO or a user.name of the appropriate user
don’t you get the same permissions based protections?
Generally speaking, gateways such as HttpFs provide access on behalf
If that doesn't work, u may need to define one entry for these properties
to each user running an httpfs instance.
See below:
http://hadoop.apache.org/docs/current/hadoop-hdfs-httpfs/ServerSetup.html
Em 03/06/2015 12:40, Wellington Chevreuil wellington.chevre...@gmail.com
escreveu:
Hi, do u
Hi
I have map reduce job for hbase bulk load. Job is converting data into
Hfiles and loading into hbase but after certain map % job is failing. Below
is the exception that I am getting.
Error: java.io.FileNotFoundException:
Hi Pratik,
I've responded to your question in the u...@ambari.apache.org mailing list.
Regards,
Yusaku
From: Pratik Gadiya
pratik_gad...@persistent.commailto:pratik_gad...@persistent.com
Reply-To: user@hadoop.apache.orgmailto:user@hadoop.apache.org
Hello all,
We need to run several HTTPFS instances on our Hadoop cluster, with different
users (basically, one HTTPFS per team).
In our setup, each HTTPFS instance runs as a team user and is allowed write
access to that user's directory only (so, HTTPFS does not run as the httpfs
user).
Sorry.
No, I don’t think that this is possible and I don’t think that you should try
and manipulate the proxy settings in such a way that team users are configured
as trusted proxies.
That would introduce risk of exactly the sort of things that you are trying to
avoid.
On Jun 3, 2015, at 9:06
Hi,
We want to let users teams be able to run their HTTPFS in order to isolate
instances. One team thus cannot crash another team's HTTPFS instance.
Now, I make the following request:
curl
localhost:14000/webhdfs/v1/user/team_user?op=LISTSTATUSuser.name=team_user
And I get the following
Hi,
Thanks for your answer. I'm not sure I understand it all, though.
Of course, you could send a request to another team's HTTPFS instance. But you
won't be necessarily be granted access to every operations (based on Kerberos
authentication, for instance).
Anyway, my objective was to use
inline...
On Jun 3, 2015, at 8:03 AM, Nathaniel Braun
n.br...@criteo.commailto:n.br...@criteo.com wrote:
Hi,
We want to let users teams be able to run their HTTPFS in order to isolate
instances. One team thus cannot crash another team’s HTTPFS instance.
For my own clarity...
How does this
Hi all,
We had a blast of a BOF session on Hadoop YARN at last year's Hadoop
Summit. We had lots of fruitful discussions led by many developers about
various features, their contributions, it was a great session overall.
I am coordinating this year's BOF as well and garnering topics of
Sounds like 192.168.1.12 is not listening on port 50010, if you have an
access to the box netstat -lvnp and see if port 50010 is in the list??
Regards,
io
On Thu, Jun 4, 2015 at 12:44 AM, Arpit Agarwal aagar...@hortonworks.com
wrote:
I recall seeing this error due to a network
I've just built my distributed cluster but am getting the following error
when I try to use HDFS.
I've traced it by telnet to 192.168.1.12 50010 and it just waits there
waiting for a connection but never happens.
If I telnet on that host using localhost (127.0.0.1) the telnet connection
I recall seeing this error due to a network misconfiguration. You may want to
verify that IP addresses and host names are correctly setup.
From: Caesar Samsi
Reply-To: user@hadoop.apache.orgmailto:user@hadoop.apache.org
Date: Wednesday, June 3, 2015 at 8:07 PM
To:
I had seen this issue. and it was due to data nodes not able to process
those many requests at a time.
On Thu, Jun 4, 2015 at 11:14 AM, Arpit Agarwal aagar...@hortonworks.com
wrote:
I recall seeing this error due to a network misconfiguration. You may
want to verify that IP addresses and
Hi, Jay
You can split this SQL into 3 scripts, like this:
--script1.sql
FROM EMPLOYER_STAGE
INSERT OVERWRITE TABLE EMPLOYER PARTITION (FISCAL_YEAR = 2015,
FISCAL_PERIOD = 01) SELECT * WHERE FISCAL_YEAR = 2014 AND
FISCAL_PERIOD = 08;
--script2.sql
FROM EMPLOYER_STAGE
INSERT OVERWRITE TABLE
Hello,
Just finished upgrading CDH from 5.3.1 to 5.4.1 using
http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/install_upgrade_to_cdh54_parcels.html
however Event health issues warnings eventserver (cloudera1)
Also, are Hadoop summit registrations required to attend the BoF?
On Wed, Jun 3, 2015 at 10:52 AM, Karthik Kambatla ka...@cloudera.com
wrote:
Going through all Yarn umbrella JIRAs
Going through all Yarn umbrella JIRAs
https://issues.apache.org/jira/issues/?jql=project%20in%20(Yarn)%20AND%20summary%20~%20umbrella%20AND%20resolution%20%3D%20Unresolved%20ORDER%20BY%20created%20ASC
could
be useful. May be, this is an opportunity to clean up that list. I looked
at all New
36 matches
Mail list logo