Yeah, I suffer from the same problem as well. I will be very happy if
it's fixed.
Usually I run all of the components from my IDE (namenode, datanode,
ozone components etc.). I have two problems:
1. the provided scopes which was mentioned.
2. The components couldn't be started because the httpserver tries to
locate the files on the classpath.
Usually I have branch with one commit which fixes all these problems
(together with some default configuration files). I cherry-pick this fix
to my feature branches and remove before the patch generation (as I am
the only user of my private branches I can do interactive rebases and
modify the history. (My current dirty commit is here:
https://github.com/elek/hadoop/commit/5c14cb0a1697f50cfd359a3d57421376c92c7ce8)
I am just interested how the web development is handled by others. Do
you compile the project every time? Do you run the components from IDE?
If you think it's usefull, I would be happy to post the first
modification (HttpServer2 from
https://github.com/elek/hadoop/commit/5c14cb0a1697f50cfd359a3d57421376c92c7ce8#diff-1d06d3e6ca2c0754572b021eb81a084d)
as a patch. But my impression was that there should be some existing
solution for the problem (improve the css/html parts of the servers from
IDE without restart the component).
Marton
On 07/26/2017 01:51 AM, Andrew Wang wrote:
I looked into this more and filed HDFS-12197 with a possible solution.
Thanks for the report, Lars!
On Fri, Jul 21, 2017 at 12:51 AM, Lars Francke <lars.fran...@gmail.com>
wrote:
Thanks John, that was helpful. I see that you're using the hadoop-dist
directory while the wiki points directly to the project folders (e.g.
hadoop-hdfs-project etc.).
The former works the latter doesn't. So I guess it's a matter of updating
the wiki.
On Thu, Jul 20, 2017 at 9:09 AM, John Zhuge <john.zh...@gmail.com> wrote:
Hi Lars,
I am able to run pseudo-distributed mode from a dev tree. Here is the
wiki: https://hadoop.apache.org/docs/current/hadoop-
project-dist/hadoop-common/SingleCluster.html#Pseudo-
Distributed_Operation
.
Check out my script pseudo_dist
<https://github.com/jzhuge/hadoop-sanity-tests/blob/
master/bin/pseudo_dist> to
start/stop a pseudo-distributed cluster.
Here are the steps:
1. mvn install -DskipTests -DskipShade -Dmaven.javadoc.skip -Pdist
-Dtar
2. pseudo_dist start ~/hadoop-sanity-tests/config/insecure/
3. test_env hdfs dfs -ls /tmp
Thanks,
On Wed, Jul 19, 2017 at 11:49 PM, Lars Francke <lars.fran...@gmail.com>
wrote:
I've already asked in <https://issues.apache.org/jira/browse/HDFS-11596
but haven't gotten a reply so far so I thought I'd bump it here.
The issue replaces the compile time dependency of the various HDFS
projects
to hdfs-client with a "provided" dependency.
Unfortunately that means that HDFS cannot be run anymore from source as
is
documented in the Wiki (<
https://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment>)
and
as used to be possible before the patch. This is because the hdfs client
classes (e.g. ClientProtocol is the first one that HDFS complains about
during startup) are not in the classpath anymore.
I wonder how all of you are running Hadoop these days from source? I've
always followed the Wiki instructions but maybe they are out of date and
there's a better way?
Thanks,
Lars
--
John
---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org