Thanks John, that was helpful. I see that you're using the hadoop-dist
directory while the wiki points directly to the project folders (e.g.
hadoop-hdfs-project etc.).

The former works the latter doesn't. So I guess it's a matter of updating
the wiki.


On Thu, Jul 20, 2017 at 9:09 AM, John Zhuge <john.zh...@gmail.com> wrote:

> Hi Lars,
>
> I am able to run pseudo-distributed mode from a dev tree. Here is the
> wiki: https://hadoop.apache.org/docs/current/hadoop-
> project-dist/hadoop-common/SingleCluster.html#Pseudo-Distributed_Operation
> .
>
> Check out my script pseudo_dist
> <https://github.com/jzhuge/hadoop-sanity-tests/blob/master/bin/pseudo_dist> to
> start/stop a pseudo-distributed cluster.
>
> Here are the steps:
>
>    1. mvn install -DskipTests -DskipShade -Dmaven.javadoc.skip -Pdist
>    -Dtar
>    2. pseudo_dist start ~/hadoop-sanity-tests/config/insecure/
>    3. test_env hdfs dfs -ls /tmp
>
> Thanks,
>
> On Wed, Jul 19, 2017 at 11:49 PM, Lars Francke <lars.fran...@gmail.com>
> wrote:
>
>> I've already asked in <https://issues.apache.org/jira/browse/HDFS-11596>
>> but haven't gotten a reply so far so I thought I'd bump it here.
>>
>> The issue replaces the compile time dependency of the various HDFS
>> projects
>> to hdfs-client with a "provided" dependency.
>>
>> Unfortunately that means that HDFS cannot be run anymore from source as is
>> documented in the Wiki (<
>> https://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment>) and
>> as used to be possible before the patch. This is because the hdfs client
>> classes (e.g. ClientProtocol is the first one that HDFS complains about
>> during startup) are not in the classpath anymore.
>>
>> I wonder how all of you are running Hadoop these days from source? I've
>> always followed the Wiki instructions but maybe they are out of date and
>> there's a better way?
>>
>> Thanks,
>> Lars
>>
>
>
>
> --
> John
>

Reply via email to