It is called dfs.datanode.data.dir. It is the 2nd option in the config
page. Labeled as "DataNode directories"

On Wed, Jul 19, 2017 at 6:34 PM, Adaryl Wakefield <
[email protected]> wrote:

> I’ll look for it in the install process. I’m about to do my 3rd install
> and I don’t remember seeing this option. I know you can change it after the
> fact.
>
>
>
> Adaryl "Bob" Wakefield, MBA
> Principal
> Mass Street Analytics, LLC
> 913.938.6685
>
> www.massstreet.net
>
> www.linkedin.com/in/bobwakefieldmba
> Twitter: @BobLovesData <http://twitter.com/BobLovesData>
>
>
>
>
>
> *From:* Loïc Chanel [mailto:[email protected]]
> *Sent:* Wednesday, July 19, 2017 2:29 AM
> *To:* [email protected]
> *Subject:* Re: making sure you have proper disk space in install
>
>
>
> Hi Bob,
>
>
>
> Yes, in Ambari you can specify what folder HDFS should use to write its
> data. This is very basic configuration in HDFS service.
>
> Regards,
>
>
>
>
>
> Loïc
>
>
> Loïc CHANEL
> System Big Data engineer
> MS&T - Worldline Analytics Platform - Worldline (Villeurbanne, France)
>
>
>
> 2017-07-19 4:05 GMT+02:00 Adaryl Wakefield <[email protected]>:
>
> I’ve seen several post about the fact that HDFS isn’t taking up the whole
> disk. Is there a way in the install process to specify which mount points
> HDFS should use so you don’t have to go change it later? I have a total of
> three nodes with 1TB each. I’d prefer that HDFS take up the lion share of
> that.
>
>
>
> Adaryl "Bob" Wakefield, MBA
> Principal
> Mass Street Analytics, LLC
> 913.938.6685 <(913)%20938-6685>
>
> www.massstreet.net
>
> www.linkedin.com/in/bobwakefieldmba
> Twitter: *@BobLovesData*
>
>
>



-- 
Best Regards,
Ayan Guha

Reply via email to