Hello all,

I'm on Apache2.6.0.

Look like the Hadoop cluster won't start if the disk occupancy is greater
than 90%.
My hard disk capacity is 1TB and with 90% occupancy I still have 100GB left.
I know for sure that my job won't consume more than 2GB. How can I disable
this check?

Following is from the logs:

WARN org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection:
Directory /tmp/hadoop/logs/userlogs error, used space above threshold of
90.0%, removing from list of valid directories--

Thanks,
--Manoj Kumar M

Reply via email to