Hi Steve,

A normally-written client program would work normally on both
permissions and no-permissions clusters. There is no concept of a
"password" for users in Apache Hadoop as of yet, unless you're dealing
with a specific cluster that has custom-implemented it.

Setting a specific user is not the right way to go. In secure and
non-secure environments both, the user is automatically inferred by
the user actually running the JVM process - its better to simply rely
on this.

An AccessControlException occurs when a program tries to write or
alter a defined path where it lacks permission. To bypass this, the
HDFS administrator needs to grant you access to such defined paths,
rather than you having to work around that problem.

On Mon, May 13, 2013 at 3:25 PM, Steve Lewis <lordjoe2...@gmail.com> wrote:
> -- I have been running Hadoop on a clister set to not check permissions. I
> would run a java client on my local machine and would run as the local user
> on the cluster.
>
> I say
> *      String connectString =   "hdfs://" + host + ":" + port + "/";*
> *            Configuration config = new Configuration();*
> *
> *
> *            config.set("fs.default.name",connectString);*
> *
> *
> *            FileSystem fs  = FileSystem.get(config);*
> *The above code works*
> *  *
> I am trying to port to a cluster where permissions are checked - I have  an
> account but need to set a user and password to avoid Access Exceptions
>
> How do I do this and If I can only access certain directories how do I do
> that?
>
> Also are there some directories my code MUST be able to access outside
> those for my user only?
>
> Steven M. Lewis PhD
> 4221 105th Ave NE
> Kirkland, WA 98033
> 206-384-1340 (cell)
> Skype lordjoe_com



-- 
Harsh J

Reply via email to