Hi,

You can set following configuration in hbase-site.xml
and export HBASE_CONF_PATH or HBASE_CONF_DIR with the configuration
directory before running the job.



property>
      <name>hbase.coprocessor.region.classes</name>
      
<value>org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint</value>
    </property>


Thanks,
Rajeshbabu.

On Wed, Oct 28, 2015 at 11:51 AM, Bulvik, Noam <noam.bul...@teoco.com>
wrote:

> Thanks Matt ,
>
> Is this a known issue in the CSV Bulk Load Tool ? Do we need to open JIRA
> so it will be fixed ?
>
>
>
>
>
>
>
> *From:* Matt Kowalczyk [mailto:ma...@cloudability.com]
> *Sent:* Wednesday, October 28, 2015 1:01 AM
> *To:* user@phoenix.apache.org
> *Subject:* Re: mapreduce.LoadIncrementalHFiles: Trying to load hfile...
> hang till we set permission on the tmp file
>
>
>
> There might be a better way but my fix for this same problem was to modify
> the CsvBulkLoadTool.java to perform,
>
>                     FileSystem fs = FileSystem.get(conf);
>                     RemoteIterator<LocatedFileStatus> ri =
> fs.listFiles(outputPath, true);
>                     while (ri.hasNext()) {
>                         LocatedFileStatus fileStatus = ri.next();
>                         LOG.info("chmod a+rwx on {}",
> fileStatus.getPath().getParent().toString());
>                         fs.setPermission(fileStatus.getPath().getParent(),
>                                          new FsPermission(FsAction.ALL,
> FsAction.ALL, FsAction.ALL));
>                         LOG.info("chmod a+rwx on {}",
> fileStatus.getPath().toString());
>                         fs.setPermission(fileStatus.getPath(), new
> FsPermission(FsAction.ALL, FsAction.ALL, FsAction.ALL));
>                     }
>
> right before the the call to loader.doBulkLoad(outputPath, htable)
>
> This unfortunately requires that you modify the source. I'd be interested
> in a solution that doesn't require patching phoenix.
>
> -Matt
>
>
>
> On Tue, Oct 27, 2015 at 1:06 PM, Bulvik, Noam <noam.bul...@teoco.com>
> wrote:
>
> Hi,
>
> We are running CSV bulk loader on phoenix 4.5 with CDH 5.4 and it works
> fine but with one problem. The loading task is hang on
> mapreduce.LoadIncrementalHFiles: Trying to load hfile .. until we give the
> directory holding the hfile (under /tmp of the HDFS) write permissions.
>
>
>
> We set umask to be 000 but it does not work.
>
>
>
> Any idea how it should be fixed
>
>
>
> thanks
>
>
>
> *Noam*
>
>
>
>
> ------------------------------
>
>
> PRIVILEGED AND CONFIDENTIAL
> PLEASE NOTE: The information contained in this message is privileged and
> confidential, and is intended only for the use of the individual to whom it
> is addressed and others who have been specifically authorized to receive
> it. If you are not the intended recipient, you are hereby notified that any
> dissemination, distribution or copying of this communication is strictly
> prohibited. If you have received this communication in error, or if any
> problems occur with transmission, please contact sender. Thank you.
>
>
>
> ------------------------------
>
> PRIVILEGED AND CONFIDENTIAL
> PLEASE NOTE: The information contained in this message is privileged and
> confidential, and is intended only for the use of the individual to whom it
> is addressed and others who have been specifically authorized to receive
> it. If you are not the intended recipient, you are hereby notified that any
> dissemination, distribution or copying of this communication is strictly
> prohibited. If you have received this communication in error, or if any
> problems occur with transmission, please contact sender. Thank you.
>

Reply via email to