-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/46653/
-----------------------------------------------------------

Review request for Ambari and Dmytro Sen.


Bugs: AMBARI-16104
    https://issues.apache.org/jira/browse/AMBARI-16104


Repository: ambari


Description
-------

On an HA cluster deployed via blueprints, HBase service check is failing with
the below error

    
    
    
    Traceback (most recent call last):
      File 
"/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/service_check.py",
 line 88, in <module>
        HbaseServiceCheck().execute()
      File 
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 219, in execute
        method(env)
      File 
"/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/service_check.py",
 line 84, in service_check
        logoutput = True
      File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
line 154, in __init__
        self.env.run()
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 160, in run
        self.run_action(resource, action)
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 124, in run_action
        provider_action()
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
 line 238, in action_run
        tries=self.resource.tries, try_sleep=self.resource.try_sleep)
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, 
in inner
        result = function(command, **kwargs)
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, 
in checked_call
        tries=tries, try_sleep=try_sleep)
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, 
in _call_wrapper
        result = _call(command, **kwargs_copy)
      File 
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, 
in _call
        raise Fail(err_msg)
    resource_management.core.exceptions.Fail: Execution of ' 
/usr/hdp/current/hbase-client/bin/hbase --config 
/usr/hdp/current/hbase-client/conf shell 
/var/lib/ambari-agent/tmp/hbase-smoke.sh && 
/var/lib/ambari-agent/tmp/hbaseSmokeVerify.sh 
/usr/hdp/current/hbase-client/conf id16ac4252_date492216 
/usr/hdp/current/hbase-client/bin/hbase' returned 1. ######## Hortonworks 
#############
    This is MOTD message, added for testing in qe infra
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:file:/grid/0/hdp/2.4.2.0-244/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:file:/grid/0/hdp/2.4.2.0-244/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    
    ERROR: Table ambarismoketest does not exist.
    
    Here is some help for this command:
    Start disable of named table:
      hbase> disable 't1'
      hbase> disable 'ns1:t1'
    
    
    
    ERROR: Table ambarismoketest does not exist.
    
    Here is some help for this command:
    Drop the named table. Table must first be disabled:
      hbase> drop 't1'
      hbase> drop 'ns1:t1'
    
    
    
    ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient 
permissions for user 'ambari-qa' (action=create)
    
    Here is some help for this command:
    Creates a table. Pass a table name, and a set of column family
    specifications (at least one), and, optionally, table configuration.
    Column specification can be a simple string (name), or a dictionary
    (dictionaries are described below in main help output), necessarily 
    including NAME attribute. 
    Examples:
    
    Create a table with namespace=ns1 and table qualifier=t1
      hbase> create 'ns1:t1', {NAME => 'f1', VERSIONS => 5}
    
    Create a table with namespace=default and table qualifier=t1
      hbase> create 't1', {NAME => 'f1'}, {NAME => 'f2'}, {NAME => 'f3'}
      hbase> # The above in shorthand would be the following:
      hbase> create 't1', 'f1', 'f2', 'f3'
      hbase> create 't1', {NAME => 'f1', VERSIONS => 1, TTL => 2592000, 
BLOCKCACHE => true}
      hbase> create 't1', {NAME => 'f1', CONFIGURATION => 
{'hbase.hstore.blockingStoreFiles' => '10'}}
      
    Table configuration options can be put at the end.
    Examples:
    
      hbase> create 'ns1:t1', 'f1', SPLITS => ['10', '20', '30', '40']
      hbase> create 't1', 'f1', SPLITS => ['10', '20', '30', '40']
      hbase> create 't1', 'f1', SPLITS_FILE => 'splits.txt', OWNER => 'johndoe'
      hbase> create 't1', {NAME => 'f1', VERSIONS => 5}, METADATA => { 'mykey' 
=> 'myvalue' }
      hbase> # Optionally pre-split the table into NUMREGIONS, using
      hbase> # SPLITALGO ("HexStringSplit", "UniformSplit" or classname)
      hbase> create 't1', 'f1', {NUMREGIONS => 15, SPLITALGO => 
'HexStringSplit'}
      hbase> create 't1', 'f1', {NUMREGIONS => 15, SPLITALGO => 
'HexStringSplit', REGION_REPLICATION => 2, CONFIGURATION => 
{'hbase.hregion.scan.loadColumnFamiliesOnDemand' => 'true'}}
    
    You can also keep around a reference to the created table:
    
      hbase> t1 = create 't1', 'f1'
    
    Which gives you a reference to the table named 't1', on which you can then
    call methods.
    
    
    2016-04-22 18:53:11,340 ERROR [main] client.AsyncProcess: Failed to get 
region location 
    org.apache.hadoop.hbase.TableNotFoundException: ambarismoketest
        at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1264)
        at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1162)
        at 
org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:370)
        at 
org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:321)
        at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:206)
        at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
        at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
        at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:450)
        at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:311)
        at 
org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:59)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169)
        at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71)
        at org.jruby.ast.CallManyArgsNode.interpret(CallManyArgsNode.java:59)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:111)
        at 
org.jruby.runtime.InterpretedBlock.evalBlockBody(InterpretedBlock.java:374)
        at org.jruby.runtime.InterpretedBlock.yield(InterpretedBlock.java:295)
        at 
org.jruby.runtime.InterpretedBlock.yieldSpecific(InterpretedBlock.java:229)
        at org.jruby.runtime.Block.yieldSpecific(Block.java:99)
        at org.jruby.ast.ZYieldNode.interpret(ZYieldNode.java:25)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:169)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:191)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:302)
        at 
org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:144)
        at 
org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:153)
        at 
org.jruby.ast.FCallNoArgBlockNode.interpret(FCallNoArgBlockNode.java:32)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71)
        at org.jruby.ast.FCallManyArgsNode.interpret(FCallManyArgsNode.java:60)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:165)
        at org.jruby.RubyClass.finvoke(RubyClass.java:573)
        at org.jruby.RubyBasicObject.send(RubyBasicObject.java:2801)
        at org.jruby.RubyKernel.send(RubyKernel.java:2117)
        at org.jruby.RubyKernel$s$send.call(RubyKernel$s$send.gen:65535)
        at 
org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:181)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71)
        at 
org.jruby.ast.FCallSpecialArgNode.interpret(FCallSpecialArgNode.java:45)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:111)
        at 
org.jruby.runtime.InterpretedBlock.evalBlockBody(InterpretedBlock.java:374)
        at org.jruby.runtime.InterpretedBlock.yield(InterpretedBlock.java:295)
        at 
org.jruby.runtime.InterpretedBlock.yieldSpecific(InterpretedBlock.java:229)
        at org.jruby.runtime.Block.yieldSpecific(Block.java:99)
        at org.jruby.ast.ZYieldNode.interpret(ZYieldNode.java:25)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at org.jruby.ast.RescueNode.executeBody(RescueNode.java:216)
        at 
org.jruby.ast.RescueNode.interpretWithJavaExceptions(RescueNode.java:120)
        at org.jruby.ast.RescueNode.interpret(RescueNode.java:110)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:165)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:272)
        at 
org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:80)
        at 
org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:89)
        at 
org.jruby.ast.FCallSpecialArgBlockNode.interpret(FCallSpecialArgBlockNode.java:42)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at org.jruby.ast.RescueNode.executeBody(RescueNode.java:216)
        at 
org.jruby.ast.RescueNode.interpretWithJavaExceptions(RescueNode.java:120)
        at org.jruby.ast.RescueNode.interpret(RescueNode.java:110)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71)
        at 
org.jruby.ast.CallSpecialArgNode.interpret(CallSpecialArgNode.java:73)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:69)
        at 
org.jruby.ast.FCallSpecialArgNode.interpret(FCallSpecialArgNode.java:45)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71)
        at 
org.jruby.ast.CallSpecialArgNode.interpret(CallSpecialArgNode.java:73)
        at org.jruby.ast.LocalAsgnNode.interpret(LocalAsgnNode.java:123)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120)
        at 
org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134)
        at 
org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71)
        at org.jruby.ast.FCallManyArgsNode.interpret(FCallManyArgsNode.java:60)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at org.jruby.ast.RootNode.interpret(RootNode.java:129)
        at 
org.jruby.evaluator.ASTInterpreter.INTERPRET_ROOT(ASTInterpreter.java:119)
        at org.jruby.Ruby.runInterpreter(Ruby.java:724)
        at org.jruby.Ruby.loadFile(Ruby.java:2489)
        at org.jruby.runtime.load.ExternalScript.load(ExternalScript.java:66)
        at org.jruby.runtime.load.LoadService.load(LoadService.java:270)
        at org.jruby.RubyKernel.loadCommon(RubyKernel.java:1105)
        at org.jruby.RubyKernel.load(RubyKernel.java:1087)
        at org.jruby.RubyKernel$s$0$1$load.call(RubyKernel$s$0$1$load.gen:65535)
        at 
org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:211)
        at 
org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:207)
        at 
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312)
        at 
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169)
        at 
usr.hdp.$2_dot_4_dot_2_dot_0_minus_244.hbase.bin.hirb.__file__(/usr/hdp/2.4.2.0-244/hbase/bin/hirb.rb:177)
        at 
usr.hdp.$2_dot_4_dot_2_dot_0_minus_244.hbase.bin.hirb.load(/usr/hdp/2.4.2.0-244/hbase/bin/hirb.rb)
        at org.jruby.Ruby.runScript(Ruby.java:697)
        at org.jruby.Ruby.runScript(Ruby.java:690)
        at org.jruby.Ruby.runNormally(Ruby.java:597)
        at org.jruby.Ruby.runFromMain(Ruby.java:446)
        at org.jruby.Main.doRunFromMain(Main.java:369)
        at org.jruby.Main.internalRun(Main.java:258)
        at org.jruby.Main.run(Main.java:224)
        at org.jruby.Main.run(Main.java:208)
        at org.jruby.Main.main(Main.java:188)
    
    ERROR: Failed 1 action: ambarismoketest: 1 time, 
    
    Here is some help for this command:
    Put a cell 'value' at specified table/row/column and optionally
    timestamp coordinates.  To put a cell value into table 'ns1:t1' or 't1'
    at row 'r1' under column 'c1' marked with the time 'ts1', do:
    
      hbase> put 'ns1:t1', 'r1', 'c1', 'value'
      hbase> put 't1', 'r1', 'c1', 'value'
      hbase> put 't1', 'r1', 'c1', 'value', ts1
      hbase> put 't1', 'r1', 'c1', 'value', {ATTRIBUTES=>{'mykey'=>'myvalue'}}
      hbase> put 't1', 'r1', 'c1', 'value', ts1, 
{ATTRIBUTES=>{'mykey'=>'myvalue'}}
      hbase> put 't1', 'r1', 'c1', 'value', ts1, {VISIBILITY=>'PRIVATE|SECRET'}
    
    The same commands also can be run on a table reference. Suppose you had a 
reference
    t to table 't1', the corresponding command would be:
    
      hbase> t.put 'r1', 'c1', 'value', ts1, {ATTRIBUTES=>{'mykey'=>'myvalue'}}
    
    
    ROW  COLUMN+CELL
    
    ERROR: Unknown table ambarismoketest!
    
    Here is some help for this command:
    Scan a table; pass table name and optionally a dictionary of scanner
    specifications.  Scanner specifications may include one or more of:
    TIMERANGE, FILTER, LIMIT, STARTROW, STOPROW, ROWPREFIXFILTER, TIMESTAMP,
    MAXLENGTH or COLUMNS, CACHE or RAW, VERSIONS
    
    If no columns are specified, all columns will be scanned.
    To scan all members of a column family, leave the qualifier empty as in
    'col_family:'.
    
    The filter can be specified in two ways:
    1. Using a filterString - more information on this is available in the
    Filter Language document attached to the HBASE-4176 JIRA
    2. Using the entire package name of the filter.
    
    Some examples:
    
      hbase> scan 'hbase:meta'
      hbase> scan 'hbase:meta', {COLUMNS => 'info:regioninfo'}
      hbase> scan 'ns1:t1', {COLUMNS => ['c1', 'c2'], LIMIT => 10, STARTROW => 
'xyz'}
      hbase> scan 't1', {COLUMNS => ['c1', 'c2'], LIMIT => 10, STARTROW => 
'xyz'}
      hbase> scan 't1', {COLUMNS => 'c1', TIMERANGE => [1303668804, 1303668904]}
      hbase> scan 't1', {REVERSED => true}
      hbase> scan 't1', {ROWPREFIXFILTER => 'row2', FILTER => "
        (QualifierFilter (>=, 'binary:xyz')) AND (TimestampsFilter ( 123, 
456))"}
      hbase> scan 't1', {FILTER =>
        org.apache.hadoop.hbase.filter.ColumnPaginationFilter.new(1, 0)}
      hbase> scan 't1', {CONSISTENCY => 'TIMELINE'}
    For setting the Operation Attributes 
      hbase> scan 't1', { COLUMNS => ['c1', 'c2'], ATTRIBUTES => {'mykey' => 
'myvalue'}}
      hbase> scan 't1', { COLUMNS => ['c1', 'c2'], AUTHORIZATIONS => 
['PRIVATE','SECRET']}
    For experts, there is an additional option -- CACHE_BLOCKS -- which
    switches block caching for the scanner on (true) or off (false).  By
    default it is enabled.  Examples:
    
      hbase> scan 't1', {COLUMNS => ['c1', 'c2'], CACHE_BLOCKS => false}
    
    Also for experts, there is an advanced option -- RAW -- which instructs the
    scanner to return all cells (including delete markers and uncollected 
deleted
    cells). This option cannot be combined with requesting specific COLUMNS.
    Disabled by default.  Example:
    
      hbase> scan 't1', {RAW => true, VERSIONS => 10}
    
    Besides the default 'toStringBinary' format, 'scan' supports custom 
formatting
    by column.  A user can define a FORMATTER by adding it to the column name in
    the scan specification.  The FORMATTER can be stipulated: 
    
     1. either as a org.apache.hadoop.hbase.util.Bytes method name (e.g, toInt, 
toString)
     2. or as a custom class followed by method name: e.g. 
'c(MyFormatterClass).format'.
    
    Example formatting cf:qualifier1 and cf:qualifier2 both as Integers: 
      hbase> scan 't1', {COLUMNS => ['cf:qualifier1:toInt',
        'cf:qualifier2:c(org.apache.hadoop.hbase.util.Bytes).toInt'] } 
    
    Note that you can specify a FORMATTER by column only (cf:qualifier).  You 
cannot
    specify a FORMATTER for all columns of a column family.
    
    Scan can also be used directly from a table, by first getting a reference 
to a
    table, like such:
    
      hbase> t = get_table 't'
      hbase> t.scan
    
    Note in the above situation, you can still provide all the filtering, 
columns,
    options, etc as described above.
    
    
    
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:file:/grid/0/hdp/2.4.2.0-244/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:file:/grid/0/hdp/2.4.2.0-244/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    HBase Shell; enter 'help<RETURN>' for list of supported commands.
    Type "exit<RETURN>" to leave the HBase Shell
    Version 1.1.2.2.4.2.0-244, rUnknown, Thu Apr 21 12:51:04 UTC 2016
    
    scan 'ambarismoketest'
    ROW  COLUMN+CELL
    
    ERROR: Unknown table ambarismoketest!
    
    Here is some help for this command:
    Scan a table; pass table name and optionally a dictionary of scanner
    specifications.  Scanner specifications may include one or more of:
    TIMERANGE, FILTER, LIMIT, STARTROW, STOPROW, ROWPREFIXFILTER, TIMESTAMP,
    MAXLENGTH or COLUMNS, CACHE or RAW, VERSIONS
    
    If no columns are specified, all columns will be scanned.
    To scan all members of a column family, leave the qualifier empty as in
    'col_family:'.
    
    The filter can be specified in two ways:
    1. Using a filterString - more information on this is available in the
    Filter Language document attached to the HBASE-4176 JIRA
    2. Using the entire package name of the filter.
    
    Some examples:
    
      hbase> scan 'hbase:meta'
      hbase> scan 'hbase:meta', {COLUMNS => 'info:regioninfo'}
      hbase> scan 'ns1:t1', {COLUMNS => ['c1', 'c2'], LIMIT => 10, STARTROW => 
'xyz'}
      hbase> scan 't1', {COLUMNS => ['c1', 'c2'], LIMIT => 10, STARTROW => 
'xyz'}
      hbase> scan 't1', {COLUMNS => 'c1', TIMERANGE => [1303668804, 1303668904]}
      hbase> scan 't1', {REVERSED => true}
      hbase> scan 't1', {ROWPREFIXFILTER => 'row2', FILTER => "
        (QualifierFilter (>=, 'binary:xyz')) AND (TimestampsFilter ( 123, 
456))"}
      hbase> scan 't1', {FILTER =>
        org.apache.hadoop.hbase.filter.ColumnPaginationFilter.new(1, 0)}
      hbase> scan 't1', {CONSISTENCY => 'TIMELINE'}
    For setting the Operation Attributes 
      hbase> scan 't1', { COLUMNS => ['c1', 'c2'], ATTRIBUTES => {'mykey' => 
'myvalue'}}
      hbase> scan 't1', { COLUMNS => ['c1', 'c2'], AUTHORIZATIONS => 
['PRIVATE','SECRET']}
    For experts, there is an additional option -- CACHE_BLOCKS -- which
    switches block caching for the scanner on (true) or off (false).  By
    default it is enabled.  Examples:
    
      hbase> scan 't1', {COLUMNS => ['c1', 'c2'], CACHE_BLOCKS => false}
    
    Also for experts, there is an advanced option -- RAW -- which instructs the
    scanner to return all cells (including delete markers and uncollected 
deleted
    cells). This option cannot be combined with requesting specific COLUMNS.
    Disabled by default.  Example:
    
      hbase> scan 't1', {RAW => true, VERSIONS => 10}
    
    Besides the default 'toStringBinary' format, 'scan' supports custom 
formatting
    by column.  A user can define a FORMATTER by adding it to the column name in
    the scan specification.  The FORMATTER can be stipulated: 
    
     1. either as a org.apache.hadoop.hbase.util.Bytes method name (e.g, toInt, 
toString)
     2. or as a custom class followed by method name: e.g. 
'c(MyFormatterClass).format'.
    
    Example formatting cf:qualifier1 and cf:qualifier2 both as Integers: 
      hbase> scan 't1', {COLUMNS => ['cf:qualifier1:toInt',
        'cf:qualifier2:c(org.apache.hadoop.hbase.util.Bytes).toInt'] } 
    
    Note that you can specify a FORMATTER by column only (cf:qualifier).  You 
cannot
    specify a FORMATTER for all columns of a column family.
    
    Scan can also be used directly from a table, by first getting a reference 
to a
    table, like such:
    
      hbase> t = get_table 't'
      hbase> t.scan
    
    Note in the above situation, you can still provide all the filtering, 
columns,
    options, etc as described above.
    
    
    
    Looking for id16ac4252_date492216
    

Please help take a look. This test passed yesterday on

    
    
    
    ambari-server version: ambari-server-2.2.2.0-450
    ambari-server --hash: a564058a4acce03403aa645721eaed420288ec24
    HDP Stack: 2.4
    HDP Version: 2.4.2.0-236
    

<http://dashboard.qe.hortonworks.com:5000/#/testHistory?tc_id=508757>


Diffs
-----

  
ambari-common/src/main/python/resource_management/libraries/functions/ranger_functions.py
 84a03a3 

Diff: https://reviews.apache.org/r/46653/diff/


Testing
-------

mvn clean test


Thanks,

Andrew Onischuk

Reply via email to