Maybe we should have a kerberos FAQ with stuff like this in it

On May 17, 2017 at 09:14:57, Mohan Venkateshaiah (
[email protected]) wrote:

Hi Justin,



Yes, the problem was users were created before configuring the KDC to issue
renewable tickets, I was under the impression that setting the max_life and
max_renewable_life in /var/kerberos/krb5kdc/kdc.conf and restarting the
kadmin and krb5kdc services would be enough, but as the values were already
stored in KDC it didn’t work. So, as a quick fix I set the renew lifetime
for the existing user and krbtgt realm. I think I need to recreate the KDB
using "kdb5_util create -s" as even for the new users I see the
max_renewable_life is set to 0.



Thanks

Mohan DV



*From: *Justin Leet <[email protected]>
*Reply-To: *"[email protected]" <[email protected]>
*Date: *Tuesday, May 16, 2017 at 7:39 PM
*To: *"[email protected]" <[email protected]>
*Subject: *Re: kerberizing Metron's Deployment on real cluster



Not sure if this is the case for you, but if the KDC was never set up to
issue renewable tickets, but the principals were already created, you'll
have to edit them to issue renewable tickets (KDC change doesn't affect
existing principals).



See point 3:

https://github.com/apache/metron/blob/master/metron-deployment/Kerberos-manual-setup.md#storm-authorization



On Tue, May 16, 2017 at 9:50 AM, Nick Allen <[email protected]> wrote:

Hi Mohan - In my experience you need to setup your KDC so that it can issue
renewable tickets.  I would not move beyond that step until you figure out
that issue.



On Mon, May 15, 2017 at 12:32 PM, Mohan Venkateshaiah <
[email protected]> wrote:

Hi All,



I am enabling kerberos on a 12 node cluster, I did successful installation
of KDC and set all the required properties in the conf files and added the
required principals. While enabling Kerberos through wizard from Amabari it
fails at ‘Start and Test Services “. The task at the failure happened is
‘Metron Enrichment Start’. Below is the Trace for the same, the exception is



Caused by: java.lang.RuntimeException: The TGT found is not renewable



I have set the ‘max_renewable_life = 7d’  In /var/kerberos/krb5kdc/kdc.conf
in in the realm section, If the KDC cannot issue renewable tickets should I
remove this property and proceed ?





stderr:

Traceback (most recent call last):

  File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.0.
1.1.0.0/package/scripts/enrichment_master.py", line 113, in <module>

    Enrichment().execute()

  File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 280, in execute

    method(env)

  File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.0.
1.1.0.0/package/scripts/enrichment_master.py", line 74, in start

    commands.start_enrichment_topology()

  File "/var/lib/ambari-agent/cache/common-services/METRON/0.4.0.
1.1.0.0/package/scripts/enrichment_commands.py", line 146, in
start_enrichment_topology

    user=self.__params.metron_user)

  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
line 155, in __init__

    self.env.run()

  File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run

    self.run_action(resource, action)

  File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action

    provider_action()

  File
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 273, in action_run

    tries=self.resource.tries, try_sleep=self.resource.try_sleep)

  File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
70, in inner

    result = function(command, **kwargs)

  File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
92, in checked_call

    tries=tries, try_sleep=try_sleep)

  File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
140, in _call_wrapper

    result = _call(command, **kwargs_copy)

  File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
293, in _call

    raise ExecutionFailed(err_msg, code, out, err)

resource_management.core.exceptions.ExecutionFailed: Execution of
'/usr/hcp/1.1.0.0-71/metron/bin/start_enrichment_topology.sh
-s enrichment                                     -z
hcpa-11.openstacklocal:2181,hcpa-12.openstacklocal:2181,hcpa-10.openstacklocal:2181'
returned 1. Running: /usr/jdk64/jdk1.8.0_77/bin/java -server -Ddaemon.name=
-Dstorm.options= -Dstorm.home=/grid/0/hdp/2.5.3.0-37/storm
-Dstorm.log.dir=/var/log/storm
-Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib
-Dstorm.conf.file= -cp
/grid/0/hdp/2.5.3.0-37/storm/lib/zookeeper.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/storm-core-1.0.1.2.5.3.0-37.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/kryo-3.0.3.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-slf4j-impl-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-core-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/ring-cors-0.1.5.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-api-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/servlet-api-2.5.jar:/grid/0/hdp/
2.5.3.0-37/storm/lib/minlog-1.3.0.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-over-slf4j-1.6.6.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/objenesis-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/asm-5.0.3.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/clojure-1.7.0.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/storm-rename-hack-1.0.1.2.5.3.0-37.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/disruptor-3.3.2.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/slf4j-api-1.7.7.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/reflectasm-1.10.1.jar
org.apache.storm.daemon.ClientJarTransformerRunner
org.apache.storm.hack.StormShadeTransformer
/usr/hcp/1.1.0.0-71/metron/lib/metron-enrichment-0.4.0.1.1.0.0-71-uber.jar
/tmp/e11b7abc396311e7b0c2fa163e0f2645.jar

Running: /usr/jdk64/jdk1.8.0_77/bin/java -client -Ddaemon.name=
-Dstorm.options= -Dstorm.home=/grid/0/hdp/2.5.3.0-37/storm
-Dstorm.log.dir=/var/log/storm
-Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib
-Dstorm.conf.file= -cp
/grid/0/hdp/2.5.3.0-37/storm/lib/zookeeper.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/storm-core-1.0.1.2.5.3.0-37.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/kryo-3.0.3.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-slf4j-impl-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-core-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/ring-cors-0.1.5.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-api-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/servlet-api-2.5.jar:/grid/0/hdp/
2.5.3.0-37/storm/lib/minlog-1.3.0.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/log4j-over-slf4j-1.6.6.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/objenesis-2.1.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/asm-5.0.3.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/clojure-1.7.0.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/storm-rename-hack-1.0.1.2.5.3.0-37.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/disruptor-3.3.2.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/slf4j-api-1.7.7.jar:/grid/0/hdp/2.5.3.0-37/storm/lib/reflectasm-1.10.1.jar:/tmp/e11b7abc396311e7b0c2fa163e0f2645.jar:/home/metron/.storm:/grid/0/hdp/2.5.3.0-37/storm/bin
-Dstorm.jar=/tmp/e11b7abc396311e7b0c2fa163e0f2645.jar
org.apache.storm.flux.Flux --remote
/usr/hcp/1.1.0.0-71/metron/flux/enrichment/remote.yaml --filter
/usr/hcp/1.1.0.0-71/metron/config/enrichment.properties

███████╗██╗     ██╗   ██╗██╗  ██╗

██╔════╝██║     ██║   ██║╚██╗██╔╝

█████╗  ██║     ██║   ██║ ╚███╔╝

██╔══╝  ██║     ██║   ██║ ██╔██╗

██║     ███████╗╚██████╔╝██╔╝ ██╗

╚═╝     ╚══════╝ ╚═════╝ ╚═╝  ╚═╝

+-         Apache Storm        -+

+-  data FLow User eXperience  -+

Version: 1.0.1

Parsing file: /usr/hcp/1.1.0.0-71/metron/flux/enrichment/remote.yaml

629  [main] INFO  o.a.s.f.p.FluxParser - loading YAML from input stream...

640  [main] INFO  o.a.s.f.p.FluxParser - Performing property substitution.

658  [main] INFO  o.a.s.f.p.FluxParser - Not performing environment
variable substitution.

982  [main] INFO  o.a.c.f.i.CuratorFrameworkImpl - Starting

1097 [main-EventThread] INFO  o.a.c.f.s.ConnectionStateManager - State
change: CONNECTED

1439 [main] INFO  o.a.s.f.FluxBuilder - Detected DSL topology...

1728 [main] INFO  o.a.s.k.s.KafkaSpoutStream - Declared [streamId =
default], [outputFields = [value]] for [topic = enrichments]

---------- TOPOLOGY DETAILS ----------

Topology Name: enrichment

--------------- SPOUTS ---------------

kafkaSpout [1] (org.apache.metron.storm.kafka.flux.StormKafkaSpout)

---------------- BOLTS ---------------

enrichmentSplitBolt [1]
(org.apache.metron.enrichment.bolt.EnrichmentSplitterBolt)

geoEnrichmentBolt [1]
(org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)

stellarEnrichmentBolt [1]
(org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)

hostEnrichmentBolt [1]
(org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)

simpleHBaseEnrichmentBolt [1]
(org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)

enrichmentJoinBolt [1]
(org.apache.metron.enrichment.bolt.EnrichmentJoinBolt)

enrichmentErrorOutputBolt [1]
(org.apache.metron.writer.bolt.BulkMessageWriterBolt)

threatIntelSplitBolt [1]
(org.apache.metron.enrichment.bolt.ThreatIntelSplitterBolt)

simpleHBaseThreatIntelBolt [1]
(org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)

stellarThreatIntelBolt [1]
(org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)

threatIntelJoinBolt [1]
(org.apache.metron.enrichment.bolt.ThreatIntelJoinBolt)

threatIntelErrorOutputBolt [1]
(org.apache.metron.writer.bolt.BulkMessageWriterBolt)

outputBolt [1] (org.apache.metron.writer.bolt.BulkMessageWriterBolt)

--------------- STREAMS ---------------

kafkaSpout --SHUFFLE--> enrichmentSplitBolt

enrichmentSplitBolt --FIELDS--> hostEnrichmentBolt

enrichmentSplitBolt --FIELDS--> geoEnrichmentBolt

enrichmentSplitBolt --FIELDS--> stellarEnrichmentBolt

enrichmentSplitBolt --FIELDS--> simpleHBaseEnrichmentBolt

enrichmentSplitBolt --FIELDS--> enrichmentJoinBolt

geoEnrichmentBolt --FIELDS--> enrichmentJoinBolt

stellarEnrichmentBolt --FIELDS--> enrichmentJoinBolt

simpleHBaseEnrichmentBolt --FIELDS--> enrichmentJoinBolt

hostEnrichmentBolt --FIELDS--> enrichmentJoinBolt

geoEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt

stellarEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt

hostEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt

simpleHBaseEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt

enrichmentJoinBolt --FIELDS--> threatIntelSplitBolt

threatIntelSplitBolt --FIELDS--> simpleHBaseThreatIntelBolt

threatIntelSplitBolt --FIELDS--> stellarThreatIntelBolt

simpleHBaseThreatIntelBolt --FIELDS--> threatIntelJoinBolt

stellarThreatIntelBolt --FIELDS--> threatIntelJoinBolt

threatIntelSplitBolt --FIELDS--> threatIntelJoinBolt

threatIntelJoinBolt --FIELDS--> outputBolt

simpleHBaseThreatIntelBolt --FIELDS--> threatIntelErrorOutputBolt

stellarThreatIntelBolt --FIELDS--> threatIntelErrorOutputBolt

--------------------------------------

1752 [main] INFO  o.a.s.f.Flux - Running remotely...

1752 [main] INFO  o.a.s.f.Flux - Deploying topology in an ACTIVE state...

1768 [main] INFO  o.a.s.StormSubmitter - Generated ZooKeeper secret payload
for MD5-digest: -7795983889088274963:-7645329918562802951

1851 [main] INFO  o.a.s.s.a.AuthUtils - Got AutoCreds
[org.apache.storm.security.auth.kerberos.AutoTGT@798256c5]

1851 [main] INFO  o.a.s.StormSubmitter - Running
org.apache.storm.security.auth.kerberos.AutoTGT@798256c5

Exception in thread "main" java.lang.RuntimeException:
java.lang.RuntimeException: The TGT found is not renewable

                at
org.apache.storm.security.auth.kerberos.AutoTGT.populateCredentials(AutoTGT.java:103)

                at
org.apache.storm.StormSubmitter.populateCredentials(StormSubmitter.java:94)

                at
org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:214)

                at
org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:310)

                at org.apache.storm.flux.Flux.runCli(Flux.java:171)

                at org.apache.storm.flux.Flux.main(Flux.java:98)

Caused by: java.lang.RuntimeException: The TGT found is not renewable

                at
org.apache.storm.security.auth.kerberos.AutoTGT.populateCredentials(AutoTGT.java:94)

                ... 5 more

stdout:

2017-05-15 11:44:35,171 - Using hadoop conf dir:
/usr/hdp/current/hadoop-client/conf

2017-05-15 11:44:35,314 - Using hadoop conf dir:
/usr/hdp/current/hadoop-client/conf

2017-05-15 11:44:35,316 - Group['metron'] {}

2017-05-15 11:44:35,317 - Group['livy'] {}

2017-05-15 11:44:35,317 - Group['elasticsearch'] {}

2017-05-15 11:44:35,317 - Group['spark'] {}

2017-05-15 11:44:35,317 - Group['zeppelin'] {}

2017-05-15 11:44:35,317 - Group['hadoop'] {}

2017-05-15 11:44:35,318 - Group['kibana'] {}

2017-05-15 11:44:35,318 - Group['users'] {}

2017-05-15 11:44:35,318 - User['hive'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,319 - User['storm'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,319 - User['zookeeper'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,320 - User['tez'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'users']}

2017-05-15 11:44:35,320 - User['zeppelin'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,321 - User['metron'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,321 - User['livy'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,322 - User['elasticsearch'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,323 - User['spark'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,323 - User['ambari-qa'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'users']}

2017-05-15 11:44:35,324 - User['kafka'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,324 - User['hdfs'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,325 - User['yarn'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,325 - User['kibana'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,326 - User['mapred'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,326 - User['hbase'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,327 - User['hcat'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}

2017-05-15 11:44:35,328 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}

2017-05-15 11:44:35,329 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
{'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}

2017-05-15 11:44:35,344 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
due to not_if

2017-05-15 11:44:35,344 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase',
'create_parents': True, 'mode': 0775, 'cd_access': 'a'}

2017-05-15 11:44:35,345 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}

2017-05-15 11:44:35,346 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase']
{'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}

2017-05-15 11:44:35,363 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due
to not_if

2017-05-15 11:44:35,364 - Group['hdfs'] {}

2017-05-15 11:44:35,364 - User['hdfs'] {'fetch_nonlocal_groups': True,
'groups': [u'hadoop', u'hdfs']}

2017-05-15 11:44:35,364 - FS Type:

2017-05-15 11:44:35,365 - Directory['/etc/hadoop'] {'mode': 0755}

2017-05-15 11:44:35,379 -
File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content':
InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}

2017-05-15 11:44:35,379 -
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner':
'hdfs', 'group': 'hadoop', 'mode': 01777}

2017-05-15 11:44:35,396 - Execute[('setenforce', '0')] {'not_if': '(! which
getenforce ) || (which getenforce && getenforce | grep -q Disabled)',
'sudo': True, 'only_if': 'test -f /selinux/enforce'}

2017-05-15 11:44:35,429 - Skipping Execute[('setenforce', '0')] due to
only_if

2017-05-15 11:44:35,430 - Directory['/var/log/hadoop'] {'owner': 'root',
'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}

2017-05-15 11:44:35,433 - Directory['/var/run/hadoop'] {'owner': 'root',
'create_parents': True, 'group': 'root', 'cd_access': 'a'}

2017-05-15 11:44:35,434 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs',
'create_parents': True, 'cd_access': 'a'}

2017-05-15 11:44:35,441 - File['/usr/hdp/current/hadoop-client/conf/
commons-logging.properties'] {'content':
Template('commons-logging.properties.j2'), 'owner': 'root'}

2017-05-15 11:44:35,442 -
File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content':
Template('health_check.j2'), 'owner': 'root'}

2017-05-15 11:44:35,443 -
File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content':
..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}

2017-05-15 11:44:35,454 - File['/usr/hdp/current/hadoop-client/conf/
hadoop-metrics2.properties'] {'content':
Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group':
'hadoop'}

2017-05-15 11:44:35,455 -
File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties']
{'content': StaticFile('task-log4j.properties'), 'mode': 0755}

2017-05-15 11:44:35,456 -
File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner':
'hdfs', 'group': 'hadoop'}

2017-05-15 11:44:35,460 - File['/etc/hadoop/conf/topology_mappings.data']
{'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'),
'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}

2017-05-15 11:44:35,477 - File['/etc/hadoop/conf/topology_script.py']
{'content': StaticFile('topology_script.py'), 'only_if': 'test -d
/etc/hadoop/conf', 'mode': 0755}

2017-05-15 11:44:35,709 - Using hadoop conf dir:
/usr/hdp/current/hadoop-client/conf

2017-05-15 11:44:35,711 - Running enrichment configure

2017-05-15 11:44:35,717 -
File['/usr/hcp/1.1.0.0-71/metron/config/enrichment.properties'] {'owner':
'metron', 'content': Template('enrichment.properties.j2'), 'group':
'metron'}

2017-05-15 11:44:35,719 - Calling security setup

2017-05-15 11:44:35,720 - Directory['/usr/hcp/1.1.0.0-71/metron'] {'owner':
'metron', 'group': 'metron', 'create_parents': True, 'mode': 0755}

2017-05-15 11:44:35,720 - Directory['/home/metron/.storm'] {'owner':
'metron', 'group': 'metron', 'mode': 0755}

2017-05-15 11:44:35,722 -
File['/usr/hcp/1.1.0.0-71/metron/client_jaas.conf'] {'owner': 'metron',
'content': Template('client_jaas.conf.j2'), 'group': 'metron', 'mode': 0755}

2017-05-15 11:44:35,724 - File['/home/metron/.storm/storm.yaml'] {'owner':
'metron', 'content': Template('storm.yaml.j2'), 'group': 'metron', 'mode':
0755}

2017-05-15 11:44:35,725 - File['/home/metron/.storm/storm.config']
{'owner': 'metron', 'content': Template('storm.config.j2'), 'group':
'metron', 'mode': 0755}

2017-05-15 11:44:35,726 - kinit command: /usr/bin/kinit -kt
/etc/security/keytabs/metron.headless.keytab [email protected];  as user:
metron

2017-05-15 11:44:35,726 - Execute['/usr/bin/kinit -kt
/etc/security/keytabs/metron.headless.keytab [email protected]; ']
{'user': 'metron'}

2017-05-15 11:44:35,807 - Create Metron Local Config Directory

2017-05-15 11:44:35,807 - Configure Metron global.json

2017-05-15 11:44:35,807 -
Directory['/usr/hcp/1.1.0.0-71/metron/config/zookeeper'] {'owner':
'metron', 'group': 'metron', 'mode': 0755}

2017-05-15 11:44:35,810 -
File['/usr/hcp/1.1.0.0-71/metron/config/zookeeper/global.json'] {'content':
InlineTemplate(...), 'owner': 'metron'}

2017-05-15 11:44:35,813 -
File['/usr/hcp/1.1.0.0-71/metron/config/zookeeper/../elasticsearch.properties']
{'content': InlineTemplate(...), 'owner': 'metron'}

2017-05-15 11:44:35,814 - Loading config into ZooKeeper

2017-05-15 11:44:35,814 -
Execute['/usr/hcp/1.1.0.0-71/metron/bin/zk_load_configs.sh --mode PUSH -i
/usr/hcp/1.1.0.0-71/metron/config/zookeeper -z
hcpa-11.openstacklocal:2181,hcpa-12.openstacklocal:2181,hcpa-10.openstacklocal:2181']
{'path': [u'/usr/jdk64/jdk1.8.0_77/bin']}

2017-05-15 11:44:37,486 - Starting Metron enrichment topology: enrichment

2017-05-15 11:44:37,486 - Starting enrichment

2017-05-15 11:44:37,486 -
Execute['/usr/hcp/1.1.0.0-71/metron/bin/start_enrichment_topology.sh
-s enrichment                                     -z
hcpa-11.openstacklocal:2181,hcpa-12.openstacklocal:2181,hcpa-10.openstacklocal:2181']
{'user': 'metron'}



Command failed after 1 tries





Thanks

Mohan DV

Reply via email to