[
https://issues.apache.org/jira/browse/METRON-895?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15985585#comment-15985585
]
Nick Allen commented on METRON-895:
-----------------------------------
To fix this I need to modify the principals. May need to do this for both
'metron' and the ticket granting principal, but I am not sure. Will try to
narrow down the specific fix.
Metron principal
{code}
kadmin: modprinc -maxlife 1days -maxrenewlife 7days +allow_renewable
[email protected]
{code}
Ticket granting principal
{code}
kadmin: modprinc -maxlife 1days -maxrenewlife 7days +allow_renewable
krbtgt/[email protected]
{code}
> Ambari "Metron Enrichment Start" Fails - The TGT found is not renewable
> -----------------------------------------------------------------------
>
> Key: METRON-895
> URL: https://issues.apache.org/jira/browse/METRON-895
> Project: Metron
> Issue Type: Bug
> Reporter: Nick Allen
>
> After Kerberizing a cluster in Ambari with Metron already installed, I am
> unable to launch the enrichment topology. It complains that the "The TGT
> found is not renewable".
> I am running on CentOS 7 and is likely specific to this version of CentOS.
> {code}
> [root@y113 ~]# cat /etc/centos-release
> CentOS Linux release 7.2.1511 (Core)
> {code}
> {code}
> Traceback (most recent call last):
> File
> "/var/lib/ambari-agent/cache/common-services/METRON/0.4.0/package/scripts/enrichment_master.py",
> line 113, in <module>
> Enrichment().execute()
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 280, in execute
> method(env)
> File
> "/var/lib/ambari-agent/cache/common-services/METRON/0.4.0/package/scripts/enrichment_master.py",
> line 74, in start
> commands.start_enrichment_topology()
> File
> "/var/lib/ambari-agent/cache/common-services/METRON/0.4.0/package/scripts/enrichment_commands.py",
> line 146, in start_enrichment_topology
> user=self.__params.metron_user)
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
> line 155, in __init__
> self.env.run()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 160, in run
> self.run_action(resource, action)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 124, in run_action
> provider_action()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
> line 273, in action_run
> tries=self.resource.tries, try_sleep=self.resource.try_sleep)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 70, in inner
> result = function(command, **kwargs)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 92, in checked_call
> tries=tries, try_sleep=try_sleep)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 140, in _call_wrapper
> result = _call(command, **kwargs_copy)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 293, in _call
> raise ExecutionFailed(err_msg, code, out, err)
> resource_management.core.exceptions.ExecutionFailed: Execution of
> '/usr/metron/0.4.0/bin/start_enrichment_topology.sh
> -s enrichment -z
> y113.l42scl.hortonworks.com:2181,y114.l42scl.hortonworks.com:2181,y115.l42scl.hortonworks.com:2181'
> returned 1. Running: /usr/jdk64/jdk1.8.0_77/bin/java -server -Ddaemon.name=
> -Dstorm.options= -Dstorm.home=/usr/hdp/2.5.3.0-37/storm
> -Dstorm.log.dir=/var/log/storm
> -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file=
> -cp
> /usr/hdp/2.5.3.0-37/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-rename-hack-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.3.0-37/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.3.0-37/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/zookeeper.jar:/usr/hdp/2.5.3.0-37/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-core-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/ambari-metrics-storm-sink.jar
> org.apache.storm.daemon.ClientJarTransformerRunner
> org.apache.storm.hack.StormShadeTransformer
> /usr/metron/0.4.0/lib/metron-enrichment-0.4.0-uber.jar
> /tmp/0bea323c2aba11e781e900351a9eb24a.jar
> Running: /usr/jdk64/jdk1.8.0_77/bin/java -client -Ddaemon.name=
> -Dstorm.options= -Dstorm.home=/usr/hdp/2.5.3.0-37/storm
> -Dstorm.log.dir=/var/log/storm
> -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file=
> -cp
> /usr/hdp/2.5.3.0-37/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-rename-hack-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.3.0-37/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.3.0-37/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.3.0-37/storm/lib/zookeeper.jar:/usr/hdp/2.5.3.0-37/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.3.0-37/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.3.0-37/storm/lib/storm-core-1.0.1.2.5.3.0-37.jar:/usr/hdp/2.5.3.0-37/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.3.0-37/storm/lib/ambari-metrics-storm-sink.jar:/tmp/0bea323c2aba11e781e900351a9eb24a.jar:/home/metron/.storm:/usr/hdp/2.5.3.0-37/storm/bin
> -Dstorm.jar=/tmp/0bea323c2aba11e781e900351a9eb24a.jar
> org.apache.storm.flux.Flux --remote
> /usr/metron/0.4.0/flux/enrichment/remote.yaml --filter
> /usr/metron/0.4.0/config/enrichment.properties
> ███████╗██╗ ██╗ ██╗██╗ ██╗
> ██╔════╝██║ ██║ ██║╚██╗██╔╝
> █████╗ ██║ ██║ ██║ ╚███╔╝
> ██╔══╝ ██║ ██║ ██║ ██╔██╗
> ██║ ███████╗╚██████╔╝██╔╝ ██╗
> ╚═╝ ╚══════╝ ╚═════╝ ╚═╝ ╚═╝
> +- Apache Storm -+
> +- data FLow User eXperience -+
> Version: 1.0.1
> Parsing file: /usr/metron/0.4.0/flux/enrichment/remote.yaml
> 624 [main] INFO o.a.s.f.p.FluxParser - loading YAML from input stream...
> 635 [main] INFO o.a.s.f.p.FluxParser - Performing property substitution.
> 647 [main] INFO o.a.s.f.p.FluxParser - Not performing environment variable
> substitution.
> 900 [main] INFO o.a.c.f.i.CuratorFrameworkImpl - Starting
> 1041 [main-EventThread] INFO o.a.c.f.s.ConnectionStateManager - State
> change: CONNECTED
> 2397 [main] INFO o.a.s.f.FluxBuilder - Detected DSL topology...
> 2676 [main] INFO o.a.s.k.s.KafkaSpoutStream - Declared [streamId = default],
> [outputFields = [value]] for [topic = enrichments]
> ---------- TOPOLOGY DETAILS ----------
> Topology Name: enrichment
> --------------- SPOUTS ---------------
> kafkaSpout [1] (org.apache.metron.storm.kafka.flux.StormKafkaSpout)
> ---------------- BOLTS ---------------
> enrichmentSplitBolt [1]
> (org.apache.metron.enrichment.bolt.EnrichmentSplitterBolt)
> geoEnrichmentBolt [1]
> (org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)
> stellarEnrichmentBolt [1]
> (org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)
> hostEnrichmentBolt [1]
> (org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)
> simpleHBaseEnrichmentBolt [1]
> (org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)
> enrichmentJoinBolt [1] (org.apache.metron.enrichment.bolt.EnrichmentJoinBolt)
> enrichmentErrorOutputBolt [1]
> (org.apache.metron.writer.bolt.BulkMessageWriterBolt)
> threatIntelSplitBolt [1]
> (org.apache.metron.enrichment.bolt.ThreatIntelSplitterBolt)
> simpleHBaseThreatIntelBolt [1]
> (org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)
> stellarThreatIntelBolt [1]
> (org.apache.metron.enrichment.bolt.GenericEnrichmentBolt)
> threatIntelJoinBolt [1]
> (org.apache.metron.enrichment.bolt.ThreatIntelJoinBolt)
> threatIntelErrorOutputBolt [1]
> (org.apache.metron.writer.bolt.BulkMessageWriterBolt)
> outputBolt [1] (org.apache.metron.writer.bolt.BulkMessageWriterBolt)
> --------------- STREAMS ---------------
> kafkaSpout --SHUFFLE--> enrichmentSplitBolt
> enrichmentSplitBolt --FIELDS--> hostEnrichmentBolt
> enrichmentSplitBolt --FIELDS--> geoEnrichmentBolt
> enrichmentSplitBolt --FIELDS--> stellarEnrichmentBolt
> enrichmentSplitBolt --FIELDS--> simpleHBaseEnrichmentBolt
> enrichmentSplitBolt --FIELDS--> enrichmentJoinBolt
> geoEnrichmentBolt --FIELDS--> enrichmentJoinBolt
> stellarEnrichmentBolt --FIELDS--> enrichmentJoinBolt
> simpleHBaseEnrichmentBolt --FIELDS--> enrichmentJoinBolt
> hostEnrichmentBolt --FIELDS--> enrichmentJoinBolt
> geoEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt
> stellarEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt
> hostEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt
> simpleHBaseEnrichmentBolt --FIELDS--> enrichmentErrorOutputBolt
> enrichmentJoinBolt --FIELDS--> threatIntelSplitBolt
> threatIntelSplitBolt --FIELDS--> simpleHBaseThreatIntelBolt
> threatIntelSplitBolt --FIELDS--> stellarThreatIntelBolt
> simpleHBaseThreatIntelBolt --FIELDS--> threatIntelJoinBolt
> stellarThreatIntelBolt --FIELDS--> threatIntelJoinBolt
> threatIntelSplitBolt --FIELDS--> threatIntelJoinBolt
> threatIntelJoinBolt --FIELDS--> outputBolt
> simpleHBaseThreatIntelBolt --FIELDS--> threatIntelErrorOutputBolt
> stellarThreatIntelBolt --FIELDS--> threatIntelErrorOutputBolt
> --------------------------------------
> 2701 [main] INFO o.a.s.f.Flux - Running remotely...
> 2701 [main] INFO o.a.s.f.Flux - Deploying topology in an ACTIVE state...
> 2716 [main] INFO o.a.s.StormSubmitter - Generated ZooKeeper secret payload
> for MD5-digest: -5842172149707655819:-6224174930371947516
> 2772 [main] INFO o.a.s.s.a.AuthUtils - Got AutoCreds
> [org.apache.storm.security.auth.kerberos.AutoTGT@748a654a]
> 2773 [main] INFO o.a.s.StormSubmitter - Running
> org.apache.storm.security.auth.kerberos.AutoTGT@748a654a
> Exception in thread "main" java.lang.RuntimeException:
> java.lang.RuntimeException: The TGT found is not renewable
> at
> org.apache.storm.security.auth.kerberos.AutoTGT.populateCredentials(AutoTGT.java:103)
> at
> org.apache.storm.StormSubmitter.populateCredentials(StormSubmitter.java:94)
> at
> org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:214)
> at
> org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:310)
> at org.apache.storm.flux.Flux.runCli(Flux.java:171)
> at org.apache.storm.flux.Flux.main(Flux.java:98)
> Caused by: java.lang.RuntimeException: The TGT found is not renewable
> at
> org.apache.storm.security.auth.kerberos.AutoTGT.populateCredentials(AutoTGT.java:94)
> ... 5 more
> stdout: /var/lib/ambari-agent/data/output-1451.txt
> 2017-04-26 19:53:36,377 - The hadoop conf dir
> /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for
> version 2.5.3.0-37
> 2017-04-26 19:53:36,380 - Checking if need to create versioned conf dir
> /etc/hadoop/2.5.3.0-37/0
> 2017-04-26 19:53:36,382 - call[('ambari-python-wrap',
> u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop',
> '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False,
> 'sudo': True, 'quiet': False, 'stderr': -1}
> 2017-04-26 19:53:36,420 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist
> already', '')
> 2017-04-26 19:53:36,420 - checked_call[('ambari-python-wrap',
> u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop',
> '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False,
> 'sudo': True, 'quiet': False}
> 2017-04-26 19:53:36,454 - checked_call returned (0, '')
> 2017-04-26 19:53:36,455 - Ensuring that hadoop has the correct symlink
> structure
> 2017-04-26 19:53:36,455 - Using hadoop conf dir:
> /usr/hdp/current/hadoop-client/conf
> 2017-04-26 19:53:36,638 - The hadoop conf dir
> /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for
> version 2.5.3.0-37
> 2017-04-26 19:53:36,640 - Checking if need to create versioned conf dir
> /etc/hadoop/2.5.3.0-37/0
> 2017-04-26 19:53:36,642 - call[('ambari-python-wrap',
> u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop',
> '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False,
> 'sudo': True, 'quiet': False, 'stderr': -1}
> 2017-04-26 19:53:36,681 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist
> already', '')
> 2017-04-26 19:53:36,682 - checked_call[('ambari-python-wrap',
> u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop',
> '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False,
> 'sudo': True, 'quiet': False}
> 2017-04-26 19:53:36,722 - checked_call returned (0, '')
> 2017-04-26 19:53:36,723 - Ensuring that hadoop has the correct symlink
> structure
> 2017-04-26 19:53:36,723 - Using hadoop conf dir:
> /usr/hdp/current/hadoop-client/conf
> 2017-04-26 19:53:36,726 - Group['metron'] {}
> 2017-04-26 19:53:36,728 - Group['livy'] {}
> 2017-04-26 19:53:36,728 - Group['elasticsearch'] {}
> 2017-04-26 19:53:36,729 - Group['spark'] {}
> 2017-04-26 19:53:36,729 - Group['zeppelin'] {}
> 2017-04-26 19:53:36,730 - Group['hadoop'] {}
> 2017-04-26 19:53:36,730 - Group['kibana'] {}
> 2017-04-26 19:53:36,730 - Group['users'] {}
> 2017-04-26 19:53:36,731 - User['hive'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,732 - User['storm'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,733 - User['zookeeper'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,734 - User['ams'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,735 - User['tez'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2017-04-26 19:53:36,736 - User['zeppelin'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,737 - User['metron'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,739 - User['livy'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,740 - User['elasticsearch'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,741 - User['spark'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,742 - User['ambari-qa'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2017-04-26 19:53:36,743 - User['kafka'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,744 - User['hdfs'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,745 - User['yarn'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,746 - User['kibana'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,747 - User['mapred'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,748 - User['hbase'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,749 - User['hcat'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-04-26 19:53:36,750 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2017-04-26 19:53:36,753 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2017-04-26 19:53:36,761 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> due to not_if
> 2017-04-26 19:53:36,761 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase',
> 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
> 2017-04-26 19:53:36,763 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2017-04-26 19:53:36,764 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase']
> {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2017-04-26 19:53:36,772 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due
> to not_if
> 2017-04-26 19:53:36,773 - Group['hdfs'] {}
> 2017-04-26 19:53:36,773 - User['hdfs'] {'fetch_nonlocal_groups': True,
> 'groups': [u'hadoop', u'hdfs']}
> 2017-04-26 19:53:36,774 - FS Type:
> 2017-04-26 19:53:36,774 - Directory['/etc/hadoop'] {'mode': 0755}
> 2017-04-26 19:53:36,796 -
> File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content':
> InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
> 2017-04-26 19:53:36,796 -
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner':
> 'hdfs', 'group': 'hadoop', 'mode': 01777}
> 2017-04-26 19:53:36,816 - Execute[('setenforce', '0')] {'not_if': '(! which
> getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo':
> True, 'only_if': 'test -f /selinux/enforce'}
> 2017-04-26 19:53:36,826 - Skipping Execute[('setenforce', '0')] due to not_if
> 2017-04-26 19:53:36,827 - Directory['/var/log/hadoop'] {'owner': 'root',
> 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
> 2017-04-26 19:53:36,831 - Directory['/var/run/hadoop'] {'owner': 'root',
> 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
> 2017-04-26 19:53:36,832 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs',
> 'create_parents': True, 'cd_access': 'a'}
> 2017-04-26 19:53:36,839 -
> File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties']
> {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
> 2017-04-26 19:53:36,842 -
> File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content':
> Template('health_check.j2'), 'owner': 'root'}
> 2017-04-26 19:53:36,843 -
> File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ...,
> 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
> 2017-04-26 19:53:36,861 -
> File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties']
> {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs',
> 'group': 'hadoop'}
> 2017-04-26 19:53:36,862 -
> File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content':
> StaticFile('task-log4j.properties'), 'mode': 0755}
> 2017-04-26 19:53:36,864 -
> File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner':
> 'hdfs', 'group': 'hadoop'}
> 2017-04-26 19:53:36,870 - File['/etc/hadoop/conf/topology_mappings.data']
> {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'),
> 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
> 2017-04-26 19:53:36,876 - File['/etc/hadoop/conf/topology_script.py']
> {'content': StaticFile('topology_script.py'), 'only_if': 'test -d
> /etc/hadoop/conf', 'mode': 0755}
> 2017-04-26 19:53:37,112 - The hadoop conf dir
> /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for
> version 2.5.3.0-37
> 2017-04-26 19:53:37,114 - Checking if need to create versioned conf dir
> /etc/hadoop/2.5.3.0-37/0
> 2017-04-26 19:53:37,116 - call[('ambari-python-wrap',
> u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop',
> '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False,
> 'sudo': True, 'quiet': False, 'stderr': -1}
> 2017-04-26 19:53:37,153 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist
> already', '')
> 2017-04-26 19:53:37,154 - checked_call[('ambari-python-wrap',
> u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop',
> '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False,
> 'sudo': True, 'quiet': False}
> 2017-04-26 19:53:37,192 - checked_call returned (0, '')
> 2017-04-26 19:53:37,194 - Ensuring that hadoop has the correct symlink
> structure
> 2017-04-26 19:53:37,194 - Using hadoop conf dir:
> /usr/hdp/current/hadoop-client/conf
> 2017-04-26 19:53:37,197 - Running enrichment configure
> 2017-04-26 19:53:37,208 -
> File['/usr/metron/0.4.0/config/enrichment.properties'] {'owner': 'metron',
> 'content': Template('enrichment.properties.j2'), 'group': 'metron'}
> 2017-04-26 19:53:37,211 - Calling security setup
> 2017-04-26 19:53:37,212 - Directory['/usr/metron/0.4.0'] {'owner': 'metron',
> 'group': 'metron', 'create_parents': True, 'mode': 0755}
> 2017-04-26 19:53:37,212 - Directory['/home/metron/.storm'] {'owner':
> 'metron', 'group': 'metron', 'mode': 0755}
> 2017-04-26 19:53:37,216 - File['/usr/metron/0.4.0/client_jaas.conf']
> {'owner': 'metron', 'content': Template('client_jaas.conf.j2'), 'group':
> 'metron', 'mode': 0755}
> 2017-04-26 19:53:37,219 - File['/home/metron/.storm/storm.yaml'] {'owner':
> 'metron', 'content': Template('storm.yaml.j2'), 'group': 'metron', 'mode':
> 0755}
> 2017-04-26 19:53:37,222 - File['/home/metron/.storm/storm.config'] {'owner':
> 'metron', 'content': Template('storm.config.j2'), 'group': 'metron', 'mode':
> 0755}
> 2017-04-26 19:53:37,223 - kinit command: /usr/bin/kinit -kt
> /etc/security/keytabs/metron.headless.keytab [email protected]; as user:
> metron
> 2017-04-26 19:53:37,223 - Execute['/usr/bin/kinit -kt
> /etc/security/keytabs/metron.headless.keytab [email protected]; '] {'user':
> 'metron'}
> 2017-04-26 19:53:37,308 - Create Metron Local Config Directory
> 2017-04-26 19:53:37,308 - Configure Metron global.json
> 2017-04-26 19:53:37,308 - Directory['/usr/metron/0.4.0/config/zookeeper']
> {'owner': 'metron', 'group': 'metron', 'mode': 0755}
> 2017-04-26 19:53:37,313 -
> File['/usr/metron/0.4.0/config/zookeeper/global.json'] {'content':
> InlineTemplate(...), 'owner': 'metron'}
> 2017-04-26 19:53:37,319 -
> File['/usr/metron/0.4.0/config/zookeeper/../elasticsearch.properties']
> {'content': InlineTemplate(...), 'owner': 'metron'}
> 2017-04-26 19:53:37,319 - Loading config into ZooKeeper
> 2017-04-26 19:53:37,320 - Execute['/usr/metron/0.4.0/bin/zk_load_configs.sh
> --mode PUSH -i /usr/metron/0.4.0/config/zookeeper -z
> y113.l42scl.hortonworks.com:2181,y114.l42scl.hortonworks.com:2181,y115.l42scl.hortonworks.com:2181']
> {'path': [u'/usr/jdk64/jdk1.8.0_77/bin']}
> 2017-04-26 19:53:38,975 - Starting Metron enrichment topology: enrichment
> 2017-04-26 19:53:38,976 - Starting enrichment
> 2017-04-26 19:53:38,976 -
> Execute['/usr/metron/0.4.0/bin/start_enrichment_topology.sh
> -s enrichment -z
> y113.l42scl.hortonworks.com:2181,y114.l42scl.hortonworks.com:2181,y115.l42scl.hortonworks.com:2181']
> {'user': 'metron'}
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)