Thanks for your explanation!
> 2017. 3. 14. 오후 1:33, Sumit Mohanty <[email protected]> 작성:
>
> You are better off using Ambari from branch-2.5.
>
> Druid probably does not need Ambari Infra, so you can choose not to deploy
> Ambari Infra.
>
> In any case, you should be able to build infra too.
>
> Here is a command that I have seen used for building with HDP being the
> default stack -
>
> mvn -B -nsu -B install rpm:rpm -DskipTests -Dfindbugs.skip=true
> -DdefaultStackVersion=HDP-2.6 -Dviews -Dstack.distribution=HDP --projects
> ambari-web,ambari-project,ambari-views,ambari-admin,contrib/views/hueambarimigration,contrib/views/hive-next,contrib/views/slider,contrib/views/tez,contrib/views/utils,contrib/views/commons,contrib/views/files,contrib/views/hive20,contrib/views/pig,contrib/views/capacity-scheduler,contrib/views/storm,contrib/views/wfmanager,ambari-metrics/ambari-metrics-common,ambari-server,ambari-agent,ambari-client,ambari-shell
> ________________________________________
> From: Dongkyu Hwangbo <[email protected]>
> Sent: Monday, March 13, 2017 9:21 PM
> To: [email protected]
> Subject: Using Ambari trunk to use druid
>
> Hi. Here’s Ambari newbie.
> To test Druid-integrated ambari, try to install HDP 2.6 with trunk ambari src.
> Build trunk src and make rpm. and then successfully install ambari-server rpm
> and ambari-agent rpm.
> After ambari-server setup , try running ambari-server and ambari-agent.
> In web UI, I logged in then try to install specific stack.
> I checked “ambari infra” and “ambari metrics” to collect service and host
> metric, but error was occured with error message below.
>
>
> stderr:
> Traceback (most recent call last):
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 98, in _call_with_retries
> code, out = func(cmd, **kwargs)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 72, in inner
> result = function(command, **kwargs)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 102, in checked_call
> tries=tries, try_sleep=try_sleep,
> timeout_kill_strategy=timeout_kill_strategy)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 150, in _call_wrapper
> result = _call(command, **kwargs_copy)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 303, in _call
> raise ExecutionFailed(err_msg, code, out, err)
> ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install
> ambari-infra-solr-client' returned 1. Error: Nothing to do
>
> The above exception was the cause of the following exception:
>
> Traceback (most recent call last):
> File
> "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py",
> line 123, in <module>
> InfraSolr().execute()
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 314, in execute
> method(env)
> File
> "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py",
> line 35, in install
> self.install_packages(env)
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 629, in install_packages
> retry_count=agent_stack_retry_count)
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
> line 155, in __init__
> self.env.run()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 160, in run
> self.run_action(resource, action)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 124, in run_action
> provider_action()
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 54, in action_install
> self.install_package(package_name, self.resource.use_repos,
> self.resource.skip_repos)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py",
> line 51, in install_package
> self.checked_call_with_retries(cmd, sudo=True,
> logoutput=self.get_logoutput())
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 86, in checked_call_with_retries
> return self._call_with_retries(cmd, is_checked=True, **kwargs)
> File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 98, in _call_with_retries
> code, out = func(cmd, **kwargs)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 72, in inner
> result = function(command, **kwargs)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 102, in checked_call
> tries=tries, try_sleep=try_sleep,
> timeout_kill_strategy=timeout_kill_strategy)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 150, in _call_wrapper
> result = _call(command, **kwargs_copy)
> File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 303, in _call
> raise ExecutionFailed(err_msg, code, out, err)
> resource_management.core.exceptions.ExecutionFailed: Execution of
> '/usr/bin/yum -d 0 -e 0 -y install ambari-infra-solr-client' returned 1.
> Error: Nothing to do
> stdout:
> 2017-03-14 12:20:40,983 - Stack Feature Version Info: stack_version=2.6,
> version=None, current_cluster_version=None -> 2.6
> User Group mapping (user_group) is missing in the hostLevelParams
> 2017-03-14 12:20:40,989 - Group['hadoop'] {}
> 2017-03-14 12:20:40,990 - User['logsearch'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-03-14 12:20:40,991 - User['infra-solr'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-03-14 12:20:40,992 - User['zookeeper'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-03-14 12:20:40,993 - User['ams'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2017-03-14 12:20:40,993 - User['ambari-qa'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': ['users']}
> 2017-03-14 12:20:40,994 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2017-03-14 12:20:40,996 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2017-03-14 12:20:41,019 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> due to not_if
> 2017-03-14 12:20:41,038 - Initializing 2 repositories
> 2017-03-14 12:20:41,039 - Repository['HDP-2.6'] {'base_url':
> 'http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/2.x/BUILDS/2.6.0.2-21',
> 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template':
> '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
> %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
> %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list':
> None}
> 2017-03-14 12:20:41,051 - File['/etc/yum.repos.d/HDP.repo'] {'content':
> InlineTemplate(...)}
> 2017-03-14 12:20:41,053 - Writing File['/etc/yum.repos.d/HDP.repo'] because
> it doesn't exist
> 2017-03-14 12:20:41,053 - Repository['HDP-UTILS-1.1.0.21'] {'base_url':
> 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7',
> 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template':
> '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
> %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
> %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS',
> 'mirror_list': None}
> 2017-03-14 12:20:41,059 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content':
> '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2017-03-14 12:20:41,059 - Package['unzip'] {'retry_on_repo_unavailability':
> False, 'retry_count': 5}
> 2017-03-14 12:20:41,169 - Skipping installation of existing package unzip
> 2017-03-14 12:20:41,170 - Package['curl'] {'retry_on_repo_unavailability':
> False, 'retry_count': 5}
> 2017-03-14 12:20:41,179 - Skipping installation of existing package curl
> 2017-03-14 12:20:41,179 - Package['hdp-select']
> {'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2017-03-14 12:20:41,189 - Installing package hdp-select ('/usr/bin/yum -d 0
> -e 0 -y install hdp-select')
> 2017-03-14 12:20:45,111 - Package['ambari-infra-solr-client']
> {'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2017-03-14 12:20:45,184 - Installing package ambari-infra-solr-client
> ('/usr/bin/yum -d 0 -e 0 -y install ambari-infra-solr-client')
> 2017-03-14 12:20:45,735 - Execution of '/usr/bin/yum -d 0 -e 0 -y install
> ambari-infra-solr-client' returned 1. Error: Nothing to do
> 2017-03-14 12:20:45,735 - Failed to install package ambari-infra-solr-client.
> Executing '/usr/bin/yum clean metadata'
> 2017-03-14 12:20:45,913 - Retrying to install package
> ambari-infra-solr-client after 30 seconds
>
> Command failed after 1 tries
>
>
> It seems that ambari failed to install ambari-infra-solr-client but
> ambari-infra-solr-client rpm file was not maded during building ambari trunk
> source. I tried same process after install ambari-metrics-assembly rpm file
> but same error message was printed. How do I solve this problem?
>
> (another question, It seems that Druid is only integrated with HDP 2.6,
> Building trunk ambari source is only option user can using druid with ambari?)