[
https://issues.apache.org/jira/browse/AMBARI-14964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15450571#comment-15450571
]
Hoover commented on AMBARI-14964:
---------------------------------
hi Aravindan Vijayan,
Build and install Ambari 2.4.0 from Source
according the page:
https://cwiki.apache.org/confluence/display/AMBARI/Build+and+install+Ambari+2.4.0+from+Source
when runing:
pushd ambari-metrics
mvn versions:set -DnewVersion=2.4.0
the console logging:
[INFO] Local aggregation root: /root/apache-ambari-2.4.0-src
[INFO] Processing change of org.apache.ambari:ambari-metrics:2.4.0.0.0 -> 2.4.0
[ERROR]
java.io.FileNotFoundException:
/root/apache-ambari-2.4.0-src/ambari-metrics/ambari-metrics (No such file or
directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.codehaus.plexus.util.xml.XmlReader.<init>(XmlReader.java:124)
at
org.codehaus.plexus.util.xml.XmlStreamReader.<init>(XmlStreamReader.java:67)
at
org.codehaus.plexus.util.ReaderFactory.newXmlReader(ReaderFactory.java:118)
at
org.codehaus.mojo.versions.api.PomHelper.readXmlFile(PomHelper.java:1606)
at
org.codehaus.mojo.versions.AbstractVersionsUpdaterMojo.process(AbstractVersionsUpdaterMojo.java:321)
at org.codehaus.mojo.versions.SetMojo.execute(SetMojo.java:298)
at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] Processing org.apache.ambari:ambari-metrics
[INFO] Updating project org.apache.ambari:ambari-metrics
[INFO] from version 2.4.0.0.0 to 2.4.0
how to work arround ?
> AMS cannot be installed on trunk
> --------------------------------
>
> Key: AMBARI-14964
> URL: https://issues.apache.org/jira/browse/AMBARI-14964
> Project: Ambari
> Issue Type: Bug
> Affects Versions: 2.4.0
> Reporter: Aravindan Vijayan
> Assignee: Aravindan Vijayan
> Priority: Blocker
> Fix For: 2.4.0
>
> Attachments: AMBARI-14964.patch
>
>
> Installation of AMS fails due to the following:
> {code}
> stderr:
> Traceback (most recent call last):
> File
> "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_grafana.py",
> line 65, in <module>
> AmsGrafana().execute()
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 238, in execute
> method(env)
> File
> "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_grafana.py",
> line 28, in install
> self.install_packages(env, exclude_packages =
> ['ambari-metrics-collector'])
> TypeError: install_packages() got an unexpected keyword argument
> 'exclude_packages'
> stdout:
> 2016-02-08 23:55:57,518 - Using hadoop conf dir:
> /usr/hdp/current/hadoop-client/conf
> 2016-02-08 23:55:57,519 - Group['hadoop'] {}
> 2016-02-08 23:55:57,520 - Group['users'] {}
> 2016-02-08 23:55:57,520 - User['hive'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,520 - User['zookeeper'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,521 - User['ams'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,521 - User['ambari-qa'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2016-02-08 23:55:57,522 - User['tez'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2016-02-08 23:55:57,522 - User['hdfs'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,523 - User['yarn'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,523 - User['hcat'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,524 - User['mapred'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,524 - User['hbase'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,525 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2016-02-08 23:55:57,526 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2016-02-08 23:55:57,529 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> due to not_if
> 2016-02-08 23:55:57,530 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase',
> 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
> 2016-02-08 23:55:57,530 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2016-02-08 23:55:57,531 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase']
> {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2016-02-08 23:55:57,534 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due
> to not_if
> 2016-02-08 23:55:57,534 - Group['hdfs'] {}
> 2016-02-08 23:55:57,534 - User['hdfs'] {'fetch_nonlocal_groups': True,
> 'groups': [u'hadoop', u'hdfs']}
> 2016-02-08 23:55:57,535 - FS Type:
> 2016-02-08 23:55:57,535 - Directory['/etc/hadoop'] {'mode': 0755}
> 2016-02-08 23:55:57,548 -
> File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content':
> InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
> 2016-02-08 23:55:57,548 -
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner':
> 'hdfs', 'group': 'hadoop', 'mode': 0777}
> 2016-02-08 23:55:57,557 - Repository['HDP-2.3'] {'base_url':
> 'http://public-repo-1.hortonworks.com/HDP/ubuntu12/2.x/updates/2.3.4.0',
> 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template':
> '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP',
> 'mirror_list': None}
> 2016-02-08 23:55:57,562 - File['/tmp/tmpniL5ny'] {'content': 'deb
> http://public-repo-1.hortonworks.com/HDP/ubuntu12/2.x/updates/2.3.4.0 HDP
> main'}
> 2016-02-08 23:55:57,562 - Writing File['/tmp/tmpniL5ny'] because contents
> don't match
> 2016-02-08 23:55:57,563 - File['/tmp/tmpswEIS6'] {'content':
> StaticFile('/etc/apt/sources.list.d/HDP.list')}
> 2016-02-08 23:55:57,563 - Writing File['/tmp/tmpswEIS6'] because contents
> don't match
> 2016-02-08 23:55:57,564 - Repository['HDP-UTILS-1.1.0.20'] {'base_url':
> 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu12',
> 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template':
> '{{package_type}} {{base_url}} {{components}}', 'repo_file_name':
> 'HDP-UTILS', 'mirror_list': None}
> 2016-02-08 23:55:57,565 - File['/tmp/tmpEsfP2D'] {'content': 'deb
> http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu12
> HDP-UTILS main'}
> 2016-02-08 23:55:57,565 - Writing File['/tmp/tmpEsfP2D'] because contents
> don't match
> 2016-02-08 23:55:57,566 - File['/tmp/tmptJGgBL'] {'content':
> StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')}
> 2016-02-08 23:55:57,566 - Writing File['/tmp/tmptJGgBL'] because contents
> don't match
> 2016-02-08 23:55:57,567 - Package['unzip'] {}
> 2016-02-08 23:55:57,581 - Skipping installation of existing package unzip
> 2016-02-08 23:55:57,581 - Package['curl'] {}
> 2016-02-08 23:55:57,595 - Skipping installation of existing package curl
> 2016-02-08 23:55:57,595 - Package['hdp-select'] {}
> 2016-02-08 23:55:57,608 - Skipping installation of existing package hdp-select
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)