Hi Pratik,

I've responded to your question in the u...@ambari.apache.org mailing list.

Regards,
Yusaku

From: Pratik Gadiya 
<pratik_gad...@persistent.com<mailto:pratik_gad...@persistent.com>>
Reply-To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" 
<user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Date: Monday, June 1, 2015 4:31 AM
To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" 
<user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: Hive Metastore Service Startup Fails

Hello All,

When I try to deploy hortonworks cluster using ambari blueprint APIs, it 
results in failure while starting up of Hive Metastore service.

The same blueprint most of the times works appropriately on the same 
environment.

The parameter which gets changed in the entire blueprint w.r.t hive is,

Host Mapping File Content:
{'blueprint': 'onemasterblueprint',
'configurations': [{u'hive-env': {u'hive_metastore_user_passwd': 'tkdw1rN&'}},
                    {u'gateway-site': {u'gateway.port': u'8445'}},
                    {u'nagios-env': {u'nagios_contact': 
u'a...@us.ibm.com<mailto:u'a...@us.ibm.com>'}},
                    {u'hive-site': {u'javax.jdo.option.ConnectionPassword': 
'tkdw1rN&'}},
                    {'hdfs-site': {'dfs.datanode.data.dir': 
'/disk1/hadoop/hdfs/data,/disk2/hadoop/hdfs/data',
                                   'dfs.namenode.checkpoint.dir': 
'/disk1/hadoop/hdfs/namesecondary',
                                   'dfs.namenode.name.dir': 
'/disk1/hadoop/hdfs/namenode'}},
                    {'core-site': {'fs.swift.impl': 
'org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem',
                                   'fs.swift.service.softlayer.auth.url': 
'https://dal05.objectstorage.service.networklayer.com/auth/v1.0',
                                   
'fs.swift.service.softlayer.connect.timeout': '120000',
                                   'fs.swift.service.softlayer.public': 'false',
                                   'fs.swift.service.softlayer.use.encryption': 
'true',
                                   'fs.swift.service.softlayer.use.get.auth': 
'true'}}],
'default_password': 'tkdw1rN&',
'host_groups': [{'hosts': [{'fqdn': 'vmktest0003.test.analytics.com'}],
                  'name': 'master'},
                 {'hosts': [{'fqdn': 'vmktest0004.test.analytics.com'}],
                  'name': 'compute'}]}

Error.txt:
2015-06-01 05:59:22,178 - Error while executing command 'start':
Traceback (most recent call last):
  File 
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 123, in execute
    method(env)
  File 
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive_metastore.py",
 line 43, in start
    self.configure(env) # FOR SECURITY
  File 
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive_metastore.py",
 line 38, in configure
    hive(name='metastore')
  File 
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive.py",
 line 97, in hive
    not_if = check_schema_created_cmd
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
line 148, in __init__
    self.env.run()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 149, in run
    self.run_action(resource, action)
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 115, in run_action
    provider_action()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
 line 241, in action_run
    raise ex
Fail: Execution of 'export HIVE_CONF_DIR=/etc/hive/conf.server ; 
/usr/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql -userName 
hive -passWord [PROTECTED]' returned 1. 15/06/01 05:59:21 WARN conf.HiveConf: 
HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.heapsize does not 
exist
15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name 
hive.server2.enable.impersonation does not exist
15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name 
hive.auto.convert.sortmerge.join.noconditionaltask does not exist
Metastore connection URL: 
jdbc:mysql://vmktest0009.test.analytics.ibmcloud.com/hive?createDatabaseIfNotExist=true
Metastore Connection Driver :      com.mysql.jdbc.Driver
Metastore connection User:         hive
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema 
version.
*** schemaTool failed ***

Output.txt:


2015-06-01 05:59:07,907 - Changing permission for 
/var/lib/ambari-agent/data/tmp/start_metastore_script from 644 to 755

2015-06-01 05:59:07,909 - Execute['export HIVE_CONF_DIR=/etc/hive/conf.server ; 
/usr/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql -userName 
hive -passWord [PROTECTED]'] {'not_if': 'export 
HIVE_CONF_DIR=/etc/hive/conf.server ; 
/usr/hdp/current/hive-client/bin/schematool -info -dbType mysql -userName hive 
-passWord \'Hb2\'"\'"\'aasz\''}

2015-06-01 05:59:22,178 - Error while executing command 'start':

Traceback (most recent call last):

  File 
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 123, in execute

    method(env)

  File 
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive_metastore.py",
 line 43, in start

    self.configure(env) # FOR SECURITY

  File 
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive_metastore.py",
 line 38, in configure

    hive(name='metastore')

  File 
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive.py",
 line 97, in hive

    not_if = check_schema_created_cmd

  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
line 148, in __init__

    self.env.run()

  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 149, in run

    self.run_action(resource, action)

  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 115, in run_action

    provider_action()

  File 
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
 line 241, in action_run

    raise ex

Fail: Execution of 'export HIVE_CONF_DIR=/etc/hive/conf.server ; 
/usr/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql -userName 
hive -passWord [PROTECTED]' returned 1. 15/06/01 05:59:21 WARN conf.HiveConf: 
HiveConf of name hive.optimize.mapjoin.mapreduce does not exist

15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.heapsize does not 
exist

15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name 
hive.server2.enable.impersonation does not exist

15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name 
hive.auto.convert.sortmerge.join.noconditionaltask does not exist

Metastore connection URL:  
jdbc:mysql://vmktest0009.test.analytics.ibmcloud.com/hive?createDatabaseIfNotExist=true

Metastore Connection Driver :       com.mysql.jdbc.Driver

Metastore connection User:          hive

org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema 
version.

*** schemaTool failed ***


Is there any constraint w.r.t setting up of passwords in ambari.

Please let me know how can I resolve this error so that I can automate the same 
in the deployment.


With Regards,
Pratik Gadiya


DISCLAIMER ========== This e-mail may contain privileged and confidential 
information which is the property of Persistent Systems Ltd. It is intended 
only for the use of the individual or entity to which it is addressed. If you 
are not the intended recipient, you are not authorized to read, retain, copy, 
print, distribute or use this message. If you have received this communication 
in error, please notify the sender and delete all copies of this message. 
Persistent Systems Ltd. does not accept any liability for virus infected mails.

Reply via email to