[ambari] branch trunk updated: AMBARI-24010. Fix kafka_extended_sasl_support version check (echekanskiy)

2018-06-01 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new b79e4be  AMBARI-24010. Fix kafka_extended_sasl_support version check 
(echekanskiy)
b79e4be is described below

commit b79e4be51bdf8067958e17998f4e61c92ed98556
Author: Eugene Chekanskiy 
AuthorDate: Fri Jun 1 14:22:21 2018 +0300

AMBARI-24010. Fix kafka_extended_sasl_support version check (echekanskiy)
---
 .../resources/common-services/KAFKA/0.8.1/package/scripts/params.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/ambari-server/src/main/resources/common-services/KAFKA/0.8.1/package/scripts/params.py
 
b/ambari-server/src/main/resources/common-services/KAFKA/0.8.1/package/scripts/params.py
index f07cba8..722fe7c 100644
--- 
a/ambari-server/src/main/resources/common-services/KAFKA/0.8.1/package/scripts/params.py
+++ 
b/ambari-server/src/main/resources/common-services/KAFKA/0.8.1/package/scripts/params.py
@@ -172,7 +172,7 @@ kafka_kerberos_enabled = (('security.inter.broker.protocol' 
in config['configura
 
 
 kafka_other_sasl_enabled = not kerberos_security_enabled and 
check_stack_feature(StackFeature.KAFKA_LISTENERS, stack_version_formatted) and \
-  
check_stack_feature(StackFeature.KAFKA_EXTENDED_SASL_SUPPORT, 
stack_version_formatted) and \
+  
check_stack_feature(StackFeature.KAFKA_EXTENDED_SASL_SUPPORT, 
format_stack_version(version_for_stack_feature_checks)) and \
   (("SASL_PLAINTEXT" in 
config['configurations']['kafka-broker']['listeners']) or
   ("PLAINTEXTSASL" in 
config['configurations']['kafka-broker']['listeners']) or
   ("SASL_SSL" in 
config['configurations']['kafka-broker']['listeners']))

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-23539. Unable to install stacks without hdfs-site config (echekanskiy)

2018-04-13 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 73e7362  AMBARI-23539. Unable to install stacks without hdfs-site 
config (echekanskiy)
73e7362 is described below

commit 73e7362ce6bf0eb58b27a06a998e625b1e85ac55
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Wed Apr 11 13:57:51 2018 +0300

AMBARI-23539. Unable to install stacks without hdfs-site config 
(echekanskiy)
---
 .../src/main/resources/stack-hooks/before-ANY/scripts/params.py| 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/ambari-server/src/main/resources/stack-hooks/before-ANY/scripts/params.py 
b/ambari-server/src/main/resources/stack-hooks/before-ANY/scripts/params.py
index 3fcac39..6f6cd39 100644
--- a/ambari-server/src/main/resources/stack-hooks/before-ANY/scripts/params.py
+++ b/ambari-server/src/main/resources/stack-hooks/before-ANY/scripts/params.py
@@ -207,7 +207,8 @@ dfs_ha_nameservices = 
default('/configurations/hdfs-site/dfs.internal.nameservic
 if dfs_ha_nameservices is None:
   dfs_ha_nameservices = default('/configurations/hdfs-site/dfs.nameservices', 
None)
 
-dfs_ha_namenode_ids_all_ns = get_properties_for_all_nameservices(hdfs_site, 
'dfs.ha.namenodes')
+# on stacks without any filesystem there is no hdfs-site
+dfs_ha_namenode_ids_all_ns = get_properties_for_all_nameservices(hdfs_site, 
'dfs.ha.namenodes') if 'hdfs-site' in config['configurations'] else {}
 dfs_ha_automatic_failover_enabled = 
default("/configurations/hdfs-site/dfs.ha.automatic-failover.enabled", False)
 
 # Values for the current Host

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-23260. Add reliable way of getting base service/stack_advisor.py file path (echekanskiy)

2018-03-16 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 8440247  AMBARI-23260. Add reliable way of getting base 
service/stack_advisor.py file path (echekanskiy)
8440247 is described below

commit 844024779ccc2c4f31d4bf76bbc7fc113a212490
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
AuthorDate: Fri Mar 16 07:54:04 2018 -0400

AMBARI-23260. Add reliable way of getting base service/stack_advisor.py 
file path (echekanskiy)
---
 .../services/stackadvisor/StackAdvisorRunner.java  |  9 +++-
 .../stackadvisor/StackAdvisorRunnerTest.java   | 24 --
 2 files changed, 26 insertions(+), 7 deletions(-)

diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunner.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunner.java
index 3ba9f6b..556c2c9 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunner.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunner.java
@@ -20,6 +20,7 @@ package org.apache.ambari.server.api.services.stackadvisor;
 
 import java.io.File;
 import java.io.IOException;
+import java.nio.file.Paths;
 import java.util.ArrayList;
 import java.util.List;
 
@@ -34,7 +35,6 @@ import org.slf4j.LoggerFactory;
 import com.google.inject.Inject;
 import com.google.inject.Singleton;
 
-
 @Singleton
 public class StackAdvisorRunner {
 
@@ -78,6 +78,9 @@ public class StackAdvisorRunner {
 ProcessBuilder builder = 
prepareShellCommand(ServiceInfo.ServiceAdvisorType.PYTHON, 
StackAdvisorHelper.pythonStackAdvisorScript, saCommandType,
 actionDirectory, outputFile,
 errorFile);
+builder.environment().put("METADATA_DIR_PATH", 
configs.getProperty(Configuration.METADATA_DIR_PATH));
+builder.environment().put("BASE_SERVICE_ADVISOR", 
Paths.get(configs.getProperty(Configuration.METADATA_DIR_PATH), 
"service_advisor.py").toString());
+builder.environment().put("BASE_STACK_ADVISOR", 
Paths.get(configs.getProperty(Configuration.METADATA_DIR_PATH), 
"stack_advisor.py").toString());
 stackAdvisorReturnCode = launchProcess(builder);
 break;
 }
@@ -220,4 +223,8 @@ public class StackAdvisorRunner {
 
 return new ProcessBuilder(builderParameters);
   }
+
+  public void setConfigs(Configuration configs) {
+this.configs = configs;
+  }
 }
diff --git 
a/ambari-server/src/test/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunnerTest.java
 
b/ambari-server/src/test/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunnerTest.java
index 7eb9fb4..a4fea7c 100644
--- 
a/ambari-server/src/test/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunnerTest.java
+++ 
b/ambari-server/src/test/java/org/apache/ambari/server/api/services/stackadvisor/StackAdvisorRunnerTest.java
@@ -26,8 +26,10 @@ import static 
org.powermock.api.support.membermodification.MemberModifier.stub;
 
 import java.io.File;
 import java.io.IOException;
+import java.util.HashMap;
 
 import 
org.apache.ambari.server.api.services.stackadvisor.commands.StackAdvisorCommandType;
+import org.apache.ambari.server.configuration.Configuration;
 import org.apache.ambari.server.state.ServiceInfo;
 import org.junit.After;
 import org.junit.Before;
@@ -64,11 +66,13 @@ public class StackAdvisorRunnerTest {
 File actionDirectory = temp.newFolder("actionDir");
 ProcessBuilder processBuilder = createNiceMock(ProcessBuilder.class);
 StackAdvisorRunner saRunner = new StackAdvisorRunner();
-
+Configuration configMock = createNiceMock(Configuration.class);
+saRunner.setConfigs(configMock);
 stub(PowerMock.method(StackAdvisorRunner.class, "prepareShellCommand"))
 .toReturn(processBuilder);
+expect(processBuilder.environment()).andReturn(new HashMap<>()).times(3);
 expect(processBuilder.start()).andThrow(new IOException());
-replay(processBuilder);
+replay(processBuilder, configMock);
 saRunner.runScript(ServiceInfo.ServiceAdvisorType.PYTHON, saCommandType, 
actionDirectory);
   }
 
@@ -80,12 +84,14 @@ public class StackAdvisorRunnerTest {
 ProcessBuilder processBuilder = createNiceMock(ProcessBuilder.class);
 Process process = createNiceMock(Process.class);
 StackAdvisorRunner saRunner = new StackAdvisorRunner();
-
+Configuration configMock = createNiceMock(Configuration.class);
+saRunner.setConfigs(configMock);
 stub(PowerMock.method(StackAdvisorRunner.class, "prepareShellCommand"))
 .toReturn(processBuilder);
+expect(process

[ambari] branch trunk updated: AMBARI-23221. Before-start hook is failed when hdfs is not available in cluster (echekanskiy)

2018-03-14 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new e189c48  AMBARI-23221. Before-start hook is failed when hdfs is not 
available in cluster (echekanskiy)
e189c48 is described below

commit e189c487e17d795e3f1265eab86ec5765b16e9a6
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Tue Mar 13 19:53:52 2018 +0200

AMBARI-23221. Before-start hook is failed when hdfs is not available in 
cluster (echekanskiy)
---
 .../src/main/resources/stack-hooks/before-START/scripts/params.py   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/ambari-server/src/main/resources/stack-hooks/before-START/scripts/params.py 
b/ambari-server/src/main/resources/stack-hooks/before-START/scripts/params.py
index db6caf2..da8ef5e 100644
--- 
a/ambari-server/src/main/resources/stack-hooks/before-START/scripts/params.py
+++ 
b/ambari-server/src/main/resources/stack-hooks/before-START/scripts/params.py
@@ -325,7 +325,7 @@ else:
   namenode_rpc = default('/configurations/hdfs-site/dfs.namenode.rpc-address', 
default_fs)
 
 # if HDFS is not installed in the cluster, then don't try to access 
namenode_rpc
-if "core-site" in config['configurations'] and namenode_rpc:
+if has_namenode and namenode_rpc and "core-site" in config['configurations']:
   port_str = namenode_rpc.split(':')[-1].strip()
   try:
 nn_rpc_client_port = int(port_str)

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-23082. Install-mpack --purge does not works (echekanskiy)

2018-03-06 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 8cd0a62  AMBARI-23082. Install-mpack --purge does not works 
(echekanskiy)
8cd0a62 is described below

commit 8cd0a62612bcdc5f5b6ec239817f799f07c7ae09
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Mon Feb 26 14:09:20 2018 +0200

AMBARI-23082. Install-mpack --purge does not works (echekanskiy)
---
 .../main/java/org/apache/ambari/server/checks/MpackInstallChecker.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/checks/MpackInstallChecker.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/checks/MpackInstallChecker.java
index d471c1f..da83eb4 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/checks/MpackInstallChecker.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/checks/MpackInstallChecker.java
@@ -28,6 +28,7 @@ import java.util.Map;
 
 import org.apache.ambari.server.audit.AuditLoggerModule;
 import org.apache.ambari.server.controller.ControllerModule;
+import org.apache.ambari.server.ldap.LdapModule;
 import org.apache.ambari.server.orm.DBAccessor;
 import org.apache.commons.cli.CommandLine;
 import org.apache.commons.cli.CommandLineParser;
@@ -203,7 +204,7 @@ public class MpackInstallChecker {
*/
   public static void main(String[] args) throws Exception {
 
-Injector injector = Guice.createInjector(new ControllerModule(), new 
MpackCheckerAuditModule());
+Injector injector = Guice.createInjector(new ControllerModule(), new 
MpackCheckerAuditModule(), new LdapModule());
 MpackInstallChecker mpackInstallChecker = 
injector.getInstance(MpackInstallChecker.class);
 MpackContext mpackContext = processArguments(args);
 

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch branch-2.6 updated: AMBARI-23135. Fix unicode_tolerant_fs.py (echekanskiy)

2018-03-02 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch branch-2.6
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/branch-2.6 by this push:
 new 68aa59f  AMBARI-23135. Fix unicode_tolerant_fs.py (echekanskiy)
68aa59f is described below

commit 68aa59f0fa734f3c0012990f11da48a814961021
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
AuthorDate: Sat Mar 3 01:13:37 2018 +0200

AMBARI-23135. Fix unicode_tolerant_fs.py (echekanskiy)
---
 .../src/main/python/ambari_commons/unicode_tolerant_fs.py | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git 
a/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py 
b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
index a2b8e6b..ab95c60 100644
--- a/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
+++ b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
@@ -19,6 +19,11 @@ Ambari Agent
 
 """
 
+def get_encoded_string(data):
+  try:
+return data.encode("utf8")
+  except UnicodeDecodeError:
+return data
 
 def unicode_walk(top, topdown=True, onerror=None, followlinks=False):
   """
@@ -31,7 +36,7 @@ def unicode_walk(top, topdown=True, onerror=None, 
followlinks=False):
 
   islink, join, isdir = os.path.islink, os.path.join, os.path.isdir
 
-  top = top.encode("utf8")
+  top = get_encoded_string(top)
 
   try:
 # Note that listdir and error are globals in this module due
@@ -44,7 +49,7 @@ def unicode_walk(top, topdown=True, onerror=None, 
followlinks=False):
 
   dirs, nondirs = [], []
   for name in names:
-name = name.encode("utf8")
+name = get_encoded_string(name)
 if isdir(join(top, name)):
   dirs.append(name)
 else:
@@ -53,7 +58,7 @@ def unicode_walk(top, topdown=True, onerror=None, 
followlinks=False):
   if topdown:
 yield top, dirs, nondirs
   for name in dirs:
-name = name.encode("utf8")
+name = get_encoded_string(name)
 new_path = join(top, name)
 if followlinks or not islink(new_path):
   for x in unicode_walk(new_path, topdown, onerror, followlinks):

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-23135. Fix unicode_tolerant_fs.py (echekanskiy)

2018-03-02 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 56be3a7  AMBARI-23135. Fix unicode_tolerant_fs.py (echekanskiy)
56be3a7 is described below

commit 56be3a747dee7ce1bf2169e654e75ceaef4a5b7c
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
AuthorDate: Sat Mar 3 01:29:47 2018 +0200

AMBARI-23135. Fix unicode_tolerant_fs.py (echekanskiy)
---
 .../main/python/ambari_commons/unicode_tolerant_fs.py| 16 
 1 file changed, 8 insertions(+), 8 deletions(-)

diff --git 
a/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py 
b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
index a2b8e6b..fe0c0bb 100644
--- a/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
+++ b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
@@ -6,19 +6,20 @@ regarding copyright ownership.  The ASF licenses this file
 to you under the Apache License, Version 2.0 (the
 "License"); you may not use this file except in compliance
 with the License.  You may obtain a copy of the License at
-
 http://www.apache.org/licenses/LICENSE-2.0
-
 Unless required by applicable law or agreed to in writing, software
 distributed under the License is distributed on an "AS IS" BASIS,
 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 See the License for the specific language governing permissions and
 limitations under the License.
-
 Ambari Agent
-
 """
 
+def get_encoded_string(data):
+  try:
+return data.encode("utf8")
+  except UnicodeDecodeError:
+return data
 
 def unicode_walk(top, topdown=True, onerror=None, followlinks=False):
   """
@@ -31,7 +32,7 @@ def unicode_walk(top, topdown=True, onerror=None, 
followlinks=False):
 
   islink, join, isdir = os.path.islink, os.path.join, os.path.isdir
 
-  top = top.encode("utf8")
+  top = get_encoded_string(top)
 
   try:
 # Note that listdir and error are globals in this module due
@@ -44,7 +45,7 @@ def unicode_walk(top, topdown=True, onerror=None, 
followlinks=False):
 
   dirs, nondirs = [], []
   for name in names:
-name = name.encode("utf8")
+name = get_encoded_string(name)
 if isdir(join(top, name)):
   dirs.append(name)
 else:
@@ -53,11 +54,10 @@ def unicode_walk(top, topdown=True, onerror=None, 
followlinks=False):
   if topdown:
 yield top, dirs, nondirs
   for name in dirs:
-name = name.encode("utf8")
+name = get_encoded_string(name)
 new_path = join(top, name)
 if followlinks or not islink(new_path):
   for x in unicode_walk(new_path, topdown, onerror, followlinks):
 yield x
   if not topdown:
 yield top, dirs, nondirs
-

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch branch-2.6 updated: AMBARI-22819. Issues with storm jaas files and recursive ch_(mod/own) calls (echekanskiy)

2018-02-19 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch branch-2.6
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/branch-2.6 by this push:
 new f4a4dca  AMBARI-22819. Issues with storm jaas files and recursive 
ch_(mod/own) calls (echekanskiy)
f4a4dca is described below

commit f4a4dca591d18076c0afb26807d2961bc4ac1d46
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Fri Feb 16 13:57:05 2018 +0200

AMBARI-22819. Issues with storm jaas files and recursive ch_(mod/own) calls 
(echekanskiy)
---
 .../python/ambari_commons/unicode_tolerant_fs.py   | 63 ++
 .../main/python/resource_management/core/sudo.py   |  5 +-
 .../STORM/0.9.1/package/scripts/storm.py   | 21 ++--
 .../python/stacks/2.1/STORM/test_storm_base.py | 32 ++-
 .../2.1/STORM/test_storm_jaas_configuration.py |  1 +
 .../python/stacks/2.1/STORM/test_storm_nimbus.py   | 34 +---
 .../stacks/2.1/STORM/test_storm_supervisor_prod.py |  1 -
 .../python/stacks/2.3/STORM/test_storm_base.py |  1 +
 8 files changed, 117 insertions(+), 41 deletions(-)

diff --git 
a/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py 
b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
new file mode 100644
index 000..a2b8e6b
--- /dev/null
+++ b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
@@ -0,0 +1,63 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+Ambari Agent
+
+"""
+
+
+def unicode_walk(top, topdown=True, onerror=None, followlinks=False):
+  """
+  Unicode tolerant version of os.walk. Can(and must) be used environments with 
messed locales(and other encoding-related
+  problems) to traverse directories trees with unicode names in files and 
directories. All others function seems like
+  to accept utf-8 encoded strings, so result of `unicode_walk` can be used 
without a problems.
+  """
+  import os.path
+  import os
+
+  islink, join, isdir = os.path.islink, os.path.join, os.path.isdir
+
+  top = top.encode("utf8")
+
+  try:
+# Note that listdir and error are globals in this module due
+# to earlier import-*.
+names = os.listdir(top)
+  except os.error, err:
+if onerror is not None:
+  onerror(err)
+return
+
+  dirs, nondirs = [], []
+  for name in names:
+name = name.encode("utf8")
+if isdir(join(top, name)):
+  dirs.append(name)
+else:
+  nondirs.append(name)
+
+  if topdown:
+yield top, dirs, nondirs
+  for name in dirs:
+name = name.encode("utf8")
+new_path = join(top, name)
+if followlinks or not islink(new_path):
+  for x in unicode_walk(new_path, topdown, onerror, followlinks):
+yield x
+  if not topdown:
+yield top, dirs, nondirs
+
diff --git a/ambari-common/src/main/python/resource_management/core/sudo.py 
b/ambari-common/src/main/python/resource_management/core/sudo.py
index 2989367..a0a8e45 100644
--- a/ambari-common/src/main/python/resource_management/core/sudo.py
+++ b/ambari-common/src/main/python/resource_management/core/sudo.py
@@ -28,6 +28,7 @@ import errno
 import random
 from resource_management.core import shell
 from resource_management.core.exceptions import Fail
+from ambari_commons.unicode_tolerant_fs import unicode_walk
 from ambari_commons import subprocess32
 
 from resource_management.core.utils import attr_to_bitmask
@@ -46,7 +47,7 @@ if os.geteuid() == 0:
 if uid == -1 and gid == -1:
   return
   
-for root, dirs, files in os.walk(path, followlinks=follow_links):
+for root, dirs, files in unicode_walk(path, followlinks=True):
   for name in files + dirs:
 if follow_links:
   os.chown(os.path.join(root, name), uid, gid)
@@ -83,7 +84,7 @@ if os.geteuid() == 0:
 dir_attrib = recursive_mode_flags["d"] if "d" in recursive_mode_flags else 
None
 files_attrib = recursive_mode_flags["f"] if "d" in recursive_mode_flags 
else None
 
-for root, dirs, files in os.walk(path, followlinks=recursion_follow_li

[ambari] branch trunk updated: AMBARI-23016. Fix trunk build after merge of 3.0-perf (echekanskiy)

2018-02-16 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new c1e87ac  AMBARI-23016. Fix trunk build after merge of 3.0-perf 
(echekanskiy)
c1e87ac is described below

commit c1e87ac1e8db91fd332693d829f4536acef54393
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Fri Feb 16 17:17:08 2018 +0200

AMBARI-23016. Fix trunk build after merge of 3.0-perf (echekanskiy)
---
 .../server/state/stack/upgrade/RepositoryVersionHelper.java   | 8 ++--
 1 file changed, 2 insertions(+), 6 deletions(-)

diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/state/stack/upgrade/RepositoryVersionHelper.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/state/stack/upgrade/RepositoryVersionHelper.java
index 02d770e..cec3629 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/state/stack/upgrade/RepositoryVersionHelper.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/state/stack/upgrade/RepositoryVersionHelper.java
@@ -656,7 +656,7 @@ public class RepositoryVersionHelper {
 return true;
   }
   
-   * Get repository info given a cluster and host.
+  /** Get repository info given a cluster and host.
*
* @param cluster  the cluster
* @param host the host
@@ -664,11 +664,7 @@ public class RepositoryVersionHelper {
* @return the repo info
*
* @throws AmbariException if the repository information can not be obtained
-  public String getRepoInfoString(Cluster cluster, Host host) throws 
AmbariException {
-
-  return getRepoInfoString(cluster, host.getOsType(), host.getOsFamily(), 
host.getHostName());
-  }*/
-
+   */
   public String getRepoInfoString(Cluster cluster, ServiceComponent component, 
Host host) throws AmbariException, SystemException {
 return gson.toJson(getCommandRepository(cluster, component, host));
   }

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-22819. Issues with storm jaas files and recursive ch_(mod/own) calls (echekanskiy)

2018-02-16 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 837b42e  AMBARI-22819. Issues with storm jaas files and recursive 
ch_(mod/own) calls (echekanskiy)
837b42e is described below

commit 837b42ea477949af3104165ec1f0e0051e9da8c3
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Fri Feb 16 13:57:05 2018 +0200

AMBARI-22819. Issues with storm jaas files and recursive ch_(mod/own) calls 
(echekanskiy)
---
 .../python/ambari_commons/unicode_tolerant_fs.py   | 63 ++
 .../main/python/resource_management/core/sudo.py   |  5 +-
 .../STORM/0.9.1/package/scripts/storm.py   | 21 ++--
 .../python/stacks/2.1/STORM/test_storm_base.py | 32 ++-
 .../2.1/STORM/test_storm_jaas_configuration.py |  1 +
 .../python/stacks/2.1/STORM/test_storm_nimbus.py   | 34 +---
 .../stacks/2.1/STORM/test_storm_supervisor_prod.py |  1 -
 .../python/stacks/2.3/STORM/test_storm_base.py |  1 +
 8 files changed, 117 insertions(+), 41 deletions(-)

diff --git 
a/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py 
b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
new file mode 100644
index 000..a2b8e6b
--- /dev/null
+++ b/ambari-common/src/main/python/ambari_commons/unicode_tolerant_fs.py
@@ -0,0 +1,63 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+Ambari Agent
+
+"""
+
+
+def unicode_walk(top, topdown=True, onerror=None, followlinks=False):
+  """
+  Unicode tolerant version of os.walk. Can(and must) be used environments with 
messed locales(and other encoding-related
+  problems) to traverse directories trees with unicode names in files and 
directories. All others function seems like
+  to accept utf-8 encoded strings, so result of `unicode_walk` can be used 
without a problems.
+  """
+  import os.path
+  import os
+
+  islink, join, isdir = os.path.islink, os.path.join, os.path.isdir
+
+  top = top.encode("utf8")
+
+  try:
+# Note that listdir and error are globals in this module due
+# to earlier import-*.
+names = os.listdir(top)
+  except os.error, err:
+if onerror is not None:
+  onerror(err)
+return
+
+  dirs, nondirs = [], []
+  for name in names:
+name = name.encode("utf8")
+if isdir(join(top, name)):
+  dirs.append(name)
+else:
+  nondirs.append(name)
+
+  if topdown:
+yield top, dirs, nondirs
+  for name in dirs:
+name = name.encode("utf8")
+new_path = join(top, name)
+if followlinks or not islink(new_path):
+  for x in unicode_walk(new_path, topdown, onerror, followlinks):
+yield x
+  if not topdown:
+yield top, dirs, nondirs
+
diff --git a/ambari-common/src/main/python/resource_management/core/sudo.py 
b/ambari-common/src/main/python/resource_management/core/sudo.py
index 2989367..a0a8e45 100644
--- a/ambari-common/src/main/python/resource_management/core/sudo.py
+++ b/ambari-common/src/main/python/resource_management/core/sudo.py
@@ -28,6 +28,7 @@ import errno
 import random
 from resource_management.core import shell
 from resource_management.core.exceptions import Fail
+from ambari_commons.unicode_tolerant_fs import unicode_walk
 from ambari_commons import subprocess32
 
 from resource_management.core.utils import attr_to_bitmask
@@ -46,7 +47,7 @@ if os.geteuid() == 0:
 if uid == -1 and gid == -1:
   return
   
-for root, dirs, files in os.walk(path, followlinks=follow_links):
+for root, dirs, files in unicode_walk(path, followlinks=True):
   for name in files + dirs:
 if follow_links:
   os.chown(os.path.join(root, name), uid, gid)
@@ -83,7 +84,7 @@ if os.geteuid() == 0:
 dir_attrib = recursive_mode_flags["d"] if "d" in recursive_mode_flags else 
None
 files_attrib = recursive_mode_flags["f"] if "d" in recursive_mode_flags 
else None
 
-for root, dirs, files in os.walk(path, followlinks=recursion_follow_li

[ambari] branch branch-2.6 updated: AMBARI-22984. Old mpacks broken (echekanskiy)

2018-02-14 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch branch-2.6
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/branch-2.6 by this push:
 new 96cee1d  AMBARI-22984. Old mpacks broken (echekanskiy)
96cee1d is described below

commit 96cee1d8ff785c88422cf0c9a1ca0bab70910346
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Wed Feb 14 13:25:21 2018 +0200

AMBARI-22984. Old mpacks broken (echekanskiy)
---
 .../src/main/resources/custom_actions/scripts/install_packages.py | 4 
 1 file changed, 4 insertions(+)

diff --git 
a/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py 
b/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py
index f925a0b..0965698 100644
--- 
a/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py
+++ 
b/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py
@@ -193,8 +193,12 @@ class InstallPackages(Script):
   for directory_struct in directories:
 if "component" in directory_struct:
   component_name = directory_struct["component"]
+
   if component_name:
 stack_version = 
stack_select.get_stack_version_before_install(component_name)
+  else:
+Logger.warning("Unable to fix {0} since stack using outdated 
stack_packages.json".format(package_name))
+return
 
   if 0 == len(restricted_packages) or package_name in restricted_packages:
 if stack_version:

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-22984. Old mpacks broken (echekanskiy)

2018-02-14 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new dc69741  AMBARI-22984. Old mpacks broken (echekanskiy)
dc69741 is described below

commit dc697410dd7e48e4e3705b81e8342a4b0f812769
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Wed Feb 14 13:25:21 2018 +0200

AMBARI-22984. Old mpacks broken (echekanskiy)
---
 .../src/main/resources/custom_actions/scripts/install_packages.py | 4 
 1 file changed, 4 insertions(+)

diff --git 
a/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py 
b/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py
index cd172af..dabfdb3 100644
--- 
a/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py
+++ 
b/ambari-server/src/main/resources/custom_actions/scripts/install_packages.py
@@ -191,8 +191,12 @@ class InstallPackages(Script):
   for directory_struct in directories:
 if "component" in directory_struct:
   component_name = directory_struct["component"]
+
   if component_name:
 stack_version = 
stack_select.get_stack_version_before_install(component_name)
+  else:
+Logger.warning("Unable to fix {0} since stack using outdated 
stack_packages.json".format(package_name))
+return
 
   if 0 == len(restricted_packages) or package_name in restricted_packages:
 if stack_version:

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-22841. Fix upgrade-catalog after accordingly to kerberos changes(echekanskiy)

2018-01-31 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new a69872e  AMBARI-22841. Fix upgrade-catalog after accordingly to 
kerberos changes(echekanskiy)
a69872e is described below

commit a69872e24d08000da1bd53b7b7316a33f8118b03
Author: Eugene Chekanskiy <echekans...@gmail.com>
AuthorDate: Mon Jan 22 17:51:52 2018 +0200

AMBARI-22841. Fix upgrade-catalog after accordingly to kerberos 
changes(echekanskiy)
---
 .../ambari/server/controller/AmbariServer.java |   4 +
 .../ambari/server/controller/KerberosHelper.java   |  11 +
 .../server/controller/KerberosHelperImpl.java  |   2 +-
 .../AbstractPrepareKerberosServerAction.java   |   2 +-
 .../ambari/server/upgrade/UpgradeCatalog300.java   | 389 ++---
 .../server/upgrade/UpgradeCatalog300Test.java  |  90 -
 6 files changed, 363 insertions(+), 135 deletions(-)

diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/AmbariServer.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/AmbariServer.java
index c0e9692..09badb6 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/AmbariServer.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/AmbariServer.java
@@ -306,6 +306,10 @@ public class AmbariServer {
 return clusterController;
   }
 
+  public static void setController(AmbariManagementController controller) {
+clusterController = controller;
+  }
+
   @SuppressWarnings("deprecation")
   public void run() throws Exception {
 setupJulLogging();
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
index 0aef548..297fc3c 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
@@ -18,6 +18,7 @@
 
 package org.apache.ambari.server.controller;
 
+import java.io.File;
 import java.io.IOException;
 import java.util.Collection;
 import java.util.List;
@@ -750,6 +751,16 @@ public interface KerberosHelper {
   void removeStaleKeytabs(Collection expectedKeytabs);
 
   /**
+   * Creates a temporary directory within the system temporary directory
+   * 
+   * The resulting directory is to be removed by the caller when desired.
+   *
+   * @return a File pointing to the new temporary directory, or null if one 
was not created
+   * @throws AmbariException if a new temporary directory cannot be created
+   */
+  File createTemporaryDirectory() throws AmbariException;
+
+  /**
* Translates a collection of configuration specifications 
(config-type/property-name)
* to a map of configuration types to a set of property names.
* 
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
index c7b69f0..2a30da4 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
@@ -2518,7 +2518,7 @@ public class KerberosHelperImpl implements KerberosHelper 
{
* @return a File pointing to the new temporary directory, or null if one 
was not created
* @throws AmbariException if a new temporary directory cannot be created
*/
-  protected File createTemporaryDirectory() throws AmbariException {
+  public File createTemporaryDirectory() throws AmbariException {
 try {
   File temporaryDirectory = getConfiguredTemporaryDirectory();
 
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/AbstractPrepareKerberosServerAction.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/AbstractPrepareKerberosServerAction.java
index cffd8e1..bcace83 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/AbstractPrepareKerberosServerAction.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/AbstractPrepareKerberosServerAction.java
@@ -73,7 +73,7 @@ public abstract class AbstractPrepareKerberosServerAction 
extends KerberosServer
 return kerberosHelper;
   }
 
-  void processServiceComponentHosts(Cluster cluster, KerberosDescriptor 
kerberosDescriptor,
+  public void processServiceComponentHosts(Cluster cluster, KerberosDescriptor 
kerberosDescriptor,
 List schToProcess,
 Collection identityFilter, String 
dataDirectory,
 Map<String,

[ambari] branch trunk updated: AMBARI-22847. Let HBase use ZK principal name set by users when enabling Kerberos (until now it's been hardcoded to 'zookeeper')

2018-01-29 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new c37068a  AMBARI-22847. Let HBase use ZK principal name set by users 
when enabling Kerberos (until now it's been hardcoded to 'zookeeper')
c37068a is described below

commit c37068ae4f700dc96900adf5b0e8bc36b8cd2fd3
Author: Sandor Molnar <smol...@hortonworks.com>
AuthorDate: Fri Jan 26 13:36:33 2018 +0100

AMBARI-22847. Let HBase use ZK principal name set by users when enabling 
Kerberos (until now it's been hardcoded to 'zookeeper')
---
 .../common-services/HBASE/2.0.0.3.0/configuration/hbase-env.xml| 2 +-
 .../common-services/HBASE/2.0.0.3.0/package/scripts/params_linux.py| 3 +++
 2 files changed, 4 insertions(+), 1 deletion(-)

diff --git 
a/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/configuration/hbase-env.xml
 
b/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/configuration/hbase-env.xml
index b36ac00..ff9d6fa 100644
--- 
a/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/configuration/hbase-env.xml
+++ 
b/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/configuration/hbase-env.xml
@@ -230,7 +230,7 @@ JDK_DEPENDED_OPTS="-XX:PermSize=128m -XX:MaxPermSize=128m"
 {% endif %}
 
 {% if security_enabled %}
-export HBASE_OPTS="$HBASE_OPTS -XX:+UseConcMarkSweepGC 
-XX:ErrorFile={{log_dir}}/hs_err_pid%p.log 
-Djava.security.auth.login.config={{client_jaas_config_file}} 
-Djava.io.tmpdir={{java_io_tmpdir}}"
+export HBASE_OPTS="$HBASE_OPTS -XX:+UseConcMarkSweepGC 
-XX:ErrorFile={{log_dir}}/hs_err_pid%p.log 
-Djava.security.auth.login.config={{client_jaas_config_file}} 
-Djava.io.tmpdir={{java_io_tmpdir}} {{zk_security_opts}}"
 export HBASE_MASTER_OPTS="$HBASE_MASTER_OPTS -Xmx{{master_heapsize}} 
-Djava.security.auth.login.config={{master_jaas_config_file}} 
-Djavax.security.auth.useSubjectCredsOnly=false $JDK_DEPENDED_OPTS"
 export HBASE_REGIONSERVER_OPTS="$HBASE_REGIONSERVER_OPTS 
-Xmn{{regionserver_xmn_size}} -XX:CMSInitiatingOccupancyFraction=70 
-XX:ReservedCodeCacheSize=256m -Xms{{regionserver_heapsize}} 
-Xmx{{regionserver_heapsize}} 
-Djava.security.auth.login.config={{regionserver_jaas_config_file}} 
-Djavax.security.auth.useSubjectCredsOnly=false $JDK_DEPENDED_OPTS"
 export PHOENIX_QUERYSERVER_OPTS="$PHOENIX_QUERYSERVER_OPTS 
-Djava.security.auth.login.config={{queryserver_jaas_config_file}}"
diff --git 
a/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/package/scripts/params_linux.py
 
b/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/package/scripts/params_linux.py
index b7e2b89..d8b26fc 100644
--- 
a/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/package/scripts/params_linux.py
+++ 
b/ambari-server/src/main/resources/common-services/HBASE/2.0.0.3.0/package/scripts/params_linux.py
@@ -184,6 +184,9 @@ service_check_data = get_unique_id_and_date()
 user_group = config['configurations']['cluster-env']["user_group"]
 
 if security_enabled:
+  zk_principal_name = 
default("/configurations/zookeeper-env/zookeeper_principal_name", 
"zookeeper/_h...@example.com")
+  zk_principal_user = zk_principal_name.split('/')[0]
+  zk_security_opts = format('-Dzookeeper.sasl.client=true 
-Dzookeeper.sasl.client.username={zk_principal_user} 
-Dzookeeper.sasl.clientconfig=Client')
   _hostname_lowercase = config['hostname'].lower()
   master_jaas_princ = 
config['configurations']['hbase-site']['hbase.master.kerberos.principal'].replace('_HOST',_hostname_lowercase)
   master_keytab_path = 
config['configurations']['hbase-site']['hbase.master.keytab.file']

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-22824. Let YARN/MR2 use ZK principal name set by users when enabling Kerberos (until now it's been hardcoded to 'zookeeper')

2018-01-29 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 4a9e13c  AMBARI-22824. Let YARN/MR2 use ZK principal name set by users 
when enabling Kerberos (until now it's been hardcoded to 'zookeeper')
4a9e13c is described below

commit 4a9e13c7040761785c7d09e312f37cae590f2221
Author: smolnar82 <34065904+smolna...@users.noreply.github.com>
AuthorDate: Mon Jan 29 12:46:05 2018 +0100

AMBARI-22824. Let YARN/MR2 use ZK principal name set by users when enabling 
Kerberos (until now it's been hardcoded to 'zookeeper')
---
 .../common-services/YARN/2.1.0.2.0/configuration/yarn-env.xml | 4 
 .../common-services/YARN/2.1.0.2.0/package/scripts/params_linux.py| 4 +++-
 .../common-services/YARN/3.0.0.3.0/package/scripts/params_linux.py| 4 +++-
 3 files changed, 10 insertions(+), 2 deletions(-)

diff --git 
a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/configuration/yarn-env.xml
 
b/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/configuration/yarn-env.xml
index d663c49..52560ac 100644
--- 
a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/configuration/yarn-env.xml
+++ 
b/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/configuration/yarn-env.xml
@@ -244,6 +244,10 @@ if [ "x$JAVA_LIBRARY_PATH" != "x" ]; then
 fi
 YARN_OPTS="$YARN_OPTS -Dyarn.policy.file=$YARN_POLICYFILE"
 YARN_OPTS="$YARN_OPTS -Djava.io.tmpdir={{hadoop_java_io_tmpdir}}"
+
+{% if rm_security_opts is defined %}
+YARN_OPTS="{{rm_security_opts}} $YARN_OPTS"
+{% endif %}
 
 
   content
diff --git 
a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/params_linux.py
 
b/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/params_linux.py
index 4a49822..eab6870 100644
--- 
a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/params_linux.py
+++ 
b/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/params_linux.py
@@ -347,7 +347,9 @@ if security_enabled:
   rm_kinit_cmd = format("{kinit_path_local} -kt {rm_keytab} 
{rm_principal_name};")
   yarn_jaas_file = os.path.join(config_dir, 'yarn_jaas.conf')
   if stack_supports_zk_security:
-rm_security_opts = format('-Dzookeeper.sasl.client=true 
-Dzookeeper.sasl.client.username=zookeeper 
-Djava.security.auth.login.config={yarn_jaas_file} 
-Dzookeeper.sasl.clientconfig=Client')
+zk_principal_name = 
default("/configurations/zookeeper-env/zookeeper_principal_name", 
"zookeeper/_h...@example.com")
+zk_principal_user = zk_principal_name.split('/')[0]
+rm_security_opts = format('-Dzookeeper.sasl.client=true 
-Dzookeeper.sasl.client.username={zk_principal_user} 
-Djava.security.auth.login.config={yarn_jaas_file} 
-Dzookeeper.sasl.clientconfig=Client')
 
   # YARN timeline security options
   if has_ats:
diff --git 
a/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/params_linux.py
 
b/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/params_linux.py
index 9afd112..7593708 100644
--- 
a/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/params_linux.py
+++ 
b/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/params_linux.py
@@ -345,7 +345,9 @@ if security_enabled:
   rm_keytab = 
config['configurations']['yarn-site']['yarn.resourcemanager.keytab']
   rm_kinit_cmd = format("{kinit_path_local} -kt {rm_keytab} 
{rm_principal_name};")
   yarn_jaas_file = os.path.join(config_dir, 'yarn_jaas.conf')
-  rm_security_opts = format('-Dzookeeper.sasl.client=true 
-Dzookeeper.sasl.client.username=zookeeper 
-Djava.security.auth.login.config={yarn_jaas_file} 
-Dzookeeper.sasl.clientconfig=Client')
+  zk_principal_name = 
default("/configurations/zookeeper-env/zookeeper_principal_name", 
"zookeeper/_h...@example.com")
+  zk_principal_user = zk_principal_name.split('/')[0]
+  rm_security_opts = format('-Dzookeeper.sasl.client=true 
-Dzookeeper.sasl.client.username={zk_principal_user} 
-Djava.security.auth.login.config={yarn_jaas_file} 
-Dzookeeper.sasl.clientconfig=Client')
 
   # YARN timeline security options
   if has_ats:

-- 
To stop receiving notification emails like this one, please contact
echekans...@apache.org.


[ambari] branch trunk updated: AMBARI-22792. Refactor agent-side kerberos code (echekanskiy)

2018-01-18 Thread echekanskiy
This is an automated email from the ASF dual-hosted git repository.

echekanskiy pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/ambari.git


The following commit(s) were added to refs/heads/trunk by this push:
 new 1a6548a  AMBARI-22792. Refactor agent-side kerberos code (echekanskiy)
1a6548a is described below

commit 1a6548a69970446bda994e89b76cc9cee92860c6
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
AuthorDate: Mon Jan 15 16:29:49 2018 +0200

AMBARI-22792. Refactor agent-side kerberos code (echekanskiy)
---
 .../python/ambari_commons/kerberos/__init__.py |  19 +
 .../ambari_commons/kerberos/kerberos_common.py | 168 +++
 .../main/python/ambari_commons/kerberos}/utils.py  |   4 +
 .../ambari/server/agent/HeartbeatProcessor.java|  13 +-
 .../ConfigureAmbariIdentitiesServerAction.java |   2 +
 .../1.10.3-10/package/scripts/kerberos_client.py   |  57 ++-
 .../1.10.3-10/package/scripts/kerberos_common.py   | 493 -
 .../KERBEROS/1.10.3-10/package/scripts/params.py   |   4 +-
 .../1.10.3-10/package/scripts/service_check.py |  32 +-
 .../1.10.3-30/package/scripts/kerberos_client.py   |  57 ++-
 .../1.10.3-30/package/scripts/kerberos_common.py   | 493 -
 .../KERBEROS/1.10.3-30/package/scripts/params.py   |   5 +-
 .../1.10.3-30/package/scripts/service_check.py |  32 +-
 .../KERBEROS/1.10.3-30/package/scripts/utils.py| 105 -
 .../stacks/2.2/KERBEROS/test_kerberos_client.py|   2 +-
 15 files changed, 334 insertions(+), 1152 deletions(-)

diff --git a/ambari-common/src/main/python/ambari_commons/kerberos/__init__.py 
b/ambari-common/src/main/python/ambari_commons/kerberos/__init__.py
new file mode 100644
index 000..3cb6ecf
--- /dev/null
+++ b/ambari-common/src/main/python/ambari_commons/kerberos/__init__.py
@@ -0,0 +1,19 @@
+#!/usr/bin/env python2.6
+
+'''
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+'''
diff --git 
a/ambari-common/src/main/python/ambari_commons/kerberos/kerberos_common.py 
b/ambari-common/src/main/python/ambari_commons/kerberos/kerberos_common.py
new file mode 100644
index 000..c0ac580
--- /dev/null
+++ b/ambari-common/src/main/python/ambari_commons/kerberos/kerberos_common.py
@@ -0,0 +1,168 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+"""
+
+import base64
+import getpass
+import os
+import stat
+from ambari_agent import Constants
+from collections import namedtuple
+from resource_management.core import sudo
+from resource_management.core.logger import Logger
+from resource_management.core.resources.klist import Klist
+from resource_management.core.resources.system import Directory, File
+from resource_management.core.source import InlineTemplate
+from tempfile import gettempdir
+from .utils import get_property_value
+
+KRB5_REALM_PROPERTIES = [
+  'kdc',
+  'admin_server',
+  'default_domain',
+  'master_kdc'
+]
+
+
+class MissingKeytabs(object):
+  class Identity(namedtuple('Identity', ['principal', 'keytab_file_path'])):
+@staticmethod
+def from_kerberos_record(item, hostname):
+  return MissingKeytabs.Identity(
+get_property_value(item, 'principal').replace("_HOST", hostname),
+get_property_value(item, 'keytab_file_path'))
+
+def __str__(self):
+  return "Keytab: %s Principal: %s" % (self.keytab_file_path, 
self.principal)
+
+  @classmethod
+  def from_kerberos_record

ambari git commit: AMBARI-22629 Disabling Kerberos after enabled during Blueprint install fails with missing data directory error (echekanskiy)

2018-01-04 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 b5da60b5a -> a02098672


AMBARI-22629 Disabling Kerberos after enabled during Blueprint install fails 
with missing data directory error (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/a0209867
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/a0209867
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/a0209867

Branch: refs/heads/branch-2.6
Commit: a020986728875896df7acd6c2811760ba739d38e
Parents: b5da60b
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Thu Dec 14 17:29:20 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Thu Jan 4 19:00:02 2018 +0200

--
 .../server/controller/KerberosHelperImpl.java   |  6 +-
 .../server/controller/KerberosHelperTest.java   | 64 +---
 2 files changed, 60 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/a0209867/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
index 86d3ee0..4402d4e 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
@@ -3976,17 +3976,17 @@ public class KerberosHelperImpl implements 
KerberosHelper {
 
 @Override
 public SecurityState getNewDesiredSCHSecurityState() {
-  return null;
+  return SecurityState.SECURED_KERBEROS;
 }
 
 @Override
 public SecurityState getNewSCHSecurityState() {
-  return null;
+  return SecurityState.SECURING;
 }
 
 @Override
 public SecurityState getNewServiceSecurityState() {
-  return null;
+  return SecurityState.SECURED_KERBEROS;
 }
 
 @Override

http://git-wip-us.apache.org/repos/asf/ambari/blob/a0209867/ambari-server/src/test/java/org/apache/ambari/server/controller/KerberosHelperTest.java
--
diff --git 
a/ambari-server/src/test/java/org/apache/ambari/server/controller/KerberosHelperTest.java
 
b/ambari-server/src/test/java/org/apache/ambari/server/controller/KerberosHelperTest.java
index 198fdf5..cc8bb53 100644
--- 
a/ambari-server/src/test/java/org/apache/ambari/server/controller/KerberosHelperTest.java
+++ 
b/ambari-server/src/test/java/org/apache/ambari/server/controller/KerberosHelperTest.java
@@ -1380,6 +1380,12 @@ public class KerberosHelperTest extends EasyMockSupport {
 
expect(schKerberosClient.getServiceComponentName()).andReturn(Role.KERBEROS_CLIENT.name()).anyTimes();
 expect(schKerberosClient.getHostName()).andReturn("host1").anyTimes();
 expect(schKerberosClient.getState()).andReturn(State.INSTALLED).anyTimes();
+schKerberosClient.setDesiredSecurityState(SecurityState.SECURED_KERBEROS);
+expectLastCall().anyTimes();
+schKerberosClient.setSecurityState(SecurityState.SECURING);
+expectLastCall().anyTimes();
+schKerberosClient.setSecurityState(SecurityState.SECURED_KERBEROS);
+expectLastCall().anyTimes();
 
 final ServiceComponentHost sch1 = createMock(ServiceComponentHost.class);
 expect(sch1.getServiceName()).andReturn("SERVICE1").anyTimes();
@@ -1387,6 +1393,12 @@ public class KerberosHelperTest extends EasyMockSupport {
 
expect(sch1.getSecurityState()).andReturn(SecurityState.UNSECURED).anyTimes();
 
expect(sch1.getDesiredSecurityState()).andReturn(SecurityState.UNSECURED).anyTimes();
 expect(sch1.getHostName()).andReturn("host1").anyTimes();
+sch1.setDesiredSecurityState(SecurityState.SECURED_KERBEROS);
+expectLastCall().anyTimes();
+sch1.setSecurityState(SecurityState.SECURING);
+expectLastCall().anyTimes();
+sch1.setSecurityState(SecurityState.SECURED_KERBEROS);
+expectLastCall().anyTimes();
 
 final ServiceComponentHost sch2 = createMock(ServiceComponentHost.class);
 expect(sch2.getServiceName()).andReturn("SERVICE2").anyTimes();
@@ -1394,6 +1406,12 @@ public class KerberosHelperTest extends EasyMockSupport {
 
expect(sch2.getSecurityState()).andReturn(SecurityState.UNSECURED).anyTimes();
 
expect(sch2.getDesiredSecurityState()).andReturn(SecurityState.UNSECURED).anyTimes();
 expect(sch2.getHostName()).andReturn("host1").anyTimes();
+sch2.setDesiredSecurityState(SecurityState.SECURED_KERBEROS);
+expectLastCall().anyTimes();
+sch2.setSecurityState(Sec

[2/5] ambari git commit: AMBARI-22530. Refactor internal code of handling info between kerberos wizard actions (echekanskiy)

2017-12-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
index 5ec4c10..a803dcf 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
@@ -34,10 +34,11 @@ import org.apache.ambari.server.configuration.Configuration;
 import org.apache.ambari.server.controller.KerberosHelper;
 import org.apache.ambari.server.orm.dao.HostDAO;
 import org.apache.ambari.server.orm.dao.KerberosPrincipalDAO;
-import org.apache.ambari.server.orm.dao.KerberosPrincipalHostDAO;
-import org.apache.ambari.server.orm.entities.HostEntity;
 import org.apache.ambari.server.orm.entities.KerberosPrincipalEntity;
 import org.apache.ambari.server.serveraction.ActionLog;
+import 
org.apache.ambari.server.serveraction.kerberos.stageutils.KerberosKeytabController;
+import 
org.apache.ambari.server.serveraction.kerberos.stageutils.ResolvedKerberosKeytab;
+import 
org.apache.ambari.server.serveraction.kerberos.stageutils.ResolvedKerberosPrincipal;
 import org.apache.commons.codec.digest.DigestUtils;
 import org.apache.directory.server.kerberos.shared.keytab.Keytab;
 import org.slf4j.Logger;
@@ -52,7 +53,7 @@ import com.google.inject.Inject;
  * This class mainly relies on the KerberosServerAction to iterate through 
metadata identifying
  * the Kerberos keytab files that need to be created. For each identity in the 
metadata, this
  * implementation's
- * {@link KerberosServerAction#processIdentity(Map, String, 
KerberosOperationHandler, Map, Map)}
+ * {@link KerberosServerAction#processIdentity(ResolvedKerberosPrincipal, 
KerberosOperationHandler, Map, Map)}
  * is invoked attempting the creation of the relevant keytab file.
  */
 public class CreateKeytabFilesServerAction extends KerberosServerAction {
@@ -65,12 +66,6 @@ public class CreateKeytabFilesServerAction extends 
KerberosServerAction {
   private KerberosPrincipalDAO kerberosPrincipalDAO;
 
   /**
-   * KerberosPrincipalHostDAO used to get Kerberos principal details
-   */
-  @Inject
-  private KerberosPrincipalHostDAO kerberosPrincipalHostDAO;
-
-  /**
* Configuration used to get the configured properties such as the keytab 
file cache directory
*/
   @Inject
@@ -82,6 +77,9 @@ public class CreateKeytabFilesServerAction extends 
KerberosServerAction {
   @Inject
   private HostDAO hostDAO;
 
+  @Inject
+  private KerberosKeytabController kerberosKeytabController;
+
   /**
* A map of data used to track what has been processed in order to optimize 
the creation of keytabs
* such as knowing when to create a cached keytab file or use a cached 
keytab file.
@@ -118,10 +116,7 @@ public class CreateKeytabFilesServerAction extends 
KerberosServerAction {
* If a password exists for the current evaluatedPrincipal, use a
* {@link 
org.apache.ambari.server.serveraction.kerberos.KerberosOperationHandler} to 
generate
* the keytab file. To help avoid filename collisions and to build a 
structure that is easy to
-   * discover, each keytab file is stored in host-specific
-   * ({@link 
org.apache.ambari.server.serveraction.kerberos.KerberosIdentityDataFileReader#HOSTNAME})
-   * directory using the SHA1 hash of its destination file path
-   * ({@link 
org.apache.ambari.server.serveraction.kerberos.KerberosIdentityDataFileReader#KEYTAB_FILE_PATH})
+   * discover, each keytab file is stored in host-specific directory using the 
SHA1 hash of its destination file path.
* 
* 
*   data_directory
@@ -133,8 +128,7 @@ public class CreateKeytabFilesServerAction extends 
KerberosServerAction {
*   |  |- ...
* 
*
-   * @param identityRecord   a Map containing the data for the current 
identity record
-   * @param evaluatedPrincipal   a String indicating the relevant principal
+   * @param resolvedPrincipala ResolvedKerberosPrincipal object to 
process
* @param operationHandler a KerberosOperationHandler used to 
perform Kerberos-related
* tasks for specific Kerberos 
implementations
* (MIT, Active Directory, etc...)
@@ -145,7 +139,7 @@ public class CreateKeytabFilesServerAction extends 
KerberosServerAction {
* @throws AmbariException if an error occurs while processing the identity 
record
*/
   @Override
-  protected CommandReport processIdentity(Map identityRecord, 
String evaluatedPrincipal,
+  protected CommandReport 

[5/5] ambari git commit: AMBARI-22530. Refactor internal code of handling info between kerberos wizard actions (echekanskiy)

2017-12-21 Thread echekanskiy
AMBARI-22530. Refactor internal code of handling info between kerberos wizard 
actions (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/67fc4a37
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/67fc4a37
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/67fc4a37

Branch: refs/heads/trunk
Commit: 67fc4a3785da0a7c39dcb27f220c8573a59ab63d
Parents: 81c0454
Author: root <r...@build.home.lan>
Authored: Thu Dec 21 10:58:23 2017 -0500
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Thu Dec 21 11:00:37 2017 -0500

--
 .../ambari/server/agent/HeartBeatHandler.java   |  122 +-
 .../ambari/server/agent/HeartbeatProcessor.java |   33 +-
 .../controller/DeleteIdentityHandler.java   |5 +-
 .../server/controller/KerberosHelper.java   |2 +-
 .../server/controller/KerberosHelperImpl.java   | 1129 +-
 .../HostKerberosIdentityResourceProvider.java   |   15 +-
 .../server/orm/dao/KerberosKeytabDAO.java   |  154 ++-
 .../orm/dao/KerberosKeytabPrincipalDAO.java |  309 +
 .../server/orm/dao/KerberosPrincipalDAO.java|9 -
 .../orm/dao/KerberosPrincipalHostDAO.java   |  252 
 .../entities/HostGroupComponentEntityPK.java|4 +-
 .../orm/entities/KerberosKeytabEntity.java  |  152 ++-
 .../entities/KerberosKeytabPrincipalEntity.java |  236 
 .../KerberosKeytabServiceMappingEntity.java |   88 ++
 .../orm/entities/KerberosPrincipalEntity.java   |   25 -
 .../entities/KerberosPrincipalHostEntity.java   |  213 
 .../entities/KerberosPrincipalHostEntityPK.java |  115 --
 .../AbstractPrepareKerberosServerAction.java|   31 +-
 .../kerberos/CleanupServerAction.java   |6 +-
 .../ConfigureAmbariIdentitiesServerAction.java  |  141 ++-
 .../kerberos/CreateKeytabFilesServerAction.java |  112 +-
 .../kerberos/CreatePrincipalsServerAction.java  |   47 +-
 .../kerberos/DestroyPrincipalsServerAction.java |   62 +-
 .../kerberos/FinalizeKerberosServerAction.java  |   24 +-
 .../kerberos/KerberosServerAction.java  |  291 ++---
 .../PrepareEnableKerberosServerAction.java  |   16 +-
 .../PrepareKerberosIdentitiesServerAction.java  |9 -
 .../stageutils/KerberosKeytabController.java|  213 
 .../stageutils/ResolvedKerberosKeytab.java  |  117 +-
 .../stageutils/ResolvedKerberosPrincipal.java   |  169 +++
 .../upgrades/PreconfigureKerberosAction.java|   12 +-
 .../server/state/cluster/ClustersImpl.java  |8 +-
 .../main/resources/Ambari-DDL-Derby-CREATE.sql  |   34 +-
 .../main/resources/Ambari-DDL-MySQL-CREATE.sql  |   33 +-
 .../main/resources/Ambari-DDL-Oracle-CREATE.sql |   35 +-
 .../resources/Ambari-DDL-Postgres-CREATE.sql|   35 +-
 .../resources/Ambari-DDL-SQLAnywhere-CREATE.sql |   33 +-
 .../resources/Ambari-DDL-SQLServer-CREATE.sql   |   33 +-
 .../src/main/resources/META-INF/persistence.xml |3 +-
 .../server/agent/TestHeartbeatHandler.java  |   79 +-
 .../server/controller/KerberosHelperTest.java   |   47 +-
 ...ostKerberosIdentityResourceProviderTest.java |   15 +-
 .../apache/ambari/server/orm/db/DDLTests.java   |2 +-
 ...nfigureAmbariIdentitiesServerActionTest.java |   36 +-
 .../FinalizeKerberosServerActionTest.java   |5 +-
 .../kerberos/KerberosServerActionTest.java  |   26 +-
 .../PreconfigureKerberosActionTest.java |   16 +-
 47 files changed, 2618 insertions(+), 1935 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartBeatHandler.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartBeatHandler.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartBeatHandler.java
index 53cceb0..2b82fe3 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartBeatHandler.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartBeatHandler.java
@@ -26,6 +26,7 @@ import java.io.File;
 import java.io.FileInputStream;
 import java.io.IOException;
 import java.util.ArrayList;
+import java.util.Collection;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
@@ -39,8 +40,10 @@ import org.apache.ambari.server.actionmanager.ActionManager;
 import org.apache.ambari.server.api.services.AmbariMetaInfo;
 import org.apache.ambari.server.configuration.Configuration;
 import 
org.apache.ambari.server.serveraction.kerberos.KerberosIdentityDataFileReader;
-import 
org.apache.ambari.server.serveraction.kerberos.KerberosIdentityDataFileReaderFactory;
 import org.apache.ambari.server.serveraction.kerberos.KerberosServerAct

[3/5] ambari git commit: AMBARI-22530. Refactor internal code of handling info between kerberos wizard actions (echekanskiy)

2017-12-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/HostKerberosIdentityResourceProvider.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/HostKerberosIdentityResourceProvider.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/HostKerberosIdentityResourceProvider.java
index 52ab9b5..d90d5bf 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/HostKerberosIdentityResourceProvider.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/HostKerberosIdentityResourceProvider.java
@@ -36,9 +36,10 @@ import org.apache.ambari.server.controller.spi.Resource;
 import org.apache.ambari.server.controller.spi.SystemException;
 import org.apache.ambari.server.controller.spi.UnsupportedPropertyException;
 import org.apache.ambari.server.orm.dao.HostDAO;
+import org.apache.ambari.server.orm.dao.KerberosKeytabPrincipalDAO;
 import org.apache.ambari.server.orm.dao.KerberosPrincipalDAO;
-import org.apache.ambari.server.orm.dao.KerberosPrincipalHostDAO;
 import org.apache.ambari.server.orm.entities.HostEntity;
+import org.apache.ambari.server.orm.entities.KerberosKeytabPrincipalEntity;
 import org.apache.ambari.server.state.kerberos.KerberosIdentityDescriptor;
 import org.apache.ambari.server.state.kerberos.KerberosKeytabDescriptor;
 import org.apache.ambari.server.state.kerberos.KerberosPrincipalDescriptor;
@@ -101,12 +102,6 @@ public class HostKerberosIdentityResourceProvider extends 
ReadOnlyResourceProvid
   private KerberosHelper kerberosHelper;
 
   /**
-   * KerberosPrincipalHostDAO used to get Kerberos principal details
-   */
-  @Inject
-  private KerberosPrincipalHostDAO kerberosPrincipalHostDAO;
-
-  /**
* KerberosPrincipalDAO used to get Kerberos principal details
*/
   @Inject
@@ -118,6 +113,9 @@ public class HostKerberosIdentityResourceProvider extends 
ReadOnlyResourceProvid
   @Inject
   private HostDAO hostDAO;
 
+  @Inject
+  private KerberosKeytabPrincipalDAO kerberosKeytabPrincipalDAO;
+
   /**
* Create a  new resource provider for the given management controller.
*
@@ -200,7 +198,8 @@ public class HostKerberosIdentityResourceProvider extends 
ReadOnlyResourceProvid
 
 if ((hostId != null) && 
kerberosPrincipalDAO.exists(principal)) {
   if (keytabDescriptor != null) {
-if (kerberosPrincipalHostDAO.exists(principal, hostId, 
keytabDescriptor.getFile())) {
+KerberosKeytabPrincipalEntity entity = 
kerberosKeytabPrincipalDAO.findByNaturalKey(hostId, keytabDescriptor.getFile(), 
principal);
+if (entity != null && entity.isDistributed()) {
   installedStatus = "true";
 } else {
   installedStatus = "false";

http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/orm/dao/KerberosKeytabDAO.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/orm/dao/KerberosKeytabDAO.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/orm/dao/KerberosKeytabDAO.java
index a8723b7..ca7d23c 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/orm/dao/KerberosKeytabDAO.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/orm/dao/KerberosKeytabDAO.java
@@ -18,14 +18,13 @@
 
 package org.apache.ambari.server.orm.dao;
 
-import java.util.Collection;
+import java.util.Collections;
 import java.util.List;
 
 import javax.persistence.EntityManager;
 import javax.persistence.TypedQuery;
 
 import org.apache.ambari.server.orm.RequiresSession;
-import org.apache.ambari.server.orm.entities.HostEntity;
 import org.apache.ambari.server.orm.entities.KerberosKeytabEntity;
 
 import com.google.inject.Inject;
@@ -35,76 +34,103 @@ import com.google.inject.persist.Transactional;
 
 @Singleton
 public class KerberosKeytabDAO {
-@Inject
-Provider entityManagerProvider;
-
-@Transactional
-public void create(KerberosKeytabEntity kerberosKeytabEntity) {
-entityManagerProvider.get().persist(kerberosKeytabEntity);
-}
-
-public void create(String keytabPath) {
-create(new KerberosKeytabEntity(keytabPath));
-}
-
-@Transactional
-public KerberosKeytabEntity merge(KerberosKeytabEntity 
kerberosKeytabEntity) {
-return entityManagerProvider.get().merge(kerberosKeytabEntity);
-}
-
-@Transactional
-public void remove(KerberosKeytabEntity kerberosKeytabEntity) {
-entityManagerProvider.get().remove(merge(kerberosKeytabEntity));
-}
-
-public void remove(String keytabPath) {
-KerberosKeytabEntity kke = 

[4/5] ambari git commit: AMBARI-22530. Refactor internal code of handling info between kerberos wizard actions (echekanskiy)

2017-12-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
index ab85aa1..c7b69f0 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
@@ -60,11 +60,14 @@ import 
org.apache.ambari.server.controller.internal.RequestStageContainer;
 import org.apache.ambari.server.controller.utilities.KerberosChecker;
 import org.apache.ambari.server.metadata.RoleCommandOrder;
 import org.apache.ambari.server.orm.dao.ArtifactDAO;
+import org.apache.ambari.server.orm.dao.HostDAO;
 import org.apache.ambari.server.orm.dao.KerberosKeytabDAO;
+import org.apache.ambari.server.orm.dao.KerberosKeytabPrincipalDAO;
 import org.apache.ambari.server.orm.dao.KerberosPrincipalDAO;
-import org.apache.ambari.server.orm.dao.KerberosPrincipalHostDAO;
 import org.apache.ambari.server.orm.entities.ArtifactEntity;
+import org.apache.ambari.server.orm.entities.HostEntity;
 import org.apache.ambari.server.orm.entities.KerberosKeytabEntity;
+import org.apache.ambari.server.orm.entities.KerberosKeytabPrincipalEntity;
 import org.apache.ambari.server.security.credential.Credential;
 import org.apache.ambari.server.security.credential.PrincipalKeyCredential;
 import org.apache.ambari.server.security.encryption.CredentialStoreService;
@@ -79,7 +82,6 @@ import 
org.apache.ambari.server.serveraction.kerberos.FinalizeKerberosServerActi
 import org.apache.ambari.server.serveraction.kerberos.KDCType;
 import 
org.apache.ambari.server.serveraction.kerberos.KerberosAdminAuthenticationException;
 import 
org.apache.ambari.server.serveraction.kerberos.KerberosIdentityDataFileWriter;
-import 
org.apache.ambari.server.serveraction.kerberos.KerberosIdentityDataFileWriterFactory;
 import 
org.apache.ambari.server.serveraction.kerberos.KerberosInvalidConfigurationException;
 import 
org.apache.ambari.server.serveraction.kerberos.KerberosKDCConnectionException;
 import 
org.apache.ambari.server.serveraction.kerberos.KerberosKDCSSLConnectionException;
@@ -95,6 +97,7 @@ import 
org.apache.ambari.server.serveraction.kerberos.PrepareEnableKerberosServe
 import 
org.apache.ambari.server.serveraction.kerberos.PrepareKerberosIdentitiesServerAction;
 import 
org.apache.ambari.server.serveraction.kerberos.UpdateKerberosConfigsServerAction;
 import 
org.apache.ambari.server.serveraction.kerberos.stageutils.ResolvedKerberosKeytab;
+import 
org.apache.ambari.server.serveraction.kerberos.stageutils.ResolvedKerberosPrincipal;
 import org.apache.ambari.server.stageplanner.RoleGraph;
 import org.apache.ambari.server.stageplanner.RoleGraphFactory;
 import org.apache.ambari.server.state.Cluster;
@@ -129,14 +132,17 @@ import org.apache.ambari.server.utils.StageUtils;
 import org.apache.commons.collections.CollectionUtils;
 import org.apache.commons.io.FileUtils;
 import org.apache.commons.lang.StringUtils;
-import org.apache.commons.lang3.tuple.Pair;
 import org.apache.directory.server.kerberos.shared.keytab.Keytab;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
 import com.google.common.collect.ImmutableMap;
 import com.google.common.collect.ImmutableSet;
+import com.google.common.collect.Lists;
 import com.google.common.collect.Sets;
+import com.google.gson.JsonArray;
+import com.google.gson.JsonObject;
+import com.google.gson.JsonPrimitive;
 import com.google.inject.Inject;
 import com.google.inject.Injector;
 import com.google.inject.Singleton;
@@ -200,19 +206,19 @@ public class KerberosHelperImpl implements KerberosHelper 
{
   private KerberosDescriptorFactory kerberosDescriptorFactory;
 
   @Inject
-  private KerberosIdentityDataFileWriterFactory 
kerberosIdentityDataFileWriterFactory;
+  private ArtifactDAO artifactDAO;
 
   @Inject
   private KerberosPrincipalDAO kerberosPrincipalDAO;
 
   @Inject
-  private ArtifactDAO artifactDAO;
+  private KerberosKeytabDAO kerberosKeytabDAO;
 
   @Inject
-  private KerberosKeytabDAO kerberosKeytabDAO;
+  private KerberosKeytabPrincipalDAO kerberosKeytabPrincipalDAO;
 
   @Inject
-  KerberosPrincipalHostDAO kerberosPrincipalHostDAO;
+  private HostDAO hostDAO;
 
   /**
* The injector used to create new instances of helper classes like 
CreatePrincipalsServerAction
@@ -234,7 +240,7 @@ public class KerberosHelperImpl implements KerberosHelper {
   public RequestStageContainer toggleKerberos(Cluster cluster, SecurityType 
securityType,
   RequestStageContainer 
requestStageContainer,
   Boolean manageIdentities)
-  throws AmbariException, 

[1/5] ambari git commit: AMBARI-22530. Refactor internal code of handling info between kerberos wizard actions (echekanskiy)

2017-12-21 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 81c045452 -> 67fc4a378


http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/serveraction/upgrades/PreconfigureKerberosAction.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/upgrades/PreconfigureKerberosAction.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/upgrades/PreconfigureKerberosAction.java
index ca78dbb..94a6a49 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/upgrades/PreconfigureKerberosAction.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/upgrades/PreconfigureKerberosAction.java
@@ -38,10 +38,11 @@ import 
org.apache.ambari.server.actionmanager.HostRoleStatus;
 import org.apache.ambari.server.agent.CommandReport;
 import org.apache.ambari.server.controller.AmbariManagementController;
 import org.apache.ambari.server.controller.KerberosHelper;
+import org.apache.ambari.server.controller.RootComponent;
+import org.apache.ambari.server.controller.RootService;
 import org.apache.ambari.server.orm.dao.HostDAO;
 import org.apache.ambari.server.orm.dao.KerberosKeytabDAO;
 import org.apache.ambari.server.orm.dao.KerberosPrincipalDAO;
-import org.apache.ambari.server.orm.dao.KerberosPrincipalHostDAO;
 import org.apache.ambari.server.orm.entities.HostEntity;
 import org.apache.ambari.server.orm.entities.RepositoryVersionEntity;
 import org.apache.ambari.server.serveraction.kerberos.PreconfigureServiceType;
@@ -96,9 +97,6 @@ public class PreconfigureKerberosAction extends 
AbstractUpgradeServerAction {
   private KerberosKeytabDAO kerberosKeytabDAO;
 
   @Inject
-  KerberosPrincipalHostDAO kerberosPrincipalHostDAO;
-
-  @Inject
   KerberosPrincipalDAO kerberosPrincipalDAO;
 
   @Override
@@ -376,11 +374,11 @@ public class PreconfigureKerberosAction extends 
AbstractUpgradeServerAction {
 // component.
 String componentName = 
KerberosHelper.AMBARI_SERVER_KERBEROS_IDENTITY_NAME.equals(identity.getName())
 ? "AMBARI_SERVER_SELF"
-: "AMBARI_SERVER";
+: RootComponent.AMBARI_SERVER.name();
 
 List componentIdentities = 
Collections.singletonList(identity);
 kerberosHelper.addIdentities(null, componentIdentities,
-null, KerberosHelper.AMBARI_SERVER_HOST_NAME, 
ambariServerHostID(), "AMBARI", componentName, kerberosConfigurations, 
currentConfigurations,
+null, KerberosHelper.AMBARI_SERVER_HOST_NAME, 
ambariServerHostID(), RootService.AMBARI.name(), componentName, 
kerberosConfigurations, currentConfigurations,
 resolvedKeytabs, realm);
 propertiesToIgnore = gatherPropertiesToIgnore(componentIdentities, 
propertiesToIgnore);
   }
@@ -392,7 +390,7 @@ public class PreconfigureKerberosAction extends 
AbstractUpgradeServerAction {
 
 // create database records for keytabs that must be presented on 
cluster
 for (ResolvedKerberosKeytab keytab : resolvedKeytabs.values()) {
-  kerberosHelper.processResolvedKeytab(keytab);
+  kerberosHelper.createResolvedKeytab(keytab);
 }
   } catch (IOException e) {
 throw new AmbariException(e.getMessage(), e);

http://git-wip-us.apache.org/repos/asf/ambari/blob/67fc4a37/ambari-server/src/main/java/org/apache/ambari/server/state/cluster/ClustersImpl.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/state/cluster/ClustersImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/state/cluster/ClustersImpl.java
index 5ac1ac3..385a276 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/state/cluster/ClustersImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/state/cluster/ClustersImpl.java
@@ -46,7 +46,7 @@ import org.apache.ambari.server.orm.dao.HostConfigMappingDAO;
 import org.apache.ambari.server.orm.dao.HostDAO;
 import org.apache.ambari.server.orm.dao.HostStateDAO;
 import org.apache.ambari.server.orm.dao.HostVersionDAO;
-import org.apache.ambari.server.orm.dao.KerberosPrincipalHostDAO;
+import org.apache.ambari.server.orm.dao.KerberosKeytabPrincipalDAO;
 import org.apache.ambari.server.orm.dao.RequestOperationLevelDAO;
 import org.apache.ambari.server.orm.dao.ResourceTypeDAO;
 import org.apache.ambari.server.orm.dao.ServiceConfigDAO;
@@ -112,8 +112,6 @@ public class ClustersImpl implements Clusters {
   @Inject
   private RequestOperationLevelDAO requestOperationLevelDAO;
   @Inject
-  private KerberosPrincipalHostDAO kerberosPrincipalHostDAO;
-  @Inject
   private HostConfigMappingDAO hostConfigMappingDAO;
   @Inject
   private ServiceConfigDAO serviceConfigDAO;
@@ -129,6 +127,8 @@ public class ClustersImpl implements Clusters 

ambari git commit: AMBARI-22390. Implement many-to-many relation between keytabs and principals (echekanskiy)

2017-11-13 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 6e706d427 -> 18e549034


AMBARI-22390. Implement many-to-many relation between keytabs and principals 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/18e54903
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/18e54903
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/18e54903

Branch: refs/heads/trunk
Commit: 18e549034e893aac4ade80e49bda70604d5147f3
Parents: 6e706d4
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Nov 13 17:01:16 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Nov 13 17:01:29 2017 +0200

--
 .../server/controller/KerberosHelperImpl.java   | 59 ++--
 .../AbstractPrepareKerberosServerAction.java| 23 +++-
 .../kerberos/CreatePrincipalsServerAction.java  |  6 +-
 .../kerberos/KerberosServerAction.java  | 21 +++
 .../stageutils/ResolvedKerberosKeytab.java  | 16 +++---
 5 files changed, 82 insertions(+), 43 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/18e54903/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
index f913831..474c335 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
@@ -1546,19 +1546,21 @@ public class KerberosHelperImpl implements 
KerberosHelper {
 keytabFileOwnerAccess,
 keytabFileGroupName,
 keytabFileGroupAccess,
-Sets.newHashSet(Pair.of(hostId, evaluatedPrincipal)),
+Sets.newHashSet(Pair.of(hostId, Pair.of(evaluatedPrincipal, 
principalType))),
 serviceName.equalsIgnoreCase("AMBARI"),
 componentName.equalsIgnoreCase("AMBARI_SERVER_SELF")
 );
 if (resolvedKeytabs.containsKey(keytabFilePath)) {
   ResolvedKerberosKeytab sameKeytab = 
resolvedKeytabs.get(keytabFilePath);
   // validating owner and group
-  String warnTemplate = "Keytab '{}' on host '{}' have different 
{}, originally set to '{}' and '{}:{}' has '{}', using '{}'";
+  boolean differentOwners = false;
+  String warnTemplate = "Keytab '{}' on host '{}' has different 
{}, originally set to '{}' and '{}:{}' has '{}', using '{}'";
   if 
(!resolvedKeytab.getOwnerName().equals(sameKeytab.getOwnerName())) {
 LOG.warn(warnTemplate,
 keytabFilePath, hostname, "owners", 
sameKeytab.getOwnerName(),
 serviceName, componentName, resolvedKeytab.getOwnerName(),
 sameKeytab.getOwnerName());
+differentOwners = true;
   }
   if 
(!resolvedKeytab.getOwnerAccess().equals(sameKeytab.getOwnerAccess())) {
 LOG.warn(warnTemplate,
@@ -1570,16 +1572,39 @@ public class KerberosHelperImpl implements 
KerberosHelper {
   // TODO with different owners, so make sure that keytabs are 
accessible through group acls
   // TODO this includes same group name and group 'r' mode
   if 
(!resolvedKeytab.getGroupName().equals(sameKeytab.getGroupName())) {
-LOG.warn(warnTemplate,
-keytabFilePath, hostname, "groups", 
sameKeytab.getGroupName(),
-serviceName, componentName, resolvedKeytab.getGroupName(),
-sameKeytab.getGroupName());
+if(differentOwners) {
+  LOG.error(warnTemplate,
+  keytabFilePath, hostname, "groups", 
sameKeytab.getGroupName(),
+  serviceName, componentName, 
resolvedKeytab.getGroupName(),
+  sameKeytab.getGroupName());
+} else {
+  LOG.warn(warnTemplate,
+  keytabFilePath, hostname, "groups", 
sameKeytab.getGroupName(),
+  serviceName, componentName, 
resolvedKeytab.getGroupName(),
+  sameKeytab.getGroupName());
+}
   }
   if 
(!resolvedKeytab.getGroupAccess().equals(sameKeytab.getGroupAccess())) {
-LOG.warn(warnTemplate,
-keytabFilePath, hostname, "group access", 
sam

ambari git commit: AMBARI-22415. Blueprint deploys failing with missing smoke user keytab file (echekanskiy)

2017-11-10 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 22b2d55f6 -> ec02a14c0


AMBARI-22415. Blueprint deploys failing with missing smoke user keytab file 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/ec02a14c
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/ec02a14c
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/ec02a14c

Branch: refs/heads/trunk
Commit: ec02a14c02529ec7fec647e6fed7c8c401f10e6d
Parents: 22b2d55
Author: Eugene Chekanskiy <echekans...@gmail.com>
Authored: Fri Nov 10 17:17:34 2017 +0200
Committer: Eugene Chekanskiy <echekans...@gmail.com>
Committed: Fri Nov 10 18:29:43 2017 +0200

--
 .../kerberos/CreateKeytabFilesServerAction.java | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/ec02a14c/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
index aa65e61..5ec4c10 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/CreateKeytabFilesServerAction.java
@@ -218,17 +218,18 @@ public class CreateKeytabFilesServerAction extends 
KerberosServerAction {
 }
 
 boolean regenerateKeytabs = 
getOperationType(getCommandParameters()) == OperationType.RECREATE_ALL;
+
+KerberosPrincipalEntity principalEntity = 
kerberosPrincipalDAO.find(evaluatedPrincipal);
+String cachedKeytabPath = (principalEntity == null) ? null : 
principalEntity.getCachedKeytabPath();
+
 if (password == null) {
   if (!regenerateKeytabs && 
(hostName.equalsIgnoreCase(KerberosHelper.AMBARI_SERVER_HOST_NAME) || 
kerberosPrincipalHostDAO
-  .exists(evaluatedPrincipal, hostEntity.getHostId(), 
keytabFilePath))) {
+  .exists(evaluatedPrincipal, hostEntity.getHostId(), 
keytabFilePath)) && cachedKeytabPath == null) {
 // There is nothing to do for this since it must already 
exist and we don't want to
 // regenerate the keytab
 message = String.format("Skipping keytab file for %s, 
missing password indicates nothing to do", evaluatedPrincipal);
 LOG.debug(message);
   } else {
-KerberosPrincipalEntity principalEntity = 
kerberosPrincipalDAO.find(evaluatedPrincipal);
-String cachedKeytabPath = (principalEntity == null) ? null 
: principalEntity.getCachedKeytabPath();
-
 if (cachedKeytabPath == null) {
   message = String.format("Failed to create keytab for %s, 
missing cached file", evaluatedPrincipal);
   actionLog.writeStdErr(message);



[1/2] ambari git commit: AMBARI-22278. Improve Kerberos principal and keytab accounting (echekanskiy)

2017-11-01 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 8a8d48fde -> d03c24b9f


http://git-wip-us.apache.org/repos/asf/ambari/blob/d03c24b9/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/KerberosServerAction.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/KerberosServerAction.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/KerberosServerAction.java
index 1b0f4fb..3491f18 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/KerberosServerAction.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/KerberosServerAction.java
@@ -30,6 +30,8 @@ import org.apache.ambari.server.actionmanager.HostRoleStatus;
 import org.apache.ambari.server.agent.CommandReport;
 import org.apache.ambari.server.agent.ExecutionCommand;
 import org.apache.ambari.server.controller.KerberosHelper;
+import org.apache.ambari.server.orm.dao.HostDAO;
+import org.apache.ambari.server.orm.entities.HostEntity;
 import org.apache.ambari.server.security.credential.PrincipalKeyCredential;
 import org.apache.ambari.server.serveraction.AbstractServerAction;
 import org.apache.ambari.server.state.Cluster;
@@ -171,6 +173,8 @@ public abstract class KerberosServerAction extends 
AbstractServerAction {
   @Inject
   private KerberosHelper kerberosHelper;
 
+  @Inject
+  HostDAO hostDAO;
   /**
* Given a (command parameter) Map and a property name, attempts to safely 
retrieve the requested
* data.
@@ -543,21 +547,8 @@ public abstract class KerberosServerAction extends 
AbstractServerAction {
 
 if (record != null) {
   String principal = record.get(KerberosIdentityDataFileReader.PRINCIPAL);
-
   if (principal != null) {
-String hostname = record.get(KerberosIdentityDataFileReader.HOSTNAME);
-
-if(KerberosHelper.AMBARI_SERVER_HOST_NAME.equals(hostname)) {
-  // Replace KerberosHelper.AMBARI_SERVER_HOST_NAME with the actual 
hostname where the Ambari
-  // server is... this host
-  hostname = StageUtils.getHostName();
-}
-
-// Evaluate the principal "pattern" found in the record to generate 
the "evaluated principal"
-// by replacing the _HOST and _REALM variables.
-String evaluatedPrincipal = principal.replace("_HOST", 
hostname).replace("_REALM", defaultRealm);
-
-commandReport = processIdentity(record, evaluatedPrincipal, 
operationHandler, kerberosConfiguration, requestSharedDataContext);
+commandReport = processIdentity(record, principal, operationHandler, 
kerberosConfiguration, requestSharedDataContext);
   }
 }
 
@@ -588,6 +579,14 @@ public abstract class KerberosServerAction extends 
AbstractServerAction {
 }
   }
 
+  protected Long ambariServerHostID(){
+String ambariServerHostName = StageUtils.getHostName();
+HostEntity ambariServerHostEntity = 
hostDAO.findByName(ambariServerHostName);
+return (ambariServerHostEntity == null)
+? null
+: ambariServerHostEntity.getHostId();
+  }
+
   /**
* A Kerberos operation type
* 

http://git-wip-us.apache.org/repos/asf/ambari/blob/d03c24b9/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/PrepareDisableKerberosServerAction.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/PrepareDisableKerberosServerAction.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/PrepareDisableKerberosServerAction.java
index e1f8419..b9381b4 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/PrepareDisableKerberosServerAction.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/PrepareDisableKerberosServerAction.java
@@ -107,7 +107,7 @@ public class PrepareDisableKerberosServerAction extends 
AbstractPrepareKerberosS
 Map> configurations = 
kerberosHelper.calculateConfigurations(cluster, null, kerberosDescriptor, 
false, false);
 
 processServiceComponentHosts(cluster, kerberosDescriptor, schToProcess, 
identityFilter, dataDirectory,
-configurations, kerberosConfigurations, includeAmbariIdentity, 
propertiesToIgnore, false);
+configurations, kerberosConfigurations, includeAmbariIdentity, 
propertiesToIgnore);
 
 // Add auth-to-local configurations to the set of changes
 Map authToLocalProperties = 
kerberosHelper.translateConfigurationSpecifications(kerberosDescriptor.getAllAuthToLocalProperties());

http://git-wip-us.apache.org/repos/asf/ambari/blob/d03c24b9/ambari-server/src/main/java/org/apache/ambari/server/serveraction/kerberos/PrepareEnableKerberosServerAction.java

[2/2] ambari git commit: AMBARI-22278. Improve Kerberos principal and keytab accounting (echekanskiy)

2017-11-01 Thread echekanskiy
AMBARI-22278. Improve Kerberos principal and keytab accounting (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d03c24b9
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d03c24b9
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d03c24b9

Branch: refs/heads/trunk
Commit: d03c24b9f21cd91af78e0bfca2a1ce1e0de51bdf
Parents: 8a8d48f
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Thu Oct 12 13:22:15 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed Nov 1 17:52:04 2017 +0200

--
 .../ambari/server/agent/HeartbeatProcessor.java |  72 --
 .../controller/DeleteIdentityHandler.java   |   3 +-
 .../server/controller/KerberosHelper.java   |  21 +-
 .../server/controller/KerberosHelperImpl.java   | 138 +-
 .../HostKerberosIdentityResourceProvider.java   |  16 +-
 .../ServiceComponentUninstalledEvent.java   |  11 +-
 .../server/orm/dao/KerberosKeytabDAO.java   | 110 
 .../server/orm/dao/KerberosPrincipalDAO.java|   7 +
 .../orm/dao/KerberosPrincipalHostDAO.java   |  40 ++-
 .../orm/entities/KerberosKeytabEntity.java  |  86 +++
 .../entities/KerberosPrincipalHostEntity.java   |  57 +++-
 .../entities/KerberosPrincipalHostEntityPK.java |  19 +-
 .../AbstractPrepareKerberosServerAction.java|  29 ++-
 .../kerberos/CleanupServerAction.java   |  14 +-
 .../server/serveraction/kerberos/Component.java |  13 +-
 .../ConfigureAmbariIdentitiesServerAction.java  |  31 ++-
 .../kerberos/CreateKeytabFilesServerAction.java |  64 +++--
 .../kerberos/CreatePrincipalsServerAction.java  |  48 ++--
 .../kerberos/KerberosIdentityDataFile.java  |   2 -
 .../KerberosIdentityDataFileWriter.java |   9 +-
 .../kerberos/KerberosServerAction.java  |  27 +-
 .../PrepareDisableKerberosServerAction.java |   2 +-
 .../PrepareEnableKerberosServerAction.java  |   2 +-
 .../PrepareKerberosIdentitiesServerAction.java  |   3 +-
 .../stageutils/ResolvedKerberosKeytab.java  | 257 +++
 .../upgrades/PreconfigureKerberosAction.java|  48 +++-
 .../apache/ambari/server/state/ServiceImpl.java |   2 +-
 .../svccomphost/ServiceComponentHostImpl.java   |   2 +-
 .../main/resources/Ambari-DDL-Derby-CREATE.sql  |  15 +-
 .../main/resources/Ambari-DDL-MySQL-CREATE.sql  |  13 +-
 .../main/resources/Ambari-DDL-Oracle-CREATE.sql |  13 +-
 .../resources/Ambari-DDL-Postgres-CREATE.sql|  11 +-
 .../resources/Ambari-DDL-SQLAnywhere-CREATE.sql |  13 +-
 .../resources/Ambari-DDL-SQLServer-CREATE.sql   |  13 +-
 .../src/main/resources/META-INF/persistence.xml |   1 +
 .../package/scripts/kerberos_common.py  |   7 +-
 .../package/scripts/kerberos_common.py  |   7 +-
 .../server/agent/TestHeartbeatHandler.java  |   2 +-
 .../server/controller/KerberosHelperTest.java   |   6 +
 ...ostKerberosIdentityResourceProviderTest.java |  12 +-
 .../utilities/KerberosIdentityCleanerTest.java  |  10 +-
 .../HostVersionOutOfSyncListenerTest.java   |   2 +-
 ...AbstractPrepareKerberosServerActionTest.java |  11 +-
 ...nfigureAmbariIdentitiesServerActionTest.java |  11 +-
 .../FinalizeKerberosServerActionTest.java   |   5 +
 .../kerberos/KerberosIdentityDataFileTest.java  |   8 +-
 .../kerberos/KerberosServerActionTest.java  |  10 +-
 .../PreconfigureKerberosActionTest.java |  10 +
 48 files changed, 1097 insertions(+), 216 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/d03c24b9/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartbeatProcessor.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartbeatProcessor.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartbeatProcessor.java
index 2690008..83d2c98 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartbeatProcessor.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/agent/HeartbeatProcessor.java
@@ -52,7 +52,9 @@ import 
org.apache.ambari.server.events.publishers.AlertEventPublisher;
 import org.apache.ambari.server.events.publishers.AmbariEventPublisher;
 import org.apache.ambari.server.events.publishers.VersionEventPublisher;
 import org.apache.ambari.server.metadata.ActionMetadata;
+import org.apache.ambari.server.orm.dao.KerberosKeytabDAO;
 import org.apache.ambari.server.orm.dao.KerberosPrincipalHostDAO;
+import org.apache.ambari.server.orm.entities.KerberosPrincipalHostEntity;
 import org.apache.ambari.server.state.Alert;
 import org.apache.ambari.server.state.Cluster;
 import org.apache.ambari.server.state.Clusters;
@@ -88,7 +90,6 @@ import com.google.inject.Injector;
 
 /**
  * HeartbeatPro

ambari git commit: AMBARI-22188. Make hive server create directories related to replication (echekanskiy)

2017-10-11 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 734156756 -> 4a47e792a


AMBARI-22188. Make hive server create directories related to replication 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/4a47e792
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/4a47e792
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/4a47e792

Branch: refs/heads/trunk
Commit: 4a47e792ae6e4fe4fb67ca80c96c5e2054f7a9cd
Parents: 7341567
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed Oct 11 19:38:41 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed Oct 11 19:38:41 2017 +0300

--
 .../HIVE/0.12.0.2.0/configuration/hive-site.xml | 47 +++
 .../HIVE/0.12.0.2.0/package/scripts/hive.py | 34 +-
 .../0.12.0.2.0/package/scripts/params_linux.py  |  4 ++
 .../HIVE/2.1.0.3.0/configuration/hive-site.xml  | 48 
 .../HIVE/2.1.0.3.0/package/scripts/hive.py  | 32 +
 .../2.1.0.3.0/package/scripts/params_linux.py   |  4 ++
 6 files changed, 168 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/4a47e792/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
index 69d1c69..762530b 100644
--- 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
+++ 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
@@ -471,4 +471,51 @@ limitations under the License.
 
 
   
+
+  
+  
+hive.metastore.dml.events
+
+If true, the metastore will be asked to fire events for DML 
operations
+
+  true
+
+
+  
+  
+hive.repl.cm.enabled
+
+Turn on ChangeManager, so delete files will go to 
cmrootdir.
+
+  true
+
+
+  
+  
+hive.metastore.transactional.event.listeners
+
+A comma separated list of Java classes that implement the 
org.apache.hadoop.hive.metastore.MetaStoreEventListener interface. Both the 
metastore event and corresponding listener method will be invoked in the same 
JDO transaction.
+
+  true
+
+
+  
+  
+hive.repl.cmrootdir
+
+Root dir for ChangeManager, used for deleted 
files.
+
+  true
+
+
+  
+  
+hive.repl.rootdir
+
+HDFS root dir for all replication dumps.
+
+  true
+
+
+  
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/4a47e792/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
index 8e176b6..c4b34a5 100644
--- 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
+++ 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
@@ -245,7 +245,22 @@ def setup_hiveserver2():
  owner=params.hive_user,
  group=params.hdfs_user,
  mode=0777) # Hive expects this dir to be writeable by 
everyone as it is used as a temp dir
-
+
+  if params.hive_repl_cmrootdir is not None:
+params.HdfsResource(params.hive_repl_cmrootdir,
+type = "directory",
+action = "create_on_execute",
+owner = params.hive_user,
+group=params.user_group,
+mode = 01777)
+  if params.hive_repl_rootdir is not None:
+params.HdfsResource(params.hive_repl_rootdir,
+type = "directory",
+action = "create_on_execute",
+owner = params.hive_user,
+group=params.user_group,
+mode = 0700)
+
   params.HdfsResource(None, action="execute")
   
 def setup_non_client():
@@ -310,6 +325,23 @@ def setup_metastore():
  create_parents = True,
  mode=0777)
 
+  if params.hive_repl_cmrootdir is not None:
+params.HdfsResource(params.hive_repl_cmrootdir,
+type = "directory",
+action = "create_on_execute",
+ 

ambari git commit: AMBARI-22188. Make hive server create directories related to replication (echekanskiy)

2017-10-11 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 589b7499e -> 8c766070a


AMBARI-22188. Make hive server create directories related to replication 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/8c766070
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/8c766070
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/8c766070

Branch: refs/heads/branch-2.6
Commit: 8c766070aa7db5f0f5e8da3209716786554b714a
Parents: 589b749
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed Oct 11 19:35:33 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed Oct 11 19:35:33 2017 +0300

--
 .../HIVE/0.12.0.2.0/configuration/hive-site.xml | 47 
 .../HIVE/0.12.0.2.0/package/scripts/hive.py | 33 +-
 .../0.12.0.2.0/package/scripts/params_linux.py  |  4 ++
 3 files changed, 83 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/8c766070/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
index 2510fda..161dfb2 100644
--- 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
+++ 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml
@@ -467,4 +467,51 @@ limitations under the License.
 
 
   
+
+
+
+  hive.metastore.dml.events
+  
+  If true, the metastore will be asked to fire events for DML 
operations
+  
+true
+  
+  
+
+
+  hive.repl.cm.enabled
+  
+  Turn on ChangeManager, so delete files will go to 
cmrootdir.
+  
+true
+  
+  
+
+
+  hive.metastore.transactional.event.listeners
+  
+  A comma separated list of Java classes that implement the 
org.apache.hadoop.hive.metastore.MetaStoreEventListener interface. Both the 
metastore event and corresponding listener method will be invoked in the same 
JDO transaction.
+  
+true
+  
+  
+
+
+  hive.repl.cmrootdir
+  
+  Root dir for ChangeManager, used for deleted 
files.
+  
+true
+  
+  
+
+
+  hive.repl.rootdir
+  
+  HDFS root dir for all replication dumps.
+  
+true
+  
+  
+
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/8c766070/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
index 0d6e6dc..abbe59e 100644
--- 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
+++ 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
@@ -209,7 +209,21 @@ def hive(name=None):
owner=params.hive_user,
group=params.hdfs_user,
mode=0777) # Hive expects this dir to be writeable 
by everyone as it is used as a temp dir
-  
+if params.hive_repl_cmrootdir is not None:
+  params.HdfsResource(params.hive_repl_cmrootdir,
+  type = "directory",
+  action = "create_on_execute",
+  owner = params.hive_user,
+  group=params.user_group,
+  mode = 01777)
+if params.hive_repl_rootdir is not None:
+  params.HdfsResource(params.hive_repl_rootdir,
+  type = "directory",
+  action = "create_on_execute",
+  owner = params.hive_user,
+  group=params.user_group,
+  mode = 0700)
+
 params.HdfsResource(None, action="execute")
 
   Directory(params.hive_etc_dir_prefix,
@@ -320,6 +334,23 @@ def hive(name=None):
create_parents = True,
mode=0777)
 
+if params.hive_repl_cmrootdir is not None:
+  params.HdfsResource(params.hive_repl_cmrootdir,
+type = "directory",
+action = "create_on_execute",
+owner = params.hive_user,
+ 

ambari git commit: BUG-89063. Make dfs.permissions.superusergroup as group property (echekanskiy)

2017-09-28 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 268c9642c -> 903829603


BUG-89063. Make dfs.permissions.superusergroup as group property (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/90382960
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/90382960
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/90382960

Branch: refs/heads/branch-2.6
Commit: 903829603c512223e2da2cb951f9bb8543c0b53f
Parents: 268c964
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Thu Sep 28 19:48:47 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Thu Sep 28 19:48:47 2017 +0300

--
 .../HDFS/2.1.0.2.0/configuration/hadoop-env.xml| 6 ++
 .../common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml | 1 +
 .../stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py| 2 +-
 3 files changed, 8 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/90382960/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
index bb671cc..ca2f5ff 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
@@ -193,6 +193,12 @@
 
   user
   false
+  
+
+  hdfs-site
+  dfs.permissions.superusergroup
+
+  
 
 
   

http://git-wip-us.apache.org/repos/asf/ambari/blob/90382960/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
index 4eab367..7fdc227 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
@@ -335,6 +335,7 @@
   
 dfs.permissions.superusergroup
 hdfs
+GROUP
 The name of the group of super-users.
 
   

http://git-wip-us.apache.org/repos/asf/ambari/blob/90382960/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py
index 1fd7f3e..e085225 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py
@@ -256,7 +256,7 @@ if has_zeppelin_master:
   user_to_groups_dict[zeppelin_user] = [zeppelin_group, user_group]
 #Append new user-group mapping to the dict
 try:
-  user_group_map = ast.literal_eval(config['hostLevelParams']['user_group'])
+  user_group_map = ast.literal_eval(config['hostLevelParams']['user_groups'])
   for key in user_group_map.iterkeys():
 user_to_groups_dict[key] = user_group_map[key]
 except ValueError:



ambari git commit: BUG-89063. Make dfs.permissions.superusergroup as group property (echekanskiy)

2017-09-28 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk fb4115e27 -> 8e0f782ef


BUG-89063. Make dfs.permissions.superusergroup as group property (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/8e0f782e
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/8e0f782e
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/8e0f782e

Branch: refs/heads/trunk
Commit: 8e0f782efd4694028b598106e68ebe2a1c7c0a2e
Parents: fb4115e
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Thu Sep 28 19:58:58 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Thu Sep 28 19:58:58 2017 +0300

--
 .../common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml  | 4 
 .../common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml   | 1 +
 .../common-services/HDFS/3.0.0.3.0/configuration/hadoop-env.xml  | 4 
 .../common-services/HDFS/3.0.0.3.0/configuration/hdfs-site.xml   | 1 +
 4 files changed, 10 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/8e0f782e/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
index 0f36e0b..660ab63 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hadoop-env.xml
@@ -198,6 +198,10 @@
   cluster-env
   user_group
 
+
+  hdfs-site
+  dfs.permissions.superusergroup
+
   
 
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/8e0f782e/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
index 4eab367..7fdc227 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/configuration/hdfs-site.xml
@@ -335,6 +335,7 @@
   
 dfs.permissions.superusergroup
 hdfs
+GROUP
 The name of the group of super-users.
 
   

http://git-wip-us.apache.org/repos/asf/ambari/blob/8e0f782e/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hadoop-env.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hadoop-env.xml
 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hadoop-env.xml
index 4154007..2ce3f84 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hadoop-env.xml
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hadoop-env.xml
@@ -198,6 +198,10 @@
   cluster-env
   user_group
 
+
+  hdfs-site
+  dfs.permissions.superusergroup
+
   
 
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/8e0f782e/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hdfs-site.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hdfs-site.xml
 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hdfs-site.xml
index a4fed0f..5c28527 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hdfs-site.xml
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/configuration/hdfs-site.xml
@@ -332,6 +332,7 @@
   
 dfs.permissions.superusergroup
 hdfs
+GROUP
 The name of the group of super-users.
 
   



ambari git commit: AMBARI-22027. Add UID/GID related issue with external users not listed in /etc/passwd (echekanskiy)

2017-09-21 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 30a046adb -> a04774865


AMBARI-22027. Add UID/GID related issue with external users not listed in 
/etc/passwd (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/a0477486
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/a0477486
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/a0477486

Branch: refs/heads/branch-2.6
Commit: a047748655370a4936d593acf26723d5de2680c6
Parents: 30a046a
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Thu Sep 21 20:33:16 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Thu Sep 21 20:33:16 2017 +0300

--
 .../before-ANY/scripts/shared_initialization.py | 29 +--
 .../2.0.6/hooks/before-ANY/test_before_any.py   | 85 
 2 files changed, 36 insertions(+), 78 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/a0477486/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
index bcd4b17..3997117 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
@@ -139,11 +139,19 @@ def set_uid(user, user_dirs):
content=StaticFile("changeToSecureUid.sh"),
mode=0555)
   ignore_groupsusers_create_str = str(params.ignore_groupsusers_create).lower()
-  uid = get_uid(user)
+  uid = get_uid(user, return_existing=True)
   Execute(format("{tmp_dir}/changeUid.sh {user} {user_dirs} {new_uid}", 
new_uid=0 if uid is None else uid),
   not_if = format("(test $(id -u {user}) -gt 1000) || 
({ignore_groupsusers_create_str})"))
 
-def get_uid(user):
+def get_uid(user, return_existing=False):
+  """
+  Tries to get UID for username. It will try to find UID in custom properties 
in *cluster_env* and, if *return_existing=True*,
+  it will try to return UID of existing *user*.
+
+  :param user: username to get UID for
+  :param return_existing: return UID for existing user
+  :return:
+  """
   import params
   user_str = str(user) + "_uid"
   service_env = [ serviceEnv for serviceEnv in params.config['configurations'] 
if user_str in params.config['configurations'][serviceEnv]]
@@ -155,13 +163,18 @@ def get_uid(user):
   Logger.warning("Multiple values found for %s, using %s"  % (user_str, 
uid))
 return uid
   else:
-if user == params.smoke_user:
+if return_existing:
+  # pick up existing UID or try to find available UID in /etc/passwd, see 
changeToSecureUid.sh for more info
+  if user == params.smoke_user:
+return None
+  File(format("{tmp_dir}/changeUid.sh"),
+   content=StaticFile("changeToSecureUid.sh"),
+   mode=0555)
+  code, newUid = shell.call(format("{tmp_dir}/changeUid.sh {user}"))
+  return int(newUid)
+else:
+  # do not return UID for existing user, used in User resource call to let 
OS to choose UID for us
   return None
-File(format("{tmp_dir}/changeUid.sh"),
- content=StaticFile("changeToSecureUid.sh"),
- mode=0555)
-code, newUid = shell.call(format("{tmp_dir}/changeUid.sh {user}"))
-return int(newUid)
 
 def setup_hadoop_env():
   import params

http://git-wip-us.apache.org/repos/asf/ambari/blob/a0477486/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
 
b/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
index a13ac24..9dceb69 100644
--- 
a/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
+++ 
b/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
@@ -52,33 +52,22 @@ class TestHookBeforeInstall(RMFTestCase):
 self.assertResourceCalled('Group', 'hadoop',)
 self.assertResourceCalled('Group', 'nobody',)
 self.assertResourceCalled('Group', 'users',)
-self.assertResourceCalled('File', '/tmp/changeUid.sh',
-  content = StaticFile('changeToSecureUid.sh'),
-  mode = 0555,
-  )
+
 self.assert

ambari git commit: AMBARI-22027. Add UID/GID related issue with external users not listed in /etc/passwd (echekanskiy)

2017-09-21 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 2a0602104 -> f1b53000c


AMBARI-22027. Add UID/GID related issue with external users not listed in 
/etc/passwd (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/f1b53000
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/f1b53000
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/f1b53000

Branch: refs/heads/trunk
Commit: f1b53000c65a97ac7784d51c9a648e7e135acaab
Parents: 2a06021
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Thu Sep 21 21:07:03 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Thu Sep 21 21:07:03 2017 +0300

--
 .../before-ANY/scripts/shared_initialization.py | 29 +--
 .../2.0.6/hooks/before-ANY/test_before_any.py   | 85 
 2 files changed, 36 insertions(+), 78 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/f1b53000/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
index ee950e8..11593fe 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
@@ -139,11 +139,19 @@ def set_uid(user, user_dirs):
content=StaticFile("changeToSecureUid.sh"),
mode=0555)
   ignore_groupsusers_create_str = str(params.ignore_groupsusers_create).lower()
-  uid = get_uid(user)
+  uid = get_uid(user, return_existing=True)
   Execute(format("{tmp_dir}/changeUid.sh {user} {user_dirs} {new_uid}", 
new_uid=0 if uid is None else uid),
   not_if = format("(test $(id -u {user}) -gt 1000) || 
({ignore_groupsusers_create_str})"))
 
-def get_uid(user):
+def get_uid(user, return_existing=False):
+  """
+  Tries to get UID for username. It will try to find UID in custom properties 
in *cluster_env* and, if *return_existing=True*,
+  it will try to return UID of existing *user*.
+
+  :param user: username to get UID for
+  :param return_existing: return UID for existing user
+  :return:
+  """
   import params
   user_str = str(user) + "_uid"
   service_env = [ serviceEnv for serviceEnv in params.config['configurations'] 
if user_str in params.config['configurations'][serviceEnv]]
@@ -155,13 +163,18 @@ def get_uid(user):
   Logger.warning("Multiple values found for %s, using %s"  % (user_str, 
uid))
 return uid
   else:
-if user == params.smoke_user:
+if return_existing:
+  # pick up existing UID or try to find available UID in /etc/passwd, see 
changeToSecureUid.sh for more info
+  if user == params.smoke_user:
+return None
+  File(format("{tmp_dir}/changeUid.sh"),
+   content=StaticFile("changeToSecureUid.sh"),
+   mode=0555)
+  code, newUid = shell.call(format("{tmp_dir}/changeUid.sh {user}"))
+  return int(newUid)
+else:
+  # do not return UID for existing user, used in User resource call to let 
OS to choose UID for us
   return None
-File(format("{tmp_dir}/changeUid.sh"),
- content=StaticFile("changeToSecureUid.sh"),
- mode=0555)
-code, newUid = shell.call(format("{tmp_dir}/changeUid.sh {user}"))
-return int(newUid)
 
 def setup_hadoop_env():
   import params

http://git-wip-us.apache.org/repos/asf/ambari/blob/f1b53000/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
 
b/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
index a13ac24..9dceb69 100644
--- 
a/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
+++ 
b/ambari-server/src/test/python/stacks/2.0.6/hooks/before-ANY/test_before_any.py
@@ -52,33 +52,22 @@ class TestHookBeforeInstall(RMFTestCase):
 self.assertResourceCalled('Group', 'hadoop',)
 self.assertResourceCalled('Group', 'nobody',)
 self.assertResourceCalled('Group', 'users',)
-self.assertResourceCalled('File', '/tmp/changeUid.sh',
-  content = StaticFile('changeToSecureUid.sh'),
-  mode = 0555,
-  )
+
 self.assertResourc

ambari git commit: AMBARI-21970. Enable sticky bit for curl_krb_cache (echekanskiy)

2017-09-20 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 412c6cac4 -> 629f3ee6d


AMBARI-21970. Enable sticky bit for curl_krb_cache (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/629f3ee6
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/629f3ee6
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/629f3ee6

Branch: refs/heads/branch-2.6
Commit: 629f3ee6d0433fe01523af7c20a133229926e9dc
Parents: 412c6ca
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Wed Sep 20 16:26:36 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Wed Sep 20 16:26:36 2017 +0300

--
 .../resource_management/libraries/functions/curl_krb_request.py| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/629f3ee6/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
--
diff --git 
a/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
 
b/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
index 72bc5c6..95e8625 100644
--- 
a/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
+++ 
b/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
@@ -111,7 +111,7 @@ def curl_krb_request(tmp_dir, keytab, principal, url, 
cache_file_prefix,
   curl_krb_cache_path = os.path.join(tmp_dir, "curl_krb_cache")
   if not os.path.exists(curl_krb_cache_path):
 os.makedirs(curl_krb_cache_path)
-  os.chmod(curl_krb_cache_path, 0777)
+  os.chmod(curl_krb_cache_path, 01777)
 
   ccache_file_path = "{0}{1}{2}_{3}_cc_{4}".format(curl_krb_cache_path, 
os.sep, cache_file_prefix, user, ccache_file_name)
   kerberos_env = {'KRB5CCNAME': ccache_file_path}



ambari git commit: AMBARI-21970. Enable sticky bit for curl_krb_cache (echekanskiy)

2017-09-20 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 260c75fc8 -> 8b9370a55


AMBARI-21970. Enable sticky bit for curl_krb_cache (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/8b9370a5
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/8b9370a5
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/8b9370a5

Branch: refs/heads/trunk
Commit: 8b9370a55207ecb6baccfd02337afe7c75532f7b
Parents: 260c75f
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Wed Sep 20 16:28:11 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Wed Sep 20 16:28:11 2017 +0300

--
 .../resource_management/libraries/functions/curl_krb_request.py| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/8b9370a5/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
--
diff --git 
a/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
 
b/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
index 72bc5c6..95e8625 100644
--- 
a/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
+++ 
b/ambari-common/src/main/python/resource_management/libraries/functions/curl_krb_request.py
@@ -111,7 +111,7 @@ def curl_krb_request(tmp_dir, keytab, principal, url, 
cache_file_prefix,
   curl_krb_cache_path = os.path.join(tmp_dir, "curl_krb_cache")
   if not os.path.exists(curl_krb_cache_path):
 os.makedirs(curl_krb_cache_path)
-  os.chmod(curl_krb_cache_path, 0777)
+  os.chmod(curl_krb_cache_path, 01777)
 
   ccache_file_path = "{0}{1}{2}_{3}_cc_{4}".format(curl_krb_cache_path, 
os.sep, cache_file_prefix, user, ccache_file_name)
   kerberos_env = {'KRB5CCNAME': ccache_file_path}



ambari git commit: AMBARI-21926. Failed to call stack advisor on oozie config change (echekanskiy)

2017-09-11 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 d2ecd98b7 -> 0df94c883


AMBARI-21926. Failed to call stack advisor on oozie config change (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/0df94c88
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/0df94c88
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/0df94c88

Branch: refs/heads/branch-2.6
Commit: 0df94c883c6ed9660609795eb4e37d687e1bf9c6
Parents: d2ecd98
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Sep 11 17:53:11 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Sep 11 17:53:11 2017 +0300

--
 .../src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/0df94c88/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
index 5e04b53..7992c95 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
@@ -477,7 +477,7 @@ class HDP206StackAdvisor(DefaultStackAdvisor):
 ]
 
 self.updateMountProperties("hdfs-site", hdfs_mount_properties, 
configurations, services, hosts)
-
+dataDirs = []
 if configurations and "hdfs-site" in configurations and \
 "dfs.datanode.data.dir" in 
configurations["hdfs-site"]["properties"] and \
 
configurations["hdfs-site"]["properties"]["dfs.datanode.data.dir"] is not None:



ambari git commit: AMBARI-21926. Failed to call stack advisor on oozie config change (echekanskiy)

2017-09-11 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 4f23c1ec5 -> 26b9f4f13


AMBARI-21926. Failed to call stack advisor on oozie config change (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/26b9f4f1
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/26b9f4f1
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/26b9f4f1

Branch: refs/heads/trunk
Commit: 26b9f4f13c141434ab51a945ebc50b56b82e41b9
Parents: 4f23c1e
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Sep 11 17:50:19 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Sep 11 17:50:19 2017 +0300

--
 .../src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/26b9f4f1/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
index bd60bed..70cb7a2 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/stack_advisor.py
@@ -313,7 +313,7 @@ class HDP206StackAdvisor(DefaultStackAdvisor):
 ]
 
 self.updateMountProperties("hdfs-site", hdfs_mount_properties, 
configurations, services, hosts)
-
+dataDirs = []
 if configurations and "hdfs-site" in configurations and \
 "dfs.datanode.data.dir" in 
configurations["hdfs-site"]["properties"] and \
 
configurations["hdfs-site"]["properties"]["dfs.datanode.data.dir"] is not None:



ambari git commit: AMBARI-21687. User can't add node via Ambari UI when being part of both "cluster user" and "cluster admin" roles (echekanskiy)

2017-09-04 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 46dd062a1 -> 6174afff1


AMBARI-21687. User can't add node via Ambari UI when being part of both 
"cluster user" and "cluster admin" roles (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/6174afff
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/6174afff
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/6174afff

Branch: refs/heads/branch-2.6
Commit: 6174afff1ebd6f179c7828aebd62e3912cd8539b
Parents: 46dd062
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Sep 4 14:59:24 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Sep 4 14:59:24 2017 +0300

--
 .../server/controller/internal/RequestResourceProvider.java  | 8 +++-
 .../controller/internal/RequestResourceProviderTest.java | 2 --
 2 files changed, 7 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/6174afff/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
index 5a0549d..46b1d0c 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
@@ -249,7 +249,13 @@ public class RequestResourceProvider extends 
AbstractControllerResourceProvider
 ? null
 : actionDefinition.getPermissions();
 
-if (!AuthorizationHelper.isAuthorized(resourceType, resourceId, 
permissions)) {
+// here goes ResourceType handling for some specific custom actions
+ResourceType customActionResourceType = resourceType;
+if (actionName.contains("check_host")) { // check_host custom 
action
+  customActionResourceType = ResourceType.CLUSTER;
+}
+
+if (!AuthorizationHelper.isAuthorized(customActionResourceType, 
resourceId, permissions)) {
   throw new AuthorizationException(String.format("The 
authenticated user is not authorized to execute the action %s.", actionName));
 }
   }

http://git-wip-us.apache.org/repos/asf/ambari/blob/6174afff/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
--
diff --git 
a/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
 
b/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
index c519323..fb9c4fd 100644
--- 
a/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
+++ 
b/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
@@ -1358,13 +1358,11 @@ public class RequestResourceProviderTest {
 EnumSet.of(RoleAuthorization.HOST_ADD_DELETE_HOSTS));
   }
 
-  @Test(expected = AuthorizationException.class)
   public void 
testCreateResourcesCheckHostForNonClusterAsClusterAdministrator() throws 
Exception {
 
testCreateResources(TestAuthenticationFactory.createClusterAdministrator(), 
null, null, "check_host",
 EnumSet.of(RoleAuthorization.HOST_ADD_DELETE_HOSTS));
   }
 
-  @Test(expected = AuthorizationException.class)
   public void testCreateResourcesCheckHostForNonClusterAsClusterOperator() 
throws Exception {
 testCreateResources(TestAuthenticationFactory.createClusterOperator(), 
null, null, "check_host",
 EnumSet.of(RoleAuthorization.HOST_ADD_DELETE_HOSTS));



ambari git commit: AMBARI-21687. User can't add node via Ambari UI when being part of both "cluster user" and "cluster admin" roles (echekanskiy)

2017-09-04 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 18a16cbe4 -> c51540dee


AMBARI-21687. User can't add node via Ambari UI when being part of both 
"cluster user" and "cluster admin" roles (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/c51540de
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/c51540de
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/c51540de

Branch: refs/heads/trunk
Commit: c51540dee89d90bb488c2b1a1269ae7d40d5d509
Parents: 18a16cb
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Sep 4 14:53:51 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Sep 4 14:53:51 2017 +0300

--
 .../server/controller/internal/RequestResourceProvider.java  | 8 +++-
 .../controller/internal/RequestResourceProviderTest.java | 2 --
 2 files changed, 7 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/c51540de/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
index 355e572..81f283c 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/internal/RequestResourceProvider.java
@@ -251,7 +251,13 @@ public class RequestResourceProvider extends 
AbstractControllerResourceProvider
 ? null
 : actionDefinition.getPermissions();
 
-if (!AuthorizationHelper.isAuthorized(resourceType, resourceId, 
permissions)) {
+// here goes ResourceType handling for some specific custom actions
+ResourceType customActionResourceType = resourceType;
+if (actionName.contains("check_host")) { // check_host custom 
action
+  customActionResourceType = ResourceType.CLUSTER;
+}
+
+if (!AuthorizationHelper.isAuthorized(customActionResourceType, 
resourceId, permissions)) {
   throw new AuthorizationException(String.format("The 
authenticated user is not authorized to execute the action %s.", actionName));
 }
   }

http://git-wip-us.apache.org/repos/asf/ambari/blob/c51540de/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
--
diff --git 
a/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
 
b/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
index b2e9472..c0695b1 100644
--- 
a/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
+++ 
b/ambari-server/src/test/java/org/apache/ambari/server/controller/internal/RequestResourceProviderTest.java
@@ -1358,13 +1358,11 @@ public class RequestResourceProviderTest {
 EnumSet.of(RoleAuthorization.HOST_ADD_DELETE_HOSTS));
   }
 
-  @Test(expected = AuthorizationException.class)
   public void 
testCreateResourcesCheckHostForNonClusterAsClusterAdministrator() throws 
Exception {
 
testCreateResources(TestAuthenticationFactory.createClusterAdministrator(), 
null, null, "check_host",
 EnumSet.of(RoleAuthorization.HOST_ADD_DELETE_HOSTS));
   }
 
-  @Test(expected = AuthorizationException.class)
   public void testCreateResourcesCheckHostForNonClusterAsClusterOperator() 
throws Exception {
 testCreateResources(TestAuthenticationFactory.createClusterOperator(), 
null, null, "check_host",
 EnumSet.of(RoleAuthorization.HOST_ADD_DELETE_HOSTS));



ambari git commit: AMBARI-21833. Add keberos service for 3.0.0 stack (echekanskiy)

2017-08-29 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 5388ae6d1 -> 41656481b


AMBARI-21833. Add keberos service for 3.0.0 stack (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/41656481
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/41656481
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/41656481

Branch: refs/heads/trunk
Commit: 41656481b231244b9bfe86ba24a6a3f258b9d17d
Parents: 5388ae6
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Tue Aug 29 15:46:35 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Tue Aug 29 15:46:35 2017 +0300

--
 .../1.10.3-30/configuration/kerberos-env.xml| 423 
 .../1.10.3-30/configuration/krb5-conf.xml   |  74 +++
 .../KERBEROS/1.10.3-30/kerberos.json|  17 +
 .../KERBEROS/1.10.3-30/metainfo.xml | 131 +
 .../package/scripts/kerberos_client.py  |  56 +++
 .../package/scripts/kerberos_common.py  | 494 +++
 .../1.10.3-30/package/scripts/params.py | 205 
 .../1.10.3-30/package/scripts/service_check.py  |  85 
 .../1.10.3-30/package/scripts/status_params.py  |  34 ++
 .../KERBEROS/1.10.3-30/package/scripts/utils.py | 105 
 .../KERBEROS/1.10.3-30/properties/krb5_conf.j2  |  60 +++
 .../HDP/3.0/services/KERBEROS/metainfo.xml  |  26 +
 12 files changed, 1710 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/41656481/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-30/configuration/kerberos-env.xml
--
diff --git 
a/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-30/configuration/kerberos-env.xml
 
b/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-30/configuration/kerberos-env.xml
new file mode 100644
index 000..0a08121
--- /dev/null
+++ 
b/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-30/configuration/kerberos-env.xml
@@ -0,0 +1,423 @@
+
+
+
+
+  
+kdc_type
+
+  The type of KDC being used. Either mit-kdc, ipa, or active-directory
+
+mit-kdc
+KDC type
+
+  componentHost
+  false
+
+
+  
+  
+manage_identities
+
+  Indicates whether the Ambari user and service Kerberos identities 
(principals and keytab files)
+  should be managed (created, deleted, updated, etc...) by Ambari or 
managed manually.
+
+true
+Manage Kerberos Identities
+
+  false
+  false
+  boolean
+
+
+  
+  
+manage_auth_to_local
+
+  Indicates whether the hadoop auth_to_local rules should be managed by 
Ambari or managed manually.
+
+true
+Manage Hadoop auth_to_local rules
+
+  true
+  false
+  boolean
+
+
+  
+  
+install_packages
+Install OS-specific Kerberos client package(s)
+
+  Indicates whether Ambari should install the Kerberos client package(s) 
or not. If not, it is
+  expected that Kerberos utility programs (such as kadmin, kinit, klist, 
and kdestroy) are
+  compatible with MIT Kerberos 5 version 1.10.3 in command line options 
and behaviors.
+
+true
+
+  boolean
+  false
+
+
+  
+  
+ldap_url
+LDAP url
+
+  The URL to the Active Directory LDAP Interface
+  Example: ldaps://ad.example.com:636
+
+
+
+  false
+  false
+  ldap_url
+
+
+  
+  
+container_dn
+Container DN
+
+  The distinguished name (DN) of the container used store service 
principals
+
+
+  false
+  false
+
+
+
+  
+  
+encryption_types
+Encryption Types
+
+  The supported list of session key encryption types that should be 
returned by the KDC.
+
+aes des3-cbc-sha1 rc4 des-cbc-md5
+
+  multiLine
+  false
+
+
+  
+  
+realm
+
+  The default realm to use when creating service principals
+
+Realm name
+
+
+  host
+  true
+  false
+
+
+  
+  
+kdc_hosts
+
+  A comma-delimited list of IP addresses or FQDNs declaring the KDC hosts.
+  Optionally a port number may be included in each entry by separating 
each host and port by a
+  colon (:). Example:  kdc1.example.com:88, kdc2.example.com:88
+
+KDC hosts
+
+
+  false
+
+
+  
+  
+master_kdc
+
+  The IP address or FQDN of the master KDC host in a master-slave KDC 
deployment.
+  Optionally a port number may be included.
+  Example:  kdc1.example.com:88
+
+Master KDC host
+
+
+  true
+  false
+
+
+  
+  
+admin_server_host
+Kadmin host
+
+  The IP 

ambari git commit: AMBARI-21757. Allow for keytab regeneration to be filtered for hosts (echekanskiy)

2017-08-25 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 e16eb8c08 -> 30ad79100


AMBARI-21757. Allow for keytab regeneration to be filtered for hosts 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/30ad7910
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/30ad7910
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/30ad7910

Branch: refs/heads/branch-2.6
Commit: 30ad7910090540a13c6ce2abfbddcd01f3ba5ee2
Parents: e16eb8c
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Fri Aug 25 16:48:33 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Fri Aug 25 16:48:33 2017 +0300

--
 .../resources/ClusterResourceDefinition.java|  2 +
 .../server/controller/KerberosHelper.java   |  8 
 .../server/controller/KerberosHelperImpl.java   | 47 +++-
 .../server/controller/KerberosHelperTest.java   | 30 +
 4 files changed, 86 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/30ad7910/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
index 1a3aa01..7022b2f 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
@@ -85,6 +85,8 @@ public class ClusterResourceDefinition extends 
BaseResourceDefinition {
 directives.add(KerberosHelper.DIRECTIVE_REGENERATE_KEYTABS);
 directives.add(KerberosHelper.DIRECTIVE_MANAGE_KERBEROS_IDENTITIES);
 directives.add(KerberosHelper.DIRECTIVE_FORCE_TOGGLE_KERBEROS);
+directives.add(KerberosHelper.DIRECTIVE_HOSTS);
+directives.add(KerberosHelper.DIRECTIVE_COMPONENTS);
 return directives;
   }
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/30ad7910/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
index 4066418..8d3431b 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
@@ -49,6 +49,14 @@ public interface KerberosHelper {
*/
   String DIRECTIVE_REGENERATE_KEYTABS = "regenerate_keytabs";
   /**
+   * directive used to pass host list to regenerate keytabs on
+   */
+  String DIRECTIVE_HOSTS = "regenerate_hosts";
+  /**
+   * directive used to pass list of services and their components to 
regenerate keytabs for
+   */
+  String DIRECTIVE_COMPONENTS = "regenerate_components";
+  /**
* directive used to indicate that the enable Kerberos operation should 
proceed even if the
* cluster's security type is not changing
*/

http://git-wip-us.apache.org/repos/asf/ambari/blob/30ad7910/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
index 73e9ec2..c15af73 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
@@ -126,6 +126,8 @@ import 
org.apache.directory.server.kerberos.shared.keytab.Keytab;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.ImmutableSet;
 import com.google.inject.Inject;
 import com.google.inject.Injector;
 import com.google.inject.Singleton;
@@ -255,6 +257,9 @@ public class KerberosHelperImpl implements KerberosHelper {
 
   CreatePrincipalsAndKeytabsHandler handler = null;
 
+  Set hostFilter = parseHostFilter(requestProperties);
+  Map<String, Set> serviceComponentFilter = 
parseComponentFilter(requestProperties);
+
   if ("true".equalsIgnoreCase(value) || 
"all".equalsIgnoreCase(value)) {
 handler = new CreatePrincipalsA

ambari git commit: AMBARI-21757. Allow for keytab regeneration to be filtered for hosts (echekanskiy)

2017-08-23 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 0a2eccd09 -> b155c1a77


AMBARI-21757. Allow for keytab regeneration to be filtered for hosts 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/b155c1a7
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/b155c1a7
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/b155c1a7

Branch: refs/heads/trunk
Commit: b155c1a770fad1c268166be0bd5f34f589319d83
Parents: 0a2eccd
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Wed Aug 23 15:48:29 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Wed Aug 23 15:48:29 2017 +0300

--
 .../resources/ClusterResourceDefinition.java|  2 +
 .../server/controller/KerberosHelper.java   |  8 
 .../server/controller/KerberosHelperImpl.java   | 47 +++-
 .../server/controller/KerberosHelperTest.java   | 30 +
 4 files changed, 86 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/b155c1a7/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
index f689841..8933dd3 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/api/resources/ClusterResourceDefinition.java
@@ -85,6 +85,8 @@ public class ClusterResourceDefinition extends 
BaseResourceDefinition {
 directives.add(KerberosHelper.DIRECTIVE_REGENERATE_KEYTABS);
 directives.add(KerberosHelper.DIRECTIVE_MANAGE_KERBEROS_IDENTITIES);
 directives.add(KerberosHelper.DIRECTIVE_FORCE_TOGGLE_KERBEROS);
+directives.add(KerberosHelper.DIRECTIVE_HOSTS);
+directives.add(KerberosHelper.DIRECTIVE_COMPONENTS);
 return directives;
   }
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/b155c1a7/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
index 3819863..a334542 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelper.java
@@ -50,6 +50,14 @@ public interface KerberosHelper {
*/
   String DIRECTIVE_REGENERATE_KEYTABS = "regenerate_keytabs";
   /**
+   * directive used to pass host list to regenerate keytabs on
+   */
+  String DIRECTIVE_HOSTS = "regenerate_hosts";
+  /**
+   * directive used to pass list of services and their components to 
regenerate keytabs for
+   */
+  String DIRECTIVE_COMPONENTS = "regenerate_components";
+  /**
* directive used to indicate that the enable Kerberos operation should 
proceed even if the
* cluster's security type is not changing
*/

http://git-wip-us.apache.org/repos/asf/ambari/blob/b155c1a7/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
index 6c6c439..8d0fd0f 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/KerberosHelperImpl.java
@@ -122,6 +122,8 @@ import 
org.apache.directory.server.kerberos.shared.keytab.Keytab;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.ImmutableSet;
 import com.google.inject.Inject;
 import com.google.inject.Injector;
 import com.google.inject.Singleton;
@@ -254,6 +256,9 @@ public class KerberosHelperImpl implements KerberosHelper {
 
   CreatePrincipalsAndKeytabsHandler handler = null;
 
+  Set hostFilter = parseHostFilter(requestProperties);
+  Map<String, Set> serviceComponentFilter = 
parseComponentFilter(requestProperties);
+
   if ("true".equalsIgnoreCase(value) || 
"all".equalsIgnoreCase(value)) {
 handler = new CreatePrincipalsA

ambari git commit: AMBARI-21712. Allow WEB alert accept custom HTTP codes (echekanskiy)

2017-08-16 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.6 6618d7d53 -> d6d2a3b63


AMBARI-21712. Allow WEB alert accept custom HTTP codes (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d6d2a3b6
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d6d2a3b6
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d6d2a3b6

Branch: refs/heads/branch-2.6
Commit: d6d2a3b6332234cbfe17bc6068fd4cef7127acd2
Parents: 6618d7d
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Wed Aug 16 19:18:11 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Wed Aug 16 19:18:11 2017 +0300

--
 .../python/ambari_agent/alerts/base_alert.py | 15 +++
 .../main/python/ambari_agent/alerts/web_alert.py |  4 
 .../src/test/python/ambari_agent/TestAlerts.py   | 19 ++-
 .../ambari/server/state/alert/AlertUri.java  | 16 
 4 files changed, 49 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/d6d2a3b6/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py 
b/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
index add29fc..05f8023 100644
--- a/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
+++ b/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
@@ -255,7 +255,8 @@ class BaseAlert(object):
 
 if uri_structure is None:
   return None
-
+
+acceptable_codes_key = None
 http_key = None
 https_key = None
 https_property_key = None
@@ -267,7 +268,10 @@ class BaseAlert(object):
 ha_alias_key = None
 ha_http_pattern = None
 ha_https_pattern = None
-
+
+if 'acceptable_codes' in uri_structure:
+  acceptable_codes_key = uri_structure['acceptable_codes']
+
 if 'http' in uri_structure:
   http_key = uri_structure['http']
 
@@ -306,11 +310,14 @@ class BaseAlert(object):
 
 
 AlertUriLookupKeys = namedtuple('AlertUriLookupKeys', 
-  'http https https_property https_property_value default_port '
+  'acceptable_codes http https https_property https_property_value 
default_port '
   'kerberos_keytab kerberos_principal '
   'ha_nameservice ha_alias_key ha_http_pattern ha_https_pattern')
 
-alert_uri_lookup_keys = AlertUriLookupKeys(http=http_key, https=https_key, 
+alert_uri_lookup_keys = AlertUriLookupKeys(
+  acceptable_codes=acceptable_codes_key,
+  http=http_key,
+  https=https_key,
   https_property=https_property_key,
   https_property_value=https_property_value_key, default_port=default_port,
   kerberos_keytab=kerberos_keytab, kerberos_principal=kerberos_principal,

http://git-wip-us.apache.org/repos/asf/ambari/blob/d6d2a3b6/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py 
b/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
index 8ce4405..0e400f7 100644
--- a/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
+++ b/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
@@ -107,6 +107,10 @@ class WebAlert(BaseAlert):
 if status_code == 0:
   return (self.RESULT_CRITICAL, [status_code, url, time_seconds, 
error_message])
 
+# check explicit listed codes
+if self.uri_property_keys.acceptable_codes and status_code in 
self.uri_property_keys.acceptable_codes:
+  return (self.RESULT_OK, [status_code, url, time_seconds])
+
 # anything that's less than 400 is OK
 if status_code < 400:
   return (self.RESULT_OK, [status_code, url, time_seconds])

http://git-wip-us.apache.org/repos/asf/ambari/blob/d6d2a3b6/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
--
diff --git a/ambari-agent/src/test/python/ambari_agent/TestAlerts.py 
b/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
index 64479a2..1226900 100644
--- a/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
+++ b/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
@@ -23,6 +23,7 @@ import socket
 import sys
 import urllib2
 import tempfile
+import random
 from alerts.ams_alert import AmsAlert
 
 from ambari_agent.AlertSchedulerHandler import AlertSchedulerHandler
@@ -616,6 +617,21 @@ class TestAlerts(TestCase):
 self.assertEquals('CRITICAL', alerts[0]['state'])
 self.assertEquals('(Unit Tests) critical: 
https://c6401.ambari.apache.org:443/test/path. error message', 
alerts[0]['text'])
 
+# test cust

ambari git commit: AMBARI-21712. Allow WEB alert accept custom HTTP codes (echekanskiy)

2017-08-16 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk f2379a446 -> 12c05880d


AMBARI-21712. Allow WEB alert accept custom HTTP codes (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/12c05880
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/12c05880
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/12c05880

Branch: refs/heads/trunk
Commit: 12c05880d4a5fa13f37b58f537bebb292a064d20
Parents: f2379a4
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Wed Aug 16 19:15:26 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Wed Aug 16 19:15:26 2017 +0300

--
 .../python/ambari_agent/alerts/base_alert.py | 15 +++
 .../main/python/ambari_agent/alerts/web_alert.py |  4 
 .../src/test/python/ambari_agent/TestAlerts.py   | 19 ++-
 .../ambari/server/state/alert/AlertUri.java  | 15 +++
 4 files changed, 48 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/12c05880/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py 
b/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
index add29fc..05f8023 100644
--- a/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
+++ b/ambari-agent/src/main/python/ambari_agent/alerts/base_alert.py
@@ -255,7 +255,8 @@ class BaseAlert(object):
 
 if uri_structure is None:
   return None
-
+
+acceptable_codes_key = None
 http_key = None
 https_key = None
 https_property_key = None
@@ -267,7 +268,10 @@ class BaseAlert(object):
 ha_alias_key = None
 ha_http_pattern = None
 ha_https_pattern = None
-
+
+if 'acceptable_codes' in uri_structure:
+  acceptable_codes_key = uri_structure['acceptable_codes']
+
 if 'http' in uri_structure:
   http_key = uri_structure['http']
 
@@ -306,11 +310,14 @@ class BaseAlert(object):
 
 
 AlertUriLookupKeys = namedtuple('AlertUriLookupKeys', 
-  'http https https_property https_property_value default_port '
+  'acceptable_codes http https https_property https_property_value 
default_port '
   'kerberos_keytab kerberos_principal '
   'ha_nameservice ha_alias_key ha_http_pattern ha_https_pattern')
 
-alert_uri_lookup_keys = AlertUriLookupKeys(http=http_key, https=https_key, 
+alert_uri_lookup_keys = AlertUriLookupKeys(
+  acceptable_codes=acceptable_codes_key,
+  http=http_key,
+  https=https_key,
   https_property=https_property_key,
   https_property_value=https_property_value_key, default_port=default_port,
   kerberos_keytab=kerberos_keytab, kerberos_principal=kerberos_principal,

http://git-wip-us.apache.org/repos/asf/ambari/blob/12c05880/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py 
b/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
index 8ce4405..0e400f7 100644
--- a/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
+++ b/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
@@ -107,6 +107,10 @@ class WebAlert(BaseAlert):
 if status_code == 0:
   return (self.RESULT_CRITICAL, [status_code, url, time_seconds, 
error_message])
 
+# check explicit listed codes
+if self.uri_property_keys.acceptable_codes and status_code in 
self.uri_property_keys.acceptable_codes:
+  return (self.RESULT_OK, [status_code, url, time_seconds])
+
 # anything that's less than 400 is OK
 if status_code < 400:
   return (self.RESULT_OK, [status_code, url, time_seconds])

http://git-wip-us.apache.org/repos/asf/ambari/blob/12c05880/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
--
diff --git a/ambari-agent/src/test/python/ambari_agent/TestAlerts.py 
b/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
index 64479a2..1226900 100644
--- a/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
+++ b/ambari-agent/src/test/python/ambari_agent/TestAlerts.py
@@ -23,6 +23,7 @@ import socket
 import sys
 import urllib2
 import tempfile
+import random
 from alerts.ams_alert import AmsAlert
 
 from ambari_agent.AlertSchedulerHandler import AlertSchedulerHandler
@@ -616,6 +617,21 @@ class TestAlerts(TestCase):
 self.assertEquals('CRITICAL', alerts[0]['state'])
 self.assertEquals('(Unit Tests) critical: 
https://c6401.ambari.apache.org:443/test/path. error message', 
alerts[0]['text'])
 
+# test custom codes
+co

ambari git commit: AMBARI-21589. Service repos are not updated with "latest" url in repoinfo.xml (echekanskiy)

2017-07-27 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 096d75d7f -> d9e8f4f8c


AMBARI-21589. Service repos are not updated with "latest" url in repoinfo.xml 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d9e8f4f8
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d9e8f4f8
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d9e8f4f8

Branch: refs/heads/branch-2.5
Commit: d9e8f4f8ce18802b014b5803627e06c4d03a469e
Parents: 096d75d
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Thu Jul 27 20:15:44 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Thu Jul 27 20:15:44 2017 +0300

--
 .../src/main/java/org/apache/ambari/server/stack/StackModule.java | 3 +++
 1 file changed, 3 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/d9e8f4f8/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java 
b/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
index dd23e76..f19464f 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
@@ -1224,6 +1224,9 @@ public class StackModule extends BaseModule<StackModule, 
StackInfo> implements V
 RepositoryXml serviceRepoXml = ssd.getRepoFile();
 if (null != serviceRepoXml) {
   repos.addAll(serviceRepoXml.getRepositories());
+  if (null != serviceRepoXml.getLatestURI()) {
+stackContext.registerRepoUpdateTask(serviceRepoXml.getLatestURI(), 
this);
+  }
 }
   }
 }



ambari git commit: AMBARI-21589. Service repos are not updated with "latest" url in repoinfo.xml (echekanskiy)

2017-07-27 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 8ce722c82 -> 218829f93


AMBARI-21589. Service repos are not updated with "latest" url in repoinfo.xml 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/218829f9
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/218829f9
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/218829f9

Branch: refs/heads/trunk
Commit: 218829f93b87408f0f33637fce8caca5edaa9ac6
Parents: 8ce722c
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Thu Jul 27 20:13:53 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Thu Jul 27 20:13:53 2017 +0300

--
 .../src/main/java/org/apache/ambari/server/stack/StackModule.java | 3 +++
 1 file changed, 3 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/218829f9/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java 
b/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
index 7b2f87721..d73387d 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/stack/StackModule.java
@@ -1227,6 +1227,9 @@ public class StackModule extends BaseModule<StackModule, 
StackInfo> implements V
 RepositoryXml serviceRepoXml = ssd.getRepoFile();
 if (null != serviceRepoXml) {
   repos.addAll(serviceRepoXml.getRepositories());
+  if (null != serviceRepoXml.getLatestURI()) {
+stackContext.registerRepoUpdateTask(serviceRepoXml.getLatestURI(), 
this);
+  }
 }
   }
 }



ambari git commit: AMBARI-21272. LDAP sync requires user to be root - re-apply due to accident revert (echekanskiy)

2017-07-24 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 110c8cd6e -> fe761ef7e


AMBARI-21272. LDAP sync requires user to be root - re-apply due to accident 
revert (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/fe761ef7
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/fe761ef7
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/fe761ef7

Branch: refs/heads/branch-2.5
Commit: fe761ef7ee796de450dd71708900184228cbe6e3
Parents: 110c8cd
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Jul 24 16:42:23 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Jul 24 16:42:23 2017 +0300

--
 .../src/main/python/ambari_server/setupSecurity.py |  4 
 ambari-server/src/test/python/TestAmbariServer.py  | 13 +
 2 files changed, 1 insertion(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/fe761ef7/ambari-server/src/main/python/ambari_server/setupSecurity.py
--
diff --git a/ambari-server/src/main/python/ambari_server/setupSecurity.py 
b/ambari-server/src/main/python/ambari_server/setupSecurity.py
index ea3b9e5..f175d7c 100644
--- a/ambari-server/src/main/python/ambari_server/setupSecurity.py
+++ b/ambari-server/src/main/python/ambari_server/setupSecurity.py
@@ -275,10 +275,6 @@ class LdapSyncOptions:
 #
 def sync_ldap(options):
   logger.info("Sync users and groups with configured LDAP.")
-  if not is_root():
-err = 'Ambari-server sync-ldap should be run with ' \
-  'root-level privileges'
-raise FatalException(4, err)
 
   properties = get_ambari_properties()
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/fe761ef7/ambari-server/src/test/python/TestAmbariServer.py
--
diff --git a/ambari-server/src/test/python/TestAmbariServer.py 
b/ambari-server/src/test/python/TestAmbariServer.py
index 1ac77ab2..fb0bb70 100644
--- a/ambari-server/src/test/python/TestAmbariServer.py
+++ b/ambari-server/src/test/python/TestAmbariServer.py
@@ -7747,13 +7747,12 @@ class TestAmbariServer(TestCase):
   @patch("urllib2.urlopen")
   @patch("urllib2.Request")
   @patch("base64.encodestring")
-  @patch("ambari_server.setupSecurity.is_root")
   @patch("ambari_server.setupSecurity.is_server_runing")
   @patch("ambari_server.setupSecurity.get_ambari_properties")
   @patch("ambari_server.setupSecurity.get_validated_string_input")
   @patch("ambari_server.setupSecurity.logger")
   def test_sync_ldap_forbidden(self, logger_mock, 
get_validated_string_input_method, get_ambari_properties_method,
-is_server_runing_method, is_root_method,
+is_server_runing_method,
 encodestring_method, request_constructor, 
urlopen_method):
 
 options = self._create_empty_options_mock()
@@ -7762,16 +7761,6 @@ class TestAmbariServer(TestCase):
 options.ldap_sync_users = None
 options.ldap_sync_groups = None
 
-is_root_method.return_value = False
-try:
-  sync_ldap(options)
-  self.fail("Should throw exception if not root")
-except FatalException as fe:
-  # Expected
-  self.assertTrue("root-level" in fe.reason)
-  pass
-is_root_method.return_value = True
-
 is_server_runing_method.return_value = (None, None)
 try:
   sync_ldap(options)



ambari git commit: Revert: BUG-78694. LDAP sync requires user to be root

2017-07-16 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 aa729a5bb -> 805dbe42a


Revert: BUG-78694. LDAP sync requires user to be root


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/805dbe42
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/805dbe42
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/805dbe42

Branch: refs/heads/branch-2.5
Commit: 805dbe42a0c809faacf8b86c769199c300eac1b9
Parents: aa729a5
Author: Eugene Chekanskiy 
Authored: Sun Jul 16 20:37:52 2017 +0300
Committer: Eugene Chekanskiy 
Committed: Sun Jul 16 20:37:52 2017 +0300

--
 .../src/main/python/ambari_server/setupSecurity.py |  4 
 ambari-server/src/test/python/TestAmbariServer.py  | 13 -
 2 files changed, 16 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/805dbe42/ambari-server/src/main/python/ambari_server/setupSecurity.py
--
diff --git a/ambari-server/src/main/python/ambari_server/setupSecurity.py 
b/ambari-server/src/main/python/ambari_server/setupSecurity.py
index f175d7c..ea3b9e5 100644
--- a/ambari-server/src/main/python/ambari_server/setupSecurity.py
+++ b/ambari-server/src/main/python/ambari_server/setupSecurity.py
@@ -275,6 +275,10 @@ class LdapSyncOptions:
 #
 def sync_ldap(options):
   logger.info("Sync users and groups with configured LDAP.")
+  if not is_root():
+err = 'Ambari-server sync-ldap should be run with ' \
+  'root-level privileges'
+raise FatalException(4, err)
 
   properties = get_ambari_properties()
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/805dbe42/ambari-server/src/test/python/TestAmbariServer.py
--
diff --git a/ambari-server/src/test/python/TestAmbariServer.py 
b/ambari-server/src/test/python/TestAmbariServer.py
index fb0bb70..1ac77ab2 100644
--- a/ambari-server/src/test/python/TestAmbariServer.py
+++ b/ambari-server/src/test/python/TestAmbariServer.py
@@ -7747,12 +7747,13 @@ class TestAmbariServer(TestCase):
   @patch("urllib2.urlopen")
   @patch("urllib2.Request")
   @patch("base64.encodestring")
+  @patch("ambari_server.setupSecurity.is_root")
   @patch("ambari_server.setupSecurity.is_server_runing")
   @patch("ambari_server.setupSecurity.get_ambari_properties")
   @patch("ambari_server.setupSecurity.get_validated_string_input")
   @patch("ambari_server.setupSecurity.logger")
   def test_sync_ldap_forbidden(self, logger_mock, 
get_validated_string_input_method, get_ambari_properties_method,
-is_server_runing_method,
+is_server_runing_method, is_root_method,
 encodestring_method, request_constructor, 
urlopen_method):
 
 options = self._create_empty_options_mock()
@@ -7761,6 +7762,16 @@ class TestAmbariServer(TestCase):
 options.ldap_sync_users = None
 options.ldap_sync_groups = None
 
+is_root_method.return_value = False
+try:
+  sync_ldap(options)
+  self.fail("Should throw exception if not root")
+except FatalException as fe:
+  # Expected
+  self.assertTrue("root-level" in fe.reason)
+  pass
+is_root_method.return_value = True
+
 is_server_runing_method.return_value = (None, None)
 try:
   sync_ldap(options)



ambari git commit: AMBARI-21483. Add UID/GID related enhancements (echekanskiy)

2017-07-16 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 56462b222 -> f92d12193


AMBARI-21483. Add UID/GID related enhancements (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/f92d1219
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/f92d1219
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/f92d1219

Branch: refs/heads/trunk
Commit: f92d12193b30d53dc06ab9642ef4b9d61b5bac1c
Parents: 56462b2
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Sun Jul 16 20:22:34 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Sun Jul 16 20:22:34 2017 +0300

--
 .../ambari/server/state/PropertyInfo.java   |   2 +
 .../hooks/before-ANY/files/changeToSecureUid.sh |  13 +-
 .../before-ANY/scripts/shared_initialization.py |  45 ++-
 .../2.0.6/hooks/before-ANY/test_before_any.py   | 294 +++
 .../app/controllers/wizard/step7_controller.js  |  67 +
 .../configs/stack_config_properties_mapper.js   |  14 +-
 ambari-web/app/styles/application.less  |  15 +
 ...ontrols_service_config_usergroup_with_id.hbs |  27 ++
 ambari-web/app/utils/config.js  |   3 +
 .../configs/service_configs_by_category_view.js |   6 +
 ambari-web/app/views/common/controls_view.js|  39 +++
 11 files changed, 392 insertions(+), 133 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/f92d1219/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java 
b/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
index 62396e3..63c850e 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
@@ -281,7 +281,9 @@ public class PropertyInfo {
   public enum PropertyType {
 PASSWORD,
 USER,
+UID,
 GROUP,
+GID,
 TEXT,
 ADDITIONAL_USER_PROPERTY,
 NOT_MANAGED_HDFS_PATH,

http://git-wip-us.apache.org/repos/asf/ambari/blob/f92d1219/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
index 08542c4..4663f10 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
@@ -21,6 +21,7 @@
 
 username=$1
 directories=$2
+newUid=$3
 
 function find_available_uid() {
  for ((i=1001; i<=2000; i++))
@@ -34,7 +35,16 @@ function find_available_uid() {
  done
 }
 
-find_available_uid
+if [ -z $2 ]; then
+  test $(id -u ${username} 2>/dev/null)
+  if [ $? -ne 1 ]; then
+   newUid=`id -u ${username}`
+  else
+   find_available_uid
+  fi
+  echo $newUid
+  exit 0
+fi
 
 if [ $newUid -eq 0 ]
 then
@@ -43,7 +53,6 @@ then
 fi
 
 set -e
-
 dir_array=($(echo $directories | sed 's/,/\n/g'))
 old_uid=$(id -u $username)
 sudo_prefix="/var/lib/ambari-agent/ambari-sudo.sh -H -E"

http://git-wip-us.apache.org/repos/asf/ambari/blob/f92d1219/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
index 39f5a47..bcc1a3a 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
@@ -24,6 +24,7 @@ import tempfile
 from copy import copy
 from resource_management.libraries.functions.version import compare_versions
 from resource_management import *
+from resource_management.core import shell
 
 def setup_users():
   """
@@ -43,11 +44,17 @@ def setup_users():
   )
 
 for user in params.user_list:
-  User(user,
-  gid = params.user_to_gid_dict[user],
-  groups = params.user_to_groups_dict[user],
-  fetch_nonlocal_groups = params.fetch_nonlocal_groups
-  )
+  if params.override_uid == "true":
+User(user,
+ uid = get_uid(user),
+ g

[1/3] ambari git commit: BUG-78694. LDAP sync requires user to be root

2017-07-16 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 207680147 -> aa729a5bb


BUG-78694. LDAP sync requires user to be root


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/bc57667d
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/bc57667d
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/bc57667d

Branch: refs/heads/branch-2.5
Commit: bc57667d374ec3cb476b4c988a8fd03a57bd2a01
Parents: 8cfdf91
Author: Eugene Chekanskiy 
Authored: Fri Jun 16 14:51:48 2017 +0300
Committer: Eugene Chekanskiy 
Committed: Fri Jun 16 15:02:24 2017 +0300

--
 .../src/main/python/ambari_server/setupSecurity.py |  4 
 ambari-server/src/test/python/TestAmbariServer.py  | 13 +
 2 files changed, 1 insertion(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/bc57667d/ambari-server/src/main/python/ambari_server/setupSecurity.py
--
diff --git a/ambari-server/src/main/python/ambari_server/setupSecurity.py 
b/ambari-server/src/main/python/ambari_server/setupSecurity.py
index 1508d27..17d1025 100644
--- a/ambari-server/src/main/python/ambari_server/setupSecurity.py
+++ b/ambari-server/src/main/python/ambari_server/setupSecurity.py
@@ -270,10 +270,6 @@ class LdapSyncOptions:
 #
 def sync_ldap(options):
   logger.info("Sync users and groups with configured LDAP.")
-  if not is_root():
-err = 'Ambari-server sync-ldap should be run with ' \
-  'root-level privileges'
-raise FatalException(4, err)
 
   properties = get_ambari_properties()
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/bc57667d/ambari-server/src/test/python/TestAmbariServer.py
--
diff --git a/ambari-server/src/test/python/TestAmbariServer.py 
b/ambari-server/src/test/python/TestAmbariServer.py
index 6bf5e43..dea7801 100644
--- a/ambari-server/src/test/python/TestAmbariServer.py
+++ b/ambari-server/src/test/python/TestAmbariServer.py
@@ -7714,13 +7714,12 @@ class TestAmbariServer(TestCase):
   @patch("urllib2.urlopen")
   @patch("urllib2.Request")
   @patch("base64.encodestring")
-  @patch("ambari_server.setupSecurity.is_root")
   @patch("ambari_server.setupSecurity.is_server_runing")
   @patch("ambari_server.setupSecurity.get_ambari_properties")
   @patch("ambari_server.setupSecurity.get_validated_string_input")
   @patch("ambari_server.setupSecurity.logger")
   def test_sync_ldap_forbidden(self, logger_mock, 
get_validated_string_input_method, get_ambari_properties_method,
-is_server_runing_method, is_root_method,
+is_server_runing_method,
 encodestring_method, request_constructor, 
urlopen_method):
 
 options = self._create_empty_options_mock()
@@ -7729,16 +7728,6 @@ class TestAmbariServer(TestCase):
 options.ldap_sync_users = None
 options.ldap_sync_groups = None
 
-is_root_method.return_value = False
-try:
-  sync_ldap(options)
-  self.fail("Should throw exception if not root")
-except FatalException as fe:
-  # Expected
-  self.assertTrue("root-level" in fe.reason)
-  pass
-is_root_method.return_value = True
-
 is_server_runing_method.return_value = (None, None)
 try:
   sync_ldap(options)



[2/3] ambari git commit: Merge remote-tracking branch 'origin/branch-2.5' into branch-2.5

2017-07-16 Thread echekanskiy
Merge remote-tracking branch 'origin/branch-2.5' into branch-2.5


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d5392fd0
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d5392fd0
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d5392fd0

Branch: refs/heads/branch-2.5
Commit: d5392fd053e87f0e2a30934f1fdae375e5097da0
Parents: bc57667 2076801
Author: Eugene Chekanskiy 
Authored: Sun Jul 16 19:45:57 2017 +0300
Committer: Eugene Chekanskiy 
Committed: Sun Jul 16 19:45:57 2017 +0300

--
 .../controllers/groups/GroupsEditCtrl.js|3 +
 ambari-agent/conf/unix/install-helper.sh|8 +
 ambari-agent/etc/init.d/ambari-agent|   22 +-
 ambari-agent/pom.xml|   13 +-
 .../src/main/package/rpm/posttrans_agent.sh |7 +
 .../ambari_agent/AlertSchedulerHandler.py   |   10 +-
 .../python/ambari_agent/alerts/base_alert.py|8 +-
 .../python/ambari_agent/alerts/port_alert.py|  107 +-
 ambari-agent/src/packages/tarball/all.xml   |2 +-
 .../ambari_agent/TestAlertSchedulerHandler.py   |   17 +-
 .../test/python/ambari_agent/TestHeartbeat.py   |   40 +
 .../resource_management/TestPackageResource.py  |   41 +
 .../ambari_commons/resources/os_family.json |3 +-
 .../core/providers/package/__init__.py  |2 +-
 .../core/providers/package/apt.py   |   10 +-
 .../core/providers/package/choco.py |4 +-
 .../core/providers/package/yumrpm.py|9 +-
 .../core/providers/package/zypper.py|9 +-
 .../core/resources/packaging.py |6 +
 .../libraries/functions/conf_select.py  |   57 +-
 .../libraries/functions/mounted_dirs_helper.py  |1 +
 .../libraries/functions/stack_features.py   |   41 +-
 .../libraries/functions/stack_tools.py  |   54 +-
 .../libraries/providers/hdfs_resource.py|9 +-
 .../libraries/script/script.py  |   19 +-
 ambari-metrics/ambari-metrics-common/pom.xml|4 +
 .../timeline/AbstractTimelineMetricsSink.java   |   76 +-
 .../metrics2/sink/timeline/Precision.java   |2 +-
 .../cache/HandleConnectExceptionTest.java   |   76 +-
 .../ambari-metrics/datasource.js|5 +-
 .../kafka/KafkaTimelineMetricsReporter.java |6 +-
 .../timeline/HBaseTimelineMetricStore.java  |   15 +-
 .../metrics/timeline/PhoenixHBaseAccessor.java  |   69 +-
 .../timeline/TimelineMetricConfiguration.java   |   24 +
 .../timeline/TimelineMetricsAggregatorSink.java |   60 +
 .../timeline/PhoenixHBaseAccessorTest.java  |   73 +
 .../timeline/TestTimelineMetricStore.java   |1 +
 .../TimelineMetricsAggregatorMemorySink.java|  141 +
 ambari-server/pom.xml   |   10 +-
 ambari-server/sbin/ambari-server|6 +-
 ambari-server/src/main/assemblies/server.xml|   10 +
 .../apache/ambari/annotations/Experimental.java |6 +
 .../ambari/annotations/ExperimentalFeature.java |7 +-
 .../actionmanager/ExecutionCommandWrapper.java  |   46 +
 .../server/agent/AlertDefinitionCommand.java|7 +-
 .../ambari/server/agent/ExecutionCommand.java   |   15 +-
 .../ambari/server/agent/HeartBeatHandler.java   |4 +-
 .../alerts/ComponentVersionAlertRunnable.java   |4 +-
 .../eventcreator/UpgradeEventCreator.java   |2 +-
 .../ambari/server/checks/CheckDescription.java  |   42 +-
 .../checks/ComponentsExistInRepoCheck.java  |  142 +
 .../checks/DatabaseConsistencyCheckHelper.java  |   14 +
 .../ambari/server/checks/JavaVersionCheck.java  |  102 +
 .../server/checks/PreviousUpgradeCompleted.java |   11 +-
 .../controller/ActionExecutionContext.java  |   28 +
 .../controller/AmbariActionExecutionHelper.java |   23 +-
 .../AmbariCustomCommandExecutionHelper.java |   29 +-
 .../AmbariManagementControllerImpl.java |   14 +-
 .../server/controller/ExecuteCommandJson.java   |4 +
 .../server/controller/KerberosHelperImpl.java   |8 +-
 .../internal/AbstractProviderModule.java|   49 +-
 .../BlueprintConfigurationProcessor.java|  232 +-
 .../internal/ClientConfigResourceProvider.java  |2 +
 .../ClusterStackVersionResourceProvider.java|  187 +-
 .../PreUpgradeCheckResourceProvider.java|9 +-
 .../internal/RequestResourceProvider.java   |   12 +-
 .../internal/UpgradeResourceProvider.java   |  416 +-
 .../server/controller/jmx/JMXHostProvider.java  |   15 +
 .../controller/jmx/JMXPropertyProvider.java |   25 +
 .../listeners/upgrade/StackVersionListener.java |  225 +-
 .../system/impl/AmbariMetricSinkImpl.java   |3 +
 .../system/impl/DatabaseMetricsSource.java  |4 +-
 .../metrics/system/impl/JvmMetricsSource.java   |   12 +-
 

[3/3] ambari git commit: AMBARI-21483. Add UID/GID related enhancements (echekanskiy)

2017-07-16 Thread echekanskiy
AMBARI-21483. Add UID/GID related enhancements (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/aa729a5b
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/aa729a5b
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/aa729a5b

Branch: refs/heads/branch-2.5
Commit: aa729a5bb1f48df2eeddb7234373b0b90b850bc3
Parents: d5392fd
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Sun Jul 16 20:14:22 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Sun Jul 16 20:14:22 2017 +0300

--
 .../ambari/server/state/PropertyInfo.java   |   2 +
 .../hooks/before-ANY/files/changeToSecureUid.sh |  13 +-
 .../before-ANY/scripts/shared_initialization.py |  45 ++-
 .../2.0.6/hooks/before-ANY/test_before_any.py   | 294 +++
 .../app/controllers/wizard/step7_controller.js  |  67 +
 .../configs/stack_config_properties_mapper.js   |  14 +-
 ambari-web/app/styles/application.less  |  15 +
 ...ontrols_service_config_usergroup_with_id.hbs |  27 ++
 ambari-web/app/utils/config.js  |   3 +
 .../configs/service_configs_by_category_view.js |   6 +
 ambari-web/app/views/common/controls_view.js|  39 +++
 11 files changed, 392 insertions(+), 133 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/aa729a5b/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java 
b/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
index 2ad92fd..bc89fc4 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/state/PropertyInfo.java
@@ -270,7 +270,9 @@ public class PropertyInfo {
   public enum PropertyType {
 PASSWORD,
 USER,
+UID,
 GROUP,
+GID,
 TEXT,
 ADDITIONAL_USER_PROPERTY,
 NOT_MANAGED_HDFS_PATH,

http://git-wip-us.apache.org/repos/asf/ambari/blob/aa729a5b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
index 08542c4..4663f10 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/files/changeToSecureUid.sh
@@ -21,6 +21,7 @@
 
 username=$1
 directories=$2
+newUid=$3
 
 function find_available_uid() {
  for ((i=1001; i<=2000; i++))
@@ -34,7 +35,16 @@ function find_available_uid() {
  done
 }
 
-find_available_uid
+if [ -z $2 ]; then
+  test $(id -u ${username} 2>/dev/null)
+  if [ $? -ne 1 ]; then
+   newUid=`id -u ${username}`
+  else
+   find_available_uid
+  fi
+  echo $newUid
+  exit 0
+fi
 
 if [ $newUid -eq 0 ]
 then
@@ -43,7 +53,6 @@ then
 fi
 
 set -e
-
 dir_array=($(echo $directories | sed 's/,/\n/g'))
 old_uid=$(id -u $username)
 sudo_prefix="/var/lib/ambari-agent/ambari-sudo.sh -H -E"

http://git-wip-us.apache.org/repos/asf/ambari/blob/aa729a5b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
index 4d0de7f..886bc45 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py
@@ -24,6 +24,7 @@ import tempfile
 from copy import copy
 from resource_management.libraries.functions.version import compare_versions
 from resource_management import *
+from resource_management.core import shell
 
 def setup_users():
   """
@@ -43,11 +44,17 @@ def setup_users():
   )
 
 for user in params.user_list:
-  User(user,
-  gid = params.user_to_gid_dict[user],
-  groups = params.user_to_groups_dict[user],
-  fetch_nonlocal_groups = params.fetch_nonlocal_groups
-  )
+  if params.override_uid == "true":
+User(user,
+ uid = get_uid(user),
+ gid = params.user_to_gid_dict[user],
+ groups = params.user_to_groups_dict[u

ambari git commit: AMBARI-21272. LDAP sync requires user to be root (echekanskiy)

2017-07-03 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk b5e40f9b2 -> 8e9df940f


AMBARI-21272. LDAP sync requires user to be root (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/8e9df940
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/8e9df940
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/8e9df940

Branch: refs/heads/trunk
Commit: 8e9df940fcc045c7b22cc571cd58d0a1e82dd530
Parents: b5e40f9
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Jul 3 13:15:56 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Jul 3 13:15:56 2017 +0300

--
 .../src/main/python/ambari_server/setupSecurity.py |  4 
 ambari-server/src/test/python/TestAmbariServer.py  | 13 +
 2 files changed, 1 insertion(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/8e9df940/ambari-server/src/main/python/ambari_server/setupSecurity.py
--
diff --git a/ambari-server/src/main/python/ambari_server/setupSecurity.py 
b/ambari-server/src/main/python/ambari_server/setupSecurity.py
index 1508d27..17d1025 100644
--- a/ambari-server/src/main/python/ambari_server/setupSecurity.py
+++ b/ambari-server/src/main/python/ambari_server/setupSecurity.py
@@ -270,10 +270,6 @@ class LdapSyncOptions:
 #
 def sync_ldap(options):
   logger.info("Sync users and groups with configured LDAP.")
-  if not is_root():
-err = 'Ambari-server sync-ldap should be run with ' \
-  'root-level privileges'
-raise FatalException(4, err)
 
   properties = get_ambari_properties()
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/8e9df940/ambari-server/src/test/python/TestAmbariServer.py
--
diff --git a/ambari-server/src/test/python/TestAmbariServer.py 
b/ambari-server/src/test/python/TestAmbariServer.py
index c511237..1c4ebaf 100644
--- a/ambari-server/src/test/python/TestAmbariServer.py
+++ b/ambari-server/src/test/python/TestAmbariServer.py
@@ -7675,13 +7675,12 @@ class TestAmbariServer(TestCase):
   @patch("urllib2.urlopen")
   @patch("urllib2.Request")
   @patch("base64.encodestring")
-  @patch("ambari_server.setupSecurity.is_root")
   @patch("ambari_server.setupSecurity.is_server_runing")
   @patch("ambari_server.setupSecurity.get_ambari_properties")
   @patch("ambari_server.setupSecurity.get_validated_string_input")
   @patch("ambari_server.setupSecurity.logger")
   def test_sync_ldap_forbidden(self, logger_mock, 
get_validated_string_input_method, get_ambari_properties_method,
-is_server_runing_method, is_root_method,
+is_server_runing_method,
 encodestring_method, request_constructor, 
urlopen_method):
 
 options = self._create_empty_options_mock()
@@ -7690,16 +7689,6 @@ class TestAmbariServer(TestCase):
 options.ldap_sync_users = None
 options.ldap_sync_groups = None
 
-is_root_method.return_value = False
-try:
-  sync_ldap(options)
-  self.fail("Should throw exception if not root")
-except FatalException as fe:
-  # Expected
-  self.assertTrue("root-level" in fe.reason)
-  pass
-is_root_method.return_value = True
-
 is_server_runing_method.return_value = (None, None)
 try:
   sync_ldap(options)



ambari git commit: AMBARI-21272. LDAP sync requires user to be root (echekanskiy)

2017-07-03 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 b51be77aa -> d6d6d7404


AMBARI-21272. LDAP sync requires user to be root (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d6d6d740
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d6d6d740
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d6d6d740

Branch: refs/heads/branch-2.5
Commit: d6d6d74047c2229e5976bee6b67c7301b8fe0321
Parents: b51be77
Author: Eugene Chekanskiy <echekans...@apache.org>
Authored: Mon Jul 3 13:13:36 2017 +0300
Committer: Eugene Chekanskiy <echekans...@apache.org>
Committed: Mon Jul 3 13:13:36 2017 +0300

--
 .../src/main/python/ambari_server/setupSecurity.py |  4 
 ambari-server/src/test/python/TestAmbariServer.py  | 13 +
 2 files changed, 1 insertion(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/d6d6d740/ambari-server/src/main/python/ambari_server/setupSecurity.py
--
diff --git a/ambari-server/src/main/python/ambari_server/setupSecurity.py 
b/ambari-server/src/main/python/ambari_server/setupSecurity.py
index 1508d27..17d1025 100644
--- a/ambari-server/src/main/python/ambari_server/setupSecurity.py
+++ b/ambari-server/src/main/python/ambari_server/setupSecurity.py
@@ -270,10 +270,6 @@ class LdapSyncOptions:
 #
 def sync_ldap(options):
   logger.info("Sync users and groups with configured LDAP.")
-  if not is_root():
-err = 'Ambari-server sync-ldap should be run with ' \
-  'root-level privileges'
-raise FatalException(4, err)
 
   properties = get_ambari_properties()
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/d6d6d740/ambari-server/src/test/python/TestAmbariServer.py
--
diff --git a/ambari-server/src/test/python/TestAmbariServer.py 
b/ambari-server/src/test/python/TestAmbariServer.py
index 1ac77ab2..fb0bb70 100644
--- a/ambari-server/src/test/python/TestAmbariServer.py
+++ b/ambari-server/src/test/python/TestAmbariServer.py
@@ -7747,13 +7747,12 @@ class TestAmbariServer(TestCase):
   @patch("urllib2.urlopen")
   @patch("urllib2.Request")
   @patch("base64.encodestring")
-  @patch("ambari_server.setupSecurity.is_root")
   @patch("ambari_server.setupSecurity.is_server_runing")
   @patch("ambari_server.setupSecurity.get_ambari_properties")
   @patch("ambari_server.setupSecurity.get_validated_string_input")
   @patch("ambari_server.setupSecurity.logger")
   def test_sync_ldap_forbidden(self, logger_mock, 
get_validated_string_input_method, get_ambari_properties_method,
-is_server_runing_method, is_root_method,
+is_server_runing_method,
 encodestring_method, request_constructor, 
urlopen_method):
 
 options = self._create_empty_options_mock()
@@ -7762,16 +7761,6 @@ class TestAmbariServer(TestCase):
 options.ldap_sync_users = None
 options.ldap_sync_groups = None
 
-is_root_method.return_value = False
-try:
-  sync_ldap(options)
-  self.fail("Should throw exception if not root")
-except FatalException as fe:
-  # Expected
-  self.assertTrue("root-level" in fe.reason)
-  pass
-is_root_method.return_value = True
-
 is_server_runing_method.return_value = (None, None)
 try:
   sync_ldap(options)



ambari git commit: AMBARI-21238. Kafka userprincipal to shortname is not using AUTH_TO_LOCAL rules for authorization (echekanskiy)

2017-06-14 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 2461ddbcd -> 80d804849


AMBARI-21238. Kafka userprincipal to shortname is not using AUTH_TO_LOCAL rules 
for authorization (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/80d80484
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/80d80484
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/80d80484

Branch: refs/heads/branch-2.5
Commit: 80d8048496080b84a90dac822948d1ae6d39bf80
Parents: 2461ddb
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed Jun 14 16:34:48 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed Jun 14 16:34:48 2017 +0300

--
 .../ambari/server/controller/AuthToLocalBuilder.java  | 10 --
 .../resources/common-services/KAFKA/0.9.0/kerberos.json   |  3 +++
 .../resources/stacks/HDP/2.5/services/KAFKA/kerberos.json |  3 +++
 3 files changed, 14 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/80d80484/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
index 1fb912e..7e706ff 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
@@ -242,6 +242,9 @@ public class AuthToLocalBuilder implements Cloneable {
 case SPACES:
   stringBuilder.append(" ");
   break;
+case COMMA:
+  stringBuilder.append(",");
+  break;
 default:
   throw new UnsupportedOperationException(String.format("The 
auth-to-local rule concatenation type is not supported: %s",
   concatenationType.name()));
@@ -661,8 +664,11 @@ public class AuthToLocalBuilder implements Cloneable {
 /**
  * Each rule is appended to the set of rules using a space - the ruleset 
exists on a single line
  */
-SPACES;
-
+SPACES,
+/**
+ * Each rule is appended to the set of rules using comma - the ruleset 
exists on a single line.
+ */
+COMMA;
 /**
  * Translate a string declaring a concatenation type to the enumerated 
value.
  * 

http://git-wip-us.apache.org/repos/asf/ambari/blob/80d80484/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json
--
diff --git 
a/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json 
b/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json
index 60fa959..3512656 100644
--- a/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json
+++ b/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json
@@ -18,6 +18,9 @@
   }
 }
   ],
+  "auth_to_local_properties" : [
+"kafka-broker/sasl.kerberos.principal.to.local.rules|comma"
+  ],
   "components": [
 {
   "name": "KAFKA_BROKER",

http://git-wip-us.apache.org/repos/asf/ambari/blob/80d80484/ambari-server/src/main/resources/stacks/HDP/2.5/services/KAFKA/kerberos.json
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.5/services/KAFKA/kerberos.json 
b/ambari-server/src/main/resources/stacks/HDP/2.5/services/KAFKA/kerberos.json
index 501f969..be64e70 100644
--- 
a/ambari-server/src/main/resources/stacks/HDP/2.5/services/KAFKA/kerberos.json
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.5/services/KAFKA/kerberos.json
@@ -28,6 +28,9 @@
   }
 }
   ],
+  "auth_to_local_properties" : [
+"kafka-broker/sasl.kerberos.principal.to.local.rules|comma"
+  ],
   "components": [
 {
   "name": "KAFKA_BROKER",



ambari git commit: AMBARI-21238. Kafka userprincipal to shortname is not using AUTH_TO_LOCAL rules for authorization (echekanskiy)

2017-06-14 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 03812cba4 -> f1e89e4d2


AMBARI-21238. Kafka userprincipal to shortname is not using AUTH_TO_LOCAL rules 
for authorization (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/f1e89e4d
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/f1e89e4d
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/f1e89e4d

Branch: refs/heads/trunk
Commit: f1e89e4d2cc81d8ba14d465fd5badfe77f329a69
Parents: 03812cb
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed Jun 14 16:31:28 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed Jun 14 16:31:28 2017 +0300

--
 .../ambari/server/controller/AuthToLocalBuilder.java  | 10 --
 .../common-services/KAFKA/0.10.0.3.0/kerberos.json|  3 +++
 .../resources/common-services/KAFKA/0.10.0/kerberos.json  |  3 +++
 .../resources/common-services/KAFKA/0.9.0/kerberos.json   |  3 +++
 .../resources/stacks/HDF/2.0/services/KAFKA/kerberos.json |  3 +++
 5 files changed, 20 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/f1e89e4d/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
index 33c8f3b..1d4abdd 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/controller/AuthToLocalBuilder.java
@@ -242,6 +242,9 @@ public class AuthToLocalBuilder implements Cloneable {
 case SPACES:
   stringBuilder.append(" ");
   break;
+case COMMA:
+  stringBuilder.append(",");
+  break;
 default:
   throw new UnsupportedOperationException(String.format("The 
auth-to-local rule concatenation type is not supported: %s",
   concatenationType.name()));
@@ -661,8 +664,11 @@ public class AuthToLocalBuilder implements Cloneable {
 /**
  * Each rule is appended to the set of rules using a space - the ruleset 
exists on a single line
  */
-SPACES;
-
+SPACES,
+/**
+ * Each rule is appended to the set of rules using comma - the ruleset 
exists on a single line.
+ */
+COMMA;
 /**
  * Translate a string declaring a concatenation type to the enumerated 
value.
  * 

http://git-wip-us.apache.org/repos/asf/ambari/blob/f1e89e4d/ambari-server/src/main/resources/common-services/KAFKA/0.10.0.3.0/kerberos.json
--
diff --git 
a/ambari-server/src/main/resources/common-services/KAFKA/0.10.0.3.0/kerberos.json
 
b/ambari-server/src/main/resources/common-services/KAFKA/0.10.0.3.0/kerberos.json
index eb31ad6..b4d0018 100644
--- 
a/ambari-server/src/main/resources/common-services/KAFKA/0.10.0.3.0/kerberos.json
+++ 
b/ambari-server/src/main/resources/common-services/KAFKA/0.10.0.3.0/kerberos.json
@@ -29,6 +29,9 @@
   }
 }
   ],
+  "auth_to_local_properties" : [
+"kafka-broker/sasl.kerberos.principal.to.local.rules|comma"
+  ],
   "components": [
 {
   "name": "KAFKA_BROKER",

http://git-wip-us.apache.org/repos/asf/ambari/blob/f1e89e4d/ambari-server/src/main/resources/common-services/KAFKA/0.10.0/kerberos.json
--
diff --git 
a/ambari-server/src/main/resources/common-services/KAFKA/0.10.0/kerberos.json 
b/ambari-server/src/main/resources/common-services/KAFKA/0.10.0/kerberos.json
index eb31ad6..b4d0018 100644
--- 
a/ambari-server/src/main/resources/common-services/KAFKA/0.10.0/kerberos.json
+++ 
b/ambari-server/src/main/resources/common-services/KAFKA/0.10.0/kerberos.json
@@ -29,6 +29,9 @@
   }
 }
   ],
+  "auth_to_local_properties" : [
+"kafka-broker/sasl.kerberos.principal.to.local.rules|comma"
+  ],
   "components": [
 {
   "name": "KAFKA_BROKER",

http://git-wip-us.apache.org/repos/asf/ambari/blob/f1e89e4d/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json
--
diff --git 
a/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json 
b/ambari-server/src/main/resources/common-services/KAFKA/0.9.0/kerberos.json
index 7500891..247a602 100

ambari git commit: AMBARI-21101. HDP 3.0 install fails due to 'Script' has no attribute 'get_force_https_protocol' because function was renamed (echekanskiy)

2017-05-24 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 90abffd70 -> 7bb2aba5c


AMBARI-21101. HDP 3.0 install fails due to 'Script' has no attribute 
'get_force_https_protocol' because function was renamed (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/7bb2aba5
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/7bb2aba5
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/7bb2aba5

Branch: refs/heads/trunk
Commit: 7bb2aba5cc67e65e586352efdf307cf67dbfd416
Parents: 90abffd
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed May 24 18:25:32 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed May 24 18:25:32 2017 +0300

--
 .../common-services/HDFS/3.0.0.3.0/package/scripts/utils.py| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/7bb2aba5/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py
 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py
index ab4308c..53774c6 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py
@@ -38,7 +38,7 @@ from resource_management.libraries.script.script import Script
 from ambari_commons.inet_utils import ensure_ssl_using_protocol
 from zkfc_slave import ZkfcSlaveDefault
 
-ensure_ssl_using_protocol(Script.get_force_https_protocol())
+ensure_ssl_using_protocol(Script.get_force_https_protocol_name())
 
 def safe_zkfc_op(action, env):
   """



ambari git commit: AMBARI-20953. Insecure SSL: Server Identity Verification Disabled (echekanskiy)

2017-05-08 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk e20305bc4 -> 84586cc07


AMBARI-20953. Insecure SSL: Server Identity Verification Disabled (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/84586cc0
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/84586cc0
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/84586cc0

Branch: refs/heads/trunk
Commit: 84586cc07860b9bf3434c07b427e3c8a067a6351
Parents: e20305b
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon May 8 22:33:17 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon May 8 22:33:17 2017 +0300

--
 .../main/python/ambari_agent/AmbariConfig.py| 67 +++-
 .../ambari_agent/CustomServiceOrchestrator.py   |  3 +-
 .../src/main/python/ambari_agent/NetUtil.py |  5 +-
 .../python/ambari_agent/alerts/web_alert.py |  5 +-
 .../main/python/ambari_commons/inet_utils.py| 43 +
 .../libraries/functions/curl_krb_request.py | 17 -
 .../libraries/script/script.py  | 18 +-
 .../HDFS/2.1.0.2.0/package/scripts/utils.py |  5 +-
 8 files changed, 114 insertions(+), 49 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/84586cc0/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py 
b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
index 4118ece..95e4712 100644
--- a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
+++ b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
@@ -27,6 +27,7 @@ import os
 
 from ambari_commons import OSConst
 from ambari_commons.os_family_impl import OsFamilyFuncImpl, OsFamilyImpl
+
 logger = logging.getLogger(__name__)
 
 content = """
@@ -74,9 +75,8 @@ log_command_executes = 0
 
 """.format(ps=os.sep)
 
-
 servicesToPidNames = {
-  'GLUSTERFS' : 'glusterd.pid$',
+  'GLUSTERFS': 'glusterd.pid$',
   'NAMENODE': 'hadoop-{USER}-namenode.pid$',
   'SECONDARY_NAMENODE': 'hadoop-{USER}-secondarynamenode.pid$',
   'DATANODE': 'hadoop-{USER}-datanode.pid$',
@@ -97,13 +97,13 @@ servicesToPidNames = {
   'KERBEROS_SERVER': 'kadmind.pid',
   'HIVE_SERVER': 'hive-server.pid',
   'HIVE_METASTORE': 'hive.pid',
-  'HIVE_SERVER_INTERACTIVE' : 'hive-interactive.pid',
+  'HIVE_SERVER_INTERACTIVE': 'hive-interactive.pid',
   'MYSQL_SERVER': 'mysqld.pid',
   'HUE_SERVER': '/var/run/hue/supervisor.pid',
   'WEBHCAT_SERVER': 'webhcat.pid',
 }
 
-#Each service, which's pid depends on user should provide user mapping
+# Each service, which's pid depends on user should provide user mapping
 servicesToLinuxUser = {
   'NAMENODE': 'hdfs_user',
   'SECONDARY_NAMENODE': 'hdfs_user',
@@ -120,30 +120,30 @@ servicesToLinuxUser = {
 }
 
 pidPathVars = [
-  {'var' : 'glusterfs_pid_dir_prefix',
-   'defaultValue' : '/var/run'},
-  {'var' : 'hadoop_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop'},
-  {'var' : 'hadoop_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop'},
-  {'var' : 'hbase_pid_dir',
-   'defaultValue' : '/var/run/hbase'},
-  {'var' : 'zk_pid_dir',
-   'defaultValue' : '/var/run/zookeeper'},
-  {'var' : 'oozie_pid_dir',
-   'defaultValue' : '/var/run/oozie'},
-  {'var' : 'hcat_pid_dir',
-   'defaultValue' : '/var/run/webhcat'},
-  {'var' : 'hive_pid_dir',
-   'defaultValue' : '/var/run/hive'},
-  {'var' : 'mysqld_pid_dir',
-   'defaultValue' : '/var/run/mysqld'},
-  {'var' : 'hcat_pid_dir',
-   'defaultValue' : '/var/run/webhcat'},
-  {'var' : 'yarn_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop-yarn'},
-  {'var' : 'mapred_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop-mapreduce'},
+  {'var': 'glusterfs_pid_dir_prefix',
+   'defaultValue': '/var/run'},
+  {'var': 'hadoop_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop'},
+  {'var': 'hadoop_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop'},
+  {'var': 'hbase_pid_dir',
+   'defaultValue': '/var/run/hbase'},
+  {'var': 'zk_pid_dir',
+   'defaultValue': '/var/run/zookeeper'},
+  {'var': 'oozie_pid_dir',
+   'defaultValue': '/var/run/oozie'},
+  {'var': 'hcat_pid_dir',
+   'defaultValue': '/var/run/webhcat'},
+  {'var': 'hive_pid_dir',
+   'defaultValue': '/var/run/hive'},
+  {'var': 'mysqld_pid_dir',
+   'defaultValue': '/var/run/mysqld'},
+  {'var': 'hcat_pid_dir',
+   'defaultValue': '/var/run/webhcat'},
+  {'var': 'yarn_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop-yarn'},
+  {'var': 'mapred_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop-mapreduce'},
 ]
 
 
@@ -323,7 +323,7 @@ class AmbariConfig:
 if reg_resp and AmbariConfig.AMBARI_PROPERTIES_CATEGOR

ambari git commit: AMBARI-20953. Insecure SSL: Server Identity Verification Disabled (echekanskiy)

2017-05-08 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 ed42f68f0 -> d1a4cea57


AMBARI-20953. Insecure SSL: Server Identity Verification Disabled (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d1a4cea5
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d1a4cea5
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d1a4cea5

Branch: refs/heads/branch-2.5
Commit: d1a4cea57df44cb54344b5b2704f35c2cb5f4714
Parents: ed42f68
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon May 8 22:31:38 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon May 8 22:31:38 2017 +0300

--
 .../main/python/ambari_agent/AmbariConfig.py| 67 +++-
 .../ambari_agent/CustomServiceOrchestrator.py   |  3 +-
 .../src/main/python/ambari_agent/NetUtil.py |  5 +-
 .../python/ambari_agent/alerts/web_alert.py |  5 +-
 .../main/python/ambari_commons/inet_utils.py| 43 +
 .../libraries/functions/curl_krb_request.py | 17 -
 .../libraries/script/script.py  | 18 +-
 .../HDFS/2.1.0.2.0/package/scripts/utils.py |  5 +-
 8 files changed, 114 insertions(+), 49 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/d1a4cea5/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py 
b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
index 16fb97d..f6773d5 100644
--- a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
+++ b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
@@ -27,6 +27,7 @@ import os
 
 from ambari_commons import OSConst
 from ambari_commons.os_family_impl import OsFamilyFuncImpl, OsFamilyImpl
+
 logger = logging.getLogger(__name__)
 
 content = """
@@ -74,9 +75,8 @@ log_command_executes = 0
 
 """.format(ps=os.sep)
 
-
 servicesToPidNames = {
-  'GLUSTERFS' : 'glusterd.pid$',
+  'GLUSTERFS': 'glusterd.pid$',
   'NAMENODE': 'hadoop-{USER}-namenode.pid$',
   'SECONDARY_NAMENODE': 'hadoop-{USER}-secondarynamenode.pid$',
   'DATANODE': 'hadoop-{USER}-datanode.pid$',
@@ -97,13 +97,13 @@ servicesToPidNames = {
   'KERBEROS_SERVER': 'kadmind.pid',
   'HIVE_SERVER': 'hive-server.pid',
   'HIVE_METASTORE': 'hive.pid',
-  'HIVE_SERVER_INTERACTIVE' : 'hive-interactive.pid',
+  'HIVE_SERVER_INTERACTIVE': 'hive-interactive.pid',
   'MYSQL_SERVER': 'mysqld.pid',
   'HUE_SERVER': '/var/run/hue/supervisor.pid',
   'WEBHCAT_SERVER': 'webhcat.pid',
 }
 
-#Each service, which's pid depends on user should provide user mapping
+# Each service, which's pid depends on user should provide user mapping
 servicesToLinuxUser = {
   'NAMENODE': 'hdfs_user',
   'SECONDARY_NAMENODE': 'hdfs_user',
@@ -120,30 +120,30 @@ servicesToLinuxUser = {
 }
 
 pidPathVars = [
-  {'var' : 'glusterfs_pid_dir_prefix',
-   'defaultValue' : '/var/run'},
-  {'var' : 'hadoop_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop'},
-  {'var' : 'hadoop_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop'},
-  {'var' : 'hbase_pid_dir',
-   'defaultValue' : '/var/run/hbase'},
-  {'var' : 'zk_pid_dir',
-   'defaultValue' : '/var/run/zookeeper'},
-  {'var' : 'oozie_pid_dir',
-   'defaultValue' : '/var/run/oozie'},
-  {'var' : 'hcat_pid_dir',
-   'defaultValue' : '/var/run/webhcat'},
-  {'var' : 'hive_pid_dir',
-   'defaultValue' : '/var/run/hive'},
-  {'var' : 'mysqld_pid_dir',
-   'defaultValue' : '/var/run/mysqld'},
-  {'var' : 'hcat_pid_dir',
-   'defaultValue' : '/var/run/webhcat'},
-  {'var' : 'yarn_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop-yarn'},
-  {'var' : 'mapred_pid_dir_prefix',
-   'defaultValue' : '/var/run/hadoop-mapreduce'},
+  {'var': 'glusterfs_pid_dir_prefix',
+   'defaultValue': '/var/run'},
+  {'var': 'hadoop_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop'},
+  {'var': 'hadoop_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop'},
+  {'var': 'hbase_pid_dir',
+   'defaultValue': '/var/run/hbase'},
+  {'var': 'zk_pid_dir',
+   'defaultValue': '/var/run/zookeeper'},
+  {'var': 'oozie_pid_dir',
+   'defaultValue': '/var/run/oozie'},
+  {'var': 'hcat_pid_dir',
+   'defaultValue': '/var/run/webhcat'},
+  {'var': 'hive_pid_dir',
+   'defaultValue': '/var/run/hive'},
+  {'var': 'mysqld_pid_dir',
+   'defaultValue': '/var/run/mysqld'},
+  {'var': 'hcat_pid_dir',
+   'defaultValue': '/var/run/webhcat'},
+  {'var': 'yarn_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop-yarn'},
+  {'var': 'mapred_pid_dir_prefix',
+   'defaultValue': '/var/run/hadoop-mapreduce'},
 ]
 
 
@@ -311,7 +311,7 @@ class AmbariConfig:
 if reg_resp and AmbariConfig.AMBARI_PROPERTIES_CATEGOR

ambari git commit: AMBARI-20857. After WE is enabled, graphana fails to start with SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version (echekanskiy)

2017-05-03 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 4134bbfb0 -> b733e70fd


AMBARI-20857. After WE is enabled, graphana fails to start with 
SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/b733e70f
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/b733e70f
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/b733e70f

Branch: refs/heads/trunk
Commit: b733e70fd3aecb05e7817dcbd5e118a37381ace6
Parents: 4134bbf
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed May 3 13:19:35 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed May 3 13:19:35 2017 +0300

--
 .../main/python/ambari_agent/AmbariConfig.py| 16 +-
 .../ambari_agent/CustomServiceOrchestrator.py   |  2 +-
 .../src/main/python/ambari_agent/NetUtil.py |  2 +-
 .../python/ambari_agent/alerts/web_alert.py |  2 +-
 .../src/main/python/ambari_commons/network.py   | 20 ++-
 .../libraries/script/script.py  | 57 
 .../package/scripts/metrics_grafana_util.py | 37 +
 .../0.1.0/package/scripts/service_check.py  | 22 +---
 .../package/alerts/alert_metrics_deviation.py   | 10 +++-
 .../HDFS/2.1.0.2.0/package/scripts/params.py|  2 +-
 .../HDFS/2.1.0.2.0/package/scripts/utils.py |  2 +-
 .../0.8/services/HDFS/package/scripts/params.py |  2 +-
 .../stacks/2.5/RANGER_KMS/test_kms_server.py| 13 -
 13 files changed, 122 insertions(+), 65 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/b733e70f/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py 
b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
index cf48189..4118ece 100644
--- a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
+++ b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
@@ -328,9 +328,23 @@ class AmbariConfig:
 logger.info("Updating config property (%s) with value (%s)", k, v)
 pass
 
-  def get_force_https_protocol(self):
+  def get_force_https_protocol_name(self):
+"""
+Get forced https protocol name.
+
+:return: protocol name, PROTOCOL_TLSv1 by default
+"""
 return self.get('security', 'force_https_protocol', 
default="PROTOCOL_TLSv1")
 
+  def get_force_https_protocol_value(self):
+"""
+Get forced https protocol value that correspondents to ssl module variable.
+
+:return: protocol value
+"""
+import ssl
+return getattr(ssl, self.get_force_https_protocol_name())
+
 def isSameHostList(hostlist1, hostlist2):
   is_same = True
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/b733e70f/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py 
b/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
index 8b8a8f9..4c13eb3 100644
--- a/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
+++ b/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
@@ -81,7 +81,7 @@ class CustomServiceOrchestrator():
   def __init__(self, config, controller):
 self.config = config
 self.tmp_dir = config.get('agent', 'prefix')
-self.force_https_protocol = config.get_force_https_protocol()
+self.force_https_protocol = config.get_force_https_protocol_name()
 self.exec_tmp_dir = Constants.AGENT_TMP_DIR
 self.file_cache = FileCache(config)
 self.status_commands_stdout = os.path.join(self.tmp_dir,

http://git-wip-us.apache.org/repos/asf/ambari/blob/b733e70f/ambari-agent/src/main/python/ambari_agent/NetUtil.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/NetUtil.py 
b/ambari-agent/src/main/python/ambari_agent/NetUtil.py
index 9b29633..fc19605 100644
--- a/ambari-agent/src/main/python/ambari_agent/NetUtil.py
+++ b/ambari-agent/src/main/python/ambari_agent/NetUtil.py
@@ -29,7 +29,7 @@ LOG_REQUEST_MESSAGE = "GET %s -> %s, body: %s"
 
 logger = logging.getLogger(__name__)
 
-ensure_ssl_using_protocol(AmbariConfig.get_resolved_config().get_force_https_protocol())
+ensure_ssl_using_protocol(AmbariConfig.get_resolved_config().get_force_https_protocol_name())
 
 class NetUtil:
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/b733e70f/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py

ambari git commit: AMBARI-20857. After WE is enabled, graphana fails to start with SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version (echekanskiy)

2017-05-03 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 aa34023e5 -> 5e4b94bbb


AMBARI-20857. After WE is enabled, graphana fails to start with 
SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/5e4b94bb
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/5e4b94bb
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/5e4b94bb

Branch: refs/heads/branch-2.5
Commit: 5e4b94bbb826dcbae40d2c8b6a599d6c5f09ac75
Parents: aa34023
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Wed May 3 13:16:03 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Wed May 3 13:16:03 2017 +0300

--
 .../main/python/ambari_agent/AmbariConfig.py| 16 +-
 .../ambari_agent/CustomServiceOrchestrator.py   |  2 +-
 .../src/main/python/ambari_agent/NetUtil.py |  2 +-
 .../python/ambari_agent/alerts/web_alert.py |  2 +-
 .../src/main/python/ambari_commons/network.py   | 20 ++-
 .../libraries/script/script.py  | 57 
 .../package/scripts/metrics_grafana_util.py | 37 +
 .../0.1.0/package/scripts/service_check.py  | 22 +---
 .../package/alerts/alert_metrics_deviation.py   | 10 +++-
 .../HDFS/2.1.0.2.0/package/scripts/params.py|  2 +-
 .../HDFS/2.1.0.2.0/package/scripts/utils.py |  2 +-
 .../0.8/services/HDFS/package/scripts/params.py |  2 +-
 .../stacks/2.5/RANGER_KMS/test_kms_server.py| 13 -
 13 files changed, 122 insertions(+), 65 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/5e4b94bb/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py 
b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
index 2a6023f..16fb97d 100644
--- a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
+++ b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
@@ -316,9 +316,23 @@ class AmbariConfig:
 logger.info("Updating config property (%s) with value (%s)", k, v)
 pass
 
-  def get_force_https_protocol(self):
+  def get_force_https_protocol_name(self):
+"""
+Get forced https protocol name.
+
+:return: protocol name, PROTOCOL_TLSv1 by default
+"""
 return self.get('security', 'force_https_protocol', 
default="PROTOCOL_TLSv1")
 
+  def get_force_https_protocol_value(self):
+"""
+Get forced https protocol value that correspondents to ssl module variable.
+
+:return: protocol value
+"""
+import ssl
+return getattr(ssl, self.get_force_https_protocol_name())
+
 def isSameHostList(hostlist1, hostlist2):
   is_same = True
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/5e4b94bb/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py 
b/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
index e6523e5..28574cc 100644
--- a/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
+++ b/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
@@ -81,7 +81,7 @@ class CustomServiceOrchestrator():
   def __init__(self, config, controller):
 self.config = config
 self.tmp_dir = config.get('agent', 'prefix')
-self.force_https_protocol = config.get_force_https_protocol()
+self.force_https_protocol = config.get_force_https_protocol_name()
 self.exec_tmp_dir = Constants.AGENT_TMP_DIR
 self.file_cache = FileCache(config)
 self.status_commands_stdout = os.path.join(self.tmp_dir,

http://git-wip-us.apache.org/repos/asf/ambari/blob/5e4b94bb/ambari-agent/src/main/python/ambari_agent/NetUtil.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/NetUtil.py 
b/ambari-agent/src/main/python/ambari_agent/NetUtil.py
index 9b29633..fc19605 100644
--- a/ambari-agent/src/main/python/ambari_agent/NetUtil.py
+++ b/ambari-agent/src/main/python/ambari_agent/NetUtil.py
@@ -29,7 +29,7 @@ LOG_REQUEST_MESSAGE = "GET %s -> %s, body: %s"
 
 logger = logging.getLogger(__name__)
 
-ensure_ssl_using_protocol(AmbariConfig.get_resolved_config().get_force_https_protocol())
+ensure_ssl_using_protocol(AmbariConfig.get_resolved_config().get_force_https_protocol_name())
 
 class NetUtil:
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/5e4b94bb/ambari-agent/src/main/python/ambari_agent/alerts/web_alert.py
-

ambari git commit: AMBARI-20831. Ambari agents can only connect to the server using TLSv1 (echekanskiy)

2017-04-24 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.4 79ba64c2b -> b9de1383c


AMBARI-20831. Ambari agents can only connect to the server using TLSv1 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/b9de1383
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/b9de1383
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/b9de1383

Branch: refs/heads/branch-2.4
Commit: b9de1383cd714ccc132e84abb80e8760d75a573e
Parents: 79ba64c
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Apr 24 17:47:04 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Apr 24 17:47:04 2017 +0300

--
 .../main/python/ambari_agent/AmbariConfig.py| 25 +---
 .../ambari_agent/CustomServiceOrchestrator.py   |  4 +++-
 .../src/main/python/ambari_agent/NetUtil.py |  3 +++
 .../python/ambari_agent/alerts/web_alert.py | 16 -
 .../TestCustomServiceOrchestrator.py|  8 +++
 .../main/python/ambari_commons/inet_utils.py| 21 
 .../libraries/script/script.py  | 15 +---
 .../HDFS/2.1.0.2.0/package/files/checkWebUI.py  | 17 +++--
 .../HDFS/2.1.0.2.0/package/scripts/params.py|  1 +
 .../2.1.0.2.0/package/scripts/service_check.py  |  3 ++-
 .../HDFS/2.1.0.2.0/package/scripts/utils.py |  2 ++
 .../0.8/services/HDFS/package/scripts/params.py |  1 +
 .../HDFS/package/scripts/service_check.py   |  2 +-
 13 files changed, 86 insertions(+), 32 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/b9de1383/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py 
b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
index ae938dc..7db856c 100644
--- a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
+++ b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
@@ -23,7 +23,6 @@ import ConfigParser
 import StringIO
 import hostname
 import ambari_simplejson as json
-from NetUtil import NetUtil
 import os
 
 from ambari_commons import OSConst
@@ -160,7 +159,6 @@ class AmbariConfig:
   def __init__(self):
 global content
 self.config = ConfigParser.RawConfigParser()
-self.net = NetUtil(self)
 self.config.readfp(StringIO.StringIO(content))
 
   def get(self, section, value, default=None):
@@ -186,6 +184,23 @@ class AmbariConfig:
   def getConfig(self):
 return self.config
 
+  @classmethod
+  def get_resolved_config(cls):
+if hasattr(cls, "_conf_cache"):
+  return getattr(cls, "_conf_cache")
+config = cls()
+configPath = os.path.abspath(cls.getConfigFile())
+try:
+  if os.path.exists(configPath):
+config.read(configPath)
+  else:
+raise Exception("No config found at {0}, use 
default".format(configPath))
+
+except Exception, err:
+  logger.warn(err)
+setattr(cls, "_conf_cache", config)
+return config
+
   @staticmethod
   @OsFamilyFuncImpl(OSConst.WINSRV_FAMILY)
   def getConfigFile():
@@ -236,7 +251,8 @@ class AmbariConfig:
 self.config.read(filename)
 
   def getServerOption(self, url, name, default=None):
-status, response = self.net.checkURL(url)
+from ambari_agent.NetUtil import NetUtil
+status, response = NetUtil(self).checkURL(url)
 if status is True:
   try:
 data = json.loads(response)
@@ -273,6 +289,9 @@ class AmbariConfig:
 logger.info("Updating config property (%s) with value (%s)", k, v)
 pass
 
+  def get_force_https_protocol(self):
+return self.get('security', 'force_https_protocol', 
default="PROTOCOL_TLSv1")
+
 def isSameHostList(hostlist1, hostlist2):
   is_same = True
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/b9de1383/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py 
b/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
index 57416a4..b4e2076 100644
--- a/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
+++ b/ambari-agent/src/main/python/ambari_agent/CustomServiceOrchestrator.py
@@ -67,6 +67,7 @@ class CustomServiceOrchestrator():
   def __init__(self, config, controller):
 self.config = config
 self.tmp_dir = config.get('agent', 'prefix')
+self.force_https_protocol = config.get_force_https_protocol()
 self.exec_tmp_dir = Constants.AGENT_TMP_DIR
 self.file_cache = FileCache(config)
 self.status_commands_stdout = os.

[2/5] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/b299641a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_zkfc.py
--
diff --git a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_zkfc.py 
b/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_zkfc.py
index e952108..127a045 100644
--- a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_zkfc.py
+++ b/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_zkfc.py
@@ -381,104 +381,4 @@ class TestZkfc(RMFTestCase):
 environment = {'HADOOP_LIBEXEC_DIR': '/usr/lib/hadoop/libexec'},
 not_if = "ambari-sudo.sh [RMF_ENV_PLACEHOLDER] -H -E test -f 
/var/run/hadoop/hdfs/hadoop-hdfs-zkfc.pid && ambari-sudo.sh 
[RMF_ENV_PLACEHOLDER] -H -E pgrep -F /var/run/hadoop/hdfs/hadoop-hdfs-zkfc.pid",
 )
-self.assertNoMoreResources()
-
-
-  
@patch("resource_management.libraries.functions.security_commons.build_expectations")
-  
@patch("resource_management.libraries.functions.security_commons.get_params_from_filesystem")
-  
@patch("resource_management.libraries.functions.security_commons.validate_security_config_properties")
-  
@patch("resource_management.libraries.functions.security_commons.cached_kinit_executor")
-  @patch("resource_management.libraries.script.Script.put_structured_out")
-  def test_security_status(self, put_structured_out_mock, 
cached_kinit_executor_mock, validate_security_config_mock, get_params_mock, 
build_exp_mock):
-
-# Test that function works when is called with correct parameters
-security_params = {
-  'core-site': {
-'hadoop.security.authentication': 'kerberos'
-  }
-}
-
-props_value_check = {"hadoop.security.authentication": "kerberos",
- "hadoop.security.authorization": "true"}
-props_empty_check = ["hadoop.security.auth_to_local"]
-props_read_check = None
-
-result_issues = []
-
-get_params_mock.return_value = security_params
-validate_security_config_mock.return_value = result_issues
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/zkfc_slave.py",
-   classname = "ZkfcSlave",
-   command = "security_status",
-   config_file="secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-
-build_exp_mock.assert_called_with('core-site', props_value_check, 
props_empty_check, props_read_check)
-put_structured_out_mock.assert_called_with({"securityState": 
"SECURED_KERBEROS"})
-cached_kinit_executor_mock.called_with('/usr/bin/kinit',
-   
self.config_dict['configurations']['hadoop-env']['hdfs_user'],
-   
self.config_dict['configurations']['hadoop-env']['hdfs_user_keytab'],
-   
self.config_dict['configurations']['hadoop-env']['hdfs_user_principal_name'],
-   self.config_dict['hostname'],
-   '/tmp')
-
-# Testing that the exception throw by cached_executor is caught
-cached_kinit_executor_mock.reset_mock()
-cached_kinit_executor_mock.side_effect = Exception("Invalid command")
-
-try:
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/zkfc_slave.py",
-   classname = "ZkfcSlave",
-   command = "security_status",
-   config_file="secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-except:
-  self.assertTrue(True)
-
-# Testing when hadoop.security.authentication is simple
-security_params['core-site']['hadoop.security.authentication'] = 'simple'
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/zkfc_slave.py",
-   classname = "ZkfcSlave",
-   command = "security_status",
-   config_file="secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-
-put_structured_out_mock.assert_called_with({"securityState": "UNSECURED"})
-security_params['core-site']['hadoop.security.authentication'] = 'kerberos'
-
-# Testing with not empty result_issues
-result_issues_with_params = {
-  'hdfs-site': "Something bad happened"
-}
-
-validate_security_config_mock.reset_mock()
-get_params_mock.reset_mock()
-validate_security_config_mock.return_value = result_issues_with_params
-get_params_mock.return_value = security_params
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/zkfc_slave.py",
-   classname = "ZkfcSlave",
-  

[5/5] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/b299641a
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/b299641a
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/b299641a

Branch: refs/heads/trunk
Commit: b299641a076266e8f2a19f55068c89d56bc602b7
Parents: 84d2b3a
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Fri Apr 21 17:54:13 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Fri Apr 21 17:54:13 2017 +0300

--
 .../src/main/python/ambari_agent/ActionQueue.py |   9 +-
 .../ambari_agent/CustomServiceOrchestrator.py   |  33 +-
 .../test/python/ambari_agent/TestActionQueue.py |  13 +--
 .../TestCustomServiceOrchestrator.py|  51 
 .../libraries/script/script.py  |  16 ---
 .../ambari/server/agent/ComponentStatus.java|  28 +
 .../ambari/server/agent/HeartbeatProcessor.java |  20 
 .../package/scripts/accumulo_script.py  |  50 
 .../0.1.0/package/scripts/metrics_collector.py  |  66 +--
 .../package/scripts/metadata_server.py  |  78 -
 .../0.5.0.2.1/package/scripts/falcon_client.py  |  10 --
 .../0.5.0.2.1/package/scripts/falcon_server.py  |  59 --
 .../0.96.0.2.0/package/scripts/hbase_master.py  |  49 
 .../package/scripts/hbase_regionserver.py   |  49 
 .../package/scripts/phoenix_queryserver.py  |   6 +-
 .../HDFS/2.1.0.2.0/package/scripts/datanode.py  |  58 -
 .../2.1.0.2.0/package/scripts/hdfs_client.py|  45 ---
 .../2.1.0.2.0/package/scripts/journalnode.py|  57 -
 .../HDFS/2.1.0.2.0/package/scripts/namenode.py  |  57 -
 .../2.1.0.2.0/package/scripts/nfsgateway.py |  58 -
 .../HDFS/2.1.0.2.0/package/scripts/snamenode.py |  60 --
 .../2.1.0.2.0/package/scripts/zkfc_slave.py |  43 ---
 .../HDFS/3.0.0.3.0/package/scripts/datanode.py  |  58 -
 .../3.0.0.3.0/package/scripts/hdfs_client.py|  45 ---
 .../3.0.0.3.0/package/scripts/journalnode.py|  57 -
 .../HDFS/3.0.0.3.0/package/scripts/namenode.py  |  57 -
 .../3.0.0.3.0/package/scripts/nfsgateway.py |  58 -
 .../HDFS/3.0.0.3.0/package/scripts/snamenode.py |  60 --
 .../3.0.0.3.0/package/scripts/zkfc_slave.py |  43 ---
 .../package/scripts/hive_metastore.py   |  52 -
 .../0.12.0.2.0/package/scripts/hive_server.py   |  61 --
 .../package/scripts/hive_server_interactive.py  |  61 --
 .../package/scripts/webhcat_server.py   |  67 ---
 .../2.1.0.3.0/package/scripts/hive_metastore.py |  52 -
 .../2.1.0.3.0/package/scripts/hive_server.py|  61 --
 .../package/scripts/hive_server_interactive.py  |  61 --
 .../2.1.0.3.0/package/scripts/webhcat_server.py |  67 ---
 .../package/scripts/kerberos_client.py  |  21 
 .../0.5.0.2.2/package/scripts/knox_gateway.py   |  61 --
 .../4.0.0.2.0/package/scripts/oozie_server.py   |  63 --
 .../STORM/0.9.1/package/scripts/drpc_server.py  |  52 -
 .../STORM/0.9.1/package/scripts/nimbus.py   |  45 ---
 .../STORM/0.9.1/package/scripts/pacemaker.py|  52 -
 .../STORM/0.9.1/package/scripts/ui_server.py|  53 -
 .../scripts/application_timeline_server.py  |  61 --
 .../2.1.0.2.0/package/scripts/historyserver.py  |  56 -
 .../2.1.0.2.0/package/scripts/nodemanager.py|  60 --
 .../package/scripts/resourcemanager.py  |  60 --
 .../scripts/application_timeline_server.py  |  61 --
 .../3.0.0.3.0/package/scripts/historyserver.py  |  56 -
 .../3.0.0.3.0/package/scripts/nodemanager.py|  60 --
 .../package/scripts/resourcemanager.py  |  60 --
 .../3.4.5/package/scripts/zookeeper_server.py   |  51 
 .../KERBEROS/package/scripts/kerberos_client.py |  21 
 .../server/agent/HeartbeatProcessorTest.java|   7 --
 .../server/agent/TestHeartbeatHandler.java  |  13 ---
 .../stacks/2.0.6/HBASE/test_hbase_master.py | 102 
 .../2.0.6/HBASE/test_hbase_regionserver.py  | 104 -
 .../python/stacks/2.0.6/HDFS/test_datanode.py   | 111 --
 .../stacks/2.0.6/HDFS/test_hdfs_client.py   | 100 
 .../stacks/2.0.6/HDFS/test_journalnode.py   | 114 --
 .../python/stacks/2.0.6/HDFS/test_namenode.py   | 114 --
 .../python/stacks/2.0.6/HDFS/test_nfsgateway.py | 116 --
 .../python/stacks/2.0.6/HDFS/test_snamenode.py  | 117 +--
 .../test/python/stacks/2.0.6/HDFS/test_zkfc.py  | 102 +---
 .../stacks/2.0.6/HIVE/test_hi

[4/5] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/b299641a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/namenode.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/namenode.py
 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/namenode.py
index 602dad7..a42ca79 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/namenode.py
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/namenode.py
@@ -216,63 +216,6 @@ class NameNodeDefault(NameNode):
 try_sleep=10
 )
 
-  def security_status(self, env):
-import status_params
-
-env.set_params(status_params)
-props_value_check = {"hadoop.security.authentication": "kerberos",
- "hadoop.security.authorization": "true"}
-props_empty_check = ["hadoop.security.auth_to_local"]
-props_read_check = None
-core_site_expectations = build_expectations('core-site', 
props_value_check, props_empty_check,
-props_read_check)
-props_value_check = None
-props_empty_check = ['dfs.namenode.kerberos.internal.spnego.principal',
- 'dfs.namenode.keytab.file',
- 'dfs.namenode.kerberos.principal']
-props_read_check = ['dfs.namenode.keytab.file']
-hdfs_site_expectations = build_expectations('hdfs-site', 
props_value_check, props_empty_check,
-props_read_check)
-
-hdfs_expectations = {}
-hdfs_expectations.update(core_site_expectations)
-hdfs_expectations.update(hdfs_site_expectations)
-
-security_params = get_params_from_filesystem(status_params.hadoop_conf_dir,
- {'core-site.xml': 
FILE_TYPE_XML,
-  'hdfs-site.xml': 
FILE_TYPE_XML})
-if 'core-site' in security_params and 'hadoop.security.authentication' in 
security_params['core-site'] and \
-security_params['core-site']['hadoop.security.authentication'].lower() 
== 'kerberos':
-  result_issues = validate_security_config_properties(security_params, 
hdfs_expectations)
-  if not result_issues:  # If all validations passed successfully
-try:
-  # Double check the dict before calling execute
-  if ( 'hdfs-site' not in security_params
-   or 'dfs.namenode.keytab.file' not in 
security_params['hdfs-site']
-   or 'dfs.namenode.kerberos.principal' not in 
security_params['hdfs-site']):
-self.put_structured_out({"securityState": "UNSECURED"})
-self.put_structured_out(
-  {"securityIssuesFound": "Keytab file or principal are not set 
property."})
-return
-  cached_kinit_executor(status_params.kinit_path_local,
-status_params.hdfs_user,
-
security_params['hdfs-site']['dfs.namenode.keytab.file'],
-
security_params['hdfs-site']['dfs.namenode.kerberos.principal'],
-status_params.hostname,
-status_params.tmp_dir)
-  self.put_structured_out({"securityState": "SECURED_KERBEROS"})
-except Exception as e:
-  self.put_structured_out({"securityState": "ERROR"})
-  self.put_structured_out({"securityStateErrorInfo": str(e)})
-  else:
-issues = []
-for cf in result_issues:
-  issues.append("Configuration file %s did not pass the validation. 
Reason: %s" % (cf, result_issues[cf]))
-self.put_structured_out({"securityIssuesFound": ". ".join(issues)})
-self.put_structured_out({"securityState": "UNSECURED"})
-else:
-  self.put_structured_out({"securityState": "UNSECURED"})
-
   def rebalancehdfs(self, env):
 import params
 env.set_params(params)

http://git-wip-us.apache.org/repos/asf/ambari/blob/b299641a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py
 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py
index 7ba1f96..602c179 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py
@@ -77,64 +77,6 @@ class NFSGateway(Script):
 
 check_process_status(status_params.nfsgateway_pid_file)
 
-  def security_status(self, env):
-import status_params
-
-env.set_params(status_params)
-

[3/5] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/b299641a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/resourcemanager.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/resourcemanager.py
 
b/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/resourcemanager.py
index b871b68..81b99e6 100644
--- 
a/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/resourcemanager.py
+++ 
b/ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/resourcemanager.py
@@ -132,66 +132,6 @@ class ResourcemanagerDefault(Resourcemanager):
 check_process_status(status_params.resourcemanager_pid_file)
 pass
 
-  def security_status(self, env):
-import status_params
-env.set_params(status_params)
-if status_params.security_enabled:
-  props_value_check = {"yarn.timeline-service.http-authentication.type": 
"kerberos",
-   "yarn.acl.enable": "true"}
-  props_empty_check = ["yarn.resourcemanager.principal",
-   "yarn.resourcemanager.keytab",
-   "yarn.resourcemanager.webapp.spnego-principal",
-   "yarn.resourcemanager.webapp.spnego-keytab-file"]
-
-  props_read_check = ["yarn.resourcemanager.keytab",
-  "yarn.resourcemanager.webapp.spnego-keytab-file"]
-  yarn_site_props = build_expectations('yarn-site', props_value_check, 
props_empty_check,
-   props_read_check)
-
-  yarn_expectations ={}
-  yarn_expectations.update(yarn_site_props)
-
-  security_params = 
get_params_from_filesystem(status_params.hadoop_conf_dir,
-   {'yarn-site.xml': 
FILE_TYPE_XML})
-  result_issues = validate_security_config_properties(security_params, 
yarn_site_props)
-  if not result_issues: # If all validations passed successfully
-try:
-  # Double check the dict before calling execute
-  if ( 'yarn-site' not in security_params
-   or 'yarn.resourcemanager.keytab' not in 
security_params['yarn-site']
-   or 'yarn.resourcemanager.principal' not in 
security_params['yarn-site']) \
-or 'yarn.resourcemanager.webapp.spnego-keytab-file' not in 
security_params['yarn-site'] \
-or 'yarn.resourcemanager.webapp.spnego-principal' not in 
security_params['yarn-site']:
-self.put_structured_out({"securityState": "UNSECURED"})
-self.put_structured_out(
-  {"securityIssuesFound": "Keytab file or principal are not set 
property."})
-return
-
-  cached_kinit_executor(status_params.kinit_path_local,
-status_params.yarn_user,
-
security_params['yarn-site']['yarn.resourcemanager.keytab'],
-
security_params['yarn-site']['yarn.resourcemanager.principal'],
-status_params.hostname,
-status_params.tmp_dir)
-  cached_kinit_executor(status_params.kinit_path_local,
-status_params.yarn_user,
-
security_params['yarn-site']['yarn.resourcemanager.webapp.spnego-keytab-file'],
-
security_params['yarn-site']['yarn.resourcemanager.webapp.spnego-principal'],
-status_params.hostname,
-status_params.tmp_dir)
-  self.put_structured_out({"securityState": "SECURED_KERBEROS"})
-except Exception as e:
-  self.put_structured_out({"securityState": "ERROR"})
-  self.put_structured_out({"securityStateErrorInfo": str(e)})
-  else:
-issues = []
-for cf in result_issues:
-  issues.append("Configuration file %s did not pass the validation. 
Reason: %s" % (cf, result_issues[cf]))
-self.put_structured_out({"securityIssuesFound": ". ".join(issues)})
-self.put_structured_out({"securityState": "UNSECURED"})
-else:
-  self.put_structured_out({"securityState": "UNSECURED"})
-
   def refreshqueues(self, env):
 import params
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/b299641a/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/application_timeline_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/application_timeline_server.py
 
b/ambari-server/src/main/resources/common-services/YARN/3.0.0.3.0/package/scripts/application_timeline_server.py
index 03fff21..b1e0c16 100644
--- 

[1/5] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 84d2b3a0a -> b299641a0


http://git-wip-us.apache.org/repos/asf/ambari/blob/b299641a/ambari-server/src/test/python/stacks/2.1/YARN/test_apptimelineserver.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.1/YARN/test_apptimelineserver.py 
b/ambari-server/src/test/python/stacks/2.1/YARN/test_apptimelineserver.py
index 5730783..530d1d9 100644
--- a/ambari-server/src/test/python/stacks/2.1/YARN/test_apptimelineserver.py
+++ b/ambari-server/src/test/python/stacks/2.1/YARN/test_apptimelineserver.py
@@ -266,116 +266,6 @@ class TestAppTimelineServer(RMFTestCase):
   group = 'hadoop',
   )
 
-
-  
@patch("resource_management.libraries.functions.security_commons.build_expectations")
-  
@patch("resource_management.libraries.functions.security_commons.get_params_from_filesystem")
-  
@patch("resource_management.libraries.functions.security_commons.validate_security_config_properties")
-  
@patch("resource_management.libraries.functions.security_commons.cached_kinit_executor")
-  @patch("resource_management.libraries.script.Script.put_structured_out")
-  def test_security_status(self, put_structured_out_mock, 
cached_kinit_executor_mock, validate_security_config_mock, get_params_mock, 
build_exp_mock):
-# Test that function works when is called with correct parameters
-
-security_params = {
-  'yarn-site': {
-'yarn.timeline-service.keytab': '/path/to/applicationtimeline/keytab',
-'yarn.timeline-service.principal': 'applicationtimeline_principal',
-'yarn.timeline-service.http-authentication.kerberos.keytab': 
'path/to/timeline/kerberos/keytab',
-'yarn.timeline-service.http-authentication.kerberos.principal': 
'timeline_principal'
-  }
-}
-result_issues = []
-props_value_check = {"yarn.timeline-service.enabled": "true",
- "yarn.timeline-service.http-authentication.type": 
"kerberos",
- "yarn.acl.enable": "true"}
-props_empty_check = ["yarn.timeline-service.principal",
- "yarn.timeline-service.keytab",
- 
"yarn.timeline-service.http-authentication.kerberos.principal",
- 
"yarn.timeline-service.http-authentication.kerberos.keytab"]
-
-props_read_check = ["yarn.timeline-service.keytab",
-
"yarn.timeline-service.http-authentication.kerberos.keytab"]
-
-get_params_mock.return_value = security_params
-validate_security_config_mock.return_value = result_issues
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/application_timeline_server.py",
-   classname="ApplicationTimelineServer",
-   command="security_status",
-   config_file="secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-
-build_exp_mock.assert_called_with('yarn-site', props_value_check, 
props_empty_check, props_read_check)
-put_structured_out_mock.assert_called_with({"securityState": 
"SECURED_KERBEROS"})
-self.assertTrue(cached_kinit_executor_mock.call_count, 2)
-cached_kinit_executor_mock.assert_called_with('/usr/bin/kinit',
-  
self.config_dict['configurations']['yarn-env']['yarn_user'],
-  
security_params['yarn-site']['yarn.timeline-service.http-authentication.kerberos.keytab'],
-  
security_params['yarn-site']['yarn.timeline-service.http-authentication.kerberos.principal'],
-  self.config_dict['hostname'],
-  '/tmp')
-
-# Testing that the exception throw by cached_executor is caught
-cached_kinit_executor_mock.reset_mock()
-cached_kinit_executor_mock.side_effect = Exception("Invalid command")
-
-try:
-  self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/application_timeline_server.py",
- classname="ApplicationTimelineServer",
- command="security_status",
- config_file="secured.json",
- stack_version = self.STACK_VERSION,
- target = RMFTestCase.TARGET_COMMON_SERVICES
-  )
-except:
-  self.assertTrue(True)
-
-# Testing with a security_params which doesn't contains yarn-site
-empty_security_params = {}
-cached_kinit_executor_mock.reset_mock()
-get_params_mock.reset_mock()
-put_structured_out_mock.reset_mock()
-get_params_mock.return_value = empty_security_params
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 

[3/4] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/712b3d21/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py
 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py
index ca3b14d..da5e82b 100644
--- 
a/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py
+++ 
b/ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py
@@ -83,73 +83,6 @@ class WebHCatServerDefault(WebHCatServer):
   conf_select.select(params.stack_name, "hadoop", params.version)
   stack_select.select("hive-webhcat", params.version)
 
-  def security_status(self, env):
-import status_params
-env.set_params(status_params)
-
-if status_params.security_enabled:
-  expectations ={}
-  expectations.update(
-build_expectations(
-  'webhcat-site',
-  {
-"templeton.kerberos.secret": "secret"
-  },
-  [
-"templeton.kerberos.keytab",
-"templeton.kerberos.principal"
-  ],
-  [
-"templeton.kerberos.keytab"
-  ]
-)
-  )
-  expectations.update(
-build_expectations(
-  'hive-site',
-  {
-"hive.server2.authentication": "KERBEROS",
-"hive.metastore.sasl.enabled": "true",
-"hive.security.authorization.enabled": "true"
-  },
-  None,
-  None
-)
-  )
-
-  security_params = {}
-  
security_params.update(get_params_from_filesystem(status_params.webhcat_conf_dir,
-{'webhcat-site.xml': 
FILE_TYPE_XML}))
-  result_issues = validate_security_config_properties(security_params, 
expectations)
-  if not result_issues: # If all validations passed successfully
-try:
-  # Double check the dict before calling execute
-  if 'webhcat-site' not in security_params \
-or 'templeton.kerberos.keytab' not in 
security_params['webhcat-site'] \
-or 'templeton.kerberos.principal' not in 
security_params['webhcat-site']:
-self.put_structured_out({"securityState": "UNSECURED"})
-self.put_structured_out({"securityIssuesFound": "Keytab file or 
principal are not set property."})
-return
-
-  cached_kinit_executor(status_params.kinit_path_local,
-status_params.webhcat_user,
-
security_params['webhcat-site']['templeton.kerberos.keytab'],
-
security_params['webhcat-site']['templeton.kerberos.principal'],
-status_params.hostname,
-status_params.tmp_dir)
-  self.put_structured_out({"securityState": "SECURED_KERBEROS"})
-except Exception as e:
-  self.put_structured_out({"securityState": "ERROR"})
-  self.put_structured_out({"securityStateErrorInfo": str(e)})
-  else:
-issues = []
-for cf in result_issues:
-  issues.append("Configuration file %s did not pass the validation. 
Reason: %s" % (cf, result_issues[cf]))
-self.put_structured_out({"securityIssuesFound": ". ".join(issues)})
-self.put_structured_out({"securityState": "UNSECURED"})
-else:
-  self.put_structured_out({"securityState": "UNSECURED"})
-
   def get_log_folder(self):
 import params
 return params.hcat_log_dir

http://git-wip-us.apache.org/repos/asf/ambari/blob/712b3d21/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-10/package/scripts/kerberos_client.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-10/package/scripts/kerberos_client.py
 
b/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-10/package/scripts/kerberos_client.py
index c50c67b..39fdcf5 100644
--- 
a/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-10/package/scripts/kerberos_client.py
+++ 
b/ambari-server/src/main/resources/common-services/KERBEROS/1.10.3-10/package/scripts/kerberos_client.py
@@ -43,27 +43,6 @@ class KerberosClient(KerberosScript):
   def status(self, env):
 raise ClientComponentHasNoStatus()
 
-  def security_status(self, env):
-import status_params
-if status_params.security_enabled:
-  if status_params.smoke_user and status_params.smoke_user_keytab:
-try:
-  cached_kinit_executor(status_params.kinit_path_local,
-status_params.smoke_user,
-status_params.smoke_user_keytab,
-   

[1/4] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 951bf1986 -> 712b3d21c


http://git-wip-us.apache.org/repos/asf/ambari/blob/712b3d21/ambari-server/src/test/python/stacks/2.1/HIVE/test_hive_metastore.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.1/HIVE/test_hive_metastore.py 
b/ambari-server/src/test/python/stacks/2.1/HIVE/test_hive_metastore.py
index de2dce8..3556e34 100644
--- a/ambari-server/src/test/python/stacks/2.1/HIVE/test_hive_metastore.py
+++ b/ambari-server/src/test/python/stacks/2.1/HIVE/test_hive_metastore.py
@@ -407,119 +407,6 @@ class TestHiveMetastore(RMFTestCase):
 user = 'hive',
 )
 
-  
@patch("resource_management.libraries.functions.security_commons.build_expectations")
-  
@patch("resource_management.libraries.functions.security_commons.get_params_from_filesystem")
-  
@patch("resource_management.libraries.functions.security_commons.validate_security_config_properties")
-  
@patch("resource_management.libraries.functions.security_commons.cached_kinit_executor")
-  @patch("resource_management.libraries.script.Script.put_structured_out")
-  def test_security_status(self, put_structured_out_mock, 
cached_kinit_executor_mock, validate_security_config_mock, get_params_mock, 
build_exp_mock):
-# Test that function works when is called with correct parameters
-
-security_params = {
-  'hive-site': {
-'hive.server2.authentication': "KERBEROS",
-'hive.metastore.sasl.enabled': "true",
-'hive.security.authorization.enabled': 'true',
-'hive.metastore.kerberos.keytab.file': 'path/to/keytab',
-'hive.metastore.kerberos.principal': 'principal'
-  }
-}
-result_issues = []
-props_value_check = {
-  'hive.server2.authentication': "KERBEROS",
-  'hive.metastore.sasl.enabled': "true",
-  'hive.security.authorization.enabled': 'true'
-}
-props_empty_check = [
-  'hive.metastore.kerberos.keytab.file',
-  'hive.metastore.kerberos.principal'
-]
-props_read_check = [
-  'hive.metastore.kerberos.keytab.file'
-]
-
-get_params_mock.return_value = security_params
-validate_security_config_mock.return_value = result_issues
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/hive_metastore.py",
-   classname = "HiveMetastore",
-   command = "security_status",
-   config_file="../../2.1/configs/secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-
-get_params_mock.assert_called_with("/usr/hdp/current/hive-server2/conf", 
{'hive-site.xml': "XML"})
-build_exp_mock.assert_called_with('hive-site', props_value_check, 
props_empty_check, props_read_check)
-put_structured_out_mock.assert_called_with({"securityState": 
"SECURED_KERBEROS"})
-self.assertTrue(cached_kinit_executor_mock.call_count, 2)
-cached_kinit_executor_mock.assert_called_with('/usr/bin/kinit',
-  
self.config_dict['configurations']['hive-env']['hive_user'],
-  
security_params['hive-site']['hive.metastore.kerberos.keytab.file'],
-  
security_params['hive-site']['hive.metastore.kerberos.principal'],
-  self.config_dict['hostname'],
-  '/tmp')
-
-# Testing that the exception throw by cached_executor is caught
-cached_kinit_executor_mock.reset_mock()
-cached_kinit_executor_mock.side_effect = Exception("Invalid command")
-
-try:
-  self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/hive_metastore.py",
- classname = "HiveMetastore",
- command = "security_status",
- config_file="../../2.1/configs/secured.json",
- stack_version = self.STACK_VERSION,
- target = RMFTestCase.TARGET_COMMON_SERVICES
-  )
-except:
-  self.assertTrue(True)
-
-# Testing with a security_params which doesn't contains startup
-empty_security_params = {}
-cached_kinit_executor_mock.reset_mock()
-get_params_mock.reset_mock()
-put_structured_out_mock.reset_mock()
-get_params_mock.return_value = empty_security_params
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/hive_metastore.py",
-   classname = "HiveMetastore",
-   command = "security_status",
-   config_file="../../2.1/configs/secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-

[4/4] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/712b3d21
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/712b3d21
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/712b3d21

Branch: refs/heads/branch-2.5
Commit: 712b3d21c12aec99899cffd54a694b4bcb64dd93
Parents: 951bf19
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Fri Apr 21 17:52:18 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Fri Apr 21 17:52:18 2017 +0300

--
 .../src/main/python/ambari_agent/ActionQueue.py |   9 +-
 .../ambari_agent/CustomServiceOrchestrator.py   |  33 +-
 .../test/python/ambari_agent/TestActionQueue.py |  13 +--
 .../TestCustomServiceOrchestrator.py|  51 
 .../libraries/script/script.py  |  16 ---
 .../ambari/server/agent/ComponentStatus.java|  28 +
 .../ambari/server/agent/HeartbeatProcessor.java |  22 
 .../package/scripts/accumulo_script.py  |  50 
 .../0.1.0/package/scripts/metrics_collector.py  |  66 +--
 .../package/scripts/metadata_server.py  |  78 -
 .../0.5.0.2.1/package/scripts/falcon_client.py  |  10 --
 .../0.5.0.2.1/package/scripts/falcon_server.py  |  59 --
 .../0.96.0.2.0/package/scripts/hbase_master.py  |  49 
 .../package/scripts/hbase_regionserver.py   |  49 
 .../package/scripts/phoenix_queryserver.py  |   6 +-
 .../HDFS/2.1.0.2.0/package/scripts/datanode.py  |  58 -
 .../2.1.0.2.0/package/scripts/hdfs_client.py|  45 ---
 .../2.1.0.2.0/package/scripts/journalnode.py|  57 -
 .../HDFS/2.1.0.2.0/package/scripts/namenode.py  |  57 -
 .../2.1.0.2.0/package/scripts/nfsgateway.py |  58 -
 .../HDFS/2.1.0.2.0/package/scripts/snamenode.py |  60 --
 .../2.1.0.2.0/package/scripts/zkfc_slave.py |  43 ---
 .../package/scripts/hive_metastore.py   |  52 -
 .../0.12.0.2.0/package/scripts/hive_server.py   |  61 --
 .../package/scripts/hive_server_interactive.py  |  61 --
 .../package/scripts/webhcat_server.py   |  67 ---
 .../package/scripts/kerberos_client.py  |  21 
 .../0.5.0.2.2/package/scripts/knox_gateway.py   |  61 --
 .../4.0.0.2.0/package/scripts/oozie_server.py   |  63 --
 .../STORM/0.9.1/package/scripts/drpc_server.py  |  52 -
 .../STORM/0.9.1/package/scripts/nimbus.py   |  45 ---
 .../STORM/0.9.1/package/scripts/pacemaker.py|  52 -
 .../STORM/0.9.1/package/scripts/ui_server.py|  53 -
 .../scripts/application_timeline_server.py  |  61 --
 .../2.1.0.2.0/package/scripts/historyserver.py  |  56 -
 .../2.1.0.2.0/package/scripts/nodemanager.py|  60 --
 .../package/scripts/resourcemanager.py  |  60 --
 .../3.4.5/package/scripts/zookeeper_server.py   |  51 
 .../KERBEROS/package/scripts/kerberos_client.py |  21 
 .../server/agent/HeartbeatProcessorTest.java|  10 +-
 .../server/agent/TestHeartbeatHandler.java  |  13 ---
 .../stacks/2.0.6/HBASE/test_hbase_master.py | 102 
 .../2.0.6/HBASE/test_hbase_regionserver.py  | 104 -
 .../python/stacks/2.0.6/HDFS/test_datanode.py   | 111 --
 .../stacks/2.0.6/HDFS/test_hdfs_client.py   | 100 
 .../stacks/2.0.6/HDFS/test_journalnode.py   | 114 --
 .../python/stacks/2.0.6/HDFS/test_namenode.py   | 114 --
 .../python/stacks/2.0.6/HDFS/test_nfsgateway.py | 116 --
 .../python/stacks/2.0.6/HDFS/test_snamenode.py  | 117 +--
 .../test/python/stacks/2.0.6/HDFS/test_zkfc.py  | 100 
 .../stacks/2.0.6/HIVE/test_hive_server.py   | 112 --
 .../stacks/2.0.6/HIVE/test_webhcat_server.py| 116 --
 .../stacks/2.0.6/OOZIE/test_oozie_server.py | 113 --
 .../stacks/2.0.6/YARN/test_historyserver.py | 106 -
 .../stacks/2.0.6/YARN/test_nodemanager.py   | 109 -
 .../stacks/2.0.6/YARN/test_resourcemanager.py   | 108 -
 .../2.0.6/ZOOKEEPER/test_zookeeper_server.py| 103 
 .../stacks/2.1/FALCON/test_falcon_client.py |  24 
 .../stacks/2.1/FALCON/test_falcon_server.py | 109 -
 .../stacks/2.1/HIVE/test_hive_metastore.py  | 113 --
 .../stacks/2.1/STORM/test_storm_drpc_server.py  | 104 -
 .../stacks/2.1/STORM/test_storm_nimbus.py   | 103 
 .../stacks/2.1/STORM/test_storm_ui_server.py|  82 -
 .../stacks/2.1/YARN/test_apptimelineserve

[2/4] ambari git commit: AMBARI-20733. /var/log/krb5kdc.log is growing rapidly on the KDC server (echekanskiy)

2017-04-21 Thread echekanskiy
http://git-wip-us.apache.org/repos/asf/ambari/blob/712b3d21/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_journalnode.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_journalnode.py 
b/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_journalnode.py
index 4b63de4..2202661 100644
--- a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_journalnode.py
+++ b/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_journalnode.py
@@ -369,120 +369,6 @@ class TestJournalnode(RMFTestCase):
 except:
   pass
 
-  
@patch("resource_management.libraries.functions.security_commons.build_expectations")
-  
@patch("resource_management.libraries.functions.security_commons.get_params_from_filesystem")
-  
@patch("resource_management.libraries.functions.security_commons.validate_security_config_properties")
-  
@patch("resource_management.libraries.functions.security_commons.cached_kinit_executor")
-  @patch("resource_management.libraries.script.Script.put_structured_out")
-  def test_security_status(self, put_structured_out_mock, 
cached_kinit_executor_mock, validate_security_config_mock, get_params_mock, 
build_exp_mock):
-# Test that function works when is called with correct parameters
-security_params = {
-  'core-site': {
-'hadoop.security.authentication': 'kerberos'
-  },
-  'hdfs-site': {
-'dfs.journalnode.kerberos.keytab.file': 
'path/to/journalnode/keytab/file',
-'dfs.journalnode.kerberos.principal': 'journalnode_principal'
-  }
-}
-
-props_value_check = None
-props_empty_check = ['dfs.journalnode.keytab.file',
- 'dfs.journalnode.kerberos.principal']
-props_read_check = ['dfs.journalnode.keytab.file']
-
-result_issues = []
-
-get_params_mock.return_value = security_params
-validate_security_config_mock.return_value = result_issues
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/journalnode.py",
-   classname = "JournalNode",
-   command = "security_status",
-   config_file="secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-
-build_exp_mock.assert_called_with('hdfs-site', props_value_check, 
props_empty_check, props_read_check)
-put_structured_out_mock.assert_called_with({"securityState": 
"SECURED_KERBEROS"})
-cached_kinit_executor_mock.called_with('/usr/bin/kinit',
-   
self.config_dict['configurations']['hadoop-env']['hdfs_user'],
-   
security_params['hdfs-site']['dfs.journalnode.kerberos.keytab.file'],
-   
security_params['hdfs-site']['dfs.journalnode.kerberos.principal'],
-   self.config_dict['hostname'],
-   '/tmp')
-
-# Testing when hadoop.security.authentication is simple
-security_params['core-site']['hadoop.security.authentication'] = 'simple'
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/journalnode.py",
-   classname = "JournalNode",
-   command = "security_status",
-   config_file="secured.json",
-   stack_version = self.STACK_VERSION,
-   target = RMFTestCase.TARGET_COMMON_SERVICES
-)
-
-put_structured_out_mock.assert_called_with({"securityState": "UNSECURED"})
-security_params['core-site']['hadoop.security.authentication'] = 'kerberos'
-
-# Testing that the exception throw by cached_executor is caught
-cached_kinit_executor_mock.reset_mock()
-cached_kinit_executor_mock.side_effect = Exception("Invalid command")
-
-try:
-  self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/journalnode.py",
- classname = "JournalNode",
- command = "security_status",
- config_file="secured.json",
- stack_version = self.STACK_VERSION,
- target = RMFTestCase.TARGET_COMMON_SERVICES
-  )
-except:
-  self.assertTrue(True)
-
-# Testing with a security_params which doesn't contains hdfs-site
-empty_security_params = {
-  'core-site': {
-'hadoop.security.authentication': 'kerberos'
-  }
-}
-cached_kinit_executor_mock.reset_mock()
-get_params_mock.reset_mock()
-put_structured_out_mock.reset_mock()
-get_params_mock.return_value = empty_security_params
-
-self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + 
"/scripts/journalnode.py",
-   classname = "JournalNode",
-   command = "security_status",
-   

ambari git commit: AMBARI-20670. Node manager start extremely slow when YARN NM local dirs are very large - ut fix (dgrinenko via echekanskiy)

2017-04-10 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 017f37283 -> 10da7925d


AMBARI-20670. Node manager start extremely slow when YARN NM local dirs are 
very large - ut fix (dgrinenko via echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/10da7925
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/10da7925
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/10da7925

Branch: refs/heads/branch-2.5
Commit: 10da7925debade3610d1e500aeec08d125cdf2c8
Parents: 017f372
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Apr 10 17:12:32 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Apr 10 17:12:32 2017 +0300

--
 .../src/test/python/stacks/2.0.6/YARN/test_nodemanager.py  | 6 ++
 1 file changed, 2 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/10da7925/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py 
b/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
index 10edb4b..ab5e2cd 100644
--- a/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
+++ b/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
@@ -169,8 +169,7 @@ class TestNodeManager(RMFTestCase):
   mode = 0755,
   create_parents = True,
   ignore_failures = True,
-  cd_access='a',
-  recursive_mode_flags = {'d': 'a+rwx', 'f': 
'a+rw'},
+  cd_access='a'
   )
 self.assertResourceCalled('Directory', '/hadoop/yarn/local1',
   owner = 'yarn',
@@ -178,8 +177,7 @@ class TestNodeManager(RMFTestCase):
   group = 'hadoop',
   ignore_failures = True,
   mode = 0755,
-  cd_access='a',
-  recursive_mode_flags = {'d': 'a+rwx', 'f': 
'a+rw'}
+  cd_access='a'
   )
 self.assertResourceCalled('File', 
'/var/lib/ambari-agent/data/yarn/yarn_local_dir_mount.hist',
 content = '\n# This file keeps track of the last known mount-point for 
each dir.\n# It is safe to delete, since it will get regenerated the next time 
that the component of the service starts.\n# However, it is not advised to 
delete this file since Ambari may\n# re-create a dir that used to be mounted on 
a drive but is now mounted on the root.\n# Comments begin with a hash (#) 
symbol\n# dir,mount_point\n',



ambari git commit: AMBARI-20670. Node manager start extremely slow when YARN NM local dirs are very large - ut fix (dgrinenko via echekanskiy)

2017-04-10 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 5377e6644 -> 753b38c74


AMBARI-20670. Node manager start extremely slow when YARN NM local dirs are 
very large - ut fix (dgrinenko via echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/753b38c7
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/753b38c7
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/753b38c7

Branch: refs/heads/trunk
Commit: 753b38c74723e2ef24ead5112dacb54b127c0d1a
Parents: 5377e66
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Apr 10 17:08:50 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Apr 10 17:08:50 2017 +0300

--
 .../src/test/python/stacks/2.0.6/YARN/test_nodemanager.py  | 6 ++
 1 file changed, 2 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/753b38c7/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py 
b/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
index 10edb4b..ab5e2cd 100644
--- a/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
+++ b/ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
@@ -169,8 +169,7 @@ class TestNodeManager(RMFTestCase):
   mode = 0755,
   create_parents = True,
   ignore_failures = True,
-  cd_access='a',
-  recursive_mode_flags = {'d': 'a+rwx', 'f': 
'a+rw'},
+  cd_access='a'
   )
 self.assertResourceCalled('Directory', '/hadoop/yarn/local1',
   owner = 'yarn',
@@ -178,8 +177,7 @@ class TestNodeManager(RMFTestCase):
   group = 'hadoop',
   ignore_failures = True,
   mode = 0755,
-  cd_access='a',
-  recursive_mode_flags = {'d': 'a+rwx', 'f': 
'a+rw'}
+  cd_access='a'
   )
 self.assertResourceCalled('File', 
'/var/lib/ambari-agent/data/yarn/yarn_local_dir_mount.hist',
 content = '\n# This file keeps track of the last known mount-point for 
each dir.\n# It is safe to delete, since it will get regenerated the next time 
that the component of the service starts.\n# However, it is not advised to 
delete this file since Ambari may\n# re-create a dir that used to be mounted on 
a drive but is now mounted on the root.\n# Comments begin with a hash (#) 
symbol\n# dir,mount_point\n',



ambari git commit: AMBARI-20632. With multi-process StatusCommandsExecutor, Status commands are taking too long to report back (echekanskiy)

2017-04-03 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk f389293bf -> f8827fea4


AMBARI-20632. With multi-process StatusCommandsExecutor, Status commands are 
taking too long to report back (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/f8827fea
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/f8827fea
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/f8827fea

Branch: refs/heads/trunk
Commit: f8827fea405482289a8849ebe2a7dcf2fee11294
Parents: f389293
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Apr 3 16:46:27 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Apr 3 16:46:27 2017 +0300

--
 .../ambari_agent/StatusCommandsExecutor.py  | 209 +--
 .../src/main/python/ambari_agent/main.py|   5 +-
 2 files changed, 98 insertions(+), 116 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/f8827fea/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py 
b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
index 04a3e85..142e7ca 100644
--- a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
+++ b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
@@ -49,7 +49,7 @@ class 
SingleProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.config = config
 self.actionQueue = actionQueue
 self.statusCommandQueue = Queue.Queue()
-self.need_relaunch = False
+self.need_relaunch = (False, None) #  tuple (bool, str|None) with flag to 
relaunch and reason of relaunch
 
   def put_commands(self, commands):
 with self.statusCommandQueue.mutex:
@@ -88,12 +88,13 @@ class 
MultiProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.config = config
 self.actionQueue = actionQueue
 
-self._can_relaunch_lock = threading.RLock()
-self._can_relaunch = True
+self.can_relaunch = True
 
 # used to prevent queues from been used during creation of new one to 
prevent threads messing up with combination of
 # old and new queues
 self.usage_lock = threading.RLock()
+# protects against simultaneous killing/creating from different threads.
+self.kill_lock = threading.RLock()
 
 self.status_command_timeout = int(self.config.get('agent', 
'status_command_timeout', 5))
 self.customServiceOrchestrator = self.actionQueue.customServiceOrchestrator
@@ -107,42 +108,32 @@ class 
MultiProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.mp_result_logs = multiprocessing.Queue()
 self.mp_task_queue = multiprocessing.Queue()
 
-  @property
-  def can_relaunch(self):
-with self._can_relaunch_lock:
-  return self._can_relaunch
-
-  @can_relaunch.setter
-  def can_relaunch(self, value):
-with self._can_relaunch_lock:
-  self._can_relaunch = value
-
-  def _log_message(self, level, message, exception=None):
-"""
-Put log message to logging queue. Must be used only for logging from child 
process(in _worker_process_target).
-
-:param level:
-:param message:
-:param exception:
-:return:
+  def _drain_queue(self, target_queue, max_time=5, max_empty_count=15, 
read_break=.001):
 """
-result_message = "StatusCommandExecutor reporting at {0}: 
".format(time.time()) + message
-self.mp_result_logs.put((level, result_message, exception))
-
-  def _get_log_messages(self):
-"""
-Returns list of (level, message, exception) log messages.
-
-:return: list of (level, message, exception)
+Read everything that available in queue. Using not reliable 
multiprocessing.Queue methods(qsize, empty), so contains
+extremely dumb protection against blocking too much at this method: will 
try to get all possible items for not more
+than ``max_time`` seconds; will return after ``max_empty_count`` calls of 
``target_queue.get(False)`` that raised
+``Queue.Empty`` exception. Notice ``read_break`` argument, with default 
values this method will be able to read
+~4500 ``range(1,1)`` objects for 5 seconds. So don't fill queue too 
fast.
+
+:param target_queue: queue to read from
+:param max_time: maximum time to spend in this method call
+:param max_empty_count: maximum allowed ``Queue.Empty`` in a row
+:param read_break: time to wait before next read cycle iteration
+:return: list of resulting objects
 """
 results = []
+_empty = 0
+_start = time.time()
 with self.usage_lock:
   try:
-while not self.mp_result_logs.empty():

ambari git commit: AMBARI-20632. With multi-process StatusCommandsExecutor, Status commands are taking too long to report back (echekanskiy)

2017-04-03 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 7f472bab0 -> bb8b73e84


AMBARI-20632. With multi-process StatusCommandsExecutor, Status commands are 
taking too long to report back (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/bb8b73e8
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/bb8b73e8
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/bb8b73e8

Branch: refs/heads/branch-2.5
Commit: bb8b73e844e733f36715c9b9f4d985c12bcdf074
Parents: 7f472ba
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Apr 3 16:44:25 2017 +0300
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Apr 3 16:44:25 2017 +0300

--
 .../main/python/ambari_agent/AmbariConfig.py|   2 +-
 .../ambari_agent/StatusCommandsExecutor.py  | 216 +--
 .../src/main/python/ambari_agent/main.py|   5 +-
 3 files changed, 104 insertions(+), 119 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/bb8b73e8/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py 
b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
index 11cf101..2fafa3b 100644
--- a/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
+++ b/ambari-agent/src/main/python/ambari_agent/AmbariConfig.py
@@ -293,7 +293,7 @@ class AmbariConfig:
 return int(self.get('agent', 'parallel_execution', 0))
 
   def get_multiprocess_status_commands_executor_enabled(self):
-return bool(int(self.get('agent', 
'multiprocess_status_commands_executor_enabled', 0)))
+return bool(int(self.get('agent', 
'multiprocess_status_commands_executor_enabled', 1)))
 
   def update_configuration_from_registration(self, reg_resp):
 if reg_resp and AmbariConfig.AMBARI_PROPERTIES_CATEGORY in reg_resp:

http://git-wip-us.apache.org/repos/asf/ambari/blob/bb8b73e8/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py 
b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
index eaa24c2..5a8e4ce 100644
--- a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
+++ b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
@@ -49,11 +49,14 @@ class 
SingleProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.config = config
 self.actionQueue = actionQueue
 self.statusCommandQueue = Queue.Queue()
-self.need_relaunch = False
+self.need_relaunch = (False, None) #  tuple (bool, str|None) with flag to 
relaunch and reason of relaunch
 
   def put_commands(self, commands):
-while not self.statusCommandQueue.empty():
-  self.statusCommandQueue.get()
+with self.statusCommandQueue.mutex:
+  qlen = len(self.statusCommandQueue.queue)
+  if qlen:
+logger.info("Removing %s stale status commands from queue", qlen)
+  self.statusCommandQueue.queue.clear()
 
 for command in commands:
   logger.info("Adding " + command['commandType'] + " for component " + \
@@ -85,12 +88,13 @@ class 
MultiProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.config = config
 self.actionQueue = actionQueue
 
-self._can_relaunch_lock = threading.RLock()
-self._can_relaunch = True
+self.can_relaunch = True
 
 # used to prevent queues from been used during creation of new one to 
prevent threads messing up with combination of
 # old and new queues
 self.usage_lock = threading.RLock()
+# protects against simultaneous killing/creating from different threads.
+self.kill_lock = threading.RLock()
 
 self.status_command_timeout = int(self.config.get('agent', 
'status_command_timeout', 5))
 self.customServiceOrchestrator = self.actionQueue.customServiceOrchestrator
@@ -104,42 +108,32 @@ class 
MultiProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.mp_result_logs = multiprocessing.Queue()
 self.mp_task_queue = multiprocessing.Queue()
 
-  @property
-  def can_relaunch(self):
-with self._can_relaunch_lock:
-  return self._can_relaunch
-
-  @can_relaunch.setter
-  def can_relaunch(self, value):
-with self._can_relaunch_lock:
-  self._can_relaunch = value
-
-  def _log_message(self, level, message, exception=None):
-"""
-Put log message to logging queue. Must be used only for logging from child 
process(in _worker_process_target).
-
-:param level:
-:param message:
-:param exception:
-:return:
-"""
-re

ambari git commit: AMBARI-20582 Single process executor possibly can cause deadlock.

2017-03-27 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk d69b58234 -> db9e72912


AMBARI-20582 Single process executor possibly can cause deadlock.


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/db9e7291
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/db9e7291
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/db9e7291

Branch: refs/heads/trunk
Commit: db9e72912e6948352bc15f3e146f96631e1157c8
Parents: d69b582
Author: Eugene Chekanskiy 
Authored: Mon Mar 27 18:24:44 2017 +0300
Committer: Eugene Chekanskiy 
Committed: Mon Mar 27 18:24:44 2017 +0300

--
 .../src/main/python/ambari_agent/StatusCommandsExecutor.py| 7 +--
 1 file changed, 5 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/db9e7291/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py 
b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
index eaa24c2..04a3e85 100644
--- a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
+++ b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
@@ -52,8 +52,11 @@ class 
SingleProcessStatusCommandsExecutor(StatusCommandsExecutor):
 self.need_relaunch = False
 
   def put_commands(self, commands):
-while not self.statusCommandQueue.empty():
-  self.statusCommandQueue.get()
+with self.statusCommandQueue.mutex:
+  qlen = len(self.statusCommandQueue.queue)
+  if qlen:
+logger.info("Removing %s stale status commands from queue", qlen)
+  self.statusCommandQueue.queue.clear()
 
 for command in commands:
   logger.info("Adding " + command['commandType'] + " for component " + \



ambari git commit: AMBARI-20504. Ambari-agent log rotation is broken (echekanskiy)

2017-03-20 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 bc37bf748 -> 9218fbc2e


AMBARI-20504. Ambari-agent log rotation is broken (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/9218fbc2
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/9218fbc2
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/9218fbc2

Branch: refs/heads/branch-2.5
Commit: 9218fbc2ee2180ed9f6d287913710d75130af215
Parents: bc37bf7
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Mar 20 20:18:07 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Mar 20 20:18:07 2017 +0200

--
 ambari-agent/src/main/python/ambari_agent/main.py | 12 ++--
 1 file changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/9218fbc2/ambari-agent/src/main/python/ambari_agent/main.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/main.py 
b/ambari-agent/src/main/python/ambari_agent/main.py
index 604b234..272cc06 100644
--- a/ambari-agent/src/main/python/ambari_agent/main.py
+++ b/ambari-agent/src/main/python/ambari_agent/main.py
@@ -106,6 +106,7 @@ import HeartbeatHandlers
 from HeartbeatHandlers import bind_signal_handlers
 from ambari_commons.constants import AMBARI_SUDO_BINARY
 from resource_management.core.logger import Logger
+
 logger = logging.getLogger()
 alerts_logger = logging.getLogger('ambari_alerts')
 
@@ -124,10 +125,17 @@ IS_LINUX = platform.system() == "Linux"
 SYSLOG_FORMAT_STRING = ' ambari_agent - %(filename)s - [%(process)d] - 
%(name)s - %(levelname)s - %(message)s'
 SYSLOG_FORMATTER = logging.Formatter(SYSLOG_FORMAT_STRING)
 
+_file_logging_handlers ={}
+
 def setup_logging(logger, filename, logging_level):
   formatter = logging.Formatter(formatstr)
-  rotateLog = logging.handlers.RotatingFileHandler(filename, "a", 1000, 25)
-  rotateLog.setFormatter(formatter)
+
+  if filename in _file_logging_handlers:
+rotateLog = _file_logging_handlers[filename]
+  else:
+rotateLog = logging.handlers.RotatingFileHandler(filename, "a", 1000, 
25)
+rotateLog.setFormatter(formatter)
+_file_logging_handlers[filename] = rotateLog
   logger.addHandler(rotateLog)
   
   logging.basicConfig(format=formatstr, level=logging_level, filename=filename)



ambari git commit: AMBARI-20504. Ambari-agent log rotation is broken (echekanskiy)

2017-03-20 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 5a7adeb0e -> 97d9b0595


AMBARI-20504. Ambari-agent log rotation is broken (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/97d9b059
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/97d9b059
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/97d9b059

Branch: refs/heads/trunk
Commit: 97d9b05956a74a34710a9e2fe77f545e636e7c35
Parents: 5a7adeb
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Mar 20 20:16:31 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Mar 20 20:16:31 2017 +0200

--
 ambari-agent/src/main/python/ambari_agent/main.py | 12 ++--
 1 file changed, 10 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/97d9b059/ambari-agent/src/main/python/ambari_agent/main.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/main.py 
b/ambari-agent/src/main/python/ambari_agent/main.py
index 8e55916..236de25 100644
--- a/ambari-agent/src/main/python/ambari_agent/main.py
+++ b/ambari-agent/src/main/python/ambari_agent/main.py
@@ -107,6 +107,7 @@ import HeartbeatHandlers
 from HeartbeatHandlers import bind_signal_handlers
 from ambari_commons.constants import AMBARI_SUDO_BINARY
 from resource_management.core.logger import Logger
+
 logger = logging.getLogger()
 alerts_logger = logging.getLogger('ambari_alerts')
 
@@ -125,10 +126,17 @@ IS_LINUX = platform.system() == "Linux"
 SYSLOG_FORMAT_STRING = ' ambari_agent - %(filename)s - [%(process)d] - 
%(name)s - %(levelname)s - %(message)s'
 SYSLOG_FORMATTER = logging.Formatter(SYSLOG_FORMAT_STRING)
 
+_file_logging_handlers ={}
+
 def setup_logging(logger, filename, logging_level):
   formatter = logging.Formatter(formatstr)
-  rotateLog = logging.handlers.RotatingFileHandler(filename, "a", 1000, 25)
-  rotateLog.setFormatter(formatter)
+
+  if filename in _file_logging_handlers:
+rotateLog = _file_logging_handlers[filename]
+  else:
+rotateLog = logging.handlers.RotatingFileHandler(filename, "a", 1000, 
25)
+rotateLog.setFormatter(formatter)
+_file_logging_handlers[filename] = rotateLog
   logger.addHandler(rotateLog)
   
   logging.basicConfig(format=formatstr, level=logging_level, filename=filename)



ambari git commit: AMBARI-20372. RU: Oozie LR job failed (dgrinenko via echekanskiy)

2017-03-12 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 18aa01435 -> ad566344a


AMBARI-20372. RU: Oozie LR job failed (dgrinenko via echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/ad566344
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/ad566344
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/ad566344

Branch: refs/heads/branch-2.5
Commit: ad566344a0cc956daad3c7f238b3a040b764993b
Parents: 18aa014
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Sun Mar 12 16:19:15 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Sun Mar 12 16:19:15 2017 +0200

--
 .../YARN/configuration-mapred/mapred-site.xml   | 28 
 1 file changed, 28 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/ad566344/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
 
b/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
new file mode 100644
index 000..bbfdc24
--- /dev/null
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
@@ -0,0 +1,28 @@
+
+
+
+
+yarn.app.mapreduce.client.job.max-retries
+30
+
+The number of retries the client will make for getJob and 
dependent calls.
+
+
+
+
\ No newline at end of file



ambari git commit: AMBARI-20372. RU: Oozie LR job failed (dgrinenko via echekanskiy)

2017-03-12 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk ff235b490 -> b13467fb4


AMBARI-20372. RU: Oozie LR job failed (dgrinenko via echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/b13467fb
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/b13467fb
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/b13467fb

Branch: refs/heads/trunk
Commit: b13467fb470ec4c1caf64fad4a5a88f779a14b5e
Parents: ff235b4
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Sun Mar 12 16:17:04 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Sun Mar 12 16:17:04 2017 +0200

--
 .../YARN/configuration-mapred/mapred-site.xml   | 28 
 1 file changed, 28 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/b13467fb/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
--
diff --git 
a/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
 
b/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
new file mode 100644
index 000..bbfdc24
--- /dev/null
+++ 
b/ambari-server/src/main/resources/stacks/HDP/2.6/services/YARN/configuration-mapred/mapred-site.xml
@@ -0,0 +1,28 @@
+
+
+
+
+yarn.app.mapreduce.client.job.max-retries
+30
+
+The number of retries the client will make for getJob and 
dependent calls.
+
+
+
+
\ No newline at end of file



ambari git commit: AMBARI-20323. Commands timed-out on ambari host without any error logs - addendum patch (echekanskiy)

2017-03-10 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 0471b0c37 -> b69ac43a6


AMBARI-20323. Commands timed-out on ambari host without any error logs - 
addendum patch (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/b69ac43a
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/b69ac43a
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/b69ac43a

Branch: refs/heads/trunk
Commit: b69ac43a6a7aa5c7810bca0bc1204e6641634c35
Parents: 0471b0c
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Sat Mar 11 00:08:27 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Sat Mar 11 00:08:27 2017 +0200

--
 .../src/main/python/ambari_agent/Controller.py  |  2 +-
 .../src/main/python/ambari_agent/ExitHelper.py  |  3 ++
 .../ambari_agent/StatusCommandsExecutor.py  | 36 
 .../src/main/python/ambari_agent/main.py|  4 +--
 4 files changed, 35 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/b69ac43a/ambari-agent/src/main/python/ambari_agent/Controller.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/Controller.py 
b/ambari-agent/src/main/python/ambari_agent/Controller.py
index c152f64..c1a5f1b 100644
--- a/ambari-agent/src/main/python/ambari_agent/Controller.py
+++ b/ambari-agent/src/main/python/ambari_agent/Controller.py
@@ -477,7 +477,7 @@ class Controller(threading.Thread):
 try:
   self.actionQueue = ActionQueue(self.config, controller=self)
   self.statusCommandsExecutor = StatusCommandsExecutor(self.config, 
self.actionQueue)
-  ExitHelper().register(self.statusCommandsExecutor.kill, 
"CLEANUP_KILLING")
+  ExitHelper().register(self.statusCommandsExecutor.kill, 
"CLEANUP_KILLING", can_relaunch=False)
   self.actionQueue.start()
   self.register = Register(self.config)
   self.heartbeat = Heartbeat(self.actionQueue, self.config, 
self.alert_scheduler_handler.collector())

http://git-wip-us.apache.org/repos/asf/ambari/blob/b69ac43a/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/ExitHelper.py 
b/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
index e51646f..66e29e6 100644
--- a/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
+++ b/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
@@ -39,6 +39,9 @@ class ExitHelper(object):
   """
   Class to cleanup resources before exiting. Replacement for atexit module. 
sys.exit(code) works only from threads and
   os._exit(code) will ignore atexit and cleanup will be ignored.
+
+  WARNING: always import as `ambari_agent.ExitHelper import ExitHelper`, 
otherwise it will be imported twice and nothing
+  will work as expected.
   """
   __metaclass__ = _singleton
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/b69ac43a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py 
b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
index 3f7ef4c..5c1c54a 100644
--- a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
+++ b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
@@ -37,6 +37,9 @@ class StatusCommandsExecutor(object):
 self.config = config
 self.actionQueue = actionQueue
 
+self._can_relaunch_lock = threading.RLock()
+self._can_relaunch = True
+
 # used to prevent queues from been used during creation of new one to 
prevent threads messing up with combination of
 # old and new queues
 self.usage_lock = threading.RLock()
@@ -53,6 +56,16 @@ class StatusCommandsExecutor(object):
 self.mp_result_logs = multiprocessing.Queue()
 self.mp_task_queue = multiprocessing.Queue()
 
+  @property
+  def can_relaunch(self):
+with self._can_relaunch_lock:
+  return self._can_relaunch
+
+  @can_relaunch.setter
+  def can_relaunch(self, value):
+with self._can_relaunch_lock:
+  self._can_relaunch = value
+
   def _log_message(self, level, message, exception=None):
 """
 Put log message to logging queue. Must be used only for logging from child 
process(in _worker_process_target).
@@ -163,7 +176,7 @@ class StatusCommandsExecutor(object):
   self._log_message(logging.ERROR, "StatusCommandsExecutor process failed 
with exception:", e)
   raise
 
-self._log_message(logging.WARN, "StatusCommandsExecutor subproces

ambari git commit: AMBARI-20323. Commands timed-out on ambari host without any error logs - addendum patch (echekanskiy)

2017-03-10 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 0bd7b8643 -> 290218f76


AMBARI-20323. Commands timed-out on ambari host without any error logs - 
addendum patch (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/290218f7
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/290218f7
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/290218f7

Branch: refs/heads/branch-2.5
Commit: 290218f761f6c19355b79379041634fabd5bcbb2
Parents: 0bd7b86
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Sat Mar 11 00:08:59 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Sat Mar 11 00:08:59 2017 +0200

--
 .../src/main/python/ambari_agent/Controller.py  |  2 +-
 .../src/main/python/ambari_agent/ExitHelper.py  |  3 ++
 .../ambari_agent/StatusCommandsExecutor.py  | 36 
 .../src/main/python/ambari_agent/main.py|  4 +--
 4 files changed, 35 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/290218f7/ambari-agent/src/main/python/ambari_agent/Controller.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/Controller.py 
b/ambari-agent/src/main/python/ambari_agent/Controller.py
index 301ad43..a123f2f 100644
--- a/ambari-agent/src/main/python/ambari_agent/Controller.py
+++ b/ambari-agent/src/main/python/ambari_agent/Controller.py
@@ -477,7 +477,7 @@ class Controller(threading.Thread):
 try:
   self.actionQueue = ActionQueue(self.config, controller=self)
   self.statusCommandsExecutor = StatusCommandsExecutor(self.config, 
self.actionQueue)
-  ExitHelper().register(self.statusCommandsExecutor.kill, 
"CLEANUP_KILLING")
+  ExitHelper().register(self.statusCommandsExecutor.kill, 
"CLEANUP_KILLING", can_relaunch=False)
   self.actionQueue.start()
   self.register = Register(self.config)
   self.heartbeat = Heartbeat(self.actionQueue, self.config, 
self.alert_scheduler_handler.collector())

http://git-wip-us.apache.org/repos/asf/ambari/blob/290218f7/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/ExitHelper.py 
b/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
index e51646f..66e29e6 100644
--- a/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
+++ b/ambari-agent/src/main/python/ambari_agent/ExitHelper.py
@@ -39,6 +39,9 @@ class ExitHelper(object):
   """
   Class to cleanup resources before exiting. Replacement for atexit module. 
sys.exit(code) works only from threads and
   os._exit(code) will ignore atexit and cleanup will be ignored.
+
+  WARNING: always import as `ambari_agent.ExitHelper import ExitHelper`, 
otherwise it will be imported twice and nothing
+  will work as expected.
   """
   __metaclass__ = _singleton
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/290218f7/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
--
diff --git 
a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py 
b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
index 3f7ef4c..5c1c54a 100644
--- a/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
+++ b/ambari-agent/src/main/python/ambari_agent/StatusCommandsExecutor.py
@@ -37,6 +37,9 @@ class StatusCommandsExecutor(object):
 self.config = config
 self.actionQueue = actionQueue
 
+self._can_relaunch_lock = threading.RLock()
+self._can_relaunch = True
+
 # used to prevent queues from been used during creation of new one to 
prevent threads messing up with combination of
 # old and new queues
 self.usage_lock = threading.RLock()
@@ -53,6 +56,16 @@ class StatusCommandsExecutor(object):
 self.mp_result_logs = multiprocessing.Queue()
 self.mp_task_queue = multiprocessing.Queue()
 
+  @property
+  def can_relaunch(self):
+with self._can_relaunch_lock:
+  return self._can_relaunch
+
+  @can_relaunch.setter
+  def can_relaunch(self, value):
+with self._can_relaunch_lock:
+  self._can_relaunch = value
+
   def _log_message(self, level, message, exception=None):
 """
 Put log message to logging queue. Must be used only for logging from child 
process(in _worker_process_target).
@@ -163,7 +176,7 @@ class StatusCommandsExecutor(object):
   self._log_message(logging.ERROR, "StatusCommandsExecutor process failed 
with exception:", e)
   raise
 
-self._log_message(logging.WARN, "StatusCommandsExecutor

ambari git commit: AMBARI-20323. Commands timed-out on ambari host without any error logs (echekanskiy)

2017-03-09 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 c6a9a3ca4 -> 17ef55594


AMBARI-20323. Commands timed-out on ambari host without any error logs 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/17ef5559
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/17ef5559
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/17ef5559

Branch: refs/heads/branch-2.5
Commit: 17ef555940758b73cd09ddcc9fc8a3461604c085
Parents: c6a9a3c
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Thu Mar 9 18:30:20 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Thu Mar 9 18:30:20 2017 +0200

--
 .../src/main/python/ambari_agent/ActionQueue.py |  52 +---
 .../src/main/python/ambari_agent/Controller.py  |  54 +---
 .../ambari_agent/StatusCommandsExecutor.py  | 307 +++
 .../src/main/python/ambari_agent/main.py|  12 +-
 .../test/python/ambari_agent/TestActionQueue.py |   4 +-
 .../test/python/ambari_agent/TestController.py  |   3 -
 .../src/test/python/ambari_agent/TestMain.py|   9 +-
 7 files changed, 280 insertions(+), 161 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/17ef5559/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/ActionQueue.py 
b/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
index 5300b52..15ae03d 100644
--- a/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
+++ b/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
@@ -76,10 +76,6 @@ class ActionQueue(threading.Thread):
   def __init__(self, config, controller):
 super(ActionQueue, self).__init__()
 self.commandQueue = Queue.Queue()
-self.statusCommandQueue = None # the queue this field points to is 
re-created whenever
-   # a new StatusCommandExecutor child process 
is spawned
-   # by Controller
-# multiprocessing.Queue()
 self.statusCommandResultQueue = multiprocessing.Queue() # this queue is 
filled by StatuCommandsExecutor.
 self.backgroundCommandQueue = Queue.Queue()
 self.commandStatuses = CommandStatusDict(callback_action =
@@ -102,25 +98,7 @@ class ActionQueue(threading.Thread):
 return self._stop.isSet()
 
   def put_status(self, commands):
-if not self.statusCommandQueue.empty():
-  #Clear all status commands. Was supposed that we got all set of 
statuses, we don't need to keep old ones
-  statusCommandQueueSize = 0
-  try:
-while not self.statusCommandQueue.empty():
-  self.statusCommandQueue.get(False)
-  statusCommandQueueSize = statusCommandQueueSize + 1
-  except Queue.Empty:
-pass
-
-  logger.info("Number of status commands removed from queue : " + 
str(statusCommandQueueSize))
-
-for command in commands:
-  logger.info("Adding " + command['commandType'] + " for component " + \
-  command['componentName'] + " of service " + \
-  command['serviceName'] + " of cluster " + \
-  command['clusterName'] + " to the queue.")
-  self.statusCommandQueue.put(command)
-  logger.debug(pprint.pformat(command))
+self.controller.statusCommandsExecutor.put_commands(commands)
 
   def put(self, commands):
 for command in commands:
@@ -167,8 +145,8 @@ class ActionQueue(threading.Thread):
   def run(self):
 try:
   while not self.stopped():
-self.processBackgroundQueueSafeEmpty();
-self.processStatusCommandResultQueueSafeEmpty();
+self.processBackgroundQueueSafeEmpty()
+self.process_status_command_results()
 try:
   if self.parallel_execution == 0:
 command = self.commandQueue.get(True, 
self.EXECUTION_COMMAND_WAIT_TIME)
@@ -212,23 +190,13 @@ class ActionQueue(threading.Thread):
   except Queue.Empty:
 pass
 
-  def processStatusCommandResultQueueSafeEmpty(self):
-try:
-  while not self.statusCommandResultQueue.empty():
-try:
-  result = self.statusCommandResultQueue.get(False)
-  self.process_status_command_result(result)
-except Queue.Empty:
-  pass
-except IOError:
-  # on race condition in multiprocessing.Queue if get/put and thread 
kill are executed at the same time.
-  # During queue.close IOError will be thrown (this prevents from 
permanently dead-locked get).
-  pass
-except UnicodeDecodeError:
-  pass
-except IOError:
-  # queue.empty() ma

ambari git commit: AMBARI-20323. Commands timed-out on ambari host without any error logs (echekanskiy)

2017-03-09 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 8ce47e0cb -> da0b7ad8f


AMBARI-20323. Commands timed-out on ambari host without any error logs 
(echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/da0b7ad8
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/da0b7ad8
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/da0b7ad8

Branch: refs/heads/trunk
Commit: da0b7ad8f8974dfdeaf0e99d86339c7562cdd9f2
Parents: 8ce47e0
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Thu Mar 9 18:31:49 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Thu Mar 9 18:31:49 2017 +0200

--
 .../src/main/python/ambari_agent/ActionQueue.py |  52 +---
 .../src/main/python/ambari_agent/Controller.py  |  54 +---
 .../ambari_agent/StatusCommandsExecutor.py  | 307 +++
 .../src/main/python/ambari_agent/main.py|  12 +-
 .../test/python/ambari_agent/TestActionQueue.py |   4 +-
 .../test/python/ambari_agent/TestController.py  |   3 -
 .../src/test/python/ambari_agent/TestMain.py|   9 +-
 7 files changed, 280 insertions(+), 161 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/da0b7ad8/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
--
diff --git a/ambari-agent/src/main/python/ambari_agent/ActionQueue.py 
b/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
index 5300b52..15ae03d 100644
--- a/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
+++ b/ambari-agent/src/main/python/ambari_agent/ActionQueue.py
@@ -76,10 +76,6 @@ class ActionQueue(threading.Thread):
   def __init__(self, config, controller):
 super(ActionQueue, self).__init__()
 self.commandQueue = Queue.Queue()
-self.statusCommandQueue = None # the queue this field points to is 
re-created whenever
-   # a new StatusCommandExecutor child process 
is spawned
-   # by Controller
-# multiprocessing.Queue()
 self.statusCommandResultQueue = multiprocessing.Queue() # this queue is 
filled by StatuCommandsExecutor.
 self.backgroundCommandQueue = Queue.Queue()
 self.commandStatuses = CommandStatusDict(callback_action =
@@ -102,25 +98,7 @@ class ActionQueue(threading.Thread):
 return self._stop.isSet()
 
   def put_status(self, commands):
-if not self.statusCommandQueue.empty():
-  #Clear all status commands. Was supposed that we got all set of 
statuses, we don't need to keep old ones
-  statusCommandQueueSize = 0
-  try:
-while not self.statusCommandQueue.empty():
-  self.statusCommandQueue.get(False)
-  statusCommandQueueSize = statusCommandQueueSize + 1
-  except Queue.Empty:
-pass
-
-  logger.info("Number of status commands removed from queue : " + 
str(statusCommandQueueSize))
-
-for command in commands:
-  logger.info("Adding " + command['commandType'] + " for component " + \
-  command['componentName'] + " of service " + \
-  command['serviceName'] + " of cluster " + \
-  command['clusterName'] + " to the queue.")
-  self.statusCommandQueue.put(command)
-  logger.debug(pprint.pformat(command))
+self.controller.statusCommandsExecutor.put_commands(commands)
 
   def put(self, commands):
 for command in commands:
@@ -167,8 +145,8 @@ class ActionQueue(threading.Thread):
   def run(self):
 try:
   while not self.stopped():
-self.processBackgroundQueueSafeEmpty();
-self.processStatusCommandResultQueueSafeEmpty();
+self.processBackgroundQueueSafeEmpty()
+self.process_status_command_results()
 try:
   if self.parallel_execution == 0:
 command = self.commandQueue.get(True, 
self.EXECUTION_COMMAND_WAIT_TIME)
@@ -212,23 +190,13 @@ class ActionQueue(threading.Thread):
   except Queue.Empty:
 pass
 
-  def processStatusCommandResultQueueSafeEmpty(self):
-try:
-  while not self.statusCommandResultQueue.empty():
-try:
-  result = self.statusCommandResultQueue.get(False)
-  self.process_status_command_result(result)
-except Queue.Empty:
-  pass
-except IOError:
-  # on race condition in multiprocessing.Queue if get/put and thread 
kill are executed at the same time.
-  # During queue.close IOError will be thrown (this prevents from 
permanently dead-locked get).
-  pass
-except UnicodeDecodeError:
-  pass
-except IOError:
-  # queue.empty() may also throw IOError
-  pass
+  def process_status_comm

ambari git commit: AMBARI-20237. After regenerate keytabs post Ambari upgrade yarn.nodemanager.linux-container-executor.cgroups.mount-path property got added with blank value (echekanskiy)

2017-03-02 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 6da1c530e -> 098e4fc09


AMBARI-20237. After regenerate keytabs post Ambari upgrade 
yarn.nodemanager.linux-container-executor.cgroups.mount-path property got added 
with blank value (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/098e4fc0
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/098e4fc0
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/098e4fc0

Branch: refs/heads/branch-2.5
Commit: 098e4fc091f02922727ed4f0954f4d2ee4ecd08e
Parents: 6da1c53
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Thu Mar 2 19:38:04 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Thu Mar 2 19:38:04 2017 +0200

--
 .../server/upgrade/UpgradeCatalog250.java   |  14 ++
 .../server/upgrade/UpgradeCatalog250Test.java   |  10 +-
 ...test_kerberos_descriptor_2_5_infra_solr.json | 212 ++-
 3 files changed, 226 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/098e4fc0/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
index 39a129d..a597a63 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
@@ -42,6 +42,7 @@ import org.apache.ambari.server.state.Cluster;
 import org.apache.ambari.server.state.Clusters;
 import org.apache.ambari.server.state.Config;
 import org.apache.ambari.server.state.kerberos.KerberosComponentDescriptor;
+import org.apache.ambari.server.state.kerberos.KerberosConfigurationDescriptor;
 import org.apache.ambari.server.state.kerberos.KerberosDescriptor;
 import org.apache.ambari.server.state.kerberos.KerberosDescriptorFactory;
 import org.apache.ambari.server.state.kerberos.KerberosIdentityDescriptor;
@@ -442,6 +443,19 @@ public class UpgradeCatalog250 extends 
AbstractUpgradeCatalog {
   }
 }
   }
+  KerberosServiceDescriptor yarnKerberosDescriptor = 
kerberosDescriptor.getService("YARN");
+  if (yarnKerberosDescriptor != null) {
+Map<String, KerberosConfigurationDescriptor> configs = 
yarnKerberosDescriptor.getConfigurations();
+KerberosConfigurationDescriptor yarnSiteConfigDescriptor = 
configs.get("yarn-site");
+if (yarnSiteConfigDescriptor != null) {
+  Map<String, String> properties = 
yarnSiteConfigDescriptor.getProperties();
+  if (properties != null && 
properties.containsKey(YARN_LCE_CGROUPS_MOUNT_PATH)) {
+properties.remove(YARN_LCE_CGROUPS_MOUNT_PATH);
+artifactEntity.setArtifactData(kerberosDescriptor.toMap());
+artifactDAO.merge(artifactEntity);
+  }
+}
+  }
 }
   }
 }

http://git-wip-us.apache.org/repos/asf/ambari/blob/098e4fc0/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
--
diff --git 
a/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
 
b/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
index 64536cb..529ac5c 100644
--- 
a/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
+++ 
b/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
@@ -1601,6 +1601,7 @@ public class UpgradeCatalog250Test {
 
   @Test
   public void testUpdateKerberosDescriptorArtifact() throws Exception {
+final String propertyToRemove = 
"yarn.nodemanager.linux-container-executor.cgroups.mount-path";
 final KerberosDescriptorFactory kerberosDescriptorFactory = new 
KerberosDescriptorFactory();
 
 KerberosServiceDescriptor serviceDescriptor;
@@ -1628,8 +1629,7 @@ public class UpgradeCatalog250Test {
 Assert.assertNotNull(serviceDescriptor);
 Assert.assertNotNull(serviceDescriptor.getComponent("NIMBUS"));
 
-UpgradeCatalog250 upgradeMock = 
createMockBuilder(UpgradeCatalog250.class).createMock();
-
+UpgradeCatalog250 upgradeMock = 
createMockBuilder(UpgradeCatalog250.class).withConstructor(injector).createMock();
 
 ArtifactEntity artifactEntity = createNiceMock(ArtifactEntity.class);
 expect(artifactEntity.getArtifactData())
@@ -1638,10 +1638,1

ambari git commit: AMBARI-20237. After regenerate keytabs post Ambari upgrade yarn.nodemanager.linux-container-executor.cgroups.mount-path property got added with blank value (echekanskiy)

2017-03-02 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk f8fe522b9 -> d481b788b


AMBARI-20237. After regenerate keytabs post Ambari upgrade 
yarn.nodemanager.linux-container-executor.cgroups.mount-path property got added 
with blank value (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/d481b788
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/d481b788
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/d481b788

Branch: refs/heads/trunk
Commit: d481b788b6f819280b849c015eb42acc2e3d16b3
Parents: f8fe522
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Thu Mar 2 19:35:45 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Thu Mar 2 19:35:45 2017 +0200

--
 .../server/upgrade/UpgradeCatalog250.java   |  14 ++
 .../server/upgrade/UpgradeCatalog250Test.java   |  10 +-
 ...test_kerberos_descriptor_2_5_infra_solr.json | 212 ++-
 3 files changed, 226 insertions(+), 10 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/d481b788/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
--
diff --git 
a/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
 
b/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
index d6ff241..ad7490b 100644
--- 
a/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
+++ 
b/ambari-server/src/main/java/org/apache/ambari/server/upgrade/UpgradeCatalog250.java
@@ -51,6 +51,7 @@ import org.apache.ambari.server.state.Cluster;
 import org.apache.ambari.server.state.Clusters;
 import org.apache.ambari.server.state.Config;
 import org.apache.ambari.server.state.kerberos.KerberosComponentDescriptor;
+import org.apache.ambari.server.state.kerberos.KerberosConfigurationDescriptor;
 import org.apache.ambari.server.state.kerberos.KerberosDescriptor;
 import org.apache.ambari.server.state.kerberos.KerberosDescriptorFactory;
 import org.apache.ambari.server.state.kerberos.KerberosIdentityDescriptor;
@@ -498,6 +499,19 @@ public class UpgradeCatalog250 extends 
AbstractUpgradeCatalog {
   }
 }
   }
+  KerberosServiceDescriptor yarnKerberosDescriptor = 
kerberosDescriptor.getService("YARN");
+  if (yarnKerberosDescriptor != null) {
+Map<String, KerberosConfigurationDescriptor> configs = 
yarnKerberosDescriptor.getConfigurations();
+KerberosConfigurationDescriptor yarnSiteConfigDescriptor = 
configs.get("yarn-site");
+if (yarnSiteConfigDescriptor != null) {
+  Map<String, String> properties = 
yarnSiteConfigDescriptor.getProperties();
+  if (properties != null && 
properties.containsKey(YARN_LCE_CGROUPS_MOUNT_PATH)) {
+properties.remove(YARN_LCE_CGROUPS_MOUNT_PATH);
+artifactEntity.setArtifactData(kerberosDescriptor.toMap());
+artifactDAO.merge(artifactEntity);
+  }
+}
+  }
 }
   }
 }

http://git-wip-us.apache.org/repos/asf/ambari/blob/d481b788/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
--
diff --git 
a/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
 
b/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
index 39d8785..7ee66ef 100644
--- 
a/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
+++ 
b/ambari-server/src/test/java/org/apache/ambari/server/upgrade/UpgradeCatalog250Test.java
@@ -1602,6 +1602,7 @@ public class UpgradeCatalog250Test {
 
   @Test
   public void testUpdateKerberosDescriptorArtifact() throws Exception {
+final String propertyToRemove = 
"yarn.nodemanager.linux-container-executor.cgroups.mount-path";
 final KerberosDescriptorFactory kerberosDescriptorFactory = new 
KerberosDescriptorFactory();
 
 KerberosServiceDescriptor serviceDescriptor;
@@ -1629,8 +1630,7 @@ public class UpgradeCatalog250Test {
 Assert.assertNotNull(serviceDescriptor);
 Assert.assertNotNull(serviceDescriptor.getComponent("NIMBUS"));
 
-UpgradeCatalog250 upgradeMock = 
createMockBuilder(UpgradeCatalog250.class).createMock();
-
+UpgradeCatalog250 upgradeMock = 
createMockBuilder(UpgradeCatalog250.class).withConstructor(injector).createMock();
 
 ArtifactEntity artifactEntity = createNiceMock(ArtifactEntity.class);
 expect(artifactEntity.getArtifactData())
@@ -1639,10 +1639,1

ambari git commit: AMBARI-20032. HDFS service check fails after Stopping one Namenode in HA cluster (echekanskiy)

2017-02-28 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 2081ca0fa -> 0b8827b7e


AMBARI-20032. HDFS service check fails after Stopping one Namenode in HA 
cluster (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/0b8827b7
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/0b8827b7
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/0b8827b7

Branch: refs/heads/branch-2.5
Commit: 0b8827b7e17d50758f5cb0d4d0cf2fe6bac70b20
Parents: 2081ca0
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Tue Feb 28 14:06:28 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Tue Feb 28 14:06:28 2017 +0200

--
 .../HDFS/2.1.0.2.0/package/scripts/service_check.py   | 10 --
 .../test/python/stacks/2.0.6/HDFS/test_service_check.py   |  8 
 2 files changed, 18 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/0b8827b7/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py
 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py
index dffa077..3d798a3 100644
--- 
a/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py
+++ 
b/ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py
@@ -37,20 +37,10 @@ class HdfsServiceCheckDefault(HdfsServiceCheck):
 dir = params.hdfs_tmp_dir
 tmp_file = format("{dir}/{unique}")
 
-safemode_command = format("dfsadmin -fs {namenode_address} -safemode get | 
grep OFF")
-
 if params.security_enabled:
   Execute(format("{kinit_path_local} -kt {hdfs_user_keytab} 
{hdfs_principal_name}"),
 user=params.hdfs_user
   )
-ExecuteHadoop(safemode_command,
-  user=params.hdfs_user,
-  logoutput=True,
-  conf_dir=params.hadoop_conf_dir,
-  try_sleep=3,
-  tries=20,
-  bin_dir=params.hadoop_bin_dir
-)
 params.HdfsResource(dir,
 type="directory",
 action="create_on_execute",

http://git-wip-us.apache.org/repos/asf/ambari/blob/0b8827b7/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_service_check.py
--
diff --git 
a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_service_check.py 
b/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_service_check.py
index bbc1b3a..5ad836f 100644
--- a/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_service_check.py
+++ b/ambari-server/src/test/python/stacks/2.0.6/HDFS/test_service_check.py
@@ -52,14 +52,6 @@ class TestServiceCheck(RMFTestCase):
 self.assertNoMoreResources()
 
   def assert_service_check(self):
-self.assertResourceCalled('ExecuteHadoop', 'dfsadmin -fs 
hdfs://c6401.ambari.apache.org:8020 -safemode get | grep OFF',
-logoutput = True,
-tries = 20,
-conf_dir = '/etc/hadoop/conf',
-try_sleep = 3,
-bin_dir = '/usr/bin',
-user = 'hdfs',
-)
 self.assertResourceCalled('HdfsResource', '/tmp',
 immutable_paths = self.DEFAULT_IMMUTABLE_PATHS,
 security_enabled = False,



ambari git commit: AMBARI-20126. Add support for Spark2 upgrade from HDP-2.5 (dgrinenko via echekankiy)

2017-02-24 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 681adc095 -> 6d872ad31


AMBARI-20126. Add support for Spark2 upgrade from HDP-2.5 (dgrinenko via 
echekankiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/6d872ad3
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/6d872ad3
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/6d872ad3

Branch: refs/heads/branch-2.5
Commit: 6d872ad3141462962e6ce71423e3c431047a20a0
Parents: 681adc0
Author: Eugene Chekanskiy 
Authored: Fri Feb 24 11:20:10 2017 +0200
Committer: Eugene Chekanskiy 
Committed: Fri Feb 24 11:20:10 2017 +0200

--
 .../2.0.0/package/scripts/job_history_server.py |  2 +-
 .../2.0.0/package/scripts/livy2_server.py   |  2 +-
 .../2.0.0/package/scripts/spark_client.py   |  2 +-
 .../package/scripts/spark_thrift_server.py  |  9 ++--
 .../stacks/HDP/2.5/upgrades/config-upgrade.xml  | 11 +
 .../HDP/2.5/upgrades/nonrolling-upgrade-2.6.xml | 50 +++
 .../stacks/HDP/2.5/upgrades/upgrade-2.6.xml | 39 +++
 .../HDP/2.6/upgrades/nonrolling-upgrade-2.6.xml | 51 
 .../stacks/HDP/2.6/upgrades/upgrade-2.6.xml | 41 
 9 files changed, 199 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/6d872ad3/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
index 61ba661..d2b32ae 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
@@ -42,7 +42,7 @@ class JobHistoryServer(Script):
 
 self.install_packages(env)
 
-  def configure(self, env, upgrade_type=None):
+  def configure(self, env, upgrade_type=None, config_dir=None):
 import params
 env.set_params(params)
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/6d872ad3/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
index 8c66998..cb4f5ee 100644
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
@@ -44,7 +44,7 @@ class LivyServer(Script):
 
 self.install_packages(env)
 
-  def configure(self, env, upgrade_type=None):
+  def configure(self, env, upgrade_type=None, config_dir=None):
 import params
 env.set_params(params)
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/6d872ad3/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
index 434b4b1..0a43750 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
@@ -35,7 +35,7 @@ class SparkClient(Script):
 self.install_packages(env)
 self.configure(env)
 
-  def configure(self, env, upgrade_type=None):
+  def configure(self, env, upgrade_type=None, config_dir=None):
 import params
 env.set_params(params)
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/6d872ad3/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
index be65986..6fdd324 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
+++ 

ambari git commit: AMBARI-20126. Add support for Spark2 upgrade from HDP-2.5 (dgrinenko via echekankiy)

2017-02-24 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk 043d6c06c -> 19da5823c


AMBARI-20126. Add support for Spark2 upgrade from HDP-2.5 (dgrinenko via 
echekankiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/19da5823
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/19da5823
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/19da5823

Branch: refs/heads/trunk
Commit: 19da5823c8adc8cd18048040c93f43941c0e568f
Parents: 043d6c0
Author: Eugene Chekanskiy 
Authored: Fri Feb 24 11:18:33 2017 +0200
Committer: Eugene Chekanskiy 
Committed: Fri Feb 24 11:18:33 2017 +0200

--
 .../2.0.0/package/scripts/job_history_server.py |  2 +-
 .../2.0.0/package/scripts/livy2_server.py   |  2 +-
 .../2.0.0/package/scripts/spark_client.py   |  2 +-
 .../package/scripts/spark_thrift_server.py  |  9 ++--
 .../stacks/HDP/2.5/upgrades/config-upgrade.xml  | 11 +
 .../HDP/2.5/upgrades/nonrolling-upgrade-2.6.xml | 50 +++
 .../stacks/HDP/2.5/upgrades/upgrade-2.6.xml | 39 +++
 .../HDP/2.6/upgrades/nonrolling-upgrade-2.6.xml | 51 
 .../stacks/HDP/2.6/upgrades/upgrade-2.6.xml | 41 
 9 files changed, 199 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/19da5823/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
index 154c83d..2631b49 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py
@@ -41,7 +41,7 @@ class JobHistoryServer(Script):
 
 self.install_packages(env)
 
-  def configure(self, env, upgrade_type=None):
+  def configure(self, env, upgrade_type=None, config_dir=None):
 import params
 env.set_params(params)
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/19da5823/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
index 8c66998..cb4f5ee 100644
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py
@@ -44,7 +44,7 @@ class LivyServer(Script):
 
 self.install_packages(env)
 
-  def configure(self, env, upgrade_type=None):
+  def configure(self, env, upgrade_type=None, config_dir=None):
 import params
 env.set_params(params)
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/19da5823/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
index 2c19b88..563b7e9 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_client.py
@@ -34,7 +34,7 @@ class SparkClient(Script):
 self.install_packages(env)
 self.configure(env)
 
-  def configure(self, env, upgrade_type=None):
+  def configure(self, env, upgrade_type=None, config_dir=None):
 import params
 env.set_params(params)
 

http://git-wip-us.apache.org/repos/asf/ambari/blob/19da5823/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
index 426c05c..72307cb 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_thrift_server.py
@@ -40,7 

ambari git commit: AMBARI-20052. Spark2 service check is failing in secure cluster (echekanskiy)

2017-02-20 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/branch-2.5 f96d43ccc -> 1169694ec


AMBARI-20052. Spark2 service check is failing in secure cluster (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/1169694e
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/1169694e
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/1169694e

Branch: refs/heads/branch-2.5
Commit: 1169694ecdc77c5f317d105a0c17da6b0b806bdf
Parents: f96d43c
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Feb 20 16:52:42 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Feb 20 16:52:42 2017 +0200

--
 .../common-services/SPARK2/2.0.0/package/scripts/params.py   | 2 ++
 .../SPARK2/2.0.0/package/scripts/service_check.py| 4 ++--
 2 files changed, 4 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/1169694e/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
index 66c6100..e508a0d 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
@@ -135,6 +135,8 @@ security_enabled = 
config['configurations']['cluster-env']['security_enabled']
 kinit_path_local = 
get_kinit_path(default('/configurations/kerberos-env/executable_search_paths', 
None))
 spark_kerberos_keytab =  
config['configurations']['spark2-defaults']['spark.history.kerberos.keytab']
 spark_kerberos_principal =  
config['configurations']['spark2-defaults']['spark.history.kerberos.principal']
+smoke_user_keytab = config['configurations']['cluster-env']['smokeuser_keytab']
+smokeuser_principal =  
config['configurations']['cluster-env']['smokeuser_principal_name']
 
 spark_thriftserver_hosts = 
default("/clusterHostInfo/spark2_thriftserver_hosts", [])
 has_spark_thriftserver = not len(spark_thriftserver_hosts) == 0

http://git-wip-us.apache.org/repos/asf/ambari/blob/1169694e/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
index bcdf118..e6bd120 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
@@ -32,7 +32,7 @@ class SparkServiceCheck(Script):
 if params.security_enabled:
   spark_kinit_cmd = format("{kinit_path_local} -kt {spark_kerberos_keytab} 
{spark_principal}; ")
   Execute(spark_kinit_cmd, user=params.spark_user)
-  if (params.has_livyserver):
+  if params.has_livyserver:
 livy_kinit_cmd = format("{kinit_path_local} -kt {smoke_user_keytab} 
{smokeuser_principal}; ")
 Execute(livy_kinit_cmd, user=params.livy2_user)
 
@@ -42,7 +42,7 @@ class SparkServiceCheck(Script):
 logoutput=True
 )
 if params.has_livyserver:
-  live_livyserver_host = "";
+  live_livyserver_host = ""
   for livyserver_host in params.livy2_livyserver_hosts:
 try:
   Execute(format("curl -s -o /dev/null -w'%{{http_code}}' --negotiate 
-u: -k http://{livyserver_host}:{livy2_livyserver_port}/sessions | grep 200"),



ambari git commit: AMBARI-20052. Spark2 service check is failing in secure cluster (echekanskiy)

2017-02-20 Thread echekanskiy
Repository: ambari
Updated Branches:
  refs/heads/trunk ece4d36cb -> 1ecdee0d5


AMBARI-20052. Spark2 service check is failing in secure cluster (echekanskiy)


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/1ecdee0d
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/1ecdee0d
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/1ecdee0d

Branch: refs/heads/trunk
Commit: 1ecdee0d559540204c66634871e6252d622552cd
Parents: ece4d36
Author: Eugene Chekanskiy <echekans...@hortonworks.com>
Authored: Mon Feb 20 16:53:25 2017 +0200
Committer: Eugene Chekanskiy <echekans...@hortonworks.com>
Committed: Mon Feb 20 16:53:25 2017 +0200

--
 .../common-services/SPARK2/2.0.0/package/scripts/params.py   | 2 ++
 .../SPARK2/2.0.0/package/scripts/service_check.py| 4 ++--
 2 files changed, 4 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/ambari/blob/1ecdee0d/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
index b6889e4..45a8c07 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py
@@ -131,6 +131,8 @@ security_enabled = 
config['configurations']['cluster-env']['security_enabled']
 kinit_path_local = 
get_kinit_path(default('/configurations/kerberos-env/executable_search_paths', 
None))
 spark_kerberos_keytab =  
config['configurations']['spark2-defaults']['spark.history.kerberos.keytab']
 spark_kerberos_principal =  
config['configurations']['spark2-defaults']['spark.history.kerberos.principal']
+smoke_user_keytab = config['configurations']['cluster-env']['smokeuser_keytab']
+smokeuser_principal =  
config['configurations']['cluster-env']['smokeuser_principal_name']
 
 spark_thriftserver_hosts = 
default("/clusterHostInfo/spark2_thriftserver_hosts", [])
 has_spark_thriftserver = not len(spark_thriftserver_hosts) == 0

http://git-wip-us.apache.org/repos/asf/ambari/blob/1ecdee0d/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
--
diff --git 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
index acd3340..8e7a766 100755
--- 
a/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
+++ 
b/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/service_check.py
@@ -31,7 +31,7 @@ class SparkServiceCheck(Script):
 if params.security_enabled:
   spark_kinit_cmd = format("{kinit_path_local} -kt {spark_kerberos_keytab} 
{spark_principal}; ")
   Execute(spark_kinit_cmd, user=params.spark_user)
-  if (params.has_livyserver):
+  if params.has_livyserver:
 livy_kinit_cmd = format("{kinit_path_local} -kt {smoke_user_keytab} 
{smokeuser_principal}; ")
 Execute(livy_kinit_cmd, user=params.livy2_user)
 
@@ -41,7 +41,7 @@ class SparkServiceCheck(Script):
 logoutput=True
 )
 if params.has_livyserver:
-  live_livyserver_host = "";
+  live_livyserver_host = ""
   for livyserver_host in params.livy2_livyserver_hosts:
 try:
   Execute(format("curl -s -o /dev/null -w'%{{http_code}}' --negotiate 
-u: -k http://{livyserver_host}:{livy2_livyserver_port}/sessions | grep 200"),