[ 
https://issues.apache.org/jira/browse/AMBARI-14926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

matteo brancaleoni updated AMBARI-14926:
----------------------------------------
    Description: 
On systems with datanode and journal node monitored by ambari-agent,
if in the standard python path the hdfs lib is installed ( 
https://pypi.python.org/pypi/hdfs/ ) it makes datanode + journalnode monitoring 
fail:
* datanodes are seen up in global live datanodes
* but the service is down into the host service list (restart will work, but 
are marked as stopped)

looking into the logs I get:

DEBUG 2016-02-04 16:54:49,160 PythonReflectiveExecutor.py:47 - Running command 
reflectively ['/usr/bin/python2',
 
u'/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/journalnode.py',
 'SECURITY_STATUS',
--
 '/var/lib/ambari-agent/tmp']
DEBUG 2016-02-04 16:54:49,170 PythonReflectiveExecutor.py:61 - Reflective 
command failed with exception:
Traceback (most recent call last):
  File 
"/usr/lib/python2.6/site-packages/ambari_agent/PythonReflectiveExecutor.py", 
line 55, in run_file
    imp.load_source('__main__', script)
  File 
"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/journalnode.py",
 line 30, in <module>
    from utils import service
  File 
"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py",
 line 37, in <module>
    from zkfc_slave import ZkfcSlave
  File 
"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/zkfc_slave.py",
 line 21, in <module>
    from hdfs import hdfs
ImportError: cannot import name hdfs

I think that ambari is including its cache files after the python path, so if 
an hdfs module is present into python path makes the scripts goes crazy.
(basically using the system hdfs py module and not his own)

maybe cache files path must be included before the python path or imported with 
local imports?

  was:
On systems with datanode and journal node monitored by ambari-agent,
if in the standard python path the hdfs lib is installed ( 
https://pypi.python.org/pypi/hdfs/ ) it makes datanode + journalnode monitoring 
fail:
* datanodes are seen up in global live datanodes
* but the service is down into the host service list

looking into the logs I get:

DEBUG 2016-02-04 16:54:49,160 PythonReflectiveExecutor.py:47 - Running command 
reflectively ['/usr/bin/python2',
 
u'/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/journalnode.py',
 'SECURITY_STATUS',
--
 '/var/lib/ambari-agent/tmp']
DEBUG 2016-02-04 16:54:49,170 PythonReflectiveExecutor.py:61 - Reflective 
command failed with exception:
Traceback (most recent call last):
  File 
"/usr/lib/python2.6/site-packages/ambari_agent/PythonReflectiveExecutor.py", 
line 55, in run_file
    imp.load_source('__main__', script)
  File 
"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/journalnode.py",
 line 30, in <module>
    from utils import service
  File 
"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py",
 line 37, in <module>
    from zkfc_slave import ZkfcSlave
  File 
"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/zkfc_slave.py",
 line 21, in <module>
    from hdfs import hdfs
ImportError: cannot import name hdfs

I think that ambari is including its cache files after the python path, so if 
an hdfs module is present into python path makes the scripts goes crazy.
(basically using the system hdfs py module and not his own)

maybe cache files path must be included before the python path or imported with 
local imports?


> ambari cached hdfs.py conflicts with python hdfs lib resulting into 
> monitoring errors
> -------------------------------------------------------------------------------------
>
>                 Key: AMBARI-14926
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14926
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-agent
>    Affects Versions: 2.2.0
>         Environment: Linux CentOS 7 x86_64
>            Reporter: matteo brancaleoni
>
> On systems with datanode and journal node monitored by ambari-agent,
> if in the standard python path the hdfs lib is installed ( 
> https://pypi.python.org/pypi/hdfs/ ) it makes datanode + journalnode 
> monitoring fail:
> * datanodes are seen up in global live datanodes
> * but the service is down into the host service list (restart will work, but 
> are marked as stopped)
> looking into the logs I get:
> DEBUG 2016-02-04 16:54:49,160 PythonReflectiveExecutor.py:47 - Running 
> command reflectively ['/usr/bin/python2',
>  
> u'/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/journalnode.py',
>  'SECURITY_STATUS',
> --
>  '/var/lib/ambari-agent/tmp']
> DEBUG 2016-02-04 16:54:49,170 PythonReflectiveExecutor.py:61 - Reflective 
> command failed with exception:
> Traceback (most recent call last):
>   File 
> "/usr/lib/python2.6/site-packages/ambari_agent/PythonReflectiveExecutor.py", 
> line 55, in run_file
>     imp.load_source('__main__', script)
>   File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/journalnode.py",
>  line 30, in <module>
>     from utils import service
>   File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py",
>  line 37, in <module>
>     from zkfc_slave import ZkfcSlave
>   File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/zkfc_slave.py",
>  line 21, in <module>
>     from hdfs import hdfs
> ImportError: cannot import name hdfs
> I think that ambari is including its cache files after the python path, so if 
> an hdfs module is present into python path makes the scripts goes crazy.
> (basically using the system hdfs py module and not his own)
> maybe cache files path must be included before the python path or imported 
> with local imports?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to