Hi all,

I am having issues with collecting lustre file system metrics

I recently updated the ganglia-gmond-python package on the client and i am
seeing the following messages in the logs. python_module is shown as
loaded, i have pasted the loaded modules below.  not sure what am i missing.


starting GANGLIA gmond:                                    [  OK  ]
Sep  3 16:04:31  /usr/sbin/gmond[21825]: Unable to find the metric
information for 'opt-read_bytes'. Possible that the module has not been
loaded.#012
Sep  3 16:04:31  /usr/sbin/gmond[21825]: Unable to find the metric
information for 'opt-write_bytes'. Possible that the module has not been
loaded.#012

Here is how the modpython.conf file looks like

/*
  params - path to the directory where mod_python
           should look for python metric modules

  the "pyconf" files in the include directory below
  will be scanned for configurations for those modules
*/
modules {
  module {
    name = "python_module"
    path = "modpython.so"
    params = "/usr/lib64/ganglia/python_modules"
  }
}

include ("/etc/ganglia/conf.d/*.pyconf")


running the below command gives me the following.

/usr/sbin/gmond -c /etc/ganglia/gmond.conf -p /var/run/gmond.pid -d 9
loaded module: core_metrics
loaded module: cpu_module
loaded module: disk_module
loaded module: load_module
loaded module: mem_module
loaded module: net_module
loaded module: proc_module
loaded module: sys_module
loaded module: python_module
loaded module: ib_module
loaded module: multicpu_module
Invalid parameter given or out of range for '-l'.
tcp_accept_channel bind=NULL port=8649 gzip_output=0
Unable to create tcp_accept_channel. Exiting.


Please let me know if i need to provide any additional info on this. I
appreciate any help on this issue.
------------------------------------------------------------------------------
_______________________________________________
Ganglia-general mailing list
Ganglia-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/ganglia-general

Reply via email to