Hello folks,
Is there REST API support for bulk removing components from multiple hosts ?
I know Ambari support bulk install components to multiple hosts and bulk
removing of empty hosts from a cluster. It seems to me that it only makes
sense if we can also bulk remove components as well.
Thanks
Ambari Devs,
It appears your PR job in Jenkins has had some 20 odd jobs in queue
due to pinning your job to the 'hadoop' label. I'm unsure why you've
tied your job to that label but I've changed it to the more generic
'ubuntu'. The Hadoop nodes are dedicated to main Hadoop projects and
since we'v
and this is after you had tried the *logoutput *flag in Execute(... ?
On Fri, Mar 23, 2018 at 6:49 AM, wrote:
> Hi!
>
> I have some shell scripts and want to call them in service python scripts.
>
> I saw an example which use Execute to call shell script like
> Execute("/path/do_something").
>
>
I don't think scripts.py has the uninstall fun interface, so don't expect
Ambari to magically become aware of a func you wrote in your code and start
calling it. You can obviously implement the Decommission custom command or
include your own custom command in metainfo.xml
On Thu, Mar 22, 2018 at 9
The only scripts which should use /usr/hdp as a hard coded value are those
which belong specifically to the HDP stack. Otherwise it's a bug and needs to
be corrected.
> On Mar 22, 2018, at 9:34 PM, xiang@sky-data.cn wrote:
>
> But i find some python script use "/usr/hdp" as hard code, is th
Hi!
I have some shell scripts and want to call them in service python scripts.
I saw an example which use Execute to call shell script like
Execute("/path/do_something").
But i notice that if the log level is INFO, the shekllscript own log can not be
printed, only change log
level to DEBUG