Thanks Sumit and Bhuvnesh, Execute is exactly what I want.
On Tue, Jan 5, 2016 at 1:43 PM, Sumit Mohanty <[email protected]>
wrote:
> ExecuteHadoop eventually calls Execute with "hdfs ..."
>
> Execute can be used to execute any command
>
> E.g.
>
> regiondrainer_cmd = format("cmd /c {hbase_executable} org.jruby.Main
> {region_drainer} remove {host}")
> Execute(regiondrainer_cmd, user=params.hbase_user, logoutput=True)
>
> ________________________________________
> From: Jeff Zhang <[email protected]>
> Sent: Monday, January 04, 2016 8:50 PM
> To: [email protected]
> Subject: How to execute non-hadoop command ?
>
> I want to create service check for spark, but spark don't use hadoop script
> as launch script. I found other component use ExecuteHadoop to launch
> hadoop job to verify the service, I am wondering is there is there any api
> for non-hadoop command ? BTW I check the source code of execute_hadoop.py
> but don't find how it associates with hadoop
>
>
> --
> Best Regards
>
> Jeff Zhang
>
--
Best Regards
Jeff Zhang