JiaLiangC commented on PR #3683:
URL: https://github.com/apache/ambari/pull/3683#issuecomment-1501296760

   Same issue with all the other component that added to a kerberized cluster,
   once a component  added to  kerberized cluster, service check will failed 
due to hadoop proxy user not refreshed.
   
   Although you have refreshed the configuration in hive service, the Ambari 
web still prompts to restart HDFS. Therefore, to completely resolve this issue, 
we need to work together on the frontend and backend.
   
   Perhaps we can raise this issue on the dev mailing list and create a few 
tasks to address it:
   
   1.The frontend should check if the current configuration includes a 
"refresh" command, and if the backend refreshes successfully, the frontend 
should no longer prompt to restart the HDFS cluster.
   If no HDFS components exist on the nodes where Hive is deployed, kinit will 
not be able to find the keytab and will report an error. Therefore, it is more 
appropriate for the HDFS service to perform the refresh command.
   2.In the HDFS frontend menu, there is a "refresh configuration" button. Once 
a new component is added, and only when the configuration for that proxy user 
changes, that interface can be automatically called to perform the refresh.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@ambari.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@ambari.apache.org
For additional commands, e-mail: dev-h...@ambari.apache.org

Reply via email to