> On Feb. 1, 2016, 1:30 p.m., Nate Cole wrote: > > ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py, > > lines 417-418 > > <https://reviews.apache.org/r/43050/diff/1/?file=1228061#file1228061line417> > > > > How could these ever get set? I wouldn't think Ambari agent code > > execution isn't setting any environment variables like this. > > Tom Beerbower wrote: > I guess not. I cut and pasted this code from the Atlas params.py file. > So, how in the Hive params python script can I determine the atlas > directories? Thanks.
I see that you're setting a bunch of stuff in params.py and params_linux.py, but no script uses them - they're all used for writing out the xml file? It should be possible to capture the XmlConfig (or whatever) to make sure the substitution makes it into the persisted XML file. If it's too daunting, I won't hold up the review for it. - Nate ----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/43050/#review117234 ----------------------------------------------------------- On Feb. 1, 2016, 12:05 p.m., Tom Beerbower wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/43050/ > ----------------------------------------------------------- > > (Updated Feb. 1, 2016, 12:05 p.m.) > > > Review request for Ambari, John Speidel, Nate Cole, and Robert Levas. > > > Bugs: AMBARI-14853 > https://issues.apache.org/jira/browse/AMBARI-14853 > > > Repository: ambari > > > Description > ------- > > Three additional steps need to be done to to install Atlas 0.6 via Ambari. > > 1. Add new Atlas Kafka related properties to the Atlas configuration > ‘application.properties’ > > atlas.notification.embedded" : false, > atlas.kafka.data = /tmp > atlas.kafka.bootstrap.servers = c6401.ambari.apache.org:6667 > atlas.kafka.zookeeper.connect = c6401.ambari.apache.org:2181 > atlas.kafka.hook.group.id = atlas > atlas.kafka.entities.group.id = entities > > > * Note: > For “atlas.kafka.bootstrap.servers” and “atlas.kafka.zookeeper.connect”, > modify host names based on your cluster topology. > The directory specified in “atlas.kaka.data” should exist. > > 2. Add an export of HADOOP_CLASSPATH which includes the required atlas > directories to hive-env.xml in the 2.3 HDP stack > export > HADOOP_CLASSPATH=/etc/atlas/conf:/usr/hdp/current/atlas-server/hook/hive:${HADOOP_CLASSPATH} > > *Note: > It is important that the atlas directories are prepended to the existing > classpath. > > 3. Restart the Atlas and Hive services after the cluster is fully provisioned > > > Diffs > ----- > > > ambari-server/src/main/resources/common-services/ATLAS/0.1.0.2.3/configuration/application-properties.xml > 82dacb6 > > ambari-server/src/main/resources/common-services/ATLAS/0.1.0.2.3/metainfo.xml > 2600fc4 > > ambari-server/src/main/resources/common-services/ATLAS/0.1.0.2.3/package/scripts/params.py > 1a0c67b > > ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/configuration/hive-env.xml > 6db42c9 > > ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py > a2131b0 > > ambari-server/src/main/resources/stacks/HDP/2.3/services/HIVE/configuration/hive-env.xml > 92c0c03 > ambari-server/src/test/python/stacks/2.3/configs/default.json 21bff13 > > Diff: https://reviews.apache.org/r/43050/diff/ > > > Testing > ------- > > Manual test and verify configuration and Atlas operation. > > mvn clean test : all tests pass > > [INFO] > ------------------------------------------------------------------------ > [INFO] BUILD SUCCESS > [INFO] > ------------------------------------------------------------------------ > [INFO] Total time: 01:02 h > [INFO] Finished at: 2016-02-01T11:56:24-05:00 > [INFO] Final Memory: 44M/1696M > [INFO] > ------------------------------------------------------------------------ > > > Thanks, > > Tom Beerbower > >
