[jira] [Created] (HADOOP-12346) Increase some default timeouts / retries for S3a connector
Sean Mackrory created HADOOP-12346: -- Summary: Increase some default timeouts / retries for S3a connector Key: HADOOP-12346 URL: https://issues.apache.org/jira/browse/HADOOP-12346 Project: Hadoop Common Issue Type: Bug Components: fs/s3 Reporter: Sean Mackrory I've been seeing some flakiness in jobs runnings against S3a, both first hand and with other accounts, for which increasing fs.s3a.connection.timeout and fs.s3a.attempts.maximum have been a reliable solution. I propose we increase the defaults. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
Jenkins build is back to normal : Hadoop-Common-trunk #1595
See https://builds.apache.org/job/Hadoop-Common-trunk/1595/changes
[VOTE] Using rebase and merge for feature branch development
Hi common-dev, As promised, here is an official vote thread. Let's run it for the standard 7 days, closing on Aug 28th at noon. Only PMC members have binding votes, but of course everyone's input is welcomed. If the vote passes, I'll put the text on the website somewhere as recommended by Steve. Previous discussion threads: http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201508.mbox/%3CCAGB5D2bPWeV2Hk%2B67%3DDamWpVfLTM6nkjb_wG3n4%3DWAN890zqfA%40mail.gmail.com%3E http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201508.mbox/%3CCAGB5D2aDXujQjwdmadVtg2-qrPAJeOgCS2_NHydv8jke8or1UA%40mail.gmail.com%3E Proposal: Feature branch development can use either a merge or rebase workflow, as decided by contributors working on the branch. When using a rebase workflow, the feature branch is periodically rebased on trunk via git rebase trunk and force pushed. Before performing a force-push, a tag should be created of the current feature branch HEAD to preserve history. The tag should identify the feature and date of most recent commit, e.g. tag_feature_HDFS-7285_2015-08-11. It can also be convenient to use a temporary branch to review rebase conflict resolution before force-pushing the main feature branch, e.g. HDFS-7285-rebase. Temporary branches should be deleted after they are force-pushed over the feature branch. Developers are allowed to squash and reorder commits to make rebasing easier. Use this judiciously. When squashing, please maintain the original commit messages in the squashed commit message to preserve history. When using a merge workflow, changes are periodically integrated from trunk to the branch via git merge trunk. Merge conflict resolution can be reviewed by posting the diff of the merge commit. For both rebase and merge workflows, integration of the branch into trunk should happen via git merge --no-ff. --no-ff is important since it generates a merge commit even if the branch applies cleanly on top of trunk. This clearly denotes the set of commits that were made on the branch, and makes it easier to revert the branch if necessary. git merge --no-ff is also the preferred way of integrating a feature branch to other branches, e.g. branch-2. Thanks, Andrew
Build failed in Jenkins: Hadoop-common-trunk-Java8 #294
See https://builds.apache.org/job/Hadoop-common-trunk-Java8/294/changes Changes: [yliu] HDFS-8884. Fail-fast check in BlockPlacementPolicyDefault#chooseTarget. (yliu) [yliu] HDFS-8863. The remaining space check in BlockPlacementPolicyDefault is flawed. (Kihwal Lee via yliu) [yzhang] HDFS-8828. Utilize Snapshot diff report to build diff copy list in distcp. (Yufei Gu via Yongjun Zhang) [wangda] YARN-2923. Support configuration based NodeLabelsProvider Service in Distributed Node Label Configuration Setup. (Naganarasimha G R) [cmccabe] HDFS-8922. Link the native_mini_dfs test library with libdl, since IBM Java requires it (Ayappan via Colin P. McCabe) [jing9] HDFS-8809. HDFS fsck reports under construction blocks as CORRUPT. Contributed by Jing Zhao. [vinodkv] Creating 2.6.2 entries in CHANGES.txt files. [aajisaka] MAPREDUCE-6357. MultipleOutputs.write() API should document that output committing is not utilized when input path is absolute. Contributed by Dustin Cote. [cdouglas] HDFS-8891. HDFS concat should keep srcs order. Contributed by Yong Zhang. [rohithsharmaks] YARN-3986. getTransferredContainers in AbstractYarnScheduler should be present in YarnScheduler interface -- [...truncated 3861 lines...] [INFO] preparing 'analyze-report' report requires 'test-compile' forked phase execution [INFO] [INFO] maven-dependency-plugin:2.8:analyze-report @ hadoop-auth [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-auth --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-auth --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-auth --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-auth --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-auth --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] maven-dependency-plugin:2.8:analyze-report @ hadoop-auth [WARNING] No project URL defined - decoration links will not be relativized! [INFO] Rendering site with org.apache.maven.skins:maven-stylus-skin:jar:1.2 skin. [INFO] Rendering 4 Doxia documents: 4 markdown [INFO] Generating Dependency Analysis report --- maven-dependency-plugin:2.8:analyze-report [INFO] [INFO] --- maven-project-info-reports-plugin:2.7:dependencies (default) @ hadoop-auth --- [ERROR] Artifact: jdk.tools:jdk.tools:jar:1.8 has no file. [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-auth --- [INFO] Loading source files for package org.apache.hadoop.util... Loading source files for package org.apache.hadoop.security.authentication.util... Loading source files for package org.apache.hadoop.security.authentication.server... Loading source files for package org.apache.hadoop.security.authentication.client... Constructing Javadoc information... Standard Doclet version 1.8.0 Building tree for all the packages and classes... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/util/PlatformName.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/AuthToken.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/CertificateUtil.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/FileSignerSecretProvider.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/KerberosName.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/KerberosName.BadFormatString.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/KerberosName.NoMatchingRule.html... Generating
Re: [VOTE] Using rebase and merge for feature branch development
I know this is a different topic than the main reason for this vote, but has there been a discussion of using a squashed merge as opposed to a normal merge when feature branches merge to trunk? Squash merges have some advantages including complicating the branch tree. Since this [VOTE] explicitly allows a rebase+squash workflow, you can do this by first squashing the branch with rebase then merging it. We discussed a lot about the importance of preserving history though, which is the point of using merge and also carefully annotating squashed commits when rebasing. This would apply to merge --squash too. Ultimately my goal though is to leave it to the contributors to decide. The dev workflow and integration plan should all be discussed publicly, so we can figure out what works best in each situation. Best, Andrew
[jira] [Created] (HADOOP-12347) Fix mismatch parameter name in javadocs of AuthToken#setMaxInactives
Xiaoyu Yao created HADOOP-12347: --- Summary: Fix mismatch parameter name in javadocs of AuthToken#setMaxInactives Key: HADOOP-12347 URL: https://issues.apache.org/jira/browse/HADOOP-12347 Project: Hadoop Common Issue Type: Bug Reporter: Xiaoyu Yao Assignee: Xiaoyu Yao Priority: Trivial This was introduced by HADOOP-12050 as evident in the build errors: {code} https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/AuthToken.java:101: warning: no @param for interval public void setMaxInactives(long interval) { ^ https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/server/AuthenticationToken.java:65: error: @param name not found * @param max inactive time of the token in milliseconds since the epoch. {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
Re: [VOTE] Using rebase and merge for feature branch development
On Fri, Aug 21, 2015 at 1:44 PM, Andrew Wang andrew.w...@cloudera.com wrote: Hi common-dev, As promised, here is an official vote thread. Let's run it for the standard 7 days, closing on Aug 28th at noon. Only PMC members have binding votes, but of course everyone's input is welcomed. If the vote passes, I'll put the text on the website somewhere as recommended by Steve. Previous discussion threads: http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201508.mbox/%3CCAGB5D2bPWeV2Hk%2B67%3DDamWpVfLTM6nkjb_wG3n4%3DWAN890zqfA%40mail.gmail.com%3E http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201508.mbox/%3CCAGB5D2aDXujQjwdmadVtg2-qrPAJeOgCS2_NHydv8jke8or1UA%40mail.gmail.com%3E Proposal: Feature branch development can use either a merge or rebase workflow, as decided by contributors working on the branch. When using a rebase workflow, the feature branch is periodically rebased on trunk via git rebase trunk and force pushed. Before performing a force-push, a tag should be created of the current feature branch HEAD to preserve history. The tag should identify the feature and date of most recent commit, e.g. tag_feature_HDFS-7285_2015-08-11. It can also be convenient to use a temporary branch to review rebase conflict resolution before force-pushing the main feature branch, e.g. HDFS-7285-rebase. Temporary branches should be deleted after they are force-pushed over the feature branch. Developers are allowed to squash and reorder commits to make rebasing easier. Use this judiciously. When squashing, please maintain the original commit messages in the squashed commit message to preserve history. When using a merge workflow, changes are periodically integrated from trunk to the branch via git merge trunk. Merge conflict resolution can be reviewed by posting the diff of the merge commit. For both rebase and merge workflows, integration of the branch into trunk should happen via git merge --no-ff. --no-ff is important since it generates a merge commit even if the branch applies cleanly on top of trunk. This clearly denotes the set of commits that were made on the branch, and makes it easier to revert the branch if necessary. git merge --no-ff is also the preferred way of integrating a feature branch to other branches, e.g. branch-2. I know this is a different topic than the main reason for this vote, but has there been a discussion of using a squashed merge as opposed to a normal merge when feature branches merge to trunk? Squash merges have some advantages including complicating the branch tree. Thanks, Andrew
Re: [VOTE] Using rebase and merge for feature branch development
Understood, and agreed on the need to preserve the commit history. The only reason I thought I'd comment is that the current proposal explicitly mentions git merge --no-ff when merging the branch to trunk. --squash cannot be combined with --no-ff, thus I wanted some clarification. On Fri, Aug 21, 2015 at 3:40 PM, Andrew Wang andrew.w...@cloudera.com wrote: I know this is a different topic than the main reason for this vote, but has there been a discussion of using a squashed merge as opposed to a normal merge when feature branches merge to trunk? Squash merges have some advantages including complicating the branch tree. Since this [VOTE] explicitly allows a rebase+squash workflow, you can do this by first squashing the branch with rebase then merging it. We discussed a lot about the importance of preserving history though, which is the point of using merge and also carefully annotating squashed commits when rebasing. This would apply to merge --squash too. Ultimately my goal though is to leave it to the contributors to decide. The dev workflow and integration plan should all be discussed publicly, so we can figure out what works best in each situation. Best, Andrew
Build failed in Jenkins: Hadoop-common-trunk-Java8 #296
See https://builds.apache.org/job/Hadoop-common-trunk-Java8/296/changes Changes: [lei] HDFS-8924. Add pluggable interface for reading replicas in DFSClient. (Colin Patrick McCabe via Lei Xu) -- [...truncated 3898 lines...] [INFO] [INFO] maven-source-plugin:2.3:jar (default) @ hadoop-auth [INFO] [INFO] --- maven-source-plugin:2.3:jar (default) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT-sources.jar [INFO] [INFO] --- maven-jar-plugin:2.5:jar (prepare-jar) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.5:test-jar (prepare-test-jar) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-auth --- [INFO] [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-auth --- [INFO] [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT-test-sources.jar [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-auth --- [INFO] [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-auth --- [INFO] [INFO] --- maven-site-plugin:3.4:site (default) @ hadoop-auth --- [INFO] configuring report plugin org.apache.maven.plugins:maven-dependency-plugin:2.8 [INFO] preparing 'analyze-report' report requires 'test-compile' forked phase execution [INFO] [INFO] maven-dependency-plugin:2.8:analyze-report @ hadoop-auth [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-auth --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-auth --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-auth --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-auth --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-auth --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] maven-dependency-plugin:2.8:analyze-report @ hadoop-auth [WARNING] No project URL defined - decoration links will not be relativized! [INFO] Rendering site with org.apache.maven.skins:maven-stylus-skin:jar:1.2 skin. [INFO] Rendering 4 Doxia documents: 4 markdown [INFO] Generating Dependency Analysis report --- maven-dependency-plugin:2.8:analyze-report [INFO] [INFO] --- maven-project-info-reports-plugin:2.7:dependencies (default) @ hadoop-auth --- [ERROR] Artifact: jdk.tools:jdk.tools:jar:1.8 has no file. [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-auth --- [INFO] Loading source files for package org.apache.hadoop.util... Loading source files for package org.apache.hadoop.security.authentication.util... Loading source files for package org.apache.hadoop.security.authentication.client... Loading source files for package org.apache.hadoop.security.authentication.server... Constructing Javadoc information... Standard Doclet version 1.8.0 Building tree for all the packages and classes... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/util/PlatformName.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/AuthToken.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/CertificateUtil.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/FileSignerSecretProvider.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/KerberosName.html...
Build failed in Jenkins: Hadoop-common-trunk-Java8 #295
See https://builds.apache.org/job/Hadoop-common-trunk-Java8/295/changes Changes: [xyao] HADOOP-12347. Fix mismatch parameter name in javadocs of AuthToken#setMaxInactives. Contributed by Xiaoyu Yao -- [...truncated 3898 lines...] [INFO] [INFO] maven-source-plugin:2.3:jar (default) @ hadoop-auth [INFO] [INFO] --- maven-source-plugin:2.3:jar (default) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT-sources.jar [INFO] [INFO] --- maven-jar-plugin:2.5:jar (prepare-jar) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.5:test-jar (prepare-test-jar) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-auth --- [INFO] [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-auth --- [INFO] [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-auth --- [INFO] Building jar: https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT-test-sources.jar [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-auth --- [INFO] [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-auth --- [INFO] [INFO] --- maven-site-plugin:3.4:site (default) @ hadoop-auth --- [INFO] configuring report plugin org.apache.maven.plugins:maven-dependency-plugin:2.8 [INFO] preparing 'analyze-report' report requires 'test-compile' forked phase execution [INFO] [INFO] maven-dependency-plugin:2.8:analyze-report @ hadoop-auth [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-auth --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-auth --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-auth --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-auth --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-auth --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] maven-dependency-plugin:2.8:analyze-report @ hadoop-auth [WARNING] No project URL defined - decoration links will not be relativized! [INFO] Rendering site with org.apache.maven.skins:maven-stylus-skin:jar:1.2 skin. [INFO] Rendering 4 Doxia documents: 4 markdown [INFO] Generating Dependency Analysis report --- maven-dependency-plugin:2.8:analyze-report [INFO] [INFO] --- maven-project-info-reports-plugin:2.7:dependencies (default) @ hadoop-auth --- [ERROR] Artifact: jdk.tools:jdk.tools:jar:1.8 has no file. [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-auth --- [INFO] Loading source files for package org.apache.hadoop.security.authentication.util... Loading source files for package org.apache.hadoop.security.authentication.server... Loading source files for package org.apache.hadoop.security.authentication.client... Loading source files for package org.apache.hadoop.util... Constructing Javadoc information... Standard Doclet version 1.8.0 Building tree for all the packages and classes... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/AuthToken.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/CertificateUtil.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/FileSignerSecretProvider.html... Generating https://builds.apache.org/job/Hadoop-common-trunk-Java8/ws/hadoop-common-project/hadoop-auth/target/org/apache/hadoop/security/authentication/util/KerberosName.html... Generating