ccaominh opened a new issue #9176: hdfs-storage extension is missing 
org.codehaus.jackson.map.ObjectMapper
URL: https://github.com/apache/druid/issues/9176
 
 
   ### Affected Version
   
   master and 0.17.0
   
   ### Description
   
   `extensions-core/hdfs-storage/pom.xml` currently excludes 
`jackson-mapper-asl` to remove a security vulnerability, but 
`jackson-mapper-asl` is needed to provide 
`org.codehaus.jackson.map.ObjectMapper`:
   
   ```
   2020-01-13T16:00:06,327 WARN [qtp415793386-164] 
org.apache.druid.segment.indexing.DataSchema - No granularitySpec has been 
specified. Using UniformGranularitySpec as default.
   2020-01-13T16:00:06,327 WARN [qtp415793386-164] 
org.apache.druid.segment.indexing.DataSchema - No metricsSpec has been 
specified. Are you sure this is what you want?
   2020-01-13T16:00:06,356 INFO [qtp415793386-164] 
org.apache.hadoop.hdfs.DFSClient - Created HDFS_DELEGATION_TOKEN token 
161877450 for druid on ha-hdfs:titan
   2020-01-13T16:00:06,393 ERROR [qtp415793386-164] 
com.sun.jersey.spi.container.ContainerResponse - The exception contained within 
MappableContainerException could not be mapped to a response, re-throwing to 
the HTTP container
   java.lang.NoClassDefFoundError: org/codehaus/jackson/map/ObjectMapper
           at 
org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.doDelegationTokenOperation(DelegationTokenAuthenticator.java:320)
 ~[?:?]
           at 
org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.getDelegationToken(DelegationTokenAuthenticator.java:182)
 ~[?:?]
           at 
org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.getDelegationToken(DelegationTokenAuthenticatedURL.java:382)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.KMSClientProvider$4.run(KMSClientProvider.java:1029)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.KMSClientProvider$4.run(KMSClientProvider.java:1023)
 ~[?:?]
           at java.security.AccessController.doPrivileged(Native Method) 
~[?:1.8.0_222]
           at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_222]
           at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:1023
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$1.call(LoadBalancingKMSClientProvider.java:193)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$1.call(LoadBalancingKMSClientProvider.java:190)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:123)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.addDelegationTokens(LoadBalancingKMSClientProvider.java:190)
 ~[?:?]
           at 
org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:110)
 ~[?:?]
           at 
org.apache.hadoop.hdfs.HdfsKMSUtil.addDelegationTokensForKeyProvider(HdfsKMSUtil.java:83)
 ~[?:?]
           at 
org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2516)
 ~[?:?]
           at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:143)
 ~[?:?]
           at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:102)
 ~[?:?]
           at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:81)
 ~[?:?]
           at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:249)
 ~[?:?]
           at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:393)
 ~[?:?]
           at 
org.apache.druid.inputsource.hdfs.HdfsInputSource.getPaths(HdfsInputSource.java:113)
 ~[?:?]
           at 
org.apache.druid.inputsource.hdfs.HdfsInputSource.cachePathsIfNeeded(HdfsInputSource.java:199)
 ~[?:?]
           at 
org.apache.druid.inputsource.hdfs.HdfsInputSource.createSplits(HdfsInputSource.java:173)
 ~[?:?]
           at 
org.apache.druid.inputsource.hdfs.HdfsInputSource.formattableReader(HdfsInputSource.java:155)
 ~[?:?]
           at 
org.apache.druid.data.input.AbstractInputSource.reader(AbstractInputSource.java:42)
 ~[druid-core-0.17.0-incubating-iap-preview4.jar:0.17.0-incubating-iap-preview4]
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to