[ https://issues.apache.org/jira/browse/HDFS-14819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16923621#comment-16923621 ]
Erik Krogen commented on HDFS-14819: ------------------------------------ Hi [~soyamiyoshi], thanks a lot for reporting this! The change overall LGTM, I just have some minor comments: * There should be a space between the closing parenthesis and the opening curly brace on L140 in {{AuditLogDirectParser}} * Why do we use {{Arrays.asList()}}? Can't we just do: {code} String[] splitMessage = auditMessage.split("=", 2); parameterMap.put(splitMessage[0], splitMessage[1]); {code} While we're at it, we should probably do a sanity check to ensure that there are at least two entries in the split message to avoid an {{ArrayIndexOutOfBoundsException}}. > [Dynamometer] Cannot parse audit logs with ‘=‘ in unexpected places when > starting a workload. > ---------------------------------------------------------------------------------------------- > > Key: HDFS-14819 > URL: https://issues.apache.org/jira/browse/HDFS-14819 > Project: Hadoop HDFS > Issue Type: Sub-task > Reporter: Soya Miyoshi > Assignee: Soya Miyoshi > Priority: Major > Attachments: HDFS-14819.001.patch > > > When trying to launch a workload job, if any of the given audit logs’ values > contain `=` aside from at the ends of the log’s keys (such as `ugi`, `src`), > the audit log will not be parsed and an exception is thrown. > For example, this audit log will result in exception, as it contains `=` in > the `src` value (“/projects/date=0822”). > {code:|borderStyle=solid} > 2019-08-22 01:00:00,186 INFO FSNamesystem.audit: allowed=true ugi=feed > (auth:aaaaa) ip=/119.472.323.333 cmd=getfileinfo > src=/projects/date=0822 dst=null > perm=null proto=rpc > {code} > If the second `=` in `src=/projects/date=0822` is removed, it works fine. -- This message was sent by Atlassian Jira (v8.3.2#803003) --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org