[ 
https://issues.apache.org/jira/browse/SPARK-30701?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17028026#comment-17028026
 ] 

Guram Savinov commented on SPARK-30701:
---------------------------------------

So the problem is: backslash character isn't included to allowedChars, see 
attached HadoopGroupTest.java

> SQL test running on Windows: hadoop chgrp warnings
> --------------------------------------------------
>
>                 Key: SPARK-30701
>                 URL: https://issues.apache.org/jira/browse/SPARK-30701
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.4
>         Environment: Windows 10
> Winutils 2.7.1: 
> [https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1]
> Oracle JavaSE 8
> SparkSQL 2.4.4
> Using: -Dhive.exec.scratchdir=C:\Users\OSUser\hadoop\tmp\hive
> Set: winutils chmod -R 777 \Users\OSUser\hadoop\tmp\hive
>            Reporter: Guram Savinov
>            Assignee: Felix Cheung
>            Priority: Major
>              Labels: WIndows, hive, unit-test
>         Attachments: HadoopGroupTest.java
>
>
> Running SparkSQL local embedded unit tests on Win10, using winutils.
> Got warnings about 'hadoop chgrp'.
> See environment info.
> {code:bash}
> -chgrp: 'TEST\Domain users' does not match expected pattern for group
> Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
> -chgrp: 'TEST\Domain users' does not match expected pattern for group
> Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
> -chgrp: 'TEST\Domain users' does not match expected pattern for group
> Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
> {code}
> Related info on SO: 
> https://stackoverflow.com/questions/48605907/error-in-pyspark-when-insert-data-in-hive
> Seems like the problem is here: 
> hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FsShellPermissions.java:210



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to