[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/19808 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152953457 --- Diff: bin/spark-sql.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem This is the entry point for running SparkSQL. To avoid polluting the +rem environment, it just launches a new cmd to do the real work. + +rem The outermost quotes are used to prevent Windows command line parse error +rem when there are some quotes in parameters, see SPARK-21877. +cmd /V /E /C ""%~dp0spark-sql2.cmd" %*" --- End diff -- BTW, to be honest, I didn't know my little Windows env dev experience a long ago could be helpful to Spark at the end :). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152951398 --- Diff: bin/spark-sql.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem This is the entry point for running SparkSQL. To avoid polluting the +rem environment, it just launches a new cmd to do the real work. + +rem The outermost quotes are used to prevent Windows command line parse error +rem when there are some quotes in parameters, see SPARK-21877. +cmd /V /E /C ""%~dp0spark-sql2.cmd" %*" --- End diff -- AFAIK, `export` in bash affects its subprocesses so it doesn't affect the parent. I somehow knew this but .. haven't thought deeply before. I think this is just a difference between OSes, maybe missing concept of forking on Windows. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152947466 --- Diff: bin/spark-sql.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem This is the entry point for running SparkSQL. To avoid polluting the +rem environment, it just launches a new cmd to do the real work. + +rem The outermost quotes are used to prevent Windows command line parse error +rem when there are some quotes in parameters, see SPARK-21877. +cmd /V /E /C ""%~dp0spark-sql2.cmd" %*" --- End diff -- just for curious, why does the shell script not need a seprated script? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152936109 --- Diff: bin/sparkR2.cmd --- @@ -21,6 +21,5 @@ rem Figure out where the Spark framework is installed call "%~dp0find-spark-home.cmd" call "%SPARK_HOME%\bin\load-spark-env.cmd" - - +set _SPARK_CMD_USAGE=Usage: .\bin\sparkR [options] call "%SPARK_HOME%\bin\spark-submit2.cmd" sparkr-shell-main %* --- End diff -- Just checked it prints: ```cmd C:\...>.\bin\sparkR --help Usage: .\bin\sparkR [options] Options: --master MASTER_URL spark://host:port, mesos://host:port, yarn, or loc ... ``` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152919825 --- Diff: bin/spark-sql2.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem Figure out where the Spark framework is installed +call "%~dp0find-spark-home.cmd" + +set _SPARK_CMD_USAGE=Usage: .\bin\spark-sql [options] [cli option] --- End diff -- Yup, missing `.cmd` works fine. Actually .. just like `python.exe` or `pyspark.cmd`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user viirya commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152919393 --- Diff: bin/spark-sql2.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem Figure out where the Spark framework is installed +call "%~dp0find-spark-home.cmd" + +set _SPARK_CMD_USAGE=Usage: .\bin\spark-sql [options] [cli option] --- End diff -- Missing `.cmd`? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user viirya commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152919445 --- Diff: bin/run-example.cmd --- @@ -20,7 +20,7 @@ rem rem Figure out where the Spark framework is installed call "%~dp0find-spark-home.cmd" -set _SPARK_CMD_USAGE=Usage: ./bin/run-example [options] example-class [example args] +set _SPARK_CMD_USAGE=Usage: .\bin\run-example [options] example-class [example args] --- End diff -- Missing `.cmd` too? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user viirya commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152917404 --- Diff: bin/spark-sql2.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem Figure out where the Spark framework is installed +call "%~dp0find-spark-home.cmd" + +set _SPARK_CMD_USAGE="Usage: .\bin\spark-sql [options] [cli option]" --- End diff -- Btw, `_SPARK_CMD_USAGE` in `bin/run-example.cmd` is also wrong, maybe fix it together? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user viirya commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152917177 --- Diff: bin/spark-sql2.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem Figure out where the Spark framework is installed +call "%~dp0find-spark-home.cmd" + +set _SPARK_CMD_USAGE="Usage: .\bin\spark-sql [options] [cli option]" --- End diff -- `set _SPARK_CMD_USAGE=Usage: .\bin\spark-sql.cmd [options] [cli option]`? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152891440 --- Diff: bin/find-spark-home.cmd --- @@ -32,7 +32,7 @@ if not "x%PYSPARK_PYTHON%"=="x" ( ) rem If there is python installed, trying to use the root dir as SPARK_HOME -where %PYTHON_RUNNER% > nul 2>$1 +where %PYTHON_RUNNER% > nul 2>&1 --- End diff -- This was a mistake. It should be `&` otherwise creates an empty file called `$1`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/19808#discussion_r152891405 --- Diff: bin/spark-sql.cmd --- @@ -0,0 +1,25 @@ +@echo off + +rem +rem Licensed to the Apache Software Foundation (ASF) under one or more +rem contributor license agreements. See the NOTICE file distributed with +rem this work for additional information regarding copyright ownership. +rem The ASF licenses this file to You under the Apache License, Version 2.0 +rem (the "License"); you may not use this file except in compliance with +rem the License. You may obtain a copy of the License at +rem +remhttp://www.apache.org/licenses/LICENSE-2.0 +rem +rem Unless required by applicable law or agreed to in writing, software +rem distributed under the License is distributed on an "AS IS" BASIS, +rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +rem See the License for the specific language governing permissions and +rem limitations under the License. +rem + +rem This is the entry point for running SparkSQL. To avoid polluting the +rem environment, it just launches a new cmd to do the real work. + +rem The outermost quotes are used to prevent Windows command line parse error +rem when there are some quotes in parameters, see SPARK-21877. +cmd /V /E /C ""%~dp0spark-sql2.cmd" %*" --- End diff -- Separate script is required. Otherwise, it will actually set the environment variable, `SPARK_HOME`. See SPARK-3943 and I also manually tested. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19808: [SPARK-22597][SQL] Add spark-sql cmd script for W...
GitHub user HyukjinKwon opened a pull request: https://github.com/apache/spark/pull/19808 [SPARK-22597][SQL] Add spark-sql cmd script for Windows users ## What changes were proposed in this pull request? This PR proposes to add cmd scripts so that Windows users can also run `spark-sql` script. ## How was this patch tested? Manually tested on Windows. ```cmd C:\...\spark>.\bin\spark-sql '.\bin\spark-sql' is not recognized as an internal or external command, operable program or batch file. C:\...\spark>.\bin\spark-sql.cmd '.\bin\spark-sql.cmd' is not recognized as an internal or external command, operable program or batch file. ``` ```cmd C:\...\spark>.\bin\spark-sql ... spark-sql> SELECT 'Hello World !!'; ... Hello World !! Time taken: 4.022 seconds, Fetched 1 row(s) ``` You can merge this pull request into a Git repository by running: $ git pull https://github.com/HyukjinKwon/spark spark-sql-cmd Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/19808.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #19808 commit d1ddede1ac34deb3732670e7425ec0b84b6d0508 Author: hyukjinkwonDate: 2017-11-24T01:10:47Z Add spark-sql script for Windows users --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org