Github user tritab commented on the pull request:

    https://github.com/apache/spark/pull/10789#issuecomment-173086028
  
    I made a number of updates to all of the .cmd files, @tsudukim please 
review to see how it works in your environment. It appears to handle a spark 
install under "Program Files" now. However, I have a couple of outstanding 
concerns that could use some work.
    
    1. If you ctrl-c out of a command, the prompt is left  in the spark 
directory rather than where you started. This isn't optimal.
    2. pyspark wasn't working for me, but I have 3.4 installed,which may be 
part of the issue
    3. I wasn't able to test R, so that may not be fully functional
    
    Any feedback is welcome. 
    
    @JoshRosen I looked briefly, but wasn't able to find a tool similar to 
shellcheck for Windows. That certainly would be helpful though. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to