Rafnel commented on code in PR #43706:
URL: https://github.com/apache/spark/pull/43706#discussion_r1534103893


##########
bin/spark-class2.cmd:
##########
@@ -64,12 +64,9 @@ if not "x%JAVA_HOME%"=="x" (
 rem The launcher library prints the command to be executed in a single line 
suitable for being
 rem executed by the batch interpreter. So read all the output of the launcher 
into a variable.
 :gen
-set LAUNCHER_OUTPUT=%temp%\spark-class-launcher-output-%RANDOM%%TIME::=0%.txt
-rem Remove space between the RANDOM and TIME output
-set LAUNCHER_OUTPUT=%LAUNCHER_OUTPUT: =%
-rem SPARK-28302: %RANDOM% would return the same number if we call it instantly 
after last call,
-rem so we should make it sure to generate unique file to avoid process 
collision of writing into
-rem the same file concurrently.
+FOR /F %%a IN ('POWERSHELL -COMMAND "$([guid]::NewGuid().ToString())"') DO 
(SET NEWGUID=%%a)

Review Comment:
   > In addition, the current logic strongly relies on `Powershell`. I am 
worried about that the lack of `Powershell` on `older` versions of Windows will 
cause launch Spark failures. Can we `eliminate` this strong dependence?
   
   As for this concern, I think this is the most lightweight way to generate a 
GUID in a .cmd script (without installing any dependencies). Powershell is 
available in Windows 7+ and Windows Server 2008R2+. I would be surprised if 
there were users running Spark on Windows XP/Vista or Server versions older 
than 2008. In its current state, you can't reliably instantiate multiple Spark 
instances on Windows, so I think it's a worthwhile tradeoff.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to