[ 
https://issues.apache.org/jira/browse/BEAM-9872?focusedWorklogId=445155&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-445155
 ]

ASF GitHub Bot logged work on BEAM-9872:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 12/Jun/20 20:37
            Start Date: 12/Jun/20 20:37
    Worklog Time Spent: 10m 
      Work Description: ibzib commented on a change in pull request #12002:
URL: https://github.com/apache/beam/pull/12002#discussion_r439631658



##########
File path: sdks/python/test-suites/portable/common.gradle
##########
@@ -99,34 +101,110 @@ task flinkTriggerTranscript() {
   }
 }
 
+
+task createProcessWorker {
+  dependsOn ':sdks:python:container:build'
+  dependsOn 'setupVirtualenv'
+  def sdkWorkerFile = file("${buildDir}/sdk_worker.sh")
+  def osType = 'linux'
+  if (Os.isFamily(Os.FAMILY_MAC))
+    osType = 'darwin'
+  def workerScript = 
"${project(":sdks:python:container:").buildDir.absolutePath}/target/launcher/${osType}_amd64/boot"
+  def sdkWorkerFileCode = "sh -c \"pip=`which pip` . ${envdir}/bin/activate && 
${workerScript} \$* \""
+  outputs.file sdkWorkerFile
+  doLast {
+    sdkWorkerFile.write sdkWorkerFileCode
+    exec {
+      commandLine('sh', '-c', ". ${envdir}/bin/activate && cd ${pythonRootDir} 
&& pip install -e .[test]")
+    }
+    exec {
+      commandLine('chmod', '+x', sdkWorkerFile)
+    }
+  }
+}
+
+def sparkCompatibilityMatrix = {
+  def config = it ? it as CompatibilityMatrixConfig : new 
CompatibilityMatrixConfig()
+  def workerType = config.workerType.name()
+  def streaming = config.streaming
+  // def environment_config = config.workerType == 
CompatibilityMatrixConfig.SDK_WORKER_TYPE.PROCESS ? 
"--environment_config='{\"command\": 
\"${buildDir.absolutePath}/sdk_worker.sh\"}'" : ""
+  def name = "sparkCompatibilityMatrix${streaming ? 'Streaming' : 
'Batch'}${config.preOptimize ? 'PreOptimize' : ''}${workerType}"
+  def extra_experiments = []
+  if (config.preOptimize)
+    extra_experiments.add('pre_optimize=all')
+  tasks.create(name: name) {
+    dependsOn 'createProcessWorker'
+    dependsOn 'setupVirtualenv'
+    dependsOn ':runners:spark:job-server:shadowJar'
+    doLast {
+      def environment_config = "'{\"command\": 
\"${buildDir.absolutePath}/sdk_worker.sh\"}'"
+      def argMap = [
+              "environment_type"    : "PROCESS",

Review comment:
       You'll need to change environment_type and environment_config according 
to the value of `workerType`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 445155)
    Time Spent: 1h 20m  (was: 1h 10m)

> Upgrade Spark Python validates runner tests to Python 3
> -------------------------------------------------------
>
>                 Key: BEAM-9872
>                 URL: https://issues.apache.org/jira/browse/BEAM-9872
>             Project: Beam
>          Issue Type: Improvement
>          Components: runner-spark, testing
>            Reporter: Kyle Weaver
>            Assignee: Anna Qin
>            Priority: P2
>              Labels: portability-spark
>          Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> beam_PostCommit_Python_VR_Spark is only running on Python 2.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to