xiedidan opened a new issue, #17633: URL: https://github.com/apache/dolphinscheduler/issues/17633
### Search before asking - [x] I had searched in the [issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and found no similar issues. ### What happened I'm deploying ds 3.2.2 with docker-compose. I have installed python3 and placed datax under /opt/soft, and created a environment called datax to export: ``` export PYTHON_HOME=/usr/bin/python3 export PYTHON_LAUNCHER=/usr/bin/python3 export DATAX_HOME=/opt/soft/datax/bin/datax.py ``` But the datax instance is failed: ```[LOG-PATH]: /opt/dolphinscheduler/logs/20251031/156139386499360/2/4/4.log, [HOST]: 172.21.0.6:1234 [INFO] 2025-10-31 19:37:34.813 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:34.817 +0800 - ********************************* Initialize task context *********************************** [INFO] 2025-10-31 19:37:34.818 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:34.818 +0800 - Begin to initialize task [INFO] 2025-10-31 19:37:34.819 +0800 - Set task startTime: 1761910654818 [INFO] 2025-10-31 19:37:34.819 +0800 - Set task appId: 4_4 [INFO] 2025-10-31 19:37:34.820 +0800 - End initialize task { "taskInstanceId" : 4, "taskName" : "test", "firstSubmitTime" : 1761910654785, "startTime" : 1761910654818, "taskType" : "DATAX", "workflowInstanceHost" : "172.21.0.7:5678", "host" : "172.21.0.6:1234", "logPath" : "/opt/dolphinscheduler/logs/20251031/156139386499360/2/4/4.log", "processId" : 0, "processDefineCode" : 156139386499360, "processDefineVersion" : 2, "processInstanceId" : 4, "scheduleTime" : 0, "executorId" : 1, "cmdTypeIfComplement" : 0, "tenantCode" : "supermicro", "processDefineId" : 0, "projectId" : 0, "projectCode" : 156139270707488, "taskParams" : "{\"localParams\":[],\"resourceList\":[],\"customConfig\":0,\"dsType\":\"POSTGRESQL\",\"dataSource\":1,\"dtType\":\"POSTGRESQL\",\"dataTarget\":1,\"sql\":\"select '1' as name;\",\"targetTable\":\"test\",\"jobSpeedByte\":0,\"jobSpeedRecord\":1000,\"preStatements\":[],\"postStatements\":[],\"xms\":1,\"xmx\":1}", "environmentConfig" : "export PYTHON_HOME=/usr/bin/python3\nexport PYTHON_LAUNCHER=/usr/bin/python3\nexport DATAX_HOME=/opt/soft/datax/bin/datax.py", "prepareParamsMap" : { "system.task.definition.name" : { "prop" : "system.task.definition.name", "direct" : "IN", "type" : "VARCHAR", "value" : "test" }, "system.project.name" : { "prop" : "system.project.name", "direct" : "IN", "type" : "VARCHAR", "value" : null }, "system.project.code" : { "prop" : "system.project.code", "direct" : "IN", "type" : "VARCHAR", "value" : "156139270707488" }, "system.workflow.instance.id" : { "prop" : "system.workflow.instance.id", "direct" : "IN", "type" : "VARCHAR", "value" : "4" }, "system.biz.curdate" : { "prop" : "system.biz.curdate", "direct" : "IN", "type" : "VARCHAR", "value" : "20251031" }, "system.biz.date" : { "prop" : "system.biz.date", "direct" : "IN", "type" : "VARCHAR", "value" : "20251030" }, "system.task.instance.id" : { "prop" : "system.task.instance.id", "direct" : "IN", "type" : "VARCHAR", "value" : "4" }, "system.workflow.definition.name" : { "prop" : "system.workflow.definition.name", "direct" : "IN", "type" : "VARCHAR", "value" : "test-datax" }, "system.task.definition.code" : { "prop" : "system.task.definition.code", "direct" : "IN", "type" : "VARCHAR", "value" : "156139351336224" }, "system.workflow.definition.code" : { "prop" : "system.workflow.definition.code", "direct" : "IN", "type" : "VARCHAR", "value" : "156139386499360" }, "system.datetime" : { "prop" : "system.datetime", "direct" : "IN", "type" : "VARCHAR", "value" : "20251031193734" } }, "taskAppId" : "4_4", "taskTimeout" : 2147483647, "workerGroup" : "default", "currentExecutionStatus" : "SUBMITTED_SUCCESS", "resourceParametersHelper" : { "resourceMap" : { "DATASOURCE" : { "1" : { "resourceType" : "DATASOURCE", "type" : "POSTGRESQL", "connectionParams" : "{\"user\":\"root\",\"password\":\"****\",\"address\":\"jdbc:postgresql://192.168.101.15:15433\",\"database\":\"postgres\",\"jdbcUrl\":\"jdbc:postgresql://192.168.101.15:15433/postgres\",\"driverClassName\":\"org.postgresql.Driver\",\"validationQuery\":\"select version()\"}", "DATASOURCE" : null } } } }, "endTime" : 0, "dryRun" : 0, "paramsMap" : { }, "cpuQuota" : -1, "memoryMax" : -1, "testFlag" : 0, "logBufferEnable" : false, "dispatchFailTimes" : 0 } [INFO] 2025-10-31 19:37:34.822 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:34.823 +0800 - ********************************* Load task instance plugin ********************************* [INFO] 2025-10-31 19:37:34.824 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:34.833 +0800 - Send task status RUNNING_EXECUTION master: 172.21.0.6:1234 [INFO] 2025-10-31 19:37:34.834 +0800 - TenantCode: supermicro check successfully [INFO] 2025-10-31 19:37:34.835 +0800 - WorkflowInstanceExecDir: /tmp/dolphinscheduler/exec/process/supermicro/156139270707488/156139386499360_2/4/4 check successfully [INFO] 2025-10-31 19:37:34.836 +0800 - Create TaskChannel: org.apache.dolphinscheduler.plugin.task.datax.DataxTaskChannel successfully [INFO] 2025-10-31 19:37:34.836 +0800 - Download resources successfully: ResourceContext(resourceItemMap={}) [INFO] 2025-10-31 19:37:34.837 +0800 - Download upstream files: [] successfully [INFO] 2025-10-31 19:37:34.837 +0800 - Task plugin instance: DATAX create successfully [INFO] 2025-10-31 19:37:34.838 +0800 - Initialize datax task params { "localParams" : [ ], "varPool" : [ ], "customConfig" : 0, "json" : null, "dsType" : "POSTGRESQL", "dataSource" : 1, "dtType" : "POSTGRESQL", "dataTarget" : 1, "sql" : "select '1' as name;", "targetTable" : "test", "preStatements" : [ ], "postStatements" : [ ], "jobSpeedByte" : 0, "jobSpeedRecord" : 1000, "xms" : 1, "xmx" : 1, "resourceList" : [ ] } [INFO] 2025-10-31 19:37:34.839 +0800 - Success initialized task plugin instance successfully [INFO] 2025-10-31 19:37:34.839 +0800 - Set taskVarPool: null successfully [INFO] 2025-10-31 19:37:34.840 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:34.840 +0800 - ********************************* Execute task instance ************************************* [INFO] 2025-10-31 19:37:34.841 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:34.850 +0800 - Final Shell file is: [INFO] 2025-10-31 19:37:34.850 +0800 - ****************************** Script Content ***************************************************************** [INFO] 2025-10-31 19:37:34.851 +0800 - #!/bin/bash BASEDIR=$(cd `dirname $0`; pwd) cd $BASEDIR export PYTHON_HOME=/usr/bin/python3 export PYTHON_LAUNCHER=/usr/bin/python3 export DATAX_HOME=/opt/soft/datax/bin/datax.py ${PYTHON_LAUNCHER} ${DATAX_LAUNCHER} --jvm="-Xms1G -Xmx1G" -p "-Dsystem.task.definition.name='test' -Dsystem.project.name='null' -Dsystem.project.code='156139270707488' -Dsystem.workflow.instance.id='4' -Dsystem.biz.curdate='20251031' -Dsystem.biz.date='20251030' -Dsystem.task.instance.id='4' -Dsystem.workflow.definition.name='test-datax' -Dsystem.task.definition.code='156139351336224' -Dsystem.workflow.definition.code='156139386499360' -Dsystem.datetime='20251031193734'" /tmp/dolphinscheduler/exec/process/supermicro/156139270707488/156139386499360_2/4/4/4_4_job.json [INFO] 2025-10-31 19:37:34.852 +0800 - ****************************** Script Content ***************************************************************** [INFO] 2025-10-31 19:37:34.852 +0800 - Executing shell command : sudo -u supermicro -i /tmp/dolphinscheduler/exec/process/supermicro/156139270707488/156139386499360_2/4/4/4_4.sh [INFO] 2025-10-31 19:37:34.858 +0800 - process start, process id is: 4118 [INFO] 2025-10-31 19:37:35.859 +0800 - -> unknown option --jvm=-Xms1G -Xmx1G usage: /usr/bin/python3 [option] ... [-c cmd | -m mod | file | -] [arg] ... Try `python -h' for more information. [INFO] 2025-10-31 19:37:35.861 +0800 - process has exited. execute path:/tmp/dolphinscheduler/exec/process/supermicro/156139270707488/156139386499360_2/4/4, processId:4118 ,exitStatusCode:2 ,processWaitForStatus:true ,processExitValue:2 [INFO] 2025-10-31 19:37:35.862 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:35.863 +0800 - ********************************* Finalize task instance ************************************ [INFO] 2025-10-31 19:37:35.863 +0800 - *********************************************************************************************** [INFO] 2025-10-31 19:37:35.864 +0800 - Upload output files: [] successfully [INFO] 2025-10-31 19:37:35.870 +0800 - Send task execute status: FAILURE to master : 172.21.0.6:1234 [INFO] 2025-10-31 19:37:35.871 +0800 - Remove the current task execute context from worker cache [INFO] 2025-10-31 19:37:35.871 +0800 - The current execute mode isn't develop mode, will clear the task execute file: /tmp/dolphinscheduler/exec/process/supermicro/156139270707488/156139386499360_2/4/4 [INFO] 2025-10-31 19:37:35.873 +0800 - Success clear the task execute file: /tmp/dolphinscheduler/exec/process/supermicro/156139270707488/156139386499360_2/4/4 [INFO] 2025-10-31 19:37:35.874 +0800 - FINALIZE_SESSION ``` It seems that python was called with jvm parameters was the cause of the failure: ``` [INFO] 2025-10-31 19:37:35.859 +0800 - -> unknown option --jvm=-Xms1G -Xmx1G usage: /usr/bin/python3 [option] ... [-c cmd | -m mod | file | -] [arg] ... Try `python -h' for more information. ``` ### What you expected to happen datax should work ### How to reproduce ```docker cp datax docker-dolphinscheduler-worker-1:/opt/soft/``` in docker-dolphinscheduler-worker-1, execute ```sudo apt install python3``` then create an env called datax: ``` export PYTHON_HOME=/usr/bin/python3 export PYTHON_LAUNCHER=/usr/bin/python3 export DATAX_HOME=/opt/soft/datax/bin/datax.py ``` create a flow and put in a datax node, run ### Anything else _No response_ ### Version 3.2.x ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [x] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
