zhangshenghang commented on code in PR #10102:
URL: https://github.com/apache/seatunnel/pull/10102#discussion_r2558354424


##########
seatunnel-engine/seatunnel-engine-core/src/main/java/org/apache/seatunnel/engine/core/parse/MultipleTableJobConfigParser.java:
##########
@@ -364,15 +430,55 @@ private int getParallelism(ReadonlyConfig config) {
                         .orElse(envOptions.get(EnvCommonOptions.PARALLELISM)));
     }
 
+    @VisibleForTesting
+    public int getParallelism(ReadonlyConfig config, SeaTunnelSource<?, ?, ?> 
source) {
+
+        if (config.getOptional(EnvCommonOptions.PARALLELISM).isPresent()) {
+            return Math.max(1, config.get(EnvCommonOptions.PARALLELISM));
+        }
+        if (envOptions.getOptional(EnvCommonOptions.PARALLELISM).isPresent()) {
+            return Math.max(1, envOptions.get(EnvCommonOptions.PARALLELISM));
+        }
+
+        ParallelismInferenceConfig parallelismInferenceConfig =
+                engineConfig.getParallelismInferenceConfig();
+        boolean inferenceEnabled =
+                envOptions
+                        
.getOptional(EnvCommonOptions.PARALLELISM_INFERENCE_ENABLED)
+                        .orElse(parallelismInferenceConfig.isEnabled());
+
+        if (inferenceEnabled && source instanceof SupportParallelismInference) 
{
+            int inferredParallelism = ((SupportParallelismInference) 
source).inferParallelism();

Review Comment:
   What if we roll back the parallelism to the default if it has issues?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to