liaoxin01 commented on code in PR #56175:
URL: https://github.com/apache/doris/pull/56175#discussion_r2363520585
##########
fe/fe-core/src/main/java/org/apache/doris/qe/SessionVariable.java:
##########
@@ -4761,7 +4761,7 @@ public void readFromJson(String json) throws IOException {
field.set(this, root.get(attr.name()));
break;
case "double":
- field.set(this, root.get(attr.name()));
+ field.set(this,
Double.valueOf(root.get(attr.name()).toString()));
Review Comment:
why change this?
##########
fe/fe-core/src/main/java/org/apache/doris/job/manager/JobManager.java:
##########
@@ -328,6 +373,18 @@ public void replayUpdateJob(T job) {
.add("msg", "replay update scheduler job").build());
}
+ public void replayUpdateStreamingJob(AlterStreamingJobOperationLog log) {
Review Comment:
Can the update operations for streaming jobs and other jobs be unified
without adding a new AlterStreamingJobOperationLog?
##########
fe/fe-core/src/main/java/org/apache/doris/datasource/property/storage/StorageProperties.java:
##########
@@ -146,17 +146,62 @@ public static List<StorageProperties>
createAll(Map<String, String> origProps) t
* @throws RuntimeException if no supported storage type is found
*/
public static StorageProperties createPrimary(Map<String, String>
origProps) {
- for (Function<Map<String, String>, StorageProperties> func :
PROVIDERS) {
- StorageProperties p = func.apply(origProps);
- if (p != null) {
- p.initNormalizeAndCheckProps();
- p.initializeHadoopStorageConfig();
- return p;
+ StorageProperties p = createPrimaryInternal(origProps);
Review Comment:
This modification may not be needed.
##########
fe/fe-core/src/main/java/org/apache/doris/nereids/trees/plans/commands/insert/OlapInsertExecutor.java:
##########
@@ -224,6 +226,18 @@ protected void onComplete() throws UserException {
}
}
+ private void setTxnCallbackId() {
Review Comment:
Does the cloud mode also follow this branch?
##########
fe/fe-core/src/main/java/org/apache/doris/fs/obj/S3ObjStorage.java:
##########
@@ -528,12 +529,50 @@ ListObjectsV2Response listObjectsV2(ListObjectsV2Request
request) throws UserExc
* Copy from `AzureObjStorage.GlobList`
*/
public Status globList(String remotePath, List<RemoteFile> result, boolean
fileNameOnly) {
+ GlobListResult globListResult = globListInternal(remotePath, result,
fileNameOnly, null, -1, -1);
+ return globListResult.getStatus();
+ }
+
+ /**
+ * List all files under the given path with glob pattern.
+ * For example, if the path is "s3://bucket/path/to/*.csv",
+ * it will list all files under "s3://bucket/path/to/" with ".csv" suffix.
+ * <p>
+ * Limit: Starting from startFile, until the total file size is greater
than fileSizeLimit,
+ * or the number of files is greater than fileNumLimit.
+ *
+ * @return The largest file name after listObject this time
+ */
+ public String globListWithLimit(String remotePath, List<RemoteFile>
result, String startFile,
+ long fileSizeLimit, long fileNumLimit) {
+ GlobListResult globListResult = globListInternal(remotePath, result,
true, startFile, fileSizeLimit,
+ fileNumLimit);
+ return globListResult.getMaxFile();
+ }
+
+ /**
+ * List all files under the given path with glob pattern.
+ * For example, if the path is "s3://bucket/path/to/*.csv",
+ * it will list all files under "s3://bucket/path/to/" with ".csv" suffix.
+ * <p>
+ * Copy from `AzureObjStorage.GlobList`
+ */
+ private GlobListResult globListInternal(String remotePath,
List<RemoteFile> result, boolean fileNameOnly,
Review Comment:
Are there any restrictions on the path of the job URI? For example, can
paths like `s3://bucket/*/ab/*` and ` s3://bucket/abc.csv` be used?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]