This is an automated email from the ASF dual-hosted git repository.
hxd pushed a commit to branch rel/0.11
in repository https://gitbox.apache.org/repos/asf/iotdb.git
The following commit(s) were added to refs/heads/rel/0.11 by this push:
new 9b89eb1 using session to load data from csv; support long format or
string fromat for timestamp column (#2430)
9b89eb1 is described below
commit 9b89eb1cf5c8c648a4653d9db583a4f79ef5a856
Author: Boris <[email protected]>
AuthorDate: Thu Jan 7 20:52:05 2021 +0800
using session to load data from csv; support long format or string fromat
for timestamp column (#2430)
---
README.md | 54 ---
README_ZH.md | 56 ---
cli/pom.xml | 5 +
.../java/org/apache/iotdb/cli/AbstractCli.java | 11 +-
.../org/apache/iotdb/tool/AbstractCsvTool.java | 44 +-
.../main/java/org/apache/iotdb/tool/ExportCsv.java | 211 ++++-----
.../main/java/org/apache/iotdb/tool/ImportCsv.java | 526 ++++++++-------------
.../java/org/apache/iotdb/cli/AbstractScript.java | 3 +-
.../org/apache/iotdb/tool/ExportCsvTestIT.java | 12 +-
.../org/apache/iotdb/tool/ImportCsvTestIT.java | 12 +-
docs/UserGuide/System Tools/CSV Tool.md | 85 ++++
docs/zh/UserGuide/System Tools/CSV Tool.md | 88 ++++
.../main/java/org/apache/iotdb/rpc/RpcUtils.java | 1 -
13 files changed, 521 insertions(+), 587 deletions(-)
diff --git a/README.md b/README.md
index 0b35840..e71ec0c 100644
--- a/README.md
+++ b/README.md
@@ -333,59 +333,5 @@ Under the root path of iotdb:
After being built, the IoTDB cli is located at the folder
"cli/target/iotdb-cli-{project.version}".
-## Usage of import-csv.sh
-
-### Create metadata
-```
-SET STORAGE GROUP TO root.fit.d1;
-SET STORAGE GROUP TO root.fit.d2;
-SET STORAGE GROUP TO root.fit.p;
-CREATE TIMESERIES root.fit.d1.s1 WITH DATATYPE=INT32,ENCODING=RLE;
-CREATE TIMESERIES root.fit.d1.s2 WITH DATATYPE=TEXT,ENCODING=PLAIN;
-CREATE TIMESERIES root.fit.d2.s1 WITH DATATYPE=INT32,ENCODING=RLE;
-CREATE TIMESERIES root.fit.d2.s3 WITH DATATYPE=INT32,ENCODING=RLE;
-CREATE TIMESERIES root.fit.p.s1 WITH DATATYPE=INT32,ENCODING=RLE;
-```
-
-### An example of import csv file
-
-```
-Time,root.fit.d1.s1,root.fit.d1.s2,root.fit.d2.s1,root.fit.d2.s3,root.fit.p.s1
-1,100,'hello',200,300,400
-2,500,'world',600,700,800
-3,900,'IoTDB',1000,1100,1200
-```
-
-### Run import shell
-```
-# Unix/OS X
-> tools/import-csv.sh -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
-
-# Windows
-> tools\import-csv.bat -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
-```
-
-### Error data file
-
-`csvInsertError.error`
-
-## Usage of export-csv.sh
-
-### Run export shell
-```
-# Unix/OS X
-> tools/export-csv.sh -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format>]
-
-# Windows
-> tools\export-csv.bat -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format>]
-```
-
-### Input query
-
-```
-select * from root.fit.d1
-```
-
-
# Frequent Questions for Compiling
see [Frequent Questions when Compiling the Source
Code](https://iotdb.apache.org/Development/ContributeGuide.html#_Frequent-Questions-when-Compiling-the-Source-Code)
diff --git a/README_ZH.md b/README_ZH.md
index 9545377..87d58be 100644
--- a/README_ZH.md
+++ b/README_ZH.md
@@ -333,61 +333,5 @@ server 可以使用 "ctrl-C" 或者执行下面的脚本:
编译完成后, IoTDB cli 将生成在 "cli/target/iotdb-cli-{project.version}".
-## 使用 import-csv.sh
-
-### 创建元数据
-
-```
-SET STORAGE GROUP TO root.fit.d1;
-SET STORAGE GROUP TO root.fit.d2;
-SET STORAGE GROUP TO root.fit.p;
-CREATE TIMESERIES root.fit.d1.s1 WITH DATATYPE=INT32,ENCODING=RLE;
-CREATE TIMESERIES root.fit.d1.s2 WITH DATATYPE=TEXT,ENCODING=PLAIN;
-CREATE TIMESERIES root.fit.d2.s1 WITH DATATYPE=INT32,ENCODING=RLE;
-CREATE TIMESERIES root.fit.d2.s3 WITH DATATYPE=INT32,ENCODING=RLE;
-CREATE TIMESERIES root.fit.p.s1 WITH DATATYPE=INT32,ENCODING=RLE;
-```
-
-### 从 csv 文件导入数据的示例
-
-```
-Time,root.fit.d1.s1,root.fit.d1.s2,root.fit.d2.s1,root.fit.d2.s3,root.fit.p.s1
-1,100,'hello',200,300,400
-2,500,'world',600,700,800
-3,900,'IoTDB',1000,1100,1200
-```
-
-### 运行 import shell
-```
-# Unix/OS X
-> tools/import-csv.sh -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
-
-# Windows
-> tools\import-csv.bat -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
-```
-
-### 错误的数据文件
-
-`csvInsertError.error`
-
-## 使用 export-csv.sh
-
-### 运行 export shell
-
-```
-# Unix/OS X
-> tools/export-csv.sh -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format>]
-
-# Windows
-> tools\export-csv.bat -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format>]
-```
-
-### 执行查询
-
-```
-select * from root.fit.d1
-```
-
-
# 常见编译错误
see [Frequent Questions when Compiling the Source
Code](https://iotdb.apache.org/zh/Development/ContributeGuide.html#%E5%B8%B8%E8%A7%81%E7%BC%96%E8%AF%91%E9%94%99%E8%AF%AF)
diff --git a/cli/pom.xml b/cli/pom.xml
index 64ca2af..6c9aac9 100644
--- a/cli/pom.xml
+++ b/cli/pom.xml
@@ -70,6 +70,11 @@
<version>0.11.3-SNAPSHOT</version>
<scope>test</scope>
</dependency>
+ <dependency>
+ <groupId>org.apache.iotdb</groupId>
+ <artifactId>iotdb-session</artifactId>
+ <version>${project.version}</version>
+ </dependency>
</dependencies>
<build>
<plugins>
diff --git a/cli/src/main/java/org/apache/iotdb/cli/AbstractCli.java
b/cli/src/main/java/org/apache/iotdb/cli/AbstractCli.java
index e413f53..f385799 100644
--- a/cli/src/main/java/org/apache/iotdb/cli/AbstractCli.java
+++ b/cli/src/main/java/org/apache/iotdb/cli/AbstractCli.java
@@ -536,14 +536,9 @@ public abstract class AbstractCli {
+ "Noted that your file path cannot contain any space character)");
return;
}
- try {
- println(cmd.split(" ")[1]);
- ImportCsv.importCsvFromFile(host, port, username, password, cmd.split("
")[1],
- connection.getTimeZone());
- } catch (SQLException e) {
- println(String.format("Failed to import from %s because %s",
- cmd.split(" ")[1], e.getMessage()));
- }
+ println(cmd.split(" ")[1]);
+ ImportCsv.importCsvFromFile(host, port, username, password, cmd.split("
")[1],
+ connection.getTimeZone());
}
private static void executeQuery(IoTDBConnection connection, String cmd) {
diff --git a/cli/src/main/java/org/apache/iotdb/tool/AbstractCsvTool.java
b/cli/src/main/java/org/apache/iotdb/tool/AbstractCsvTool.java
index d7e3074..7b50924 100644
--- a/cli/src/main/java/org/apache/iotdb/tool/AbstractCsvTool.java
+++ b/cli/src/main/java/org/apache/iotdb/tool/AbstractCsvTool.java
@@ -22,10 +22,12 @@ import java.io.IOException;
import java.time.ZoneId;
import jline.console.ConsoleReader;
import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.Option;
+import org.apache.commons.cli.Options;
import org.apache.iotdb.exception.ArgsErrorException;
-import org.apache.iotdb.jdbc.IoTDBConnection;
-import org.apache.iotdb.jdbc.IoTDBSQLException;
-import org.apache.thrift.TException;
+import org.apache.iotdb.rpc.IoTDBConnectionException;
+import org.apache.iotdb.rpc.StatementExecutionException;
+import org.apache.iotdb.session.Session;
public abstract class AbstractCsvTool {
@@ -69,7 +71,7 @@ public abstract class AbstractCsvTool {
protected static String timeZoneID;
protected static String timeFormat;
- protected static IoTDBConnection connection;
+ protected static Session session;
AbstractCsvTool() {}
@@ -85,11 +87,11 @@ public abstract class AbstractCsvTool {
return str;
}
- protected static void setTimeZone() throws IoTDBSQLException, TException {
+ protected static void setTimeZone() throws IoTDBConnectionException,
StatementExecutionException {
if (timeZoneID != null) {
- connection.setTimeZone(timeZoneID);
+ session.setTimeZone(timeZoneID);
}
- zoneId = ZoneId.of(connection.getTimeZone());
+ zoneId = ZoneId.of(session.getTimeZone());
}
protected static void parseBasicParams(CommandLine commandLine,
ConsoleReader reader)
@@ -110,8 +112,32 @@ public abstract class AbstractCsvTool {
return true;
}
}
- System.out.println(String.format("Input time format %s is not supported, "
- + "please input like yyyy-MM-dd\\ HH:mm:ss.SSS or
yyyy-MM-dd'T'HH:mm:ss.SSS", timeFormat));
+ System.out.printf("Input time format %s is not supported, "
+ + "please input like yyyy-MM-dd\\ HH:mm:ss.SSS or
yyyy-MM-dd'T'HH:mm:ss.SSS%n", timeFormat);
return false;
}
+
+ protected static Options createNewOptions() {
+ Options options = new Options();
+
+ Option opHost =
Option.builder(HOST_ARGS).longOpt(HOST_NAME).required().argName(HOST_NAME)
+ .hasArg()
+ .desc("Host Name (required)").build();
+ options.addOption(opHost);
+
+ Option opPort =
Option.builder(PORT_ARGS).longOpt(PORT_NAME).required().argName(PORT_NAME)
+ .hasArg()
+ .desc("Port (required)").build();
+ options.addOption(opPort);
+
+ Option opUsername =
Option.builder(USERNAME_ARGS).longOpt(USERNAME_NAME).required()
+ .argName(USERNAME_NAME)
+ .hasArg().desc("Username (required)").build();
+ options.addOption(opUsername);
+
+ Option opPassword =
Option.builder(PASSWORD_ARGS).longOpt(PASSWORD_NAME).optionalArg(true)
+ .argName(PASSWORD_NAME).hasArg().desc("Password (optional)").build();
+ options.addOption(opPassword);
+ return options;
+ }
}
diff --git a/cli/src/main/java/org/apache/iotdb/tool/ExportCsv.java
b/cli/src/main/java/org/apache/iotdb/tool/ExportCsv.java
index 7a33eb4..b89bea7 100644
--- a/cli/src/main/java/org/apache/iotdb/tool/ExportCsv.java
+++ b/cli/src/main/java/org/apache/iotdb/tool/ExportCsv.java
@@ -25,16 +25,9 @@ import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
-import java.sql.DriverManager;
-import java.sql.ResultSet;
-import java.sql.ResultSetMetaData;
-import java.sql.SQLException;
-import java.sql.Statement;
-import java.sql.Types;
import java.time.Instant;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter;
-import java.util.ArrayList;
import java.util.List;
import jline.console.ConsoleReader;
import org.apache.commons.cli.CommandLine;
@@ -46,9 +39,13 @@ import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.iotdb.cli.AbstractCli;
import org.apache.iotdb.exception.ArgsErrorException;
-import org.apache.iotdb.jdbc.Config;
-import org.apache.iotdb.jdbc.IoTDBConnection;
-import org.apache.thrift.TException;
+import org.apache.iotdb.rpc.IoTDBConnectionException;
+import org.apache.iotdb.rpc.StatementExecutionException;
+import org.apache.iotdb.session.Session;
+import org.apache.iotdb.session.SessionDataSet;
+import org.apache.iotdb.tsfile.file.metadata.enums.TSDataType;
+import org.apache.iotdb.tsfile.read.common.Field;
+import org.apache.iotdb.tsfile.read.common.RowRecord;
/**
* Export CSV file.
@@ -75,20 +72,16 @@ public class ExportCsv extends AbstractCsvTool {
private static final int EXPORT_PER_LINE_COUNT = 10000;
- private static String TIMESTAMP_PRECISION = "ms";
-
- private static List<Integer> typeList = new ArrayList<>();
-
/**
* main function of export csv tool.
*/
- public static void main(String[] args) throws IOException, SQLException {
+ public static void main(String[] args) throws IOException {
Options options = createOptions();
HelpFormatter hf = new HelpFormatter();
- hf.setOptionComparator(null); // avoid reordering
- hf.setWidth(MAX_HELP_CONSOLE_WIDTH);
CommandLine commandLine;
CommandLineParser parser = new DefaultParser();
+ hf.setOptionComparator(null); // avoid reordering
+ hf.setWidth(MAX_HELP_CONSOLE_WIDTH);
if (args == null || args.length == 0) {
System.out.println("Too few params input, please check the following
hint.");
@@ -116,13 +109,11 @@ public class ExportCsv extends AbstractCsvTool {
if (!checkTimeFormat()) {
return;
}
- Class.forName(Config.JDBC_DRIVER_NAME);
String sqlFile = commandLine.getOptionValue(SQL_FILE_ARGS);
String sql;
-
- connection = (IoTDBConnection) DriverManager
- .getConnection(Config.IOTDB_URL_PREFIX + host + ":" + port + "/",
username, password);
+ session = new Session(host, Integer.parseInt(port), username, password);
+ session.open(false);
setTimeZone();
if (sqlFile == null) {
@@ -134,21 +125,21 @@ public class ExportCsv extends AbstractCsvTool {
} else {
dumpFromSqlFile(sqlFile);
}
- } catch (ClassNotFoundException e) {
- System.out.println("Failed to export data because cannot find IoTDB JDBC
Driver, "
- + "please check whether you have imported driver or not: " +
e.getMessage());
- } catch (TException e) {
- System.out.println("Encounter an error when connecting to server,
because " + e.getMessage());
- } catch (SQLException e) {
- System.out.println("Encounter an error when exporting data, error is: "
+ e.getMessage());
} catch (IOException e) {
System.out.println("Failed to operate on file, because " +
e.getMessage());
} catch (ArgsErrorException e) {
System.out.println("Invalid args: " + e.getMessage());
+ } catch (IoTDBConnectionException | StatementExecutionException e) {
+ System.out.println("Connect failed because " + e.getMessage());
} finally {
reader.close();
- if (connection != null) {
- connection.close();
+ if (session != null) {
+ try {
+ session.close();
+ } catch (IoTDBConnectionException e) {
+ System.out
+ .println("Encounter an error when closing session, error is: " +
e.getMessage());
+ }
}
}
}
@@ -176,26 +167,7 @@ public class ExportCsv extends AbstractCsvTool {
* @return object Options
*/
private static Options createOptions() {
- Options options = new Options();
-
- Option opHost =
Option.builder(HOST_ARGS).longOpt(HOST_NAME).required().argName(HOST_NAME)
- .hasArg()
- .desc("Host Name (required)").build();
- options.addOption(opHost);
-
- Option opPort =
Option.builder(PORT_ARGS).longOpt(PORT_NAME).required().argName(PORT_NAME)
- .hasArg()
- .desc("Port (required)").build();
- options.addOption(opPort);
-
- Option opUsername =
Option.builder(USERNAME_ARGS).longOpt(USERNAME_NAME).required()
- .argName(USERNAME_NAME)
- .hasArg().desc("Username (required)").build();
- options.addOption(opUsername);
-
- Option opPassword =
Option.builder(PASSWORD_ARGS).longOpt(PASSWORD_NAME).optionalArg(true)
- .argName(PASSWORD_NAME).hasArg().desc("Password (optional)").build();
- options.addOption(opPassword);
+ Options options = createNewOptions();
Option opTargetFile =
Option.builder(TARGET_DIR_ARGS).required().argName(TARGET_DIR_NAME)
.hasArg()
@@ -234,12 +206,7 @@ public class ExportCsv extends AbstractCsvTool {
String sql;
int index = 0;
while ((sql = reader.readLine()) != null) {
- try {
- dumpResult(sql, index);
- } catch (SQLException e) {
- System.out
- .println("Cannot dump data for statement " + sql + ", because :
" + e.getMessage());
- }
+ dumpResult(sql, index);
index++;
}
}
@@ -248,12 +215,10 @@ public class ExportCsv extends AbstractCsvTool {
/**
* Dump files from database to CSV file.
*
- * @param sql export the result of executing the sql
+ * @param sql export the result of executing the sql
* @param index use to create dump file name
- * @throws SQLException if SQL is not valid
*/
- private static void dumpResult(String sql, int index)
- throws SQLException {
+ private static void dumpResult(String sql, int index) {
final String path = targetDirectory + targetFile + index + ".csv";
File tf = new File(path);
@@ -263,82 +228,72 @@ public class ExportCsv extends AbstractCsvTool {
return;
}
} catch (IOException e) {
- System.out.println("Cannot create dump file " + path + "because: " +
e.getMessage());
+ System.out.println("Cannot create dump file " + path + " " + "because: "
+ e.getMessage());
return;
}
System.out.println("Start to export data from sql statement: " + sql);
- try (Statement statement = connection.createStatement();
- ResultSet rs = statement.executeQuery(sql);
- BufferedWriter bw = new BufferedWriter(new FileWriter(tf))) {
- ResultSetMetaData metadata = rs.getMetaData();
+ try (BufferedWriter bw = new BufferedWriter(new FileWriter(tf))) {
+ SessionDataSet sessionDataSet = session.executeQueryStatement(sql);
long startTime = System.currentTimeMillis();
-
- int count = metadata.getColumnCount();
// write data in csv file
- writeMetadata(bw, count, metadata);
+ writeMetadata(bw, sessionDataSet.getColumnNames());
- int line = writeResultSet(rs, bw, count);
+ int line = writeResultSet(sessionDataSet, bw);
System.out
- .println(String.format("Statement [%s] has dumped to file %s
successfully! It costs "
- + "%dms to export %d lines.", sql, path,
System.currentTimeMillis() - startTime,
- line));
- } catch (IOException e) {
+ .printf("Statement [%s] has dumped to file %s successfully! It costs
"
+ + "%dms to export %d lines.%n", sql, path,
System.currentTimeMillis() - startTime,
+ line);
+ } catch (IOException | StatementExecutionException |
IoTDBConnectionException e) {
System.out.println("Cannot dump result because: " + e.getMessage());
}
}
- private static void writeMetadata(BufferedWriter bw, int count,
ResultSetMetaData metadata)
- throws SQLException, IOException {
- for (int i = 1; i <= count; i++) {
- if (i < count) {
- bw.write(metadata.getColumnLabel(i) + ",");
- } else {
- bw.write(metadata.getColumnLabel(i) + "\n");
- }
- typeList.add(metadata.getColumnType(i));
+ private static void writeMetadata(BufferedWriter bw, List<String>
columnNames)
+ throws IOException {
+ if (!columnNames.get(0).equals("Time")) {
+ bw.write("Time" + ",");
+ }
+ for (int i = 0; i < columnNames.size() - 1; i++) {
+ bw.write(columnNames.get(i) + ",");
}
+ bw.write(columnNames.get(columnNames.size() - 1) + "\n");
}
- private static int writeResultSet(ResultSet rs, BufferedWriter bw, int count)
- throws SQLException, IOException {
+ private static int writeResultSet(SessionDataSet rs, BufferedWriter bw)
+ throws IOException, StatementExecutionException,
IoTDBConnectionException {
int line = 0;
long timestamp = System.currentTimeMillis();
- while (rs.next()) {
- if (rs.getString(1) == null ||
- "null".equalsIgnoreCase(rs.getString(1))) {
- bw.write(",");
- } else {
- writeTime(rs, bw);
- writeValue(rs, count, bw);
- }
+ while (rs.hasNext()) {
+ RowRecord rowRecord = rs.next();
+ List<Field> fields = rowRecord.getFields();
+ writeTime(rowRecord.getTimestamp(), bw);
+ writeValue(fields, bw);
line++;
if (line % EXPORT_PER_LINE_COUNT == 0) {
long tmp = System.currentTimeMillis();
- System.out.println(
- String.format("%d lines have been exported, it takes %dms", line,
(tmp - timestamp)));
+ System.out.printf("%d lines have been exported, it takes %dms%n",
line, (tmp - timestamp));
timestamp = tmp;
}
}
return line;
}
- private static void writeTime(ResultSet rs, BufferedWriter bw) throws
SQLException, IOException {
+ private static void writeTime(Long time, BufferedWriter bw) throws
IOException {
ZonedDateTime dateTime;
+ String timestampPrecision = "ms";
switch (timeFormat) {
case "default":
- long timestamp = rs.getLong(1);
- String str = AbstractCli
-
.parseLongToDateWithPrecision(DateTimeFormatter.ISO_OFFSET_DATE_TIME,
timestamp, zoneId,
- TIMESTAMP_PRECISION);
+ String str =
AbstractCli.parseLongToDateWithPrecision(DateTimeFormatter.ISO_OFFSET_DATE_TIME,
time, zoneId,
+ timestampPrecision);
bw.write(str + ",");
break;
case "timestamp":
case "long":
case "nubmer":
- bw.write(rs.getLong(1) + ",");
+ bw.write(time + ",");
break;
default:
- dateTime = ZonedDateTime.ofInstant(Instant.ofEpochMilli(rs.getLong(1)),
+ dateTime = ZonedDateTime.ofInstant(Instant.ofEpochMilli(time),
zoneId);
bw.write(dateTime.format(DateTimeFormatter.ofPattern(timeFormat)) +
",");
break;
@@ -346,29 +301,49 @@ public class ExportCsv extends AbstractCsvTool {
}
@SuppressWarnings("squid:S3776") // Suppress high Cognitive Complexity
warning
- private static void writeValue(ResultSet rs, int count, BufferedWriter bw)
- throws SQLException, IOException {
- for (int j = 2; j <= count; j++) {
- if (j < count) {
- if ("null".equals(rs.getString(j))) {
- bw.write(",");
- } else {
- if(typeList.get(j-1) == Types.VARCHAR) {
- bw.write("\'" + rs.getString(j) + "\'"+ ",");
+ private static void writeValue(List<Field> fields, BufferedWriter bw) throws
IOException {
+ for (int j = 0; j < fields.size() - 1; j++) {
+ String value = fields.get(j).getStringValue();
+ if ("null".equalsIgnoreCase(value)) {
+ bw.write(",");
+ } else {
+ if (fields.get(j).getDataType() == TSDataType.TEXT) {
+ int location = value.indexOf("\"");
+ if (location > -1) {
+ if (location == 0 || value.charAt(location - 1) != '\\') {
+ bw.write("\"" + value.replace("\"", "\\\"") + "\",");
+ } else {
+ bw.write("\"" + value + "\",");
+ }
+ } else if (value.contains(",")) {
+ bw.write("\"" + value + "\",");
} else {
- bw.write(rs.getString(j) + ",");
+ bw.write(value + ",");
}
- }
- } else {
- if ("null".equals(rs.getString(j))) {
- bw.write("\n");
} else {
- if(typeList.get(j-1) == Types.VARCHAR) {
- bw.write("\'" + rs.getString(j) + "\'"+ "\n");
+ bw.write(value + ",");
+ }
+ }
+ }
+ String lastValue = fields.get(fields.size() - 1).getStringValue();
+ if ("null".equalsIgnoreCase(lastValue)) {
+ bw.write("\n");
+ } else {
+ if (fields.get(fields.size() - 1).getDataType() == TSDataType.TEXT) {
+ int location = lastValue.indexOf("\"");
+ if (location > -1) {
+ if (location == 0 || lastValue.charAt(location - 1) != '\\') {
+ bw.write("\"" + lastValue.replace("\"", "\\\"") + "\"\n");
} else {
- bw.write(rs.getString(j) + "\n");
+ bw.write("\"" + lastValue + "\"\n");
}
+ } else if (lastValue.contains(",")) {
+ bw.write("\"" + lastValue + "\"\n");
+ } else {
+ bw.write(lastValue + "\n");
}
+ } else {
+ bw.write(lastValue + "\n");
}
}
}
diff --git a/cli/src/main/java/org/apache/iotdb/tool/ImportCsv.java
b/cli/src/main/java/org/apache/iotdb/tool/ImportCsv.java
index ee83572..69448a0 100644
--- a/cli/src/main/java/org/apache/iotdb/tool/ImportCsv.java
+++ b/cli/src/main/java/org/apache/iotdb/tool/ImportCsv.java
@@ -19,22 +19,17 @@
package org.apache.iotdb.tool;
import java.io.BufferedReader;
-import java.io.BufferedWriter;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
-import java.io.FileWriter;
import java.io.IOException;
import java.io.LineNumberReader;
-import java.sql.DriverManager;
-import java.sql.ResultSet;
-import java.sql.SQLException;
-import java.sql.Statement;
+import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.HashMap;
-import java.util.Iterator;
import java.util.List;
import java.util.Map;
+import java.util.Map.Entry;
import jline.console.ConsoleReader;
import me.tongfei.progressbar.ProgressBar;
import org.apache.commons.cli.CommandLine;
@@ -44,34 +39,31 @@ import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
-import org.apache.commons.io.FileUtils;
import org.apache.iotdb.exception.ArgsErrorException;
-import org.apache.iotdb.jdbc.Config;
-import org.apache.iotdb.jdbc.IoTDBConnection;
-import org.apache.iotdb.rpc.TSStatusCode;
-import org.apache.thrift.TException;
+import org.apache.iotdb.rpc.IoTDBConnectionException;
+import org.apache.iotdb.rpc.StatementExecutionException;
+import org.apache.iotdb.session.Session;
+import org.apache.iotdb.tsfile.common.constant.TsFileConstant;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
/**
* read a CSV formatted data File and insert all the data into IoTDB.
*/
public class ImportCsv extends AbstractCsvTool {
+
private static final String FILE_ARGS = "f";
private static final String FILE_NAME = "file or folder";
private static final String FILE_SUFFIX = "csv";
+ private static final Logger logger =
LoggerFactory.getLogger(ImportCsv.class);
+ private static final String TIME_TYPE = "It isn't a {} time type";
private static final String TSFILEDB_CLI_PREFIX = "ImportCsv";
- private static final String ERROR_INFO_STR = "csvInsertError.error";
-
- private static final String STRING_DATA_TYPE = "TEXT";
- private static final int BATCH_EXECUTE_COUNT = 100;
-
- private static String errorInsertInfo = "";
- private static boolean errorFlag;
-
- private static String IOTDB_CLI_HOME = "IOTDB_CLI_HOME";
+ private static final String ILLEGAL_PATH_ARGUMENT = "Path parameter is null";
- private static int count;
- private static Statement statement;
+ // put these variable in here, because sonar fails. have to extract some
code into a function. nextNode method.
+ private static int i;
+ private static int startIndex;
/**
* create the commandline options.
@@ -79,24 +71,7 @@ public class ImportCsv extends AbstractCsvTool {
* @return object Options
*/
private static Options createOptions() {
- Options options = new Options();
-
- Option opHost = Option.builder(HOST_ARGS).longOpt(HOST_NAME).required()
- .argName(HOST_NAME).hasArg().desc("Host Name (required)").build();
- options.addOption(opHost);
-
- Option opPort = Option.builder(PORT_ARGS).longOpt(PORT_NAME).required()
- .argName(PORT_NAME).hasArg().desc("Port (required)").build();
- options.addOption(opPort);
-
- Option opUsername = Option.builder(USERNAME_ARGS).longOpt(USERNAME_NAME)
- .required().argName(USERNAME_NAME)
- .hasArg().desc("Username (required)").build();
- options.addOption(opUsername);
-
- Option opPassword = Option.builder(PASSWORD_ARGS).longOpt(PASSWORD_NAME)
- .optionalArg(true).argName(PASSWORD_NAME).hasArg().desc("Password
(optional)").build();
- options.addOption(opPassword);
+ Options options = createNewOptions();
Option opFile =
Option.builder(FILE_ARGS).required().argName(FILE_NAME).hasArg().desc(
"If input a file path, load a csv file, "
@@ -119,8 +94,7 @@ public class ImportCsv extends AbstractCsvTool {
/**
* Data from csv To tsfile.
*/
- private static void loadDataFromCSV(File file, int index) {
- statement = null;
+ private static void loadDataFromCSV(File file) {
int fileLine;
try {
fileLine = getFileLineCount(file);
@@ -128,287 +102,78 @@ public class ImportCsv extends AbstractCsvTool {
System.out.println("Failed to import file: " + file.getName());
return;
}
- File errorFile = new File(errorInsertInfo + index);
- if (!errorFile.exists()) {
- try {
- errorFile.createNewFile();
- } catch (IOException e) {
- System.out.println("Cannot create a errorFile because: " +
e.getMessage());
- return;
- }
- }
System.out.println("Start to import data from: " + file.getName());
- errorFlag = true;
- try(BufferedReader br = new BufferedReader(new FileReader(file));
- BufferedWriter bw = new BufferedWriter(new FileWriter(errorFile));
+ try (BufferedReader br = new BufferedReader(new FileReader(file));
ProgressBar pb = new ProgressBar("Import from: " + file.getName(),
fileLine)) {
pb.setExtraMessage("Importing...");
String header = br.readLine();
-
- bw.write("From " + file.getAbsolutePath());
- bw.newLine();
- bw.newLine();
- bw.write(header);
- bw.newLine();
- bw.newLine();
-
- // storage csv table head info
- Map<String, ArrayList<Integer>> deviceToColumn = new HashMap<>();
- // storage csv table head info
- List<String> colInfo = new ArrayList<>();
- // storage csv device sensor info, corresponding csv table head
- List<String> headInfo = new ArrayList<>();
-
- String[] strHeadInfo = header.split(",");
- if (strHeadInfo.length <= 1) {
- System.out.println("The CSV file "+ file.getName() +" illegal, please
check first line");
+ String[] cols = splitCsvLine(header);
+ if (cols.length <= 1) {
+ System.out.println("The CSV file " + file.getName() + " illegal,
please check first line");
return;
}
- long startTime = System.currentTimeMillis();
- Map<String, String> timeseriesDataType = new HashMap<>();
+ List<String> devices = new ArrayList<>();
+ List<Long> times = new ArrayList<>();
+ List<List<String>> measurementsList = new ArrayList<>();
+ List<List<String>> valuesList = new ArrayList<>();
+ Map<String, List<Integer>> devicesToPositions = new HashMap<>();
+ Map<String, List<String>> devicesToMeasurements = new HashMap<>();
- boolean success = queryDatabaseMeta(strHeadInfo, file, bw,
timeseriesDataType, headInfo,
- deviceToColumn, colInfo);
- if (!success) {
- errorFlag = false;
- return;
+ for (int i = 1; i < cols.length; i++) {
+ splitColToDeviceAndMeasurement(cols[i], devicesToPositions,
devicesToMeasurements, i);
}
- statement = connection.createStatement();
-
+ int lineNumber = 0;
+ String line;
+ while ((line = br.readLine()) != null) {
+ cols = splitCsvLine(line);
+ lineNumber++;
+ for (Entry<String, List<Integer>> deviceToPositions :
devicesToPositions
+ .entrySet()) {
+ String device = deviceToPositions.getKey();
+ devices.add(device);
+
+ times.add(parseTime(cols[0]));
+
+ List<String> values = new ArrayList<>();
+ for (int position : deviceToPositions.getValue()) {
+ values.add(cols[position]);
+ }
+ valuesList.add(values);
- List<String> tmp = new ArrayList<>();
- success = readAndGenSqls(br, timeseriesDataType, deviceToColumn,
colInfo, headInfo,
- bw, tmp, pb);
- if (!success) {
- return;
+ measurementsList.add(devicesToMeasurements.get(device));
+ }
+ if (lineNumber % 10000 == 0) {
+ session.insertRecords(devices, times, measurementsList, valuesList);
+ devices = new ArrayList<>();
+ times = new ArrayList<>();
+ measurementsList = new ArrayList<>();
+ valuesList = new ArrayList<>();
+ }
}
-
- executeSqls(bw, tmp, startTime, file);
+ session.insertRecords(devices, times, measurementsList, valuesList);
+ System.out.println("Insert csv successfully!");
pb.stepTo(fileLine);
} catch (FileNotFoundException e) {
- System.out.println("Cannot find " + file.getName() + " because:
"+e.getMessage());
+ System.out.println("Cannot find " + file.getName() + " because: " +
e.getMessage());
} catch (IOException e) {
System.out.println("CSV file read exception because: " + e.getMessage());
- } catch (SQLException e) {
- System.out.println("Database connection exception because: " +
e.getMessage());
+ } catch (IoTDBConnectionException | StatementExecutionException e) {
+ System.out.println("Meet error when insert csv because " +
e.getMessage());
} finally {
try {
- if (statement != null) {
- statement.close();
- }
- if (errorFlag) {
- FileUtils.forceDelete(errorFile);
- } else {
- System.out.println("Format of some lines in "+
file.getAbsolutePath() + " error, please "
- + "check "+errorFile.getAbsolutePath()+" for more information");
+ if (session != null) {
+ session.close();
}
- } catch (SQLException e) {
+ } catch (IoTDBConnectionException e) {
System.out.println("Sql statement can not be closed because: " +
e.getMessage());
- } catch (IOException e) {
- System.out.println("Close file error because: " + e.getMessage());
- }
- }
- }
-
- private static void executeSqls(BufferedWriter bw, List<String> tmp, long
startTime, File file)
- throws IOException {
- try {
- int[] result = statement.executeBatch();
- for (int i = 0; i < result.length; i++) {
- if (result[i] != TSStatusCode.SUCCESS_STATUS.getStatusCode() && i <
tmp.size()) {
- bw.write(tmp.get(i));
- bw.newLine();
- errorFlag = false;
- }
- }
- statement.clearBatch();
- tmp.clear();
- } catch (SQLException e) {
- bw.write(e.getMessage());
- bw.newLine();
- errorFlag = false;
- System.out.println("Cannot execute sql because: " + e.getMessage());
- }
- }
-
- private static boolean readAndGenSqls(BufferedReader br, Map<String, String>
timeseriesDataType,
- Map<String, ArrayList<Integer>> deviceToColumn, List<String> colInfo,
- List<String> headInfo, BufferedWriter bw, List<String> tmp, ProgressBar
pb) throws IOException {
- String line;
- count = 0;
- while ((line = br.readLine()) != null) {
- List<String> sqls;
- try {
- sqls = createInsertSQL(line, timeseriesDataType, deviceToColumn,
colInfo, headInfo);
- } catch (Exception e) {
- bw.write(String.format("error input line, maybe it is not complete:
%s", line));
- bw.newLine();
- System.out.println("Cannot create sql for " + line + " because: " +
e.getMessage());
- errorFlag = false;
- return false;
- }
- boolean success = addSqlsToBatch(sqls, tmp, bw);
- pb.step();
- if (!success) {
- return false;
- }
- }
- return true;
- }
-
- private static boolean addSqlsToBatch(List<String> sqls, List<String> tmp,
BufferedWriter bw)
- throws IOException {
- for (String str : sqls) {
- try {
- count++;
- statement.addBatch(str);
- tmp.add(str);
- checkBatchSize(bw, tmp);
-
- } catch (SQLException e) {
- bw.write(e.getMessage());
- bw.newLine();
- errorFlag = false;
- System.out.println("Cannot execute sql because: " + e.getMessage());
- return false;
- }
- }
- return true;
- }
-
-
- private static void checkBatchSize(BufferedWriter bw, List<String> tmp)
- throws SQLException, IOException {
- if (count == BATCH_EXECUTE_COUNT) {
- int[] result = statement.executeBatch();
- for (int i = 0; i < result.length; i++) {
- if (result[i] != Statement.SUCCESS_NO_INFO && i < tmp.size()) {
- bw.write(tmp.get(i));
- bw.newLine();
- errorFlag = false;
- }
}
- statement.clearBatch();
- count = 0;
- tmp.clear();
}
}
- private static boolean queryDatabaseMeta(String[] strHeadInfo, File file,
BufferedWriter bw,
- Map<String, String> timeseriesDataType, List<String> headInfo,
- Map<String, ArrayList<Integer>> deviceToColumn,
- List<String> colInfo)
- throws SQLException, IOException {
- try (Statement statement = connection.createStatement()) {
-
- for (int i = 1; i < strHeadInfo.length; i++) {
- statement.execute("show timeseries " + strHeadInfo[i]);
- ResultSet resultSet = statement.getResultSet();
- try {
- if (resultSet.next()) {
- /*
- * now the resultSet is like following, so the index of dataType
is 4
- *
+--------------+-----+-------------+--------+--------+-----------+
- | timeseries|alias|storage
group|dataType|encoding|compression|
-
+--------------+-----+-------------+--------+--------+-----------+
- |root.fit.d1.s1| null| root.fit.d1| INT32| RLE|
SNAPPY|
- |root.fit.d1.s2| null| root.fit.d1| TEXT| PLAIN|
SNAPPY|
-
+--------------+-----+-------------+--------+--------+-----------+
- */
- timeseriesDataType.put(strHeadInfo[i], resultSet.getString(4));
- } else {
- String errorInfo = String.format("Database cannot find %s in %s,
stop import!",
- strHeadInfo[i], file.getAbsolutePath());
- System.out.println("Database cannot find " + strHeadInfo[i] + " in
" + file.getAbsolutePath() + ", "
- + "stop import!");
- bw.write(errorInfo);
- return false;
- }
- } finally {
- resultSet.close();
- }
- headInfo.add(strHeadInfo[i]);
- String deviceInfo = strHeadInfo[i].substring(0,
strHeadInfo[i].lastIndexOf('.'));
-
- if (!deviceToColumn.containsKey(deviceInfo)) {
- deviceToColumn.put(deviceInfo, new ArrayList<>());
- }
- // storage every device's sensor index info
- deviceToColumn.get(deviceInfo).add(i - 1);
- colInfo.add(strHeadInfo[i].substring(strHeadInfo[i].lastIndexOf('.') +
1));
- }
- }
- return true;
- }
-
- private static List<String> createInsertSQL(String line, Map<String, String>
timeseriesToType,
- Map<String, ArrayList<Integer>> deviceToColumn,
- List<String> colInfo, List<String> headInfo) {
- String[] data = line.split(",", headInfo.size() + 1);
- List<String> sqls = new ArrayList<>();
- Iterator<Map.Entry<String, ArrayList<Integer>>> it =
deviceToColumn.entrySet().iterator();
- while (it.hasNext()) {
- Map.Entry<String, ArrayList<Integer>> entry = it.next();
- String sql = createOneSql(entry, data, colInfo, timeseriesToType,
headInfo);
- if (sql != null) {
- sqls.add(sql);
- }
- }
- return sqls;
- }
-
- private static String createOneSql(Map.Entry<String, ArrayList<Integer>>
entry, String[] data,
- List<String> colInfo, Map<String, String> timeseriesToType, List<String>
headInfo) {
- StringBuilder sbd = new StringBuilder();
- ArrayList<Integer> colIndex = entry.getValue();
- sbd.append("insert into ").append(entry.getKey()).append("(timestamp");
- int skipCount = 0;
- for (int j = 0; j < colIndex.size(); ++j) {
- if ("".equals(data[entry.getValue().get(j) + 1])) {
- skipCount++;
- continue;
- }
- sbd.append(", ").append(colInfo.get(colIndex.get(j)));
- }
- // define every device null value' number, if the number equal the
- // sensor number, the insert operation stop
- if (skipCount == entry.getValue().size()) {
- return null;
- }
-
- // TODO when timestampsStr is empty
- String timestampsStr = data[0];
- sbd.append(") values(").append(timestampsStr.trim().isEmpty()
- ? "NO TIMESTAMP" : timestampsStr);
- for (int j = 0; j < colIndex.size(); ++j) {
- if ("".equals(data[entry.getValue().get(j) + 1])) {
- continue;
- }
- if
(timeseriesToType.get(headInfo.get(colIndex.get(j))).equals(STRING_DATA_TYPE)) {
- /**
- * like csv line 1,100,'hello',200,300,400, we will read the third
field as 'hello',
- * so, if we add '', the field will be ''hello'', and IoTDB will be
failed
- * to insert the field.
- * Now, if we meet the string which is enclosed in quotation marks, we
should not add ''
- */
- if ((data[colIndex.get(j) + 1].startsWith("'") && data[colIndex.get(j)
+ 1].endsWith("'")) ||
- (data[colIndex.get(j) + 1].startsWith("\"") &&
data[colIndex.get(j) + 1].endsWith("\""))) {
- sbd.append(",").append(data[colIndex.get(j) + 1]);
- } else {
- sbd.append(", \'").append(data[colIndex.get(j) + 1]).append("\'");
- }
- } else {
- sbd.append(",").append(data[colIndex.get(j) + 1]);
- }
- }
- sbd.append(")");
- return sbd.toString();
- }
-
- public static void main(String[] args) throws IOException, SQLException {
+ public static void main(String[] args) throws IOException {
Options options = createOptions();
HelpFormatter hf = new HelpFormatter();
hf.setOptionComparator(null);
@@ -453,24 +218,49 @@ public class ImportCsv extends AbstractCsvTool {
}
}
+ private static long parseTime(String str) {
+
+ SimpleDateFormat format1 = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");
+ SimpleDateFormat format2 = new SimpleDateFormat("yyy-MM-dd HH:mm:ss");
+ SimpleDateFormat format3 = new
SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ");
+
+ try {
+ return Long.parseLong(str);
+ } catch (Exception ignored) {
+ logger.debug("It isn't a long time type");
+ }
+
+ try {
+ return format1.parse(str).getTime();
+ } catch (java.text.ParseException ignored) {
+ logger.debug(TIME_TYPE, format1.toPattern());
+ }
+
+ try {
+ return format2.parse(str).getTime();
+ } catch (java.text.ParseException ignored) {
+ logger.debug(TIME_TYPE, format2.toPattern());
+ }
+
+ try {
+ return format3.parse(str).getTime();
+ } catch (java.text.ParseException ignored) {
+ logger.debug(TIME_TYPE, format3.toPattern());
+ }
+
+ throw new IllegalArgumentException("Input time format " + str + "error.
Input like yyyy-MM-dd HH:mm:ss, yyyy-MM-ddTHH:mm:ss or
yyyy-MM-ddTHH:mm:ss.SSSZ");
+ }
+
private static void parseSpecialParams(CommandLine commandLine) {
timeZoneID = commandLine.getOptionValue(TIME_ZONE_ARGS);
}
public static void importCsvFromFile(String ip, String port, String username,
String password, String filename,
- String timeZone) throws SQLException {
- String property = System.getProperty(IOTDB_CLI_HOME);
- if (property == null) {
- errorInsertInfo = ERROR_INFO_STR;
- } else {
- errorInsertInfo = property + File.separatorChar + ERROR_INFO_STR;
- }
+ String timeZone) {
try {
- Class.forName(Config.JDBC_DRIVER_NAME);
- connection = (IoTDBConnection)
DriverManager.getConnection(Config.IOTDB_URL_PREFIX
- + ip + ":" + port + "/",
- username, password);
+ session = new Session(ip, Integer.parseInt(port), username, password);
+ session.open(false);
timeZoneID = timeZone;
setTimeZone();
@@ -480,33 +270,33 @@ public class ImportCsv extends AbstractCsvTool {
} else if (file.isDirectory()) {
importFromDirectory(file);
}
-
- } catch (ClassNotFoundException e) {
- System.out.println("Failed to import data because cannot find IoTDB JDBC
Driver, "
- + "please check whether you have imported driver or not: " +
e.getMessage());
- } catch (TException e) {
+ } catch (IoTDBConnectionException e) {
System.out.println("Encounter an error when connecting to server,
because " + e.getMessage());
- } catch (SQLException e){
- System.out.println("Encounter an error when importing data, error is: "
+ e.getMessage());
- } catch (Exception e) {
- System.out.println("Encounter an error, because: " + e.getMessage());
+ } catch (StatementExecutionException e) {
+ System.out
+ .println("Encounter an error when executing the statement, because "
+ e.getMessage());
} finally {
- if (connection != null) {
- connection.close();
+ if (session != null) {
+ try {
+ session.close();
+ } catch (IoTDBConnectionException e) {
+ System.out
+ .println("Encounter an error when closing the connection,
because " + e.getMessage());
+ }
}
}
}
private static void importFromSingleFile(File file) {
if (file.getName().endsWith(FILE_SUFFIX)) {
- loadDataFromCSV(file, 1);
+ loadDataFromCSV(file);
} else {
- System.out.println("File "+ file.getName() +" should ends with '.csv'
if you want to import");
+ System.out
+ .println("File " + file.getName() + " should ends with '.csv' if
you want to import");
}
}
private static void importFromDirectory(File file) {
- int i = 1;
File[] files = file.listFiles();
if (files == null) {
return;
@@ -515,17 +305,17 @@ public class ImportCsv extends AbstractCsvTool {
for (File subFile : files) {
if (subFile.isFile()) {
if (subFile.getName().endsWith(FILE_SUFFIX)) {
- loadDataFromCSV(subFile, i);
- i++;
+ loadDataFromCSV(subFile);
} else {
- System.out.println("File " + file.getName() + " should ends with
'.csv' if you want to import");
+ System.out
+ .println("File " + file.getName() + " should ends with '.csv' if
you want to import");
}
}
}
}
private static int getFileLineCount(File file) throws IOException {
- int line = 0;
+ int line;
try (LineNumberReader count = new LineNumberReader(new FileReader(file))) {
while (count.skip(Long.MAX_VALUE) > 0) {
// Loop just in case the file is > Long.MAX_VALUE or skip() decides to
not read the entire file
@@ -535,4 +325,84 @@ public class ImportCsv extends AbstractCsvTool {
}
return line;
}
+
+ private static void splitColToDeviceAndMeasurement(String col, Map<String,
List<Integer>> devicesToPositions, Map<String, List<String>>
devicesToMeasurements, int position) {
+ if (col.length() > 0) {
+ if (col.charAt(col.length() - 1) == TsFileConstant.DOUBLE_QUOTE) {
+ int endIndex = col.lastIndexOf('"', col.length() - 2);
+ // if a double quotes with escape character
+ while (endIndex != -1 && col.charAt(endIndex - 1) == '\\') {
+ endIndex = col.lastIndexOf('"', endIndex - 2);
+ }
+ if (endIndex != -1 && (endIndex == 0 || col.charAt(endIndex - 1) ==
'.')) {
+ putDeviceAndMeasurement(col.substring(0, endIndex - 1),
col.substring(endIndex),
+ devicesToPositions, devicesToMeasurements, position);
+ } else {
+ throw new IllegalArgumentException(ILLEGAL_PATH_ARGUMENT);
+ }
+ } else if (col.charAt(col.length() - 1) != TsFileConstant.DOUBLE_QUOTE
+ && col.charAt(col.length() - 1) !=
TsFileConstant.PATH_SEPARATOR_CHAR) {
+ int endIndex = col.lastIndexOf(TsFileConstant.PATH_SEPARATOR_CHAR);
+ if (endIndex < 0) {
+ putDeviceAndMeasurement("", col, devicesToPositions,
devicesToMeasurements, position);
+ } else {
+ putDeviceAndMeasurement(col.substring(0, endIndex),
col.substring(endIndex + 1),
+ devicesToPositions, devicesToMeasurements, position);
+ }
+ } else {
+ throw new IllegalArgumentException(ILLEGAL_PATH_ARGUMENT);
+ }
+ } else {
+ putDeviceAndMeasurement("", col, devicesToPositions,
devicesToMeasurements, position);
+ }
+ }
+
+ private static void putDeviceAndMeasurement(String device, String
measurement, Map<String, List<Integer>> devicesToPositions, Map<String,
List<String>> devicesToMeasurements, int position) {
+ if (devicesToMeasurements.get(device) == null &&
devicesToPositions.get(device) == null) {
+ List<String> measurements = new ArrayList<>();
+ measurements.add(measurement);
+ devicesToMeasurements.put(device, measurements);
+ List<Integer> positions = new ArrayList<>();
+ positions.add(position);
+ devicesToPositions.put(device, positions);
+ } else {
+ devicesToMeasurements.get(device).add(measurement);
+ devicesToPositions.get(device).add(position);
+ }
+ }
+
+ public static String[] splitCsvLine(String path) {
+ List<String> nodes = new ArrayList<>();
+ startIndex = 0;
+ for (i = 0; i < path.length(); i++) {
+ if (path.charAt(i) == ',') {
+ nodes.add(path.substring(startIndex, i));
+ startIndex = i + 1;
+ } else if (path.charAt(i) == '"') {
+ nextNode(path, nodes, '"');
+ } else if (path.charAt(i) == '\'') {
+ nextNode(path, nodes, '\'');
+ }
+ }
+ if (startIndex <= path.length() - 1) {
+ nodes.add(path.substring(startIndex));
+ }
+ return nodes.toArray(new String[0]);
+ }
+
+ public static void nextNode(String path, List<String> nodes, char enclose) {
+ int endIndex = path.indexOf(enclose, i + 1);
+ // if a double quotes with escape character
+ while (endIndex != -1 && path.charAt(endIndex - 1) == '\\') {
+ endIndex = path.indexOf(enclose, endIndex + 1);
+ }
+ if (endIndex != -1 && (endIndex == path.length() - 1 ||
path.charAt(endIndex + 1) == ',')) {
+ nodes.add(path.substring(startIndex + 1, endIndex));
+ i = endIndex + 1;
+ startIndex = endIndex + 2;
+ } else {
+ throw new IllegalArgumentException("Illegal csv line" + path);
+ }
+ }
+
}
diff --git a/cli/src/test/java/org/apache/iotdb/cli/AbstractScript.java
b/cli/src/test/java/org/apache/iotdb/cli/AbstractScript.java
index 1843c08..c15aa28 100644
--- a/cli/src/test/java/org/apache/iotdb/cli/AbstractScript.java
+++ b/cli/src/test/java/org/apache/iotdb/cli/AbstractScript.java
@@ -19,6 +19,7 @@
package org.apache.iotdb.cli;
import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertTrue;
import java.io.*;
import java.util.ArrayList;
@@ -45,7 +46,7 @@ public abstract class AbstractScript {
p.destroy();
for (int i = 0; i < output.length; i++) {
- assertEquals(output[output.length - 1 - i],
outputList.get(outputList.size() - 1 - i));
+ assertTrue(outputList.get(outputList.size() - 1 -
i).startsWith(output[output.length - 1 - i]));
}
}
diff --git a/cli/src/test/java/org/apache/iotdb/tool/ExportCsvTestIT.java
b/cli/src/test/java/org/apache/iotdb/tool/ExportCsvTestIT.java
index 96ea030..2673979 100644
--- a/cli/src/test/java/org/apache/iotdb/tool/ExportCsvTestIT.java
+++ b/cli/src/test/java/org/apache/iotdb/tool/ExportCsvTestIT.java
@@ -28,11 +28,11 @@ import org.junit.Test;
public class ExportCsvTestIT extends AbstractScript{
@Before
- public void setUp() throws Exception {
+ public void setUp() {
}
@After
- public void tearDown() throws Exception {
+ public void tearDown() {
}
@Test
@@ -50,8 +50,8 @@ public class ExportCsvTestIT extends AbstractScript{
final String[] output =
{"````````````````````````````````````````````````",
"Starting IoTDB Client Export Script",
"````````````````````````````````````````````````",
- "Encounter an error when exporting data, error is: Connection Error, "
- + "please check whether the network is available or the server has
started."};
+ "Connect failed because
org.apache.thrift.transport.TTransportException: "
+ + "java.net.ConnectException: Connection refused"};
String dir = getCliPath();
ProcessBuilder builder = new ProcessBuilder("cmd.exe", "/c",
dir + File.separator + "tools" + File.separator + "export-csv.bat",
@@ -64,8 +64,8 @@ public class ExportCsvTestIT extends AbstractScript{
final String[] output = {"------------------------------------------",
"Starting IoTDB Client Export Script",
"------------------------------------------",
- "Encounter an error when exporting data, error is: Connection Error, "
- + "please check whether the network is available or the server has
started."};
+ "Connect failed because
org.apache.thrift.transport.TTransportException: "
+ + "java.net.ConnectException: Connection refused"};
String dir = getCliPath();
ProcessBuilder builder = new ProcessBuilder("sh",
dir + File.separator + "tools" + File.separator + "export-csv.sh",
diff --git a/cli/src/test/java/org/apache/iotdb/tool/ImportCsvTestIT.java
b/cli/src/test/java/org/apache/iotdb/tool/ImportCsvTestIT.java
index 4b3bb26..6f7d2f4 100644
--- a/cli/src/test/java/org/apache/iotdb/tool/ImportCsvTestIT.java
+++ b/cli/src/test/java/org/apache/iotdb/tool/ImportCsvTestIT.java
@@ -28,11 +28,11 @@ import org.junit.Test;
public class ImportCsvTestIT extends AbstractScript {
@Before
- public void setUp() throws Exception {
+ public void setUp() {
}
@After
- public void tearDown() throws Exception {
+ public void tearDown() {
}
@Test
@@ -50,8 +50,8 @@ public class ImportCsvTestIT extends AbstractScript {
final String[] output =
{"````````````````````````````````````````````````",
"Starting IoTDB Client Import Script",
"````````````````````````````````````````````````",
- "Encounter an error when importing data, error is: Connection Error,
please check whether "
- + "the network is available or the server has started."};
+ "Encounter an error when connecting to server, because
org.apache.thrift.transport.TTransportException: "
+ + "java.net.ConnectException: Connection refused"};
String dir = getCliPath();
ProcessBuilder builder = new ProcessBuilder("cmd.exe", "/c",
dir + File.separator + "tools" + File.separator + "import-csv.bat",
@@ -64,8 +64,8 @@ public class ImportCsvTestIT extends AbstractScript {
final String[] output = {"------------------------------------------",
"Starting IoTDB Client Import Script",
"------------------------------------------",
- "Encounter an error when importing data, error is: Connection Error,
please check whether "
- + "the network is available or the server has started."};
+ "Encounter an error when connecting to server, because
org.apache.thrift.transport.TTransportException: "
+ + "java.net.ConnectException: Connection refused"};
String dir = getCliPath();
ProcessBuilder builder = new ProcessBuilder("sh",
dir + File.separator + "tools" + File.separator + "import-csv.sh",
diff --git a/docs/UserGuide/System Tools/CSV Tool.md b/docs/UserGuide/System
Tools/CSV Tool.md
new file mode 100644
index 0000000..29fc167
--- /dev/null
+++ b/docs/UserGuide/System Tools/CSV Tool.md
@@ -0,0 +1,85 @@
+<!--
+
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+-->
+# Csv tool
+
+Csv tool is that you can import csv file into IoTDB or export csv file from
IoTDB.
+
+## Usage of import-csv.sh
+
+### Create metadata (optional)
+
+```
+SET STORAGE GROUP TO root.fit.d1;
+SET STORAGE GROUP TO root.fit.d2;
+SET STORAGE GROUP TO root.fit.p;
+CREATE TIMESERIES root.fit.d1.s1 WITH DATATYPE=INT32,ENCODING=RLE;
+CREATE TIMESERIES root.fit.d1.s2 WITH DATATYPE=TEXT,ENCODING=PLAIN;
+CREATE TIMESERIES root.fit.d2.s1 WITH DATATYPE=INT32,ENCODING=RLE;
+CREATE TIMESERIES root.fit.d2.s3 WITH DATATYPE=INT32,ENCODING=RLE;
+CREATE TIMESERIES root.fit.p.s1 WITH DATATYPE=INT32,ENCODING=RLE;
+```
+
+IoTDB has the ability of type inference, so it is not necessary to create
metadata before data import. However, we still recommend creating metadata
before importing data using the CSV import tool, as this can avoid unnecessary
type conversion errors.
+
+### An example of import csv file
+
+```
+Time,root.fit.d1.s1,root.fit.d1.s2,root.fit.d2.s1,root.fit.d2.s3,root.fit.p.s1
+1,100,hello,200,300,400
+2,500,world,600,700,800
+3,900,"hello, \"world\"",1000,1100,1200
+```
+> Note that the following special characters in fields need to be checked
before importing:
+> 1. `,` : fields containing `,` should be quoted by a pair of `"` or a pair
of `'`.
+> 2. `"` : `"` in fields should be replaced by `\"` or fields should be
enclosed by `'`.
+> 3. `'` : `'` in fields should be replaced by `\'` or fields should be
enclosed by `"`.
+
+### Run import shell
+```
+# Unix/OS X
+> tools/import-csv.sh -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
+
+# Windows
+> tools\import-csv.bat -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
+```
+
+## Usage of export-csv.sh
+
+### Run export shell
+```
+# Unix/OS X
+> tools/export-csv.sh -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format> -s <sqlfile>]
+
+# Windows
+> tools\export-csv.bat -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format> -s <sqlfile>]
+```
+
+After running export shell, you need to input a data query, like `select *
from root`. or specify a sql file. If you have multi sql in a sql file, sql
should be split by new line character.
+
+an example sql file:
+
+```
+select * from root.fit.d1
+select * from root.sg1.d1
+```
+> Note that if fields exported by the export tool have the following special
characters:
+> 1. `,`: the field will be enclosed by `"`.
+> 2. `"`: the field will be enclosed by `"` and the original characters `"` in
the field will be replaced by `\"`.
diff --git a/docs/zh/UserGuide/System Tools/CSV Tool.md
b/docs/zh/UserGuide/System Tools/CSV Tool.md
new file mode 100644
index 0000000..2817c3f
--- /dev/null
+++ b/docs/zh/UserGuide/System Tools/CSV Tool.md
@@ -0,0 +1,88 @@
+<!--
+
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+-->
+
+# CSV 工具
+
+Csv工具是您可以导入csv文件到IoTDB或从IoTDB导出csv文件。
+
+## 使用 import-csv.sh
+
+### 创建元数据(可选)
+
+```
+SET STORAGE GROUP TO root.fit.d1;
+SET STORAGE GROUP TO root.fit.d2;
+SET STORAGE GROUP TO root.fit.p;
+CREATE TIMESERIES root.fit.d1.s1 WITH DATATYPE=INT32,ENCODING=RLE;
+CREATE TIMESERIES root.fit.d1.s2 WITH DATATYPE=TEXT,ENCODING=PLAIN;
+CREATE TIMESERIES root.fit.d2.s1 WITH DATATYPE=INT32,ENCODING=RLE;
+CREATE TIMESERIES root.fit.d2.s3 WITH DATATYPE=INT32,ENCODING=RLE;
+CREATE TIMESERIES root.fit.p.s1 WITH DATATYPE=INT32,ENCODING=RLE;
+```
+IoTDB具有类型推断的能力,因此在数据导入前创建元数据不是必须的。但我们仍然推荐在使用CSV导入工具导入数据前创建元数据,因为这可以避免不必要的类型转换错误。
+
+### 从 csv 文件导入数据的示例
+
+```
+Time,root.fit.d1.s1,root.fit.d1.s2,root.fit.d2.s1,root.fit.d2.s3,root.fit.p.s1
+1,100,hello,200,300,400
+2,500,world,600,700,800
+3,900,"hello, \"world\"",1000,1100,1200
+```
+
+> 注意,在导入数据前,需要特殊处理下列的字符:
+> 1. `,` : 包含`,`的字段需要使用单引号或者双引号括起来
+> 2. `"` : "字段中的`"`需要被替换成转义字符`\"`或者用`\'`将字段括起来。
+> 3. `'` : "字段中的`'`需要被替换成转义字符`\'`或者用`\"`将字段括起来。
+
+### 运行 import shell
+```
+# Unix/OS X
+> tools/import-csv.sh -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
+
+# Windows
+> tools\import-csv.bat -h <ip> -p <port> -u <username> -pw <password> -f
<xxx.csv>
+```
+
+## 使用 export-csv.sh
+
+### 运行 export shell
+
+```
+# Unix/OS X
+> tools/export-csv.sh -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format> -s <sqlfile>]
+
+# Windows
+> tools\export-csv.bat -h <ip> -p <port> -u <username> -pw <password> -td
<directory> [-tf <time-format> -s <sqlfile>]
+```
+
+在运行导出脚本之后,您需要输入一些查询或指定一些sql文件。如果在一个sql文件中有多个sql, sql应该被换行符分割。
+
+一个sql文件例子
+
+```
+select * from root.fit.d1
+select * from root.sg1.d1
+```
+
+> 注意,如果导出字段存在如下特殊字符:
+> 1. `,` : 整个字段会被用`"`括起来。
+> 2. `"` : 整个字段会被用`"`括起来且`"`会被替换为`\"`。
\ No newline at end of file
diff --git a/service-rpc/src/main/java/org/apache/iotdb/rpc/RpcUtils.java
b/service-rpc/src/main/java/org/apache/iotdb/rpc/RpcUtils.java
index 2bd09c5..2ab1bd9 100644
--- a/service-rpc/src/main/java/org/apache/iotdb/rpc/RpcUtils.java
+++ b/service-rpc/src/main/java/org/apache/iotdb/rpc/RpcUtils.java
@@ -125,5 +125,4 @@ public class RpcUtils {
resp.setStatus(tsStatus);
return resp;
}
-
}