ZhongDeyin created SQOOP-1315:
---------------------------------
Summary: Cannot export data from hdfs to mysql when record size is
rowsPerBatch*N
Key: SQOOP-1315
URL: https://issues.apache.org/jira/browse/SQOOP-1315
Project: Sqoop
Issue Type: Bug
Components: sqoop2-server
Affects Versions: 1.99.3
Reporter: ZhongDeyin
When record size is rowsPerBatch*N in HDFS, cannot export data to
mysql.(default rowsPerBatch=100, N is positive Integer)
Source Code [GenericJdbcExportLoader.java]:
public static final int DEFAULT_ROWS_PER_BATCH = 100;
public static final int DEFAULT_BATCHES_PER_TRANSACTION = 100;
private int rowsPerBatch = DEFAULT_ROWS_PER_BATCH;
private int batchesPerTransaction = DEFAULT_BATCHES_PER_TRANSACTION;
..................................
..................................
int numberOfRows = 0;
int numberOfBatches = 0;
Object[] array;
while ((array = context.getDataReader().readArrayRecord()) != null)
{
numberOfRows++;
executor.addBatch(array);
if (numberOfRows == rowsPerBatch) {
numberOfBatches++;
if (numberOfBatches == batchesPerTransaction) {
executor.executeBatch(true);
numberOfBatches = 0;
} else {
executor.executeBatch(false); //no
commit, only prepare preparedStatement
}
numberOfRows = 0;
}
}
if (numberOfRows != 0) {
// execute and commit the remaining rows
executor.executeBatch(true);
}
executor.endBatch();
Source Code [GenericJdbcExecutor.java]:
public void endBatch() {
try {
if (preparedStatement != null) {
preparedStatement.close();
}
} catch (SQLException e) {
throw new
SqoopException(GenericJdbcConnectorError.GENERIC_JDBC_CONNECTOR_0002, e);
}
}
For example:
300 record in HDFS, rowsPerBatch is 100, 300%100=0,execute three times
executor.executeBatch(false), without commit transaction, final numberOfRows=0,
numberOfBatches=3, execute executor.endBatch(), but no commit in this method,
no data will export to mysql.
--
This message was sent by Atlassian JIRA
(v6.2#6252)