|
I tryed to import ~10 000 000 records from ~2000
files via Java DB Loder API:
Demo:
... loader
= new Loader(prop);
String
command = "USE USER TEST
TEST";
loader.cmd(command);
loader.cmd("AUTOCOMMIT OFF");
File
directory = new
File(inputDirectory);
File[] files = directory.listFiles(); for (int i = 0; i < files.length; i++) { String fileName = files[i].getAbsolutePath(); command = "DATALOAD TABLE test INFILE '"+fileName+"'"; loader.cmd(command); loader.cmd("COMMIT"); } // for ... Result:
java.lang.StringIndexOutOfBoundsException: String
index out of range: -4
at java.lang.String.checkBounds(String.java:292) at java.lang.String.<init>(String.java:330) at com.sap.dbtech.util.StructuredBytes.getString(StructuredBytes.java:225) at com.sap.dbtech.powertoys.LoaderException.create(LoaderException.java:72) at com.sap.dbtech.powertoys.Loader.cmd(Loader.java:126) at Import.main(Import.java:103) Has somebody tried to use SAP DB with such amount
of data? How to handle "large" amount? At least how to load....
/Pranas
P.S. There are enought space for
data.
|
- big data sets Pranas Baliuka
- big data sets Stefan Groschupf
- RE: [DB Loader] Sress test failed Schroeder, Alexander
- RE: [DB Loader] Sress test failed Bumes, Hans-Georg
