Hi community,
I have tried to import data with
https://github.com/arangodb-helper/pokec-import into DC/OS arangodb3, when
I import relations.tsv and meet errors :
1. When I use the default --batch-size setting like
arangoimp --server.endpoint tcp://10.0.0.140:1027 --server.database
soc-pokec --from-collection-prefix profiles --to-collection-prefix
profiles --file relations.tsv --type tsv --collection "relations"
--create-collection true --create-collection-type edge
and got following errors:
Starting TSV import...
2017-02-23T06:17:09Z [370] INFO processed 31096832 bytes (3%) of input file
2017-02-23T06:17:09Z [370] INFO processed 62193664 bytes (6%) of input file
2017-02-23T06:17:10Z [370] INFO processed 93290496 bytes (9%) of input file
2017-02-23T06:17:10Z [370] INFO processed 124387328 bytes (12%) of input
file
2017-02-23T06:17:11Z [370] INFO processed 155484160 bytes (15%) of input
file
2017-02-23T06:17:11Z [370] INFO processed 186580992 bytes (18%) of input
file
2017-02-23T06:17:12Z [370] INFO processed 217645056 bytes (21%) of input
file
2017-02-23T06:17:12Z [370] INFO processed 248741888 bytes (24%) of input
file
2017-02-23T06:17:13Z [370] INFO processed 279838720 bytes (27%) of input
file
2017-02-23T06:17:14Z [370] INFO processed 310935552 bytes (30%) of input
file
2017-02-23T06:17:15Z [370] INFO processed 342032384 bytes (33%) of input
file
2017-02-23T06:17:15Z [370] INFO processed 373129216 bytes (36%) of input
file
2017-02-23T06:17:15Z [370] INFO processed 404193280 bytes (39%) of input
file
2017-02-23T06:20:42Z [370] ERROR error message: timeout in cluster
operation
2. When I set --batch-size equal to 512MB or more , the progress bar shows
the data was imported into arangodb
arangoimp --server.endpoint tcp://10.0.0.140:1027 --server.database
soc-pokec --batch-size 536870912 --from-collection-prefix profiles
--to-collection-prefix profiles --file relations.tsv --type tsv
--collection "relations" --create-collection true --create-collection-type
edge
Starting TSV import...
2017-02-23T06:04:49Z [368] INFO processed 31096832 bytes (3%) of input file
2017-02-23T06:04:49Z [368] INFO processed 62193664 bytes (6%) of input file
2017-02-23T06:04:50Z [368] INFO processed 93290496 bytes (9%) of input file
2017-02-23T06:04:50Z [368] INFO processed 124387328 bytes (12%) of input
file
2017-02-23T06:04:51Z [368] INFO processed 155484160 bytes (15%) of input
file
2017-02-23T06:04:51Z [368] INFO processed 186580992 bytes (18%) of input
file
2017-02-23T06:04:52Z [368] INFO processed 217645056 bytes (21%) of input
file
2017-02-23T06:04:52Z [368] INFO processed 248741888 bytes (24%) of input
file
2017-02-23T06:04:53Z [368] INFO processed 279838720 bytes (27%) of input
file
2017-02-23T06:04:54Z [368] INFO processed 310935552 bytes (30%) of input
file
2017-02-23T06:04:55Z [368] INFO processed 342032384 bytes (33%) of input
file
2017-02-23T06:04:55Z [368] INFO processed 373129216 bytes (36%) of input
file
2017-02-23T06:04:56Z [368] INFO processed 404193280 bytes (39%) of input
file
2017-02-23T06:08:20Z [368] INFO processed 435290112 bytes (42%) of input
file
2017-02-23T06:08:20Z [368] INFO processed 466386944 bytes (45%) of input
file
2017-02-23T06:08:21Z [368] INFO processed 497483776 bytes (48%) of input
file
2017-02-23T06:08:21Z [368] INFO processed 528580608 bytes (51%) of input
file
2017-02-23T06:08:21Z [368] INFO processed 559677440 bytes (54%) of input
file
2017-02-23T06:08:22Z [368] INFO processed 590774272 bytes (57%) of input
file
2017-02-23T06:08:22Z [368] INFO processed 621838336 bytes (60%) of input
file
2017-02-23T06:08:23Z [368] INFO processed 652935168 bytes (63%) of input
file
2017-02-23T06:08:23Z [368] INFO processed 684032000 bytes (66%) of input
file
2017-02-23T06:08:24Z [368] INFO processed 715128832 bytes (69%) of input
file
2017-02-23T06:08:24Z [368] INFO processed 746225664 bytes (72%) of input
file
2017-02-23T06:08:24Z [368] INFO processed 777322496 bytes (75%) of input
file
2017-02-23T06:08:25Z [368] INFO processed 808386560 bytes (78%) of input
file
2017-02-23T06:08:25Z [368] INFO processed 839483392 bytes (81%) of input
file
2017-02-23T06:08:26Z [368] INFO processed 870580224 bytes (84%) of input
file
2017-02-23T06:08:26Z [368] INFO processed 901677056 bytes (87%) of input
file
2017-02-23T06:08:27Z [368] INFO processed 932773888 bytes (90%) of input
file
2017-02-23T06:08:27Z [368] INFO processed 963870720 bytes (93%) of input
file
2017-02-23T06:08:27Z [368] INFO processed 994967552 bytes (96%) of input
file
2017-02-23T06:08:28Z [368] INFO processed 1026031616 bytes (99%) of input
file
created: 0
warnings/errors: 0
updated/replaced: 0
ignored: 0
lines read: 30622565
However, when I open the arangodb3 service UI and found that one of
ArangoDB-Server was suspend and unhealthy. The logs as follows:(I already
have imported document profiles )
2017-02-23T03:55:44Z [1] WARNING {heartbeat} DBServerAgencySync::execute
took longer than 30s to execute handlePlanChange()
2017-02-23T03:55:45Z [1] ERROR synchronizeOneShard: long call to
syncCollection for shard s100018 {"barrierId":"19462442","lastLogTick":
"19462214","collections":[{"id":"1100006","name":"s100018"}]} start time:
Thu Feb 23 2017 03:55:39 GMT+0000 (UTC) end time: Thu Feb 23 2017 03:55:45
GMT+0000 (UTC)
2017-02-23T03:55:45Z [1] INFO {replication} connected to master at
tcp://10.2.0.146:1026,
id 27822372102983, version 3.1, last log tick 19462487
2017-02-23T03:56:47Z [1] WARNING {heartbeat} DBServerAgencySync::execute
took longer than 30s to execute handlePlanChange()
2017-02-23T03:57:32Z [1] WARNING {heartbeat} DBServerAgencySync::execute
took longer than 30s to execute handlePlanChange()
2017-02-23T04:02:18Z [1] INFO {replication} connected to master at
tcp://10.2.0.146:1026,
id 27822372102983, version 3.1, last log tick 35157072
2017-02-23T04:02:28Z [1] ERROR synchronizeOneShard: long call to
syncCollection for shard s100018 {"barrierId":"35199249","lastLogTick":
"35157072","collections":[{"id":"1100006","name":"s100018"}]} start time:
Thu Feb 23 2017 04:02:17 GMT+0000 (UTC) end time: Thu Feb 23 2017 04:02:28
GMT+0000 (UTC)
2017-02-23T04:02:28Z [1] ERROR syncCollectionFinalize: remove { "tick" :
"35157078", "type" : 2302, "tid" : "0", "database" : "1", "cid" : "1100006",
"cname" : "s100018", "data" : { "_key" : "14430444", "_rev" : "_UkigPLu--C"
} } {"errorNum":1202,"errorMessage":"document not found"}
2017-02-23T04:02:28Z [1] ERROR syncCollectionFinalize: remove { "tick" :
"35157081", "type" : 2302, "tid" : "0", "database" : "1", "cid" : "1100006",
"cname" : "s100018", "data" : { "_key" : "119440", "_rev" : "_UkigPli--A" }
} {"errorNum":1202,"errorMessage":"document not found"}
2017-02-23T04:02:28Z [1] ERROR syncCollectionFinalize: remove { "tick" :
"35157083", "type" : 2302, "tid" : "0", "database" : "1", "cid" : "1100006",
"cname" : "s100018", "data" : { "_key" : "14430446", "_rev" : "_UkigPli--C"
} } {"errorNum":1202,"errorMessage":"document not found"}
2017-02-23T04:02:28Z [1] ERROR syncCollectionFinalize: remove { "tick" :
"35157085", "type" : 2302, "tid" : "0", "database" : "1", "cid" : "1100006",
"cname" : "s100018", "data" : { "_key" : "119442", "_rev" : "_UkigPlm--A" }
} {"errorNum":1202,"errorMessage":"document not found"}
.....
At last, ,configure of arangodb3 is :
2 * arangodb3-Coordinator each one with 1CPU & 4GB Memory
2 * arangodb3-DBServer each one with 1CPU & 4GB Memory
hope for your help ....
--
You received this message because you are subscribed to the Google Groups
"ArangoDB" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.