Hello everyone,
I have been running some experiments on current master AsterixDB with
default merge policy. I am using simple OpenStreetMap and Twitter dataset
with id used as primary key and a Rtree index on location attribute.
Then I used a feed to ingest the data to the database. After running some
time (about 2 million insertion), the Rtree index
(LSMRtreeWithAntiMatterTuples) are throwing a strange exception like file
already exists: After this exception, cluster infinitely giving this
exception repetitively and become unusable.
20:36:55.980 [Executor-24:asterix_nc1] ERROR
org.apache.hyracks.storage.am.lsm.common.impls.LSMHarness - Failed merge
operation on {"class" : "LSMRTreeWithAntiMatterTuples", "dir" :
"/home/mohiuddin/asterix-hyracks/asterixdb/target/io/dir/asterix_nc1/target/tmp/asterix_nc1/iodevice1/storage/partition_0/experiments/OpenStreetMap/0/OSMlocation",
"memory" : 2, "disk" : 5}
org.apache.hyracks.api.exceptions.HyracksDataException: HYR0082: Failed to
create the file
/home/mohiuddin/asterix-hyracks/asterixdb/target/io/dir/asterix_nc1/target/tmp/asterix_nc1/iodevice1/storage/partition_0/experiments/OpenStreetMap/0/OSMlocation/2018-07-14-20-36-09-733_2018-07-14-20-34-55-555
because it already exists
at
org.apache.hyracks.api.exceptions.HyracksDataException.create(HyracksDataException.java:55)
~[classes/:?]
at org.apache.hyracks.api.util.IoUtil.create(IoUtil.java:87)
~[classes/:?]
at
org.apache.hyracks.storage.common.buffercache.BufferCache.createFile(BufferCache.java:809)
~[classes/:?]
at
org.apache.hyracks.storage.am.common.impls.AbstractTreeIndex.create(AbstractTreeIndex.java:83)
~[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.AbstractLSMDiskComponent.activate(AbstractLSMDiskComponent.java:158)
~[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.AbstractLSMIndex.createDiskComponent(AbstractLSMIndex.java:427)
~[classes/:?]
at
org.apache.hyracks.storage.am.lsm.rtree.impls.LSMRTreeWithAntiMatterTuples.doMerge(LSMRTreeWithAntiMatterTuples.java:237)
~[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.AbstractLSMIndex.merge(AbstractLSMIndex.java:728)
~[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.LSMHarness.merge(LSMHarness.java:645)
[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.LSMTreeIndexAccessor.merge(LSMTreeIndexAccessor.java:128)
[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.MergeOperation.call(MergeOperation.java:45)
[classes/:?]
at
org.apache.hyracks.storage.am.lsm.common.impls.MergeOperation.call(MergeOperation.java:30)
[classes/:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[?:1.8.0_45-internal]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[?:1.8.0_45-internal]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[?:1.8.0_45-internal]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_45-internal]
I have tried to reinstall everything multiple times, removed all old
storage files, giving pause during and before ingestion through feeds.
Nothing seems to be working, every time this exception is occurring after
some ingestion. Can any of you guys have idea on what is happening? I have
attached the DDL I was using.
--
Regards,
Mohiuddin Abdul Qader
Dept of Computer Science
University of California Riverside
drop dataverse experiments if exists;
create dataverse experiments;
use experiments;
create type OpenStreetMapType as closed {
id: int64,
location: point,
body: string
};
create dataset OpenStreetMap(OpenStreetMapType) primary key id;
create index OSMlocation on OpenStreetMap(location) type rtree;
use experiments;
create feed OSMFeed with
{
"adapter-name" : "socket_adapter",
"sockets" : "127.0.0.1:10001",
"address-type" : "IP",
"type-name" : "OpenStreetMapType",
"format" : "adm"
};
connect feed OSMFeed to dataset OpenStreetMap;
start feed OSMFeed;