Thank for your reply. 1. Yes, I have persistence. 2. I think the cache store is not the bottleneck, because the skipStore is enabled when loading data. IgniteDataStreamer<DpKey, BinaryObject> streamer = ignite.dataStreamer(IgniteCacheKey.DATA_POINT_NEW.getCode()); streamer.skipStore(true); streamer.keepBinary(true); streamer.perNodeBufferSize(10000); streamer.perNodeParallelOperations(32);
------------------ ???????? ------------------ ??????: "Ilya Kasnacheev"<[email protected]>; ????????: 2019??2??27??(??????) ????9:59 ??????: "user"<[email protected]>; ????: Re: Ignite Data Streamer Hung after a period Hello! It's hard to say. Do you have persistence? Are you sure that cache store is not the bottleneck? I would start with gathering thread dumps from whole cluster when in stuck state. Regards, -- Ilya Kasnacheev ????, 27 ????????. 2019 ??. ?? 15:06, Justin Ji <[email protected]>: Dmitry - I also encountered this problem. I used both persistence and indexing, when I loaded 20 million records, the loading speed became much slower than before, but the CPU of the ignite server is low. <http://apache-ignite-users.70518.x6.nabble.com/file/t2000/WX20190227-200059.png> Here is my cache configuration: CacheConfiguration<K, V> cacheCfg = new CacheConfiguration(); cacheCfg.setName(cacheName); cacheCfg.setCacheMode(CacheMode.PARTITIONED); cacheCfg.setBackups(1); cacheCfg.setAtomicityMode(CacheAtomicityMode.ATOMIC); cacheCfg.setCacheStoreFactory(FactoryBuilder.factoryOf(DataPointCacheStore.class)); cacheCfg.setWriteThrough(true); cacheCfg.setWriteBehindEnabled(true); cacheCfg.setWriteBehindFlushThreadCount(2); cacheCfg.setWriteBehindFlushFrequency(15 * 1000); cacheCfg.setWriteBehindFlushSize(409600); cacheCfg.setWriteBehindBatchSize(1024); cacheCfg.setStoreKeepBinary(true); cacheCfg.setQueryParallelism(16); cacheCfg.setRebalanceBatchSize(2 * 1024 * 1024); cacheCfg.setRebalanceThrottle(100); CacheKeyConfiguration cacheKeyConfiguration = new CacheKeyConfiguration(DpKey.class); cacheCfg.setKeyConfiguration(cacheKeyConfiguration); List<QueryEntity> entities = Lists.newArrayList(); QueryEntity entity = new QueryEntity(DpKey.class.getName(), DpCache.class.getName()); entity.setTableName(IgniteTableKey.T_DATA_POINT_NEW.getCode()); LinkedHashMap<String, String> map = new LinkedHashMap<>(); map.put("id", "java.lang.String"); map.put("gmtCreate", "java.lang.Long"); map.put("gmtModified", "java.lang.Long"); map.put("devId", "java.lang.String"); map.put("dpId", "java.lang.Integer"); map.put("code", "java.lang.String"); map.put("name", "java.lang.String"); map.put("customName", "java.lang.String"); map.put("mode", "java.lang.String"); map.put("type", "java.lang.String"); map.put("value", "java.lang.String"); map.put("rawValue", byte[].class.getName()); map.put("time", "java.lang.Long"); map.put("status", "java.lang.Boolean"); map.put("uuid", "java.lang.String"); entity.setFields(map); QueryIndex devIdIdx = new QueryIndex("devId"); devIdIdx.setName("idx_devId"); devIdIdx.setInlineSize(128); List<QueryIndex> indexes = Lists.newArrayList(devIdIdx); entity.setIndexes(indexes); entities.add(entity); cacheCfg.setQueryEntities(entities); Can you give me some advice on where to start solving these problems? -- Sent from: http://apache-ignite-users.70518.x6.nabble.com/
