790477691 opened a new issue, #23526:
URL: https://github.com/apache/shardingsphere/issues/23526
Service started loading data source error
```text
2023-01-12 15:55:20.059 INFO DESKTOP-D3UDU4O - [ main]
c.a.d.p.DruidDataSource : {dataSource-1} inited
2023-01-12 15:55:20.062 INFO DESKTOP-D3UDU4O - [ main]
c.h.v.d.DataSourceAwareService : 初始化基础数据源:DataSource-345723898 => {
CreateTime:"2023-01-12 15:55:17",
ActiveCount:0,
PoolingCount:5,
CreateCount:5,
DestroyCount:0,
CloseCount:0,
ConnectCount:0,
Connections:[
{ID:876420389, ConnectTime:"2023-01-12 15:55:18", UseCount:0,
LastActiveTime:"2023-01-12 15:55:18"},
{ID:1982703147, ConnectTime:"2023-01-12 15:55:18", UseCount:0,
LastActiveTime:"2023-01-12 15:55:18"},
{ID:15058406, ConnectTime:"2023-01-12 15:55:19", UseCount:0,
LastActiveTime:"2023-01-12 15:55:19"},
{ID:1994143461, ConnectTime:"2023-01-12 15:55:19", UseCount:0,
LastActiveTime:"2023-01-12 15:55:19"},
{ID:710199598, ConnectTime:"2023-01-12 15:55:19", UseCount:0,
LastActiveTime:"2023-01-12 15:55:19"}
]
}
[
{
ID:876420389,
poolStatements:[
]
},
{
ID:1982703147,
poolStatements:[
]
},
{
ID:15058406,
poolStatements:[
]
},
{
ID:1994143461,
poolStatements:[
]
},
{
ID:710199598,
poolStatements:[
]
}
]
2023-01-12 15:55:20.190 INFO DESKTOP-D3UDU4O - [ main]
c.z.h.HikariDataSource : HikariPool-1 - Starting...
2023-01-12 15:55:20.300 INFO DESKTOP-D3UDU4O - [ main]
c.z.h.HikariDataSource : HikariPool-1 - Start completed.
2023-01-12 15:55:20.765 WARN DESKTOP-D3UDU4O - [ main]
ConfigServletWebServerApplicationContext : Exception encountered during context
initialization - cancelling refresh attempt:
org.springframework.beans.factory.BeanCreationException: Error creating bean
with name 'dataSourceAwareService': Injection of autowired dependencies failed;
nested exception is Can't construct a java object for
tag:yaml.org,2002:com.alibaba.druid.filter.stat.StatFilter; exception=Class
`com.alibaba.druid.filter.stat.StatFilter` is not accepted
in 'string', line 76, column 5:
- &id001 !!com.alibaba.druid.filte ...
^
2023-01-12 15:55:20.765 INFO DESKTOP-D3UDU4O - [ main]
o.a.k.c.p.KafkaProducer : [Producer clientId=producer-1]
Closing the Kafka producer with timeoutMillis = 30000 ms.
2023-01-12 15:55:20.770 INFO DESKTOP-D3UDU4O - [ main]
o.a.c.c.StandardService : Stopping service [Tomcat]
2023-01-12 15:55:20.779 INFO DESKTOP-D3UDU4O - [ main]
ConditionEvaluationReportLoggingListener :
Error starting ApplicationContext. To display the conditions report re-run
your application with 'debug' enabled.
2023-01-12 15:55:20.789 ERROR DESKTOP-D3UDU4O - [ main]
o.s.b.SpringApplication : Application run failed
org.springframework.beans.factory.BeanCreationException: Error creating bean
with name 'dataSourceAwareService': Injection of autowired dependencies failed;
nested exception is Can't construct a java object for
tag:yaml.org,2002:com.alibaba.druid.filter.stat.StatFilter; exception=Class
`com.alibaba.druid.filter.stat.StatFilter` is not accepted
in 'string', line 76, column 5:
- &id001 !!com.alibaba.druid.filte ...
^
at
org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:380)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1411)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:592)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515)
at
org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320)
at
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318)
at
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199)
at
org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:845)
at
org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877)
at
org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549)
at
org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141)
at
org.springframework.boot.SpringApplication.refresh(SpringApplication.java:744)
at
org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:391)
at
org.springframework.boot.SpringApplication.run(SpringApplication.java:312)
at
org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:140)
at com.hnmjm.vehicle.spring.SpringBoot.startWeb(SpringBoot.java:46)
at com.hnmjm.vehicle.midsrv.MiddleServer.main(MiddleServer.java:53)
Caused by: Can't construct a java object for
tag:yaml.org,2002:com.alibaba.druid.filter.stat.StatFilter; exception=Class
`com.alibaba.druid.filter.stat.StatFilter` is not accepted
in 'string', line 76, column 5:
- &id001 !!com.alibaba.druid.filte ...
^
at
org.yaml.snakeyaml.constructor.Constructor$ConstructYamlObject.construct(Constructor.java:364)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObjectNoCheck(BaseConstructor.java:270)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObject(BaseConstructor.java:253)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructSequenceStep2(BaseConstructor.java:469)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructSequence(BaseConstructor.java:435)
at
org.yaml.snakeyaml.constructor.SafeConstructor$ConstructYamlSeq.construct(SafeConstructor.java:577)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObjectNoCheck(BaseConstructor.java:270)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObject(BaseConstructor.java:253)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructMapping2ndStep(BaseConstructor.java:581)
at
org.yaml.snakeyaml.constructor.SafeConstructor.constructMapping2ndStep(SafeConstructor.java:213)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructMapping(BaseConstructor.java:557)
at
org.yaml.snakeyaml.constructor.SafeConstructor$ConstructYamlMap.construct(SafeConstructor.java:600)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObjectNoCheck(BaseConstructor.java:270)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObject(BaseConstructor.java:253)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructMapping2ndStep(BaseConstructor.java:581)
at
org.yaml.snakeyaml.constructor.SafeConstructor.constructMapping2ndStep(SafeConstructor.java:213)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructMapping(BaseConstructor.java:557)
at
org.yaml.snakeyaml.constructor.Constructor$ConstructMapping.construct(Constructor.java:193)
at
org.yaml.snakeyaml.constructor.Constructor$ConstructYamlObject.construct(Constructor.java:358)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObjectNoCheck(BaseConstructor.java:270)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructObject(BaseConstructor.java:253)
at
org.yaml.snakeyaml.constructor.BaseConstructor.constructDocument(BaseConstructor.java:207)
at
org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:191)
at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:477)
at org.yaml.snakeyaml.Yaml.loadAs(Yaml.java:457)
at
org.apache.shardingsphere.infra.util.yaml.YamlEngine.unmarshal(YamlEngine.java:83)
at
org.apache.shardingsphere.mode.metadata.persist.service.config.database.DataSourcePersistService.getDataSourceProperties(DataSourcePersistService.java:83)
at
org.apache.shardingsphere.mode.metadata.persist.service.config.database.DataSourcePersistService.load(DataSourcePersistService.java:71)
at
org.apache.shardingsphere.mode.metadata.persist.MetaDataPersistService.getEffectiveDataSources(MetaDataPersistService.java:115)
at
org.apache.shardingsphere.mode.metadata.MetaDataContextsFactory.createEffectiveDatabaseConfiguration(MetaDataContextsFactory.java:104)
at
org.apache.shardingsphere.mode.metadata.MetaDataContextsFactory.lambda$createEffectiveDatabaseConfigurations$1(MetaDataContextsFactory.java:99)
at java.util.stream.Collectors.lambda$toMap$58(Collectors.java:1321)
at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169)
at java.util.Collections$2.tryAdvance(Collections.java:4719)
at java.util.Collections$2.forEachRemaining(Collections.java:4727)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at
org.apache.shardingsphere.mode.metadata.MetaDataContextsFactory.createEffectiveDatabaseConfigurations(MetaDataContextsFactory.java:98)
at
org.apache.shardingsphere.mode.metadata.MetaDataContextsFactory.create(MetaDataContextsFactory.java:86)
at
org.apache.shardingsphere.mode.metadata.MetaDataContextsFactory.create(MetaDataContextsFactory.java:68)
at
org.apache.shardingsphere.mode.manager.standalone.StandaloneContextManagerBuilder.build(StandaloneContextManagerBuilder.java:54)
at
org.apache.shardingsphere.driver.jdbc.core.datasource.ShardingSphereDataSource.createContextManager(ShardingSphereDataSource.java:81)
at
org.apache.shardingsphere.driver.jdbc.core.datasource.ShardingSphereDataSource.<init>(ShardingSphereDataSource.java:66)
at
org.apache.shardingsphere.driver.api.ShardingSphereDataSourceFactory.createDataSource(ShardingSphereDataSourceFactory.java:93)
at
org.apache.shardingsphere.driver.api.ShardingSphereDataSourceFactory.createDataSource(ShardingSphereDataSourceFactory.java:77)
at
org.apache.shardingsphere.driver.api.ShardingSphereDataSourceFactory.createDataSource(ShardingSphereDataSourceFactory.java:137)
at
com.hnmjm.vehicle.db.DataSourceAwareService.setDataSourceInitAware(DataSourceAwareService.java:133)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredMethodElement.inject(AutowiredAnnotationBeanPostProcessor.java:708)
at
org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:90)
at
org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:374)
... 17 more
Caused by: java.lang.IllegalArgumentException: Class
`com.alibaba.druid.filter.stat.StatFilter` is not accepted
at
com.google.common.base.Preconditions.checkArgument(Preconditions.java:220)
at
org.apache.shardingsphere.infra.util.yaml.constructor.ShardingSphereYamlConstructor.getClassForName(ShardingSphereYamlConstructor.java:62)
at
org.yaml.snakeyaml.constructor.Constructor.getClassForNode(Constructor.java:683)
at
org.yaml.snakeyaml.constructor.Constructor$ConstructYamlObject.getConstructor(Constructor.java:349)
at
org.yaml.snakeyaml.constructor.Constructor$ConstructYamlObject.construct(Constructor.java:358)
... 73 more
```
```text
<!--
https://mvnrepository.com/artifact/org.apache.shardingsphere/shardingsphere-jdbc-core
-->
<dependency>
<groupId>org.apache.shardingsphere</groupId>
<artifactId>shardingsphere-jdbc-core</artifactId>
<version>5.3.1</version>
<exclusions>
<exclusion>
<groupId>org.yaml</groupId>
<artifactId>snakeyaml</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.yaml/snakeyaml -->
<dependency>
<groupId>org.yaml</groupId>
<artifactId>snakeyaml</artifactId>
<version>1.33</version>
</dependency>
```
```text
# ============= 数据源基础配置(这边配置所有数据源共享属性) =============
# 连接池初始化大小,最小,最大
default.initialSize=2
default.minIdle=1
default.maxActive=500
# 配置获取连接等待超时的时间
default.maxWait=60000
# 配置间隔多久才进行一次检测,检测需要关闭的空闲连接,单位是毫秒
default.timeBetweenEvictionRunsMillis=3000
# 配置一个连接在池中最小生存的时间,单位是毫秒
default.minEvictableIdleTimeMillis=300000
default.validationQuery=SELECT 'x' FROM DUAL
default.testWhileIdle=true
default.testOnBorrow=false
default.testOnReturn=false
# 打开PSCache,并且指定每个连接上PSCache的大小
default.poolPreparedStatements=true
default.maxPoolPreparedStatementPerConnectionSize=20
# 配置监控统计拦截的filters
default.filters=stat
# 默认数据源
db.default=root
# ============= 具体数据源配置(若有需要,可覆盖基础配置属性) =============
# 数据源1
root1.driverClassName=${root-driver:@jdbc.driverClassName1@}
root1.url=${root-url:@jdbc.url1@}
root1.username=${root-username:@jdbc.username1@}
root1.password=${root-password:@jdbc.password1@}
root1.initialSize=5
root1.minIdle=2
root1.maxActive=60
# 数据源2
root2.driverClassName=${root-driver:@jdbc.driverClassName2@}
root2.url=${root-url:@jdbc.url2@}
root2.username=${root-username:@jdbc.username2@}
root2.password=${root-password:@jdbc.password2@}
root2.initialSize=5
root2.minIdle=2
root2.maxActive=60
# 数据源2
root3.driverClassName=${root-driver:@jdbc.driverClassName3@}
root3.url=${root-url:@jdbc.url3@}
root3.username=${root-username:@jdbc.username3@}
root3.password=${root-password:@jdbc.password3@}
root3.initialSize=5
root3.minIdle=2
root3.maxActive=60
# mysql数据源
[email protected]@
[email protected]@
[email protected]@
[email protected]@
mysql.initialSize=5
mysql.minIdle=2
mysql.maxActive=60
```
```text
Map<String, DataSource> dataSourceMap = new HashMap<>(); //设置分库映射
for (String dataSourceName :
dataSourceProperties.getDataSourceKeys()) {
final Properties properties =
DataSourceProperties.getPropertiesWithNewPrefix(dataSourceName, "druid");
DruidDataSource dataSource =
DataSourceProperties.getDataSourceTemplate();
dataSource.setDriverClassName(properties.getProperty("druid.driverClassName"));
dataSource.setUrl(properties.getProperty("druid.url"));
dataSource.setUsername(properties.getProperty("druid.username"));
dataSource.setPassword(properties.getProperty("druid.password"));
if(dataSourceName.contains("root")){
dataSourceMap.put(dataSourceName, dataSource);
}else{
dataSource.configFromPropety(properties);
dataSource.init();
dataSourceRegistry.registerDataSource(dataSourceName,
dataSource);
LOG.info("初始化基础数据源:" + dataSource.getName() + " => " +
dataSource);
}
}
if(dataSourceMap.size() > 0){
//配置分片策略
ShardingRuleConfiguration shardingRuleConfiguration = new
ShardingRuleConfiguration();
List<String> BroadcastTables = new ArrayList<>();
BroadcastTables.add("MJM_REPORT_LAST_POSITION");
BroadcastTables.add("MJM_8702");
BroadcastTables.add("T_SERVER_DB");
BroadcastTables.add("T_GPS_RECORD_LOG_JOB");
BroadcastTables.add("MJM_PICTURE_MATCH");
shardingRuleConfiguration.getBroadcastTables().addAll(BroadcastTables);
List<String> TableRuleConfigs = new ArrayList<>();
TableRuleConfigs.add("MJM_GPS_RECORD");
TableRuleConfigs.add("MJM_JS_ATTACHMENT");
TableRuleConfigs.add("MJM_JS_BLIND_AREA");
TableRuleConfigs.add("MJM_JS_DRIVER_STATUS");
TableRuleConfigs.add("MJM_JS_DRIVING_ASS");
TableRuleConfigs.add("MJM_WARNING");
// TableRuleConfigs.add("WARNING_COLLECT_DAY");
for (String tableName:TableRuleConfigs) {
ShardingTableRuleConfiguration tableRuleConfiguration = new
ShardingTableRuleConfiguration(tableName,"root$->{1..3}." + tableName);
tableRuleConfiguration.setDatabaseShardingStrategy(new
StandardShardingStrategyConfiguration("ID","inline"));
tableRuleConfiguration.setTableShardingStrategy(new
NoneShardingStrategyConfiguration());
shardingRuleConfiguration.getTables().add(tableRuleConfiguration);
Properties props = new Properties();
props.setProperty("algorithm-expression", "root${ID % 3 +
1}");
shardingRuleConfiguration.getShardingAlgorithms().put("inline", new
AlgorithmConfiguration("INLINE", props));
}
//
shardingRuleConfiguration.getBindingTableGroups().addAll(TableRuleConfigs);
DataSource shardingDataSource =
ShardingSphereDataSourceFactory.createDataSource(dataSourceMap ,
Collections.singleton(shardingRuleConfiguration), getProperties());
ShardingSphereDataSource shardingDataSource1 =
(ShardingSphereDataSource) shardingDataSource;
Connection connection = shardingDataSource1.getConnection();
ShardingSphereConnection connection1 =
(ShardingSphereConnection)connection;
dataSourceRegistry.registerDataSource("root",
shardingDataSource);
LOG.info("初始化ShardingSphere-JDBC基础数据源:" + shardingDataSource);
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail:
[email protected]
For queries about this service, please contact Infrastructure at:
[email protected]