项目中采用Spring Boot + MyBatis + Druid的架构,在原数据源的基础上需要添加一个新的数据源。
调试期间,发现添加了SqlSessionFactoryBean后,原数据源有一部分字段无法取值,后发现是application.yml中的配置失效,所以需要在SqlSessionFactoryBean中添加所需配置。
1. 在application.yml下配置两套数据源信息
默认数据源:
spring:
datasource:
type: com.alibaba.druid.pool.DruidDataSource
driverClassName: com.mysql.jdbc.Driver
url: jdbc:mysql://192.168.1.10:3306/database1?useUnicode=true&characterEncoding=utf8
username: root
password: ****
initialSize: 1
minIdle: 3
maxActive: 20
# 配置获取连接等待超时的时间
maxWait: 60000
# 配置间隔多久才进行一次检测,检测需要关闭的空闲连接,单位是毫秒
timeBetweenEvictionRunsMillis: 60000
# 配置一个连接在池中最小生存的时间,单位是毫秒
minEvictableIdleTimeMillis: 30000
validationQuery: select 'x'
testWhileIdle: true
testOnBorrow: false
testOnReturn: false
新增数据源
hana:
datasource:
type: com.alibaba.druid.pool.DruidDataSource
driverClassName: com.mysql.jdbc.Driver
url: jdbc:mysql://192.168.6.12:3306/hana?useUnicode=true&characterEncoding=utf8
username: root
password: 1234
initialSize: 1
minIdle: 3
maxActive: 20
# 配置获取连接等待超时的时间
maxWait: 60000
# 配置间隔多久才进行一次检测,检测需要关闭的空闲连接,单位是毫秒
timeBetweenEvictionRunsMillis: 60000
# 配置一个连接在池中最小生存的时间,单位是毫秒
minEvictableIdleTimeMillis: 30000
validationQuery: select 'x'
testWhileIdle: true
testOnBorrow: false
testOnReturn: false
2. 针对两个数据源分别写配置类DruidDBConfig.java
要点:
a. basePackages 指定该数据源对应的MyBatis的Mapper类所在的包,表示指定包名下的数据库操作使用该数据源;
@MapperScan(basePackages = {"com.simba.*.dao"}, sqlSessionFactoryRef = "masterSqlSessionFactory ")
sqlSessionFactoryRef 指向了下面定义的bean
b. 定义一个masterSqlSessionFactory bean
如下代码指定了MyBatis的XML Mapping文件的路径和MapUnderscoreToCamelCase属性
Resource[] resource = new PathMatchingResourcePatternResolver().getResources("classpath:mybatis/**/*Mapper.xml");
sqlSessionFactoryBean.setMapperLocations(resource);
org.apache.ibatis.session.Configuration configuration = new org.apache.ibatis.session.Configuration();
configuration.setMapUnderscoreToCamelCase(true);
它替代了原来application.yml中的配置,需注意的是,加了sqlSessionFactory bean 后,如下配置将不再生效,因此相关配置需在sqlSessionFactory 中完成。
#[弃用]因为多数据源需要,MyBatis配置弃用,具体配置见DruidDBConfig
#mybatis:
# configuration:
# map-underscore-to-camel-case: true
# mapper-locations: mybatis/**/*Mapper.xml
# typeAliasesPackage: com.simba.**.domain
默认数据源的配置Bean
@Configuration
@MapperScan(basePackages = {"com.simba.*.dao"}, sqlSessionFactoryRef = "masterSqlSessionFactory")//指定该数据源相应的MyBatis的Mapper类所在的包
public class DruidDBConfig {
private Logger logger = LoggerFactory.getLogger(DruidDBConfig.class);
@Value("${spring.datasource.url}")
private String dbUrl;
@Value("${spring.datasource.username}")
private String username;
@Value("${spring.datasource.password}")
private String password;
@Value("${spring.datasource.driverClassName}")
private String driverClassName;
@Value("${spring.datasource.initialSize}")
private int initialSize;
@Value("${spring.datasource.minIdle}")
private int minIdle;
@Value("${spring.datasource.maxActive}")
private int maxActive;
@Value("${spring.datasource.maxWait}")
private int maxWait;
@Value("${spring.datasource.timeBetweenEvictionRunsMillis}")
private int timeBetweenEvictionRunsMillis;
@Value("${spring.datasource.minEvictableIdleTimeMillis}")
private int minEvictableIdleTimeMillis;
@Value("${spring.datasource.validationQuery}")
private String validationQuery;
@Value("${spring.datasource.testWhileIdle}")
private boolean testWhileIdle;
@Value("${spring.datasource.testOnBorrow}")
private boolean testOnBorrow;
@Value("${spring.datasource.testOnReturn}")
private boolean testOnReturn;
@Bean(initMethod = "init", destroyMethod = "close") //声明其为Bean实例
@Primary //在同样的DataSource中,首先使用被标注的DataSource
public DataSource dataSource() {
DruidDataSource datasource = new DruidDataSource();
datasource.setUrl(this.dbUrl);
datasource.setUsername(username);
datasource.setPassword(password);
datasource.setDriverClassName(driverClassName);
//configuration
datasource.setInitialSize(initialSize);
datasource.setMinIdle(minIdle);
datasource.setMaxActive(maxActive);
datasource.setMaxWait(maxWait);
datasource.setTimeBetweenEvictionRunsMillis(timeBetweenEvictionRunsMillis);
datasource.setMinEvictableIdleTimeMillis(minEvictableIdleTimeMillis);
datasource.setValidationQuery(validationQuery);
datasource.setTestWhileIdle(testWhileIdle);
datasource.setTestOnBorrow(testOnBorrow);
datasource.setTestOnReturn(testOnReturn);
return datasource;
}
//创建Session
@Bean(name="masterSqlSessionFactory")
@Primary
public SqlSessionFactory masterSqlSessionFactory(@Qualifier("dataSource") DataSource dataSource) throws Exception{
final SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setDataSource(dataSource);
Resource[] resource = new PathMatchingResourcePatternResolver().getResources("classpath:mybatis/**/*Mapper.xml");//MyBatis XML文件所在路径
sqlSessionFactoryBean.setMapperLocations(resource);
org.apache.ibatis.session.Configuration configuration = new org.apache.ibatis.session.Configuration();
configuration.setMapUnderscoreToCamelCase(true);
sqlSessionFactoryBean.setConfiguration(configuration);
return sqlSessionFactoryBean.getObject();
}
}
新增数据源的配置bean
@Configuration
@MapperScan(basePackages = "com.hana.dao", sqlSessionFactoryRef = "hanaSqlSessionFactory")
public class HanaDruidDBConfig {
private Logger logger = LoggerFactory.getLogger(HanaDruidDBConfig.class);
@Value("${hana.datasource.url}")
private String dbUrl;
@Value("${hana.datasource.username}")
private String username;
@Value("${hana.datasource.password}")
private String password;
@Value("${hana.datasource.driverClassName}")
private String driverClassName;
@Value("${hana.datasource.initialSize}")
private int initialSize;
@Value("${hana.datasource.minIdle}")
private int minIdle;
@Value("${hana.datasource.maxActive}")
private int maxActive;
@Value("${hana.datasource.maxWait}")
private int maxWait;
@Value("${hana.datasource.timeBetweenEvictionRunsMillis}")
private int timeBetweenEvictionRunsMillis;
@Value("${hana.datasource.minEvictableIdleTimeMillis}")
private int minEvictableIdleTimeMillis;
@Value("${hana.datasource.validationQuery}")
private String validationQuery;
@Value("${hana.datasource.testWhileIdle}")
private boolean testWhileIdle;
@Value("${hana.datasource.testOnBorrow}")
private boolean testOnBorrow;
@Value("${hana.datasource.testOnReturn}")
private boolean testOnReturn;
@Bean(initMethod = "init", destroyMethod = "close") //声明其为Bean实例
public DataSource hanaDataSource() {
DruidDataSource datasource = new DruidDataSource();
datasource.setUrl(this.dbUrl);
datasource.setUsername(username);
datasource.setPassword(password);
datasource.setDriverClassName(driverClassName);
//configuration
datasource.setInitialSize(initialSize);
datasource.setMinIdle(minIdle);
datasource.setMaxActive(maxActive);
datasource.setMaxWait(maxWait);
datasource.setTimeBetweenEvictionRunsMillis(timeBetweenEvictionRunsMillis);
datasource.setMinEvictableIdleTimeMillis(minEvictableIdleTimeMillis);
datasource.setValidationQuery(validationQuery);
datasource.setTestWhileIdle(testWhileIdle);
datasource.setTestOnBorrow(testOnBorrow);
datasource.setTestOnReturn(testOnReturn);
return datasource;
}
//创建Session
@Bean(name="hanaSqlSessionFactory")
public SqlSessionFactory hanaSqlSessionFactory(@Qualifier("hanaDataSource") DataSource hanaDataSource) throws Exception{
final SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setDataSource(hanaDataSource);
/*Resource[] resource = new PathMatchingResourcePatternResolver().getResources(ClusterConfig.MAPPER_LOCATION);
sqlSessionFactoryBean.setMapperLocations(resource);*/
return sqlSessionFactoryBean.getObject();
}
}
这样就完成了配置,根据MapperScan中所指定的包名,自动选择相应数据源进行数据操作。