springboot项目多数据源配置,集成hive和mysql
最近做的这个项目,原本数据库是用的mysql,后来需求问题,经过讨论后打算把hive集成进来,就是有一批数据入hive。经过昨晚的尝试加上上网查阅资料总算是集成好了。集成好了后的效果就是mysql和hive互不影响可独立使用,而且也不会影响mybatis的 使用,既然集成应该都是要这种效果吧。代码有点多,不要着急,咱们一步一步的来第一步pom文件:以下jar如果你已经有了就无需在添...
最近做的这个项目,原本数据库是用的mysql,后来需求问题,经过讨论后打算把hive集成进来,就是有一批数据入hive。
经过昨晚的尝试加上上网查阅资料总算是集成好了。
集成好了后的效果就是mysql和hive互不影响可独立使用,而且也不会影响mybatis的 使用,既然集成应该都是要这种效果吧。
代码有点多,不要着急,咱们一步一步的来
第一步pom文件:以下jar如果你已经有了就无需在添加,如果没有就添加吧,hive的版本号一定要与你的hive服务版本号一致
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
<!-- 添加数据库连接池 -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
<version>1.0.29</version>
</dependency>
<!-- 添加spring管理bean对象 -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<!-- 添加hadoop依赖 -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
</dependency>
<!-- 添加hive依赖 -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1</version>
<exclusions>
<exclusion>
<groupId>org.eclipse.jetty.aggregate</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
第二步,yml的配置写法:配置的地方跟这个所对应,目的是把mysql和hive区分开,其他的配置不需要动。
spring:
application:
name: zhongtai
datasource:
mysqlMain: #mysql主数据源
type: com.alibaba.druid.pool.DruidDataSource
password: 123456
driver-class-name: com.mysql.cj.jdbc.Driver
url: jdbc:mysql://连接地址:3306/库名?useUnicode=true&characterEncoding=utf8&useSSL=false&allowMultiQueries=true
username: root
hive: #hive数据源
url: jdbc:hive2://连接地址:10000/库名
type: com.alibaba.druid.pool.DruidDataSource
username: root
password: ******
driver-class-name: org.apache.hive.jdbc.HiveDriver
common-config: #连接池统一配置,应用到所有的数据源
initialSize: 1
minIdle: 1
maxIdle: 5
maxActive: 50
maxWait: 10000
timeBetweenEvictionRunsMillis: 10000
minEvictableIdleTimeMillis: 300000
validationQuery: select 'x'
testWhileIdle: true
testOnBorrow: false
testOnReturn: false
poolPreparedStatements: true
maxOpenPreparedStatements: 20
filters: stat
第三步,就是建管理配置文件的几个类,位置无所谓,但是为了归类不乱,最好放一块
1.HiveDruidConfig.java hive数据源配置类
package cn.**.**.system.config.druid.hive;
import javax.sql.DataSource;
import cn.cnic.zhongtai.system.config.DataSourceCommonProperties;
import cn.cnic.zhongtai.system.config.DataSourceProperties;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.alibaba.druid.pool.DruidDataSource;
import java.sql.SQLException;
/**
*
* hive 数据源配置类
* Created by wdd on 2019/11/5
*
**/
@Configuration
@EnableConfigurationProperties({DataSourceProperties.class, DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class HiveDruidConfig {
private static Logger logger = LoggerFactory.getLogger(HiveDruidConfig.class);
@Autowired
private DataSourceProperties dataSourceProperties;
@Autowired
private DataSourceCommonProperties dataSourceCommonProperties;
@Bean("hiveDruidDataSource") //新建bean实例
@Qualifier("hiveDruidDataSource")//标识
public DataSource dataSource(){
DruidDataSource datasource = new DruidDataSource();
//配置数据源属性
datasource.setUrl(dataSourceProperties.getHive().get("url"));
datasource.setUsername(dataSourceProperties.getHive().get("username"));
datasource.setPassword(dataSourceProperties.getHive().get("password"));
datasource.setDriverClassName(dataSourceProperties.getHive().get("driver-class-name"));
//配置统一属性
datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
datasource.setTimeBetweenEvictionRunsMillis(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
datasource.setMinEvictableIdleTimeMillis(dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
datasource.setPoolPreparedStatements(dataSourceCommonProperties.isPoolPreparedStatements());
try {
datasource.setFilters(dataSourceCommonProperties.getFilters());
} catch (SQLException e) {
logger.error("Druid configuration initialization filter error.", e);
}
return datasource;
}
}
2.MysqlMainDruidConfig.java 配置mysql主数据源,mysql连接默认主数据源
package cn.**.**.system.config.druid.mysql;
import java.sql.SQLException;
import javax.sql.DataSource;
import cn.cnic.zhongtai.system.config.DataSourceCommonProperties;
import cn.cnic.zhongtai.system.config.DataSourceProperties;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import com.alibaba.druid.pool.DruidDataSource;
/**
* -配置mysql主数据源,mysql连接默认主数据源
* @author wdd
* @Date 2019年11月04日
*
*/
@Configuration
@EnableConfigurationProperties({DataSourceProperties.class, DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class MysqlMainDruidConfig {
private static Logger logger = LoggerFactory.getLogger(MysqlMainDruidConfig.class);
@Autowired
private DataSourceProperties dataSourceProperties;
@Autowired
private DataSourceCommonProperties dataSourceCommonProperties;
@Primary //标明为主数据源,只能标识一个主数据源,mybatis连接默认主数据源
@Bean("mysqlDruidDataSource") //新建bean实例
@Qualifier("mysqlDruidDataSource")//标识
public DataSource dataSource(){
DruidDataSource datasource = new DruidDataSource();
//配置数据源属性
datasource.setUrl(dataSourceProperties.getMysqlMain().get("url"));
datasource.setUsername(dataSourceProperties.getMysqlMain().get("username"));
datasource.setPassword(dataSourceProperties.getMysqlMain().get("password"));
datasource.setDriverClassName(dataSourceProperties.getMysqlMain().get("driver-class-name"));
//配置统一属性
datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
datasource.setTimeBetweenEvictionRunsMillis(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
datasource.setMinEvictableIdleTimeMillis(dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
datasource.setPoolPreparedStatements(dataSourceCommonProperties.isPoolPreparedStatements());
try {
datasource.setFilters(dataSourceCommonProperties.getFilters());
} catch (SQLException e) {
logger.error("Druid configuration initialization filter error.", e);
}
return datasource;
}
}
3.DataSourceCommonProperties.java 扩展连接池,通用配置属性,可应用到所有数据源
package cn.**.**.system.config;
import org.springframework.boot.context.properties.ConfigurationProperties;
/**
* -扩展连接池,通用配置属性,可应用到所有数据源
* @author wdd
* @Date 2019年11月04日
*
*/
@ConfigurationProperties(prefix = DataSourceCommonProperties.DS, ignoreUnknownFields = false)
public class DataSourceCommonProperties {
final static String DS = "spring.datasource.common-config";
private int initialSize = 10;
private int minIdle;
private int maxIdle;
private int maxActive;
private int maxWait;
private int timeBetweenEvictionRunsMillis;
private int minEvictableIdleTimeMillis;
private String validationQuery;
private boolean testWhileIdle;
private boolean testOnBorrow;
private boolean testOnReturn;
private boolean poolPreparedStatements;
private int maxOpenPreparedStatements;
private String filters;
private String mapperLocations;
private String typeAliasPackage;
这里是get,set方法。自己生成一下
}
4.DataSourceProperties.java 统一属性控制类,获取配置文件属性
package cn.**.**.system.config;
import java.util.Map;
import org.springframework.boot.context.properties.ConfigurationProperties;
/**
* -统一属性控制类,获取配置文件属性
* @author wdd
* @Date 2019年11月04日
*
*/
@ConfigurationProperties(prefix = DataSourceProperties.DS, ignoreUnknownFields = false)
public class DataSourceProperties {
final static String DS = "spring.datasource";
private Map<String,String> mysqlMain;
private Map<String,String> hive;
private Map<String,String> commonConfig;
get,set方法,自己生成
}
5.HiveJdbcBaseDaoImpl.java 注入hive数据源
package cn.**.**.system.config.jdbcConfig;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;
/**
* -注入hive数据源
* @author wdd
* @Date 2019年11月04日
*
*/
@Repository
public class HiveJdbcBaseDaoImpl {
private JdbcTemplate jdbcTemplate;
public JdbcTemplate getJdbcTemplate() {
return jdbcTemplate;
}
@Autowired
public void setJdbcTemplate(@Qualifier("hiveDruidDataSource") DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
}
6.MysqlMainJdbcBaseDaoImpl.java 注入jdbctemplate
package cn.**.**.system.config.jdbcConfig;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;
/**
* -注入jdbctemplate
* @author wdd
* 2019年11月04日
*/
@Repository
public class MysqlMainJdbcBaseDaoImpl {
private JdbcTemplate jdbcTemplate;
public JdbcTemplate getJdbcTemplate() {
return jdbcTemplate;
}
@Autowired
public void setJdbcTemplate(@Qualifier("mysqlDruidDataSource") DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
}
到了这一步就算是创建完了,接下来就是测试一下吧,写一个测试类
用法呢就是直接注入即可使用
import cn.cnic.zhongtai.ZhongtaiApplication;
import cn.cnic.zhongtai.system.config.jdbcConfig.HiveJdbcBaseDaoImpl;
import cn.cnic.zhongtai.system.config.jdbcConfig.MysqlMainJdbcBaseDaoImpl;
import cn.cnic.zhongtai.system.mapper.TestDao;
import cn.cnic.zhongtai.system.model.GenTableColumn;
import cn.cnic.zhongtai.system.service.GenTableColumnService;
import cn.cnic.zhongtai.utils.SqlUtils;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
import java.util.List;
import java.util.Map;
/**
* 使用 hive and mysql test
*/
@RunWith(SpringRunner.class)
@SpringBootTest(classes={ZhongtaiApplication.class})
public class HiveAndMysqlTest {
@Autowired
private HiveJdbcBaseDaoImpl hiveJdbcBaseDaoImpl;
@Autowired
private MysqlMainJdbcBaseDaoImpl MysqlMainJdbcBaseDaoImpl;
@Test
public void testMysql() {
String sql = "SELECT name from test.people limit 1";
String info = hiveJdbcBaseDaoImpl.getJdbcTemplate().queryForObject(sql,String.class);
System.out.println("hive中查出的数据是:"+info);
//校验该表是否在数据库中存在
String sql1 = "SELECT count(*) FROM person ;";
int count = MysqlMainJdbcBaseDaoImpl.getJdbcTemplate().queryForObject(sql1, Integer.class);
System.out.println("mysql中查出的数据是:"+count);
}
}
测试结果:
就这样,写完了,这就是springboot多数据配置的两种hive和mysql。这应该够详细了。
如果有问题的话就多检查一下哪里设置错了,也欢迎留言讨论,如果对你有帮助就点个赞再走呗.
更多推荐
所有评论(0)