罗列一下三种插入方式:

1、动态SQL拼接

批量插入一万条数据:4.7s

 @Test
 public void dynamicSql() {
     List<NotifyRecordEntity> notifyRecordEntityList = Lists.newArrayList();
     for (int i = 0; i < 10000; i++) {
         NotifyRecordEntity record = new NotifyRecordEntity();
         record.setLastNotifyTime(new Date());
         record.setPartnerNo("1");
         record.setLimitNotifyTimes(1);
         record.setNotifyUrl("1");
         record.setLoanNo("1");
         record.setNotifyContent("1");
         record.setTradeNo("" + i);
         record.setNotifyTimes(1);
         record.setNotifyType(EnumNotifyType.DAIFU);
         record.setNotifyStatus(EnumNotifyStatus.FAIL);
         notifyRecordEntityList.add(record);
     }
     long start = System.currentTimeMillis();
     Map<String, Object> params = Maps.newHashMap();
     params.put("notifyRecordEntityList", notifyRecordEntityList);
     // 动态 SQL 进行拼接
     DaoFactory.notifyRecordDao.insertSelectiveList(params);
     System.out.println(System.currentTimeMillis() - start);
 }

2、Mybatis-ExecutorType.BATCH

批量插入一万条数据:2.4s

 @Test
 public void insertC() {
     SqlSession sqlsession = sqlSessionFactory.openSession(ExecutorType.BATCH, false);
     NotifyRecordDao notifyRecordDao = sqlsession.getMapper(NotifyRecordDao.class);
     int num = 0;
     for (int i = 0; i < 10000; i++) {
         NotifyRecordEntity record = new NotifyRecordEntity();
         record.setLastNotifyTime(new Date());
         record.setPartnerNo("1");
         record.setLimitNotifyTimes(1);
         record.setNotifyUrl("1");
         record.setLoanNo("1");
         record.setNotifyContent("1");
         record.setTradeNo("s" + i);
         record.setNotifyTimes(1);
         record.setNotifyType(EnumNotifyType.DAIFU);
         record.setNotifyStatus(EnumNotifyStatus.FAIL);
         notifyRecordDao.insert(record);
         num++;
     }
     long start = System.currentTimeMillis();
     sqlsession.commit();
     sqlsession.clearCache();
     sqlsession.close();
     System.out.println(System.currentTimeMillis() - start);
     }
 }

3、JDBC- executeBatch

这种方案也是比较推荐的方案,在速度上远高于前两种方案

这种方式注意:

        1、必须带上 rewriteBatchedStatements=true 参数
        2、在带上参数之后,批处理的
sql语句末尾不能带有分号,会抛java.sql.BatchUpdateException 异常
        3、rewriteBatchedStatements=true 对 DML 语句能显著提升执行速度,但是对于 DQL 语句,需要使用流式读取或者游标读取进行优化

测试数据:

1核2G:

        插入一万条耗时:1422 ms
        插入十万条耗时:6712 ms
        插入五十万条耗时:30965 ms
        插入一百万条耗时:57960 ms

    @Test
    public void jdbcBatchInsert() throws SQLException {
        // 1核2G:插入一万条耗时:1422
        // 1核2G:插入十万条耗时:6712
        // 1核2G:插入五十万条耗时:30965
        // 1核2G:插入一百万条耗时:57960
        Connection conn = null;
        PreparedStatement ps = null;
        try {
            Class.forName("com.mysql.jdbc.Driver");
            // 注意:1、必须带上rewriteBatchedStatements=true参数
            // 2、在带上参数之后,批处理的sql语句末尾不能带有分号,会报异常
            // 3、rewriteBatchedStatements=true 对 DML 语句能显著提升执行速度,但是对于 DQL 语句,需要使用流式读取或者游标读取进行优化
            conn = DriverManager.getConnection("jdbc:mysql://ip:3306/mysql-demo?useUnicode=true&characterEncoding=utf8&allowMultiQueries=true&rewriteBatchedStatements=true", "username", "password");
            conn.setAutoCommit(false);
            String sql = "INSERT INTO wh_b_inventory(inventory_id,sku_id,warehouse_id,warehouse_area_id,warehouse_location_id,warehouse_location_code,purpose,batch," +
                    "status,inbound_temp_store,warehouse_entry_date,initial_inventory,total_inventory,occupy_inventory,transit_inventory,freeze_inventory,basic_units_id," +
                    "version,receiving_units_id,conversion_ratio,delete_flag,created_by,created_by_name,created_time,last_upd_by,last_upd_by_name,last_upd_time,tenant_id) " +
                    "VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)";
            ps = conn.prepareStatement(sql);
            for (int i = 0; i < 1000000; i++) {
                ps.setLong(1, RandomUtil.randomInt(100000000, 999999999));
                ps.setLong(2, RandomUtil.randomInt(100000000, 999999999));
                ps.setLong(3, RandomUtil.randomInt(100000000, 999999999));
                ps.setLong(4, RandomUtil.randomInt(100000000, 999999999));
                ps.setLong(5, RandomUtil.randomInt(100000000, 999999999));
                ps.setString(6, RandomUtil.randomString(8));
                ps.setLong(7, RandomUtil.randomInt(100000000, 999999999));
                ps.setString(8, RandomUtil.randomString(8));
                ps.setInt(9, 1);
                ps.setInt(10, 1);
                ps.setDate(11, new Date(System.currentTimeMillis()));
                ps.setBigDecimal(12, new BigDecimal(RandomUtil.randomInt(10000, 99999) + ""));
                ps.setBigDecimal(13, new BigDecimal(RandomUtil.randomInt(10000, 99999) + ""));
                ps.setBigDecimal(14, new BigDecimal(RandomUtil.randomInt(10000, 99999) + ""));
                ps.setBigDecimal(15, new BigDecimal(RandomUtil.randomInt(10000, 99999) + ""));
                ps.setBigDecimal(16, new BigDecimal(RandomUtil.randomInt(10000, 99999) + ""));
                ps.setLong(17, RandomUtil.randomInt(100000000, 999999999));
                ps.setInt(18, RandomUtil.randomInt(1, 20));
                ps.setLong(19, RandomUtil.randomInt(100000000, 999999999));
                ps.setBigDecimal(20, new BigDecimal(RandomUtil.randomInt(10000, 99999) + ""));
                ps.setInt(21, 0);
                ps.setLong(22, RandomUtil.randomInt(100000000, 999999999));
                ps.setString(23, RandomUtil.randomString(8));
                ps.setDate(24, new Date(System.currentTimeMillis()));
                ps.setLong(25, RandomUtil.randomInt(100000000, 999999999));
                ps.setString(26, RandomUtil.randomString(8));
                ps.setDate(27, new Date(System.currentTimeMillis()));
                ps.setLong(28, RandomUtil.randomInt(100000000, 999999999));
                ps.addBatch();
            }
            ps.executeBatch();
            conn.commit();

        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (Objects.nonNull(ps)) {
                ps.close();
            }
            if (Objects.nonNull(conn)) {
                conn.close();
            }
        }
    }

关于rewriteBatchedStatements这个参数介绍:

        MySQL的JDBC 连接的 url 中要加 rewriteBatchedStatements 参数,并保证 5.1.13 以上版本的驱动,才能实现⾼性能的批量插⼊。
        MySQL JDBC驱动在默认情况下会⽆视 executeBatch() 语句,把我们期望批量执⾏的⼀组 sql 语句拆散,⼀条⼀条地发给MySQL 数据库,批量插⼊实际上是单条插⼊,直接造成较低的性能。
只有把 rewriteBatchedStatements 参数置为 true, 驱动才会帮你批量执⾏ SQL ,另外这个选项对INSERT/UPDATE/DELETE 都有效

总结:

        添加rewriteBatchedStatements=true 这个参数后的执⾏速度⽐较:
        同个表插⼊⼀万条数据时间近似值:
        JDBC BATCH 1422 ms > Mybatis BATCH 2423  ms  > 动态 SQL 4512  ms
        从速度来看无疑直接越过Mybatis,通过 JDBC  BATCH 的方案是最快的,但是实际生产中,我们还需要考虑代码的可维护性,Mybatis的优势在于便于代码的统一维护,在生产中酌情选择。
Logo

腾讯云面向开发者汇聚海量精品云计算使用和开发经验,营造开放的云计算技术生态圈。

更多推荐