万字长文解密数据异构最佳实践(含完整代码实现)!!

}

可以看到,这个类中,使用的destination为example。在这个类中,我们只需要将IP地址修改为Canal Server的IP即可。

具体为:将如下一行代码。

String ip = AddressUtils.getHostIp();

修改为:

String ip = “192.168.175.100”

由于我们在配置Canal时,没有指定用户名和密码,所以,我们还需要将如下代码。

CanalConnector connector = CanalConnectors.newSingleConnector(

new InetSocketAddress(ip, 11111),

destination,

“canal”,

“canal”);

修改为:

CanalConnector connector = CanalConnectors.newSingleConnector(

new InetSocketAddress(ip, 11111),

destination,

“”,

“”);

修改完成后,运行main方法启动程序。

测试数据变更

接下来,在MySQL中创建一个canaldb数据库。

create database canaldb;

此时会在IDEA的命令行输出相关的日志信息。


  • Batch Id: [7] ,count : [3] , memsize : [149] , Time : 2020-08-05 23:25:35

  • Start : [mysql-bin.000007:6180:1540286735000(2020-08-05 23:25:35)]

  • End : [mysql-bin.000007:6356:1540286735000(2020-08-05 23:25:35)]


接下来,我在canaldb数据库中创建数据表,并对数据表中的数据进行增删改查,程序输出的日志信息如下所示。

#在mysql进行数据变更后,这里会显示mysql的bin日志。


  • Batch Id: [7] ,count : [3] , memsize : [149] , Time : 2020-08-05 23:25:35

  • Start : [mysql-bin.000007:6180:1540286735000(2020-08-05 23:25:35)]

  • End : [mysql-bin.000007:6356:1540286735000(2020-08-05 23:25:35)]


================> binlog[mysql-bin.000007:6180] , executeTime : 1540286735000(2020-08-05 23:25:35) , gtid : () , delay : 393ms

BEGIN ----> Thread id: 43

----------------> binlog[mysql-bin.000007:6311] , name[canal,canal_table] , eventType : DELETE , executeTime : 1540286735000(2020-08-05 23:25:35) , gtid : () , delay : 393 ms

id : 8 type=int(10) unsigned

name : 512 type=varchar(255)


END ----> transaction id: 249

================> binlog[mysql-bin.000007:6356] , executeTime : 1540286735000(2020-08-05 23:25:35) , gtid : () , delay : 394ms


  • Batch Id: [8] ,count : [3] , memsize : [149] , Time : 2020-08-05 23:25:35

  • Start : [mysql-bin.000007:6387:1540286869000(2020-08-05 23:25:49)]

  • End : [mysql-bin.000007:6563:1540286869000(2020-08-05 23:25:49)]


================> binlog[mysql-bin.000007:6387] , executeTime : 1540286869000(2020-08-05 23:25:49) , gtid : () , delay : 976ms

BEGIN ----> Thread id: 43

----------------> binlog[mysql-bin.000007:6518] , name[canal,canal_table] , eventType : INSERT , executeTime : 1540286869000(2020-08-05 23:25:49) , gtid : () , delay : 976 ms

id : 21 type=int(10) unsigned update=true

name : aaa type=varchar(255) update=true


END ----> transaction id: 250

================> binlog[mysql-bin.000007:6563] , executeTime : 1540286869000(2020-08-05 23:25:49) , gtid : () , delay : 977ms


  • Batch Id: [9] ,count : [3] , memsize : [161] , Time : 2020-08-05 23:26:22

  • Start : [mysql-bin.000007:6594:1540286902000(2020-08-05 23:26:22)]

  • End : [mysql-bin.000007:6782:1540286902000(2020-08-05 23:26:22)]


================> binlog[mysql-bin.000007:6594] , executeTime : 1540286902000(2020-08-05 23:26:22) , gtid : () , delay : 712ms

BEGIN ----> Thread id: 43

----------------> binlog[mysql-bin.000007:6725] , name[canal,canal_table] , eventType : UPDATE , executeTime : 1540286902000(2020-08-05 23:26:22) , gtid : () , delay : 712 ms

id : 21 type=int(10) unsigned

name : aaac type=varchar(255) update=true


END ----> transaction id: 252

================> binlog[mysql-bin.000007:6782] , executeTime : 1540286902000(2020-08-05 23:26:22) , gtid : () , delay : 713ms

数据同步实现


需求

将数据库数据的变化, 通过canal解析binlog日志, 实时更新到solr的索引库中。

具体实现

创建工程

创建Maven工程mykit-canal-demo,并在pom.xml文件中添加如下配置。

com.alibaba.otter

canal.client

1.0.24

com.alibaba.otter

canal.protocol

1.0.24

commons-lang

commons-lang

2.6

org.codehaus.jackson

jackson-mapper-asl

1.8.9

org.apache.solr

solr-solrj

4.10.3

junit

junit

4.9

test

创建log4j配置文件xml

在工程的src/main/resources目录下创建log4j.properties文件,内容如下所示。

log4j.rootCategory=debug, CONSOLE

CONSOLE is set to be a ConsoleAppender using a PatternLayout.

log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender

log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout

log4j.appender.CONSOLE.layout.ConversionPattern=%d{ISO8601} %-6r [%15.15t] %-5p %30.30c %x - %m\n

LOGFILE is set to be a File appender using a PatternLayout.

log4j.appender.LOGFILE=org.apache.log4j.FileAppender

log4j.appender.LOGFILE.File=d:\axis.log

log4j.appender.LOGFILE.Append=true

log4j.appender.LOGFILE.layout=org.apache.log4j.PatternLayout

log4j.appender.LOGFILE.layout.ConversionPattern=%d{ISO8601} %-6r [%15.15t] %-5p %30.30c %x - %m\n

创建实体类

在io.mykit.canal.demo.bean包下创建一个Book实体类,用于测试Canal的数据传输,如下所示。

package io.mykit.canal.demo.bean;

import org.apache.solr.client.solrj.beans.Field;

import java.util.Date;

public class Book implements Serializable {

private static final long serialVersionUID = -6350345408771427834L;{

@Field(“id”)

private Integer id;

@Field(“book_name”)

private String name;

@Field(“book_author”)

private String author;

@Field(“book_publishtime”)

private Date publishtime;

@Field(“book_price”)

private Double price;

@Field(“book_publishgroup”)

private String publishgroup;

public Integer getId() {

return id;

}

public void setId(Integer id) {

this.id = id;

}

public String getName() {

return name;

}

public void setName(String name) {

this.name = name;

}

public String getAuthor() {

return author;

}

public void setAuthor(String author) {

this.author = author;

}

public Date getPublishtime() {

return publishtime;

}

public void setPublishtime(Date publishtime) {

this.publishtime = publishtime;

}

public Double getPrice() {

return price;

}

public void setPrice(Double price) {

this.price = price;

}

public String getPublishgroup() {

return publishgroup;

}

public void setPublishgroup(String publishgroup) {

this.publishgroup = publishgroup;

}

@Override

public String toString() {

return “Book{” +

“id=” + id +

“, name='” + name + ‘’’ +

“, author='” + author + ‘’’ +

“, publishtime=” + publishtime +

“, price=” + price +

“, publishgroup='” + publishgroup + ‘’’ +

‘}’;

}

}

其中,我们在Book实体类中,使用Solr的注解@Field定义了实体类字段与Solr域之间的关系。

各种工具类的实现

接下来,我们就在io.mykit.canal.demo.utils包下创建各种工具类。

  • BinlogValue

用于存储binlog分析的每行每列的value值,代码如下所示。

package io.mykit.canal.demo.utils;

import java.io.Serializable;

/**

  • ClassName: BinlogValue

  • binlog分析的每行每列的value值;

  • 新增数据:beforeValue 和 value 均为现有值;

  • 修改数据:beforeValue是修改前的值;value为修改后的值;

  • 删除数据:beforeValue和value均是删除前的值; 这个比较特殊主要是为了删除数据时方便获取删除前的值

*/

public class BinlogValue implements Serializable {

private static final long serialVersionUID = -6350345408773943086L;

private String value;

private String beforeValue;

/**

  • binlog分析的每行每列的value值;

  • 新增数据: value:为现有值;

  • 修改数据:value为修改后的值;

  • 删除数据:value是删除前的值; 这个比较特殊主要是为了删除数据时方便获取删除前的值

*/

public String getValue() {

return value;

}

public void setValue(String value) {

this.value = value;

}

/**

  • binlog分析的每行每列的beforeValue值;

  • 新增数据:beforeValue为现有值;

  • 修改数据:beforeValue是修改前的值;

  • 删除数据:beforeValue为删除前的值;

*/

public String getBeforeValue() {

return beforeValue;

}

public void setBeforeValue(String beforeValue) {

this.beforeValue = beforeValue;

}

}

  • CanalDataParser

用于解析数据,代码如下所示。

package io.mykit.canal.demo.utils;

import java.text.SimpleDateFormat;

import java.util.ArrayList;

import java.util.Date;

import java.util.HashMap;

import java.util.List;

import java.util.Map;

import org.apache.commons.lang.SystemUtils;

import org.slf4j.Logger;

import org.slf4j.LoggerFactory;

import org.springframework.util.CollectionUtils;

import com.alibaba.otter.canal.protocol.Message;

import com.alibaba.otter.canal.protocol.CanalEntry.Column;

import com.alibaba.otter.canal.protocol.CanalEntry.Entry;

import com.alibaba.otter.canal.protocol.CanalEntry.EntryType;

import com.alibaba.otter.canal.protocol.CanalEntry.EventType;

import com.alibaba.otter.canal.protocol.CanalEntry.RowChange;

import com.alibaba.otter.canal.protocol.CanalEntry.RowData;

import com.alibaba.otter.canal.protocol.CanalEntry.TransactionBegin;

import com.alibaba.otter.canal.protocol.CanalEntry.TransactionEnd;

import com.google.protobuf.InvalidProtocolBufferException;

/**

  • 解析数据

*/

public class CanalDataParser {

protected static final String DATE_FORMAT = “yyyy-MM-dd HH:mm:ss”;

protected static final String yyyyMMddHHmmss = “yyyyMMddHHmmss”;

protected static final String yyyyMMdd = “yyyyMMdd”;

protected static final String SEP = SystemUtils.LINE_SEPARATOR;

protected static String context_format = null;

protected static String row_format = null;

protected static String transaction_format = null;

protected static String row_log = null;

private static Logger logger = LoggerFactory.getLogger(CanalDataParser.class);

static {

context_format = SEP + “****************************************************” + SEP;

context_format += “* Batch Id: [{}] ,count : [{}] , memsize : [{}] , Time : {}” + SEP;

context_format += "* Start : [{}] " + SEP;

context_format += "* End : [{}] " + SEP;

context_format += “****************************************************” + SEP;

row_format = SEP

  • “----------------> binlog[{}:{}] , name[{},{}] , eventType : {} , executeTime : {} , delay : {}ms”

  • SEP;

transaction_format = SEP + “================> binlog[{}:{}] , executeTime : {} , delay : {}ms” + SEP;

row_log = “schema[{}], table[{}]”;

}

public static List convertToInnerBinlogEntry(Message message) {

List innerBinlogEntryList = new ArrayList();

if(message == null) {

logger.info(“接收到空的 message; 忽略”);

return innerBinlogEntryList;

}

long batchId = message.getId();

int size = message.getEntries().size();

if (batchId == -1 || size == 0) {

logger.info(“接收到空的message[size=” + size + “]; 忽略”);

return innerBinlogEntryList;

}

printLog(message, batchId, size);

List entrys = message.getEntries();

//输出日志

for (Entry entry : entrys) {

long executeTime = entry.getHeader().getExecuteTime();

long delayTime = new Date().getTime() - executeTime;

if (entry.getEntryType() == EntryType.TRANSACTIONBEGIN || entry.getEntryType() == EntryType.TRANSACTIONEND) {

if (entry.getEntryType() == EntryType.TRANSACTIONBEGIN) {

TransactionBegin begin = null;

try {

begin = TransactionBegin.parseFrom(entry.getStoreValue());

} catch (InvalidProtocolBufferException e) {

throw new RuntimeException(“parse event has an error , data:” + entry.toString(), e);

}

// 打印事务头信息,执行的线程id,事务耗时

logger.info(“BEGIN ----> Thread id: {}”, begin.getThreadId());

logger.info(transaction_format, new Object[] {entry.getHeader().getLogfileName(),

String.valueOf(entry.getHeader().getLogfileOffset()), String.valueOf(entry.getHeader().getExecuteTime()), String.valueOf(delayTime) });

} else if (entry.getEntryType() == EntryType.TRANSACTIONEND) {

TransactionEnd end = null;

try {

end = TransactionEnd.parseFrom(entry.getStoreValue());

} catch (InvalidProtocolBufferException e) {

throw new RuntimeException(“parse event has an error , data:” + entry.toString(), e);

}

// 打印事务提交信息,事务id

logger.info(“END ----> transaction id: {}”, end.getTransactionId());

logger.info(transaction_format,

new Object[] {entry.getHeader().getLogfileName(), String.valueOf(entry.getHeader().getLogfileOffset()),

String.valueOf(entry.getHeader().getExecuteTime()), String.valueOf(delayTime) });

}

continue;

}

//解析结果

if (entry.getEntryType() == EntryType.ROWDATA) {

RowChange rowChage = null;

try {

rowChage = RowChange.parseFrom(entry.getStoreValue());

} catch (Exception e) {

throw new RuntimeException(“parse event has an error , data:” + entry.toString(), e);

}

EventType eventType = rowChage.getEventType();

logger.info(row_format, new Object[] { entry.getHeader().getLogfileName(),

String.valueOf(entry.getHeader().getLogfileOffset()), entry.getHeader().getSchemaName(),

entry.getHeader().getTableName(), eventType, String.valueOf(entry.getHeader().getExecuteTime()), String.valueOf(delayTime) });

//组装数据结果

if (eventType == EventType.INSERT || eventType == EventType.DELETE || eventType == EventType.UPDATE) {

String schemaName = entry.getHeader().getSchemaName();

String tableName = entry.getHeader().getTableName();

List<Map<String, BinlogValue>> rows = parseEntry(entry);

InnerBinlogEntry innerBinlogEntry = new InnerBinlogEntry();

innerBinlogEntry.setEntry(entry);

innerBinlogEntry.setEventType(eventType);

innerBinlogEntry.setSchemaName(schemaName);

innerBinlogEntry.setTableName(tableName.toLowerCase());

innerBinlogEntry.setRows(rows);

innerBinlogEntryList.add(innerBinlogEntry);

} else {

logger.info(" 存在 INSERT INSERT UPDATE 操作之外的SQL [" + eventType.toString() + “]”);

}

continue;

}

}

return innerBinlogEntryList;

}

private static List<Map<String, BinlogValue>> parseEntry(Entry entry) {

List<Map<String, BinlogValue>> rows = new ArrayList<Map<String, BinlogValue>>();

try {

String schemaName = entry.getHeader().getSchemaName();

String tableName = entry.getHeader().getTableName();

RowChange rowChage = RowChange.parseFrom(entry.getStoreValue());

EventType eventType = rowChage.getEventType();

// 处理每个Entry中的每行数据

for (RowData rowData : rowChage.getRowDatasList()) {

StringBuilder rowlog = new StringBuilder(“rowlog schema[” + schemaName + “], table[” + tableName + “], event[” + eventType.toString() + “]”);

Map<String, BinlogValue> row = new HashMap<String, BinlogValue>();

List beforeColumns = rowData.getBeforeColumnsList();

List afterColumns = rowData.getAfterColumnsList();

beforeColumns = rowData.getBeforeColumnsList();

if (eventType == EventType.DELETE) {//delete

for(Column column : beforeColumns) {

BinlogValue binlogValue = new BinlogValue();

binlogValue.setValue(column.getValue());

binlogValue.setBeforeValue(column.getValue());

row.put(column.getName(), binlogValue);

}

} else if(eventType == EventType.UPDATE) {//update

for(Column column : beforeColumns) {

BinlogValue binlogValue = new BinlogValue();

binlogValue.setBeforeValue(column.getValue());

row.put(column.getName(), binlogValue);

}

for(Column column : afterColumns) {

BinlogValue binlogValue = row.get(column.getName());

if(binlogValue == null) {

binlogValue = new BinlogValue();

}

binlogValue.setValue(column.getValue());

row.put(column.getName(), binlogValue);

}

} else { // insert

for(Column column : afterColumns) {

BinlogValue binlogValue = new BinlogValue();

binlogValue.setValue(column.getValue());

binlogValue.setBeforeValue(column.getValue());

row.put(column.getName(), binlogValue);

}

}

rows.add(row);

String rowjson = JacksonUtil.obj2str(row);

logger.info(“#################################### Data Parse Result ####################################”);

logger.info(rowlog + " , " + rowjson);

logger.info(“#################################### Data Parse Result ####################################”);

logger.info(“”);

}

} catch (InvalidProtocolBufferException e) {

throw new RuntimeException(“parseEntry has an error , data:” + entry.toString(), e);

}

return rows;

}

private static void printLog(Message message, long batchId, int size) {

long memsize = 0;

for (Entry entry : message.getEntries()) {

memsize += entry.getHeader().getEventLength();

}

String startPosition = null;

String endPosition = null;

if (!CollectionUtils.isEmpty(message.getEntries())) {

startPosition = buildPositionForDump(message.getEntries().get(0));

endPosition = buildPositionForDump(message.getEntries().get(message.getEntries().size() - 1));

}

SimpleDateFormat format = new SimpleDateFormat(DATE_FORMAT);

logger.info(context_format, new Object[] {batchId, size, memsize, format.format(new Date()), startPosition, endPosition });

}

private static String buildPositionForDump(Entry entry) {

long time = entry.getHeader().getExecuteTime();

Date date = new Date(time);

SimpleDateFormat format = new SimpleDateFormat(DATE_FORMAT);

return entry.getHeader().getLogfileName() + “:” + entry.getHeader().getLogfileOffset() + “:” + entry.getHeader().getExecuteTime() + “(” + format.format(date) + “)”;

}

}

  • DateUtils

时间工具类,代码如下所示。

package io.mykit.canal.demo.utils;

import java.text.ParseException;

import java.text.SimpleDateFormat;

import java.util.Date;

public class DateUtils {

private static final String FORMAT_PATTERN = “yyyy-MM-dd HH:mm:ss”;

private static SimpleDateFormat sdf = new SimpleDateFormat(FORMAT_PATTERN);

public static Date parseDate(String datetime) throws ParseException{

if(datetime != null && !“”.equals(datetime)){

return sdf.parse(datetime);

}

return null;

}

public static String formatDate(Date datetime) throws ParseException{

if(datetime != null ){

return sdf.format(datetime);

}

return null;

}

public static Long formatStringDateToLong(String datetime) throws ParseException{

if(datetime != null && !“”.equals(datetime)){

Date d = sdf.parse(datetime);

return d.getTime();

}

return null;

}

public static Long formatDateToLong(Date datetime) throws ParseException{

if(datetime != null){

return datetime.getTime();

}

return null;

}

}

  • InnerBinlogEntry

Binlog实体类,代码如下所示。

package io.mykit.canal.demo.utils;

import java.util.ArrayList;

import java.util.List;

import java.util.Map;

import com.alibaba.otter.canal.protocol.CanalEntry.Entry;

import com.alibaba.otter.canal.protocol.CanalEntry.EventType;

public class InnerBinlogEntry {

/**

  • canal原生的Entry

*/

private Entry entry;

/**

  • 该Entry归属于的表名

*/

private String tableName;

/**

  • 该Entry归属数据库名

*/

private String schemaName;

/**

  • 该Entry本次的操作类型,对应canal原生的枚举;EventType.INSERT; EventType.UPDATE; EventType.DELETE;

*/

private EventType eventType;

private List<Map<String, BinlogValue>> rows = new ArrayList<Map<String, BinlogValue>>();

public Entry getEntry() {

return entry;

}

public void setEntry(Entry entry) {

this.entry = entry;

}

public String getTableName() {

return tableName;

}

public void setTableName(String tableName) {

this.tableName = tableName;

}

public EventType getEventType() {

return eventType;

}

public void setEventType(EventType eventType) {

this.eventType = eventType;

}

public String getSchemaName() {

return schemaName;

}

public void setSchemaName(String schemaName) {

this.schemaName = schemaName;

}

public List<Map<String, BinlogValue>> getRows() {

return rows;

}

public void setRows(List<Map<String, BinlogValue>> rows) {

this.rows = rows;

}

}

  • JacksonUtil

Json工具类,代码如下所示。

package io.mykit.canal.demo.utils;

import java.io.IOException;

import org.codehaus.jackson.JsonGenerationException;

import org.codehaus.jackson.JsonParseException;

import org.codehaus.jackson.map.JsonMappingException;

import org.codehaus.jackson.map.ObjectMapper;

public class JacksonUtil {

private static ObjectMapper mapper = new ObjectMapper();

public static String obj2str(Object obj) {

String json = null;

try {

json = mapper.writeValueAsString(obj);

} catch (JsonGenerationException e) {

e.printStackTrace();

} catch (JsonMappingException e) {

e.printStackTrace();

} catch (IOException e) {

e.printStackTrace();

}

return json;

}

public static T str2obj(String content, Class valueType) {

try {

return mapper.readValue(content, valueType);

} catch (JsonParseException e) {

e.printStackTrace();

} catch (JsonMappingException e) {

e.printStackTrace();

} catch (IOException e) {

e.printStackTrace();

}

return null;

}

}

同步程序的实现

准备好实体类和工具类后,我们就可以编写同步程序来实现MySQL数据库中的数据实时同步到Solr索引库了,我们在io.mykit.canal.demo.main包中常见MykitCanalDemoSync类,代码如下所示。

package io.mykit.canal.demo.main;

import io.mykit.canal.demo.bean.Book;

import io.mykit.canal.demo.utils.BinlogValue;

import io.mykit.canal.demo.utils.CanalDataParser;

import io.mykit.canal.demo.utils.DateUtils;

import io.mykit.canal.demo.utils.InnerBinlogEntry;

import com.alibaba.otter.canal.client.CanalConnector;

import com.alibaba.otter.canal.client.CanalConnectors;

import com.alibaba.otter.canal.protocol.CanalEntry;

import com.alibaba.otter.canal.protocol.Message;

import org.apache.solr.client.solrj.SolrServer;

import org.apache.solr.client.solrj.impl.HttpSolrServer;

import org.slf4j.Logger;

import org.slf4j.LoggerFactory;

import java.net.InetSocketAddress;

import java.text.ParseException;

import java.util.List;

import java.util.Map;

public class SyncDataBootStart {

private static Logger logger = LoggerFactory.getLogger(SyncDataBootStart.class);

public static void main(String[] args) throws Exception {

String hostname = “192.168.175.100”;

Integer port = 11111;

String destination = “example”;

//获取CanalServer 连接

CanalConnector canalConnector = CanalConnectors.newSingleConnector(new InetSocketAddress(hostname, port), destination, “”, “”);

//连接CanalServer

canalConnector.connect();

//订阅Destination

canalConnector.subscribe();

//轮询拉取数据

Integer batchSize = 5*1024;

while (true){

Message message = canalConnector.getWithoutAck(batchSize);

long messageId = message.getId();

int size = message.getEntries().size();

if(messageId == -1 || size == 0){

try {

Thread.sleep(1000);

} catch (InterruptedException e) {

e.printStackTrace();

}

}else{

//进行数据同步

//1. 解析Message对象

List innerBinlogEntries = CanalDataParser.convertToInnerBinlogEntry(message);

//2. 将解析后的数据信息 同步到Solr的索引库中.

syncDataToSolr(innerBinlogEntries);

}

//提交确认

canalConnector.ack(messageId);

}

}

private static void syncDataToSolr(List innerBinlogEntries) throws Exception {

//获取solr的连接

SolrServer solrServer = new HttpSolrServer(“http://192.168.175.101:8080/solr”);

//遍历数据集合 , 根据数据集合中的数据信息, 来决定执行增加, 修改 , 删除操作 .

if(innerBinlogEntries != null){

for (InnerBinlogEntry innerBinlogEntry : innerBinlogEntries) {

CanalEntry.EventType eventType = innerBinlogEntry.getEventType();

//如果是Insert, update , 则需要同步数据到 solr 索引库

if(eventType == CanalEntry.EventType.INSERT || eventType == CanalEntry.EventType.UPDATE){

List<Map<String, BinlogValue>> rows = innerBinlogEntry.getRows();

if(rows != null){

for (Map<String, BinlogValue> row : rows) {

BinlogValue id = row.get(“id”);

BinlogValue name = row.get(“name”);

BinlogValue author = row.get(“author”);

BinlogValue publishtime = row.get(“publishtime”);

BinlogValue price = row.get(“price”);

BinlogValue publishgroup = row.get(“publishgroup”);

Book book = new Book();

book.setId(Integer.parseInt(id.getValue()));

book.setName(name.getValue());

book.setAuthor(author.getValue());

book.setPrice(Double.parseDouble(price.getValue()));

book.setPublishgroup(publishgroup.getValue());

book.setPublishtime(DateUtils.parseDate(publishtime.getValue()));

//导入数据到solr索引库

solrServer.addBean(book);

solrServer.commit();

}

}

}else if(eventType == CanalEntry.EventType.DELETE){

//如果是Delete操作, 则需要删除solr索引库中的数据 .

List<Map<String, BinlogValue>> rows = innerBinlogEntry.getRows();

if(rows != null){

for (Map<String, BinlogValue> row : rows) {

BinlogValue id = row.get(“id”);

//根据ID删除solr的索引库

solrServer.deleteById(id.getValue());

solrServer.commit();

}

}

}

}

}

}

}

接下来,启动SyncDataBootStart类的main方法,监听Canal Server,而Canal Server监听MySQL binlog的日志变化,一旦MySQL的binlog日志发生变化,则SyncDataBootStart会立刻收到变更信息,并将变更信息解析成Book对象实时更新到Solr库中。如果在MySQL数据库中删除了数据,则也会实时删除Solr库中的数据。

部分参考Canal官方文档:https://github.com/alibaba/canal

自我介绍一下,小编13年上海交大毕业,曾经在小公司待过,也去过华为、OPPO等大厂,18年进入阿里一直到现在。

深知大多数Java工程师,想要提升技能,往往是自己摸索成长或者是报班学习,但对于培训机构动则几千的学费,着实压力不小。自己不成体系的自学效果低效又漫长,而且极易碰到天花板技术停滞不前!

因此收集整理了一份《2024年Java开发全套学习资料》,初衷也很简单,就是希望能够帮助到想自学提升又不知道该从何学起的朋友,同时减轻大家的负担。img

既有适合小白学习的零基础资料,也有适合3年以上经验的小伙伴深入学习提升的进阶课程,基本涵盖了95%以上Java开发知识点,真正体系化!

由于文件比较大,这里只是将部分目录截图出来,每个节点里面都包含大厂面经、学习笔记、源码讲义、实战项目、讲解视频,并且会持续更新!

如果你觉得这些内容对你有帮助,可以扫码获取!!(备注Java获取)

img

总结

本文从基础到高级再到实战,由浅入深,把MySQL讲的清清楚楚,明明白白,这应该是我目前为止看到过最好的有关MySQL的学习笔记了,我相信如果你把这份笔记认真看完后,无论是工作中碰到的问题还是被面试官问到的问题都能迎刃而解!

MySQL50道高频面试题整理:

《互联网大厂面试真题解析、进阶开发核心学习笔记、全套讲解视频、实战项目源码讲义》点击传送门即可获取!
tils.parseDate(publishtime.getValue()));

//导入数据到solr索引库

solrServer.addBean(book);

solrServer.commit();

}

}

}else if(eventType == CanalEntry.EventType.DELETE){

//如果是Delete操作, 则需要删除solr索引库中的数据 .

List<Map<String, BinlogValue>> rows = innerBinlogEntry.getRows();

if(rows != null){

for (Map<String, BinlogValue> row : rows) {

BinlogValue id = row.get(“id”);

//根据ID删除solr的索引库

solrServer.deleteById(id.getValue());

solrServer.commit();

}

}

}

}

}

}

}

接下来,启动SyncDataBootStart类的main方法,监听Canal Server,而Canal Server监听MySQL binlog的日志变化,一旦MySQL的binlog日志发生变化,则SyncDataBootStart会立刻收到变更信息,并将变更信息解析成Book对象实时更新到Solr库中。如果在MySQL数据库中删除了数据,则也会实时删除Solr库中的数据。

部分参考Canal官方文档:https://github.com/alibaba/canal

自我介绍一下,小编13年上海交大毕业,曾经在小公司待过,也去过华为、OPPO等大厂,18年进入阿里一直到现在。

深知大多数Java工程师,想要提升技能,往往是自己摸索成长或者是报班学习,但对于培训机构动则几千的学费,着实压力不小。自己不成体系的自学效果低效又漫长,而且极易碰到天花板技术停滞不前!

因此收集整理了一份《2024年Java开发全套学习资料》,初衷也很简单,就是希望能够帮助到想自学提升又不知道该从何学起的朋友,同时减轻大家的负担。[外链图片转存中…(img-OVgx5Znj-1713441202422)]

[外链图片转存中…(img-2Dq5fRrc-1713441202423)]

[外链图片转存中…(img-5Wn776Dw-1713441202423)]

既有适合小白学习的零基础资料,也有适合3年以上经验的小伙伴深入学习提升的进阶课程,基本涵盖了95%以上Java开发知识点,真正体系化!

由于文件比较大,这里只是将部分目录截图出来,每个节点里面都包含大厂面经、学习笔记、源码讲义、实战项目、讲解视频,并且会持续更新!

如果你觉得这些内容对你有帮助,可以扫码获取!!(备注Java获取)

img

总结

本文从基础到高级再到实战,由浅入深,把MySQL讲的清清楚楚,明明白白,这应该是我目前为止看到过最好的有关MySQL的学习笔记了,我相信如果你把这份笔记认真看完后,无论是工作中碰到的问题还是被面试官问到的问题都能迎刃而解!

MySQL50道高频面试题整理:

[外链图片转存中…(img-l8xj9wyZ-1713441202423)]

《互联网大厂面试真题解析、进阶开发核心学习笔记、全套讲解视频、实战项目源码讲义》点击传送门即可获取!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值