一致性哈希算法在分布式系统中应用广泛,可用于网关负载均衡、数据库分库分表、分布式缓存。使用一致性哈希算法可以尽可能的将同一资源请求路由到同一台服务器上。
本文代码中仅实现顺时针方式查找
核心方法:
- 新增一个节点和N个虚拟节点
- 删除节点及相关联的虚拟节点
- 从某个点开始顺时针查找节点
- 获取节点变动后的影响范围及其下一个接收请求的节点
算法效果
// 创建哈希环
ConsistentHashLoop hashLoop = new ConsistentHashLoop(10000);
ConsistentHash consistentHash = ConsistentHash.fro(hashLoop);
// 新增1个物理节点和3个对应的虚拟节点:服务器A
String server1 = "服务器节点A";
consistentHash.putNode(new ConsistentHashNode(consistentHash.getPoint(server1),server1),3);
// 新增1个物理节点和3个对应的虚拟节点:服务器B
String server2 = "服务器节点B";
consistentHash.putNode(new ConsistentHashNode(consistentHash.getPoint(server2),server2),3);
// 模拟请求分配节点
String request1 = "127.0.0.1";
System.out.println("请求:" + request1 + ",分配节点:" + consistentHash.getNode(request1).getTarget());
String request2 = "192.168.0.1";
System.out.println("请求:" + request2 + ",分配节点:" + consistentHash.getNode(request2).getTarget());
// 删除节点
PointScope[] pointScopes = consistentHash.removeNode(consistentHash.getPoint(server1));
System.out.println("删除节点影响范围:" + Arrays.toString(pointScopes));
请求:127.0.0.1,分配节点:服务器节点B
请求:192.168.0.1,分配节点:服务器节点A
删除节点影响范围:[PointScope(start=1582, end=6424, nextNode=ConsistentHashNode(point=6425, target=服务器节点B)), PointScope(start=6965, end=250, nextNode=ConsistentHashNode(point=1523, target=服务器节点B))]
Process finished with exit code 0
节点实体
存储节点信息
/**
* 一致性哈希节点
*
* @author Bin
* @date 2020-07-09 10:23
*/
@Data
@NoArgsConstructor
@AllArgsConstructor
public class ConsistentHashNode implements Serializable {
private Integer point;
private Object target;
}
哈希环实体
哈希环实体类,数据结构使用TreeMap
/**
* 一致性哈希环
*
* @author Bin
* @date 2020-07-09 10:18
*/
@Data
public class ConsistentHashLoop implements Serializable {
private Integer pointCount;
private TreeMap<Integer, ConsistentHashNode> nodes;
public ConsistentHashLoop() {
}
public ConsistentHashLoop(Integer pointCount) {
this.pointCount = pointCount;
}
}
受影响范围实体
在加入或删除节点时,返回的受影响范围实体,包含起始和截止两个点及下一个可接收处理的节点
/**
* 一致性哈希点范围
*
* @author Bin
* @date 2020-07-09 22:58
*/
@Data
@NoArgsConstructor
@AllArgsConstructor
public class PointScope {
/**
* 起始点(包含)
*/
private Integer start;
/**
* 截止点(包含)
* 当截止点小于起始点时,需包含截止点以下所有点
*/
private Integer end;
/**
* 截止点的下一个节点
*/
private ConsistentHashNode nextNode;
}
核心操作类
实现新增节点、删除节点、查询节点等核心操作
Assert.class (断言工具类) 和 SystemException.class (自定义异常) 需自行替换
/**
* 一致性哈希算法
*
* @author Bin
* @date 2020-07-09 12:14
*/
public class ConsistentHash {
private ConsistentHashLoop consistentHashLoop;
private ConsistentHash(ConsistentHashLoop consistentHashLoop) {
this.consistentHashLoop = consistentHashLoop;
}
public ConsistentHashLoop getConsistentHashLoop() {
return consistentHashLoop;
}
public static ConsistentHash fro(ConsistentHashLoop consistentHashLoop) {
return new ConsistentHash(consistentHashLoop).init();
}
private ConsistentHash init() {
Assert.nonNull(this.consistentHashLoop).orThrows(() -> new SystemException("loop can't be null"));
TreeMap<Integer, ConsistentHashNode> nodes = this.consistentHashLoop.getNodes();
if (nodes == null) {
this.consistentHashLoop.setNodes(new TreeMap<>());
}
return this;
}
/**
* 获取精准节点
*
* @param str 字符串
* @return 节点信息
*/
public ConsistentHashNode getAccurateNode(String str) {
int point = this.getPoint(str);
ConsistentHashNode node = this.getNode(point);
return node != null && node.getPoint() == point ? node : null;
}
/**
* 查找节点,顺时针方向
*
* @param str 字符串
* @return 节点信息
*/
public ConsistentHashNode getNode(String str) {
int point = this.getPoint(str);
return this.getNode(point);
}
/**
* 查找节点,顺时针方向
*
* @param point 点
* @return 节点信息
*/
public ConsistentHashNode getNode(Integer point) {
TreeMap<Integer, ConsistentHashNode> nodes = this.consistentHashLoop.getNodes();
if (nodes.size() == 0) {
return null;
}
Map.Entry<Integer, ConsistentHashNode> entry = nodes.ceilingEntry(point);
return entry != null ? entry.getValue() : nodes.firstEntry().getValue();
}
/**
* 获取点
*
* @param str 字符串
* @return 点
*/
public int getPoint(String str) {
int hashCode = DigestUtils.md5DigestAsHex(str.getBytes()).hashCode();
return Math.abs(hashCode % this.consistentHashLoop.getPointCount());
}
/**
* 新增节点
*
* @param node 节点信息
* @return 影响点范围
*/
public PointScope putNode(ConsistentHashNode node) {
PointScope[] pointScopes = putNode(node, 0);
return pointScopes != null && pointScopes.length > 0 ? pointScopes[0] : null;
}
/**
* 新增节点
*
* @param node 节点信息
* @param virtualCount 虚拟节点数量
* @return 影响点范围
*/
public PointScope[] putNode(ConsistentHashNode node, Integer virtualCount) {
TreeMap<Integer, ConsistentHashNode> nodes = this.consistentHashLoop.getNodes();
if (node.getPoint() == null) {
throw new SystemException("node point can't be null");
}
List<ConsistentHashNode> list = new ArrayList<>();
nodes.put(node.getPoint(), node);
list.add(node);
for (int i = 0; i < virtualCount; i++) {
int point = getPoint(node.getPoint() + "#V_" + i);
ConsistentHashNode virtualNode = new ConsistentHashNode(point, node.getTarget());
nodes.put(point, virtualNode);
list.add(virtualNode);
}
// 如果新增是否为第一批节点,则不计算影响范围
if (nodes.size() == list.size()) {
return new PointScope[0];
}
// 按照从小到大排序
list.sort(Comparator.comparingInt(ConsistentHashNode::getPoint));
// 计算出所有新增节点影响到的范围
Map<ConsistentHashNode, PointScope> pointScopeMap = new HashMap<>(list.size());
for (ConsistentHashNode endNode : list) {
PointScope pointScope = getPointScope(endNode);
if (pointScope != null) {
pointScopeMap.put(endNode, pointScope);
}
}
return pointScopeMap.values().toArray(new PointScope[0]);
}
/**
* 删除节点
*
* @param point 节点所在点
* @return 影响点范围
*/
public PointScope[] removeNode(int point) {
TreeMap<Integer, ConsistentHashNode> nodes = this.consistentHashLoop.getNodes();
ConsistentHashNode node = nodes.get(point);
if (node == null) {
return null;
}
// 查出所有相同目标的节点并删除
List<ConsistentHashNode> list = new ArrayList<>();
this.consistentHashLoop.getNodes().forEach((key, value) -> {
if (Objects.equals(value.getTarget(), node.getTarget())) {
list.add(value);
}
});
list.stream().map(ConsistentHashNode::getPoint).forEach(nodes::remove);
// 按照从小到大排序
list.sort(Comparator.comparingInt(ConsistentHashNode::getPoint));
// 计算出所有删除节点影响到的范围
Map<ConsistentHashNode, PointScope> pointScopeMap = new HashMap<>(list.size());
Map<ConsistentHashNode, List<ConsistentHashNode>> nextNodeMap = new HashMap<>(list.size());
for (ConsistentHashNode endNode : list) {
PointScope pointScope = getPointScope(endNode);
if (pointScope != null) {
pointScopeMap.put(endNode, pointScope);
if (pointScope.getNextNode() != null) {
nextNodeMap.computeIfAbsent(pointScope.getNextNode(), (obj) -> new ArrayList<>()).add(endNode);
}
}
}
// 如果删除的节点只有一个,则直接返回
if (pointScopeMap.size() <= 1) {
return pointScopeMap.values().toArray(new PointScope[0]);
}
// 去掉重叠的影响范围
List<PointScope> result = new ArrayList<>(nextNodeMap.size());
for (Map.Entry<ConsistentHashNode, List<ConsistentHashNode>> entry : nextNodeMap.entrySet()) {
ConsistentHashNode nextNode = entry.getKey();
List<ConsistentHashNode> nodeList = entry.getValue();
if (nodeList.size() == 1) {
result.add(pointScopeMap.get(nodeList.get(0)));
continue;
}
ConsistentHashNode shortestNode = geLongestDistance(nodeList, nextNode.getPoint());
PointScope pointScope = pointScopeMap.get(shortestNode);
pointScope.setEnd(nextNode.getPoint() - 1);
result.add(pointScope);
}
return result.toArray(new PointScope[0]);
}
/**
* 获取影响点范围
*
* @param endNode 截止节点
* @return 影响点范围
*/
public PointScope getPointScope(ConsistentHashNode endNode) {
TreeMap<Integer, ConsistentHashNode> nodes = this.consistentHashLoop.getNodes();
if (nodes.size() == 0) {
return null;
}
Map.Entry<Integer, ConsistentHashNode> entry = nodes.lowerEntry(endNode.getPoint());
ConsistentHashNode startNode = entry != null ? entry.getValue() : nodes.lastEntry().getValue();
return getPointScope(startNode, endNode);
}
/**
* 获取影响点范围
*
* @param startNode 起始节点
* @param endNode 截止节点
* @return 影响点范围
*/
public PointScope getPointScope(ConsistentHashNode startNode, ConsistentHashNode endNode) {
if (startNode == endNode || startNode.getPoint().equals(endNode.getPoint())) {
return null;
}
// 查询影响范围的下一个节点
ConsistentHashNode nextNode = null;
TreeMap<Integer, ConsistentHashNode> nodes = this.consistentHashLoop.getNodes();
if (nodes.size() > 0) {
Map.Entry<Integer, ConsistentHashNode> entry = nodes.ceilingEntry(endNode.getPoint());
if (entry == null) {
entry = nodes.firstEntry();
}
nextNode = entry.getValue();
}
// 防止点数超过最大点总数
Integer pointCount = this.getConsistentHashLoop().getPointCount();
int startPoint = startNode.getPoint().equals(pointCount) ? pointCount : startNode.getPoint() + 1;
return new PointScope(startPoint, endNode.getPoint(), nextNode);
}
/**
* 获取点到目标点的距离
* 顺时针方向
*
* @param point 点
* @param target 目标点
* @return 最近距离点
*/
private int getTargetDistance(int point, int target) {
if (point <= target) {
return target - point;
}
return this.consistentHashLoop.getPointCount() - target + point;
}
/**
* 获取最距离目标点最长的节点
* 顺时针方向
*
* @param points 点数组
* @param target 目标点
* @return 节点
*/
private ConsistentHashNode geLongestDistance(List<ConsistentHashNode> points, int target) {
ConsistentHashNode result = null;
int longest = -1;
for (ConsistentHashNode point : points) {
int distance = getTargetDistance(point.getPoint(), target);
if (result == null || distance > longest) {
result = point;
longest = distance;
}
}
return result;
}
}
点个赞再走吧~