list集合的去重方法以下四种:
1、hashset不能添加重复数据,但是hashset没有顺序,只作为判断条件。
private static void removeDuplicate(List<String> list) {
HashSet<String> set = new HashSet<String>(list.size());
List<String> result = new ArrayList<String>(list.size());
for (String str : list) {
if (set.add(str)) {
result.add(str);
}
}
list.clear();
list.addAll(result);
}
2、LinkedHashSet不能添加重复数据并能保证添加顺序的特性;
private static void removeDuplicate(List<String> list) {
LinkedHashSet<String> set = new LinkedHashSet<String>(list.size());
set.addAll(list);
list.clear();
list.addAll(set);
}
3、利用数组的contains方法
private static void removeDuplicate(List<String> list) {
List<String> result = new ArrayList<String>(list.size());
for (String str : list) {
if (!result.contains(str)) {
result.add(str);
}
}
list.clear();
list.addAll(result);
}
4、去除List中重复的String
List uniqueStr = list.stream().distinct().collect(Collectors.toList());
5、按对象属性去重
users = users.stream().collect(collectingAndThen(toCollection(() -> new TreeSet<>(Comparator.comparing(UserInDeptDto::getUserId))), ArrayList::new));
————————————————