java: 不兼容的类型: com.baomidou.mybatisplus.core.conditions.query.QueryWrapper<org.springframework.securi

java: 不兼容的类型: com.baomidou.mybatisplus.core.conditions.query.QueryWrapper<org.springframework.security.core.userdetails.User>无法转换为com.baomidou.mybatisplus.core.conditions.Wrapper<com.wqt.springsecruitydemo1.entity.User>

core无法转为wrapper

@Service("userDetailsService2")
public class MyUserDetailService1 implements UserDetailsService {
    @Autowired
    private UserMapper userMapper;

    @Override
    public UserDetails loadUserByUsername(String username) throws UsernameNotFoundException {


        QueryWrapper<Users> wrapper = new QueryWrapper<>();
        wrapper.eq("username", username);
        Users user = userMapper.selectOne(wrapper);
        if(user == null) {
            throw new UsernameNotFoundException("用户名不存在!");
        }
        System.out.println(user);
        List<GrantedAuthority> auth = AuthorityUtils.commaSeparatedStringToAuthorityList("role");
        return new User("ww", new BCryptPasswordEncoder().encode("222"),auth);
    }
}

错误原因:

用户名密码实现类命名为了User,与之前的User冲突了,改为Users就好了。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
2023-07-07 20:11:52,076 INFO [upload-pool-40] c.e.d.j.DataUnitService.DataUnitService#uploadFileToHdfs[DataUnitService.java:98] 本次文件上传HDFS用时:18s 2023-07-07 20:11:52,077 INFO [upload-pool-40] c.e.d.j.DataUnitService.DataUnitService#uploadFileToHdfs[DataUnitService.java:98] 本次文件上传HDFS用时:0s 2023-07-07 20:11:52,514 INFO [upload-pool-35] c.e.d.j.DataUnitService.DataUnitService#tohiveWy[DataUnitService.java:172] /u01/tarsftp//2023070719575912003640001.txt.gz解压>>>>>>/u01/untarsftp/ 2023-07-07 20:11:52,520 WARN [Thread-4655232] o.a.h.h.DFSClient.DFSOutputStream$DataStreamer#run[DFSOutputStream.java:558] DataStreamer Exception org.apache.hadoop.ipc.RemoteException: File /dataunit/cu_access_log/10/2023070719575912003640001.txt could only be written to 0 of the 1 minReplication nodes. There are 11 datanode(s) running and no node(s) are excluded in this o peration. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2121) at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:286) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2706) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) at java.security.AccessController.doPrivileged(Native Method) at javax.securi
07-13
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值