Hive 中自定义UDF 加密函数出现:JCE cannot authenticate the provider BC
解决办法:$HIVE_HOME/lib中添加对应的bcprov-jdk15on-xxx.jar 包,重启HIve SER2
Starrocks中自定义UDF 加密函数出现:JCE cannot authenticate the provider BC
1)在所有节点的JDK目录下jre/lib/security/java.security中添加:security.provider.10=org.bouncycastle.jce.provider.BouncyCastleProvider
2)将bcprov-jdk15on-xxx.jar拷贝至JDK的jre/lib/ext目录下
3)重启Starrocks *
还有一种办法是不使用oracle jdk使用内置jdk方式替换,此处不做详细说明
Flink on Yarn(CDH) 程序中添加加解密逻辑出现:JCE cannot authenticate the provider BC
ansible test-yarn -m shell -a "sudo mv /opt/cloudera/parcels/CDH/lib/hadoop-yarn/lib/bcprov-jdk15on-1.60.jar /opt/cloudera/parcels/CDH/lib/hadoop-yarn/lib/bcprov-jdk15on-1.60.jar.back"
ansible test-yarn -m shell -a "sudo ls -la /opt/cloudera/parcels/CDH/lib/hadoop-yarn/lib/bcprov-jdk15on-1.60.jar.back"
sudo ansible test-yarn -m copy -a "src=/app/sit/bcprov-jdk15on-1.64.jar dest=/opt/cloudera/parcels/CDH/lib/hadoop-yarn/lib/"
项目中配置
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId>
<version>1.64</version>
<scope>provided</scope>
</dependency>
原本更改为和yarn中的版本一致,1.60版本会引发这个问题:WARN org.apache.flink. runtime. taskmanager.Task
I - Source: Custom Source - Map - Filter - (Sink: Unnamed, Sink: Unnamed) (1/3)#358
(8f24bd0650edeacd44d0f115619f960) switched from RUNNING to FAILED with failure cause: java.lang.NoSuchMethodError: org.bouncycastle.crypto.engines.SM2Engine.<init>(Lorg/bouncycastle/crypto/engines/
SMzEnginesMode
持续更新 …