Exception in thread "main" java.lang.RuntimeException: java.lang.NoSuchFieldException: versionID

z
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.RuntimeException: java.lang.NoSuchFieldException: versionID
    at org.apache.hadoop.ipc.RPC.getProtocolVersion(RPC.java:174)
    at org.apache.hadoop.ipc.WritableRpcEngine$Invocation.<init>(WritableRpcEngine.java:114)
    at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:242)
    at com.sun.proxy.$Proxy4.login(Unknown Source)
    at cn.hadoop.rpc.LoginController.main(LoginController.java:13)
Caused by: java.lang.NoSuchFieldException: versionID
    at java.lang.Class.getField(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProtocolVersion(RPC.java:170)

    ... 4 more



1:这是提示需要指定版本号versionIDLoginServiceInterface

package cn.hadoop.rpc;

import java.io.IOException;
import java.net.InetSocketAddress;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.ipc.RPC;


public class LoginController {
    public static void main(String[] args) throws IOException {
        LoginServiceInterface proxy = RPC.getProxy(LoginServiceInterface.class, 1L, new InetSocketAddress("master",10000), new Configuration());
        String result = proxy.login("qiqi", "123456");
        
        System.out.println(result);
    }
}



2:在接口LoginServiceInterface中定义一个静态变量的版本号,这要和main方法的LoginServiceInterface proxy = RPC.getProxy(LoginServiceInterface.class, 1L, new InetSocketAddress("master",10000), new Configuration());里面的版本号一致:

package cn.hadoop.rpc;

public interface LoginServiceInterface {
    public static final long versionID=1L;
    public String login(String username,String passwd);
}


3:同样在虚拟机中eclipse的LoginServiceInterface中也加入public static final long versionID=1L;问题就解决了!


Exception in thread "main" java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131) at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118) at org.apache.spark.network.server.TransportServer.init(TransportServer.java:95) at org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:74) at org.apache.spark.network.TransportContext.createServer(TransportContext.java:114) at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:118) at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:454) at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:453) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2237) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2229) at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:458) at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:246) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901) at com.cssl.scala720.KafkaSparkStreamingHBase$.main(KafkaSparkStreamingHBase.scala:28) at com.cssl.scala720.KafkaSparkStreamingHBase.main(KafkaSparkStreamingHBase.scala) Caused by: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE at java.lang.Class.getDeclaredField(Class.java:2070) at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:127) ... 23 more Process finished with exit code 1
07-24
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值