配置好后,start-all出现 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Ljava/lang/String;Ljava/lang/St ring;I)Ljava/io/FileDescriptor; at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Native Method) at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.create(SharedFileDescriptorFactory.java:87) at org.apache.hadoop.hdfs.server.datanode.ShortCircuitRegistry.<init>(ShortCircuitRegistry.java:169) at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:586) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:773) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:292) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1893) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1780) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1827) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2003) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2027)如果用32位的jdk启动64位的hadoop也报这个错,改成64位的jdk就好了参考http://www.cnblogs.com/fanfanfantasy/p/4123412.html
Windows hadoop 64位安装
最新推荐文章于 2024-07-19 00:36:58 发布