flow mac java_从HDFS mac读取Tensorflow:java.lang.nosuchfield错误:日志

在Mac环境下,尝试从Tensorflow中读取HDFS数据时,遇到了java.lang.NoSuchFieldError: LOG的问题。错误出现在org.apache.hadoop.ipc.ClientCache类的getClient方法中。已确认在Mac上从源代码构建了带有Hadoop支持的Tensorflow和Hadoop,但问题依然存在。检查了hadoop version和hadoop checknative的输出,以及环境变量设置,包括HADOOP_HOME和其他相关路径。寻求解决方案。
摘要由CSDN通过智能技术生成

我正在尝试从我的mac上的tensorflow中读取外部hadoop。我从源代码处构建了具有hadoop支持的tf,并且在我的mac上构建了具有本机库支持的hadoop。我得到了以下错误hdfsBuilderConnect(forceNewInstance=0, nn=192.168.60.53:9000, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:

java.lang.NoSuchFieldError: LOG

at org.apache.hadoop.ipc.ClientCache.getClient(ClientCache.java:62)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.(ProtobufRpcEngine.java:145)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.(ProtobufRpcEngine.java:133)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.(ProtobufRpcEngine.java:119)

at org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(ProtobufRpcEngine.java:102)

at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:579)

at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:418)

at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:314)

at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)

at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:678)

at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:619)

at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)

at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)

at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:162)

at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:159)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:159)

2018-10-05 16:01:21.867554: W tensorflow/core/kernels/queue_base.cc:277] _0_input_producer: Skipping cancelled enqueue attempt with queue not closed

Traceback (most recent call last):

这是我的代码:

^{pr2}$

我已经在mac上从源代码构建了hadoop。在$ hadoop version

Hadoop 2.7.3

Subversion https://github.com/apache/hadoop.git -r baa91f7c6bc9cb92be5982de4719c1c8af91ccff

Compiled by himaprasoon on 2018-10-04T11:09Z

Compiled with protoc 2.5.0

From source with checksum 2e4ce5f957ea4db193bce3734ff29ff4

This command was run using /Users/himaprasoon/git/hadoop/hadoop-dist/target/hadoop-2.7.3/share/hadoop/common/hadoop-common-2.7.3.jar

hadoop checknative输出$ hadoop checknative

18/10/05 16:15:05 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library libbz2.dylib

18/10/05 16:15:05 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library

Native library checking:

hadoop: true /Users/himaprasoon/git/hadoop/hadoop-dist/target/hadoop-2.7.3/lib/native/libhadoop.dylib

zlib: true /usr/lib/libz.1.dylib

snappy: true /usr/local/lib/libsnappy.1.dylib

lz4: true revision:99

bzip2: true /usr/lib/libbz2.1.0.dylib

openssl: true /usr/local/lib/libcrypto.dylib

tf版本:1.10.1

你知道我做错了什么吗?在

以下是我的环境变量。在HADOOP_HOME=/Users/himaprasoon/git/hadoop/hadoop-dist/target/hadoop-2.7.3/

HADOOP_MAPRED_HOME=$HADOOP_HOME

HADOOP_COMMON_HOME=$HADOOP_HOME

HADOOP_HDFS_HOME=$HADOOP_HOME

YARN_HOME=$HADOOP_HOME

HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

HADOOP_INSTALL=$HADOOP_HOME

OPENSSL_ROOT_DIR="/usr/local/opt/openssl"

LDFLAGS="-L${OPENSSL_ROOT_DIR}/lib"

CPPFLAGS="-I${OPENSSL_ROOT_DIR}/include"

PKG_CONFIG_PATH="${OPENSSL_ROOT_DIR}/lib/pkgconfig"

OPENSSL_INCLUDE_DIR="${OPENSSL_ROOT_DIR}/include"

PATH="/usr/local/opt/protobuf@2.5/bin:$PATH

HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"

LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${HADOOP_HOME}/lib/native

JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH:${HADOOP_HOME}/lib/native

这就是我如何运行我的程序CLASSPATH=$($HADOOP_HDFS_HOME/bin/hdfs classpath --glob) python3.6 myfile.py

用于构建tf和hadoop的引用

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值