初次安装pig,并使用pig -x local 开启本地模式,但遇到
Exception in thread "main" java.lang.NoClassDefFoundError: org/joda/time/ReadableInstant
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.util.RunJar.main(RunJar.java:205)
Caused by: java.lang.ClassNotFoundException: org.joda.time.ReadableInstant
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 3 more
从网上查找问题解决办法,始终没有找到满意的答案,只好自己分析了。
1.查看环境变量,没问题;
2.查看依赖的包,发现lib下有joda的jar包;
3.推测是CLASSPATH有问题,查看pig shell脚本,经过分析,发现脚本将lib下的所有jar均加入了CLASSPATH;
4.继续读shell,发现
# run it
if [ -n "$HADOOP_BIN" ]; then // 如果$HADOOP_BIN变量非空
if [ "$debug" == "true" ]; then
echo "Find hadoop at $HADOOP_BIN"
fi
PIG_JAR=`echo $PIG_HOME/pig*-core-h${HADOOP_VERSION}.jar`
# for deb/rpm package, add pig jar in /usr/share/pig
if [ -z "$PIG_JAR" ]; then
PIG_JAR=`echo $PIG_HOME/share/pig/pig*-core-h${HADOOP_VERSION}.jar`
fi
if [ -n "$PIG_JAR" ]; then
CLASSPATH=${CLASSPATH}:$PIG_JAR
else
if [ "$HADOOP_VERSION" == "1" ]; then
echo "Cannot locate pig-core-h${HADOOP_VERSION}.jar. do 'ant jar', and try again"
else
echo "Cannot locate pig-core-h${HADOOP_VERSION}.jar. do 'ant -Dhadoopversion=23 jar', and try again"
fi
exit 1
fi
for f in $PIG_HOME/lib/h${HADOOP_VERSION}/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
export HADOOP_CLASSPATH=$CLASSPATH:$HADOOP_CLASSPATH
export HADOOP_CLIENT_OPTS="$JAVA_HEAP_MAX $PIG_OPTS $HADOOP_CLIENT_OPTS"
if [ "$debug" == "true" ]; then
echo "dry run:"
echo "HADOOP_CLASSPATH: $HADOOP_CLASSPATH"
echo "HADOOP_OPTS: $HADOOP_OPTS"
echo "HADOOP_CLIENT_OPTS: $HADOOP_CLIENT_OPTS"
echo "$HADOOP_BIN" jar "$PIG_JAR" "${remaining[@]}"
echo
else
exec "$HADOOP_BIN" jar "$PIG_JAR" "${remaining[@]}"
fi
else/ // $HADOOP_BIN变量为空, 适用 pig -x local
# use hadoop-core.jar to run local mode
PIG_JAR=`echo $PIG_HOME/pig*-core-h1.jar`
if [ -n "$PIG_JAR" ]; then
CLASSPATH="${CLASSPATH}:$PIG_JAR"
else
echo "Cannot locate pig.jar. do 'ant jar', and try again"
exit 1
fi
for f in $PIG_HOME/lib/h1/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
# Add bundled hadoop-core.jar
for f in $PIG_HOME/lib/hadoop1-runtime/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
if [ "$debug" == "true" ]; then
echo "Cannot find local hadoop installation, using bundled `java -cp $CLASSPATH org.apache.hadoop.util.VersionInfo | head -1`"
fi
CLASS=org.apache.pig.Main
if [ "$debug" == "true" ]; then
echo "dry run:"
echo "$JAVA" $JAVA_HEAP_MAX $PIG_OPTS -classpath "$CLASSPATH" $CLASS "${remaining[@]}"
echo
else
exec "$JAVA" $JAVA_HEAP_MAX $PIG_OPTS -classpath "$CLASSPATH" $CLASS "${remaining[@]}"
fi
fi
再往前查看shell,发现
if which hadoop >/dev/null; then
HADOOP_BIN=`which hadoop`
fi
看来这个坑有点深,终于发现问题出在哪了。
安装pig的linux服务器上,安装了hadoop了,并且设置入了环境变量,导致进入不了local模式