1.启动hadoop集群时namenode可以启动,并且namenode所在的机器的datanode能够启动。但是集群中其他机器的datanode无法启动,日志中报错(datanode unsigned)
原因分析:
由于namenode进行过格式化,或者新加入的datanode之前在其他集群,导致hadoop集群启动时识别出此datanode不属于此集群,为了保护数据集群便不操作此节点。
解决方案:
将新加入的节点的datanode存放数据的目录删除。重新启动,便可加入集群(PS:datanode的core-site.xml文件中的集群的namenode要配置为集群的namnode的地址)
2.Hive建表报key字段过长
FAILED: Error in metadata:javax.jdo.JDODataStoreException: An exception was thrown whileadding/validating class(es) : Specified key was too long; max key length is 767bytes
原因分析:
这个问题是因为hive对mysql的UTF-8编码方式有限制,修改一下mysql的编码方式即可
解决方案:
在mysql中执行
alter database hive_db character set latin1; hive_db 用自己的database name替换即可。
3.Sqoop实现HDFS导入mysql时出错
在执行sqoop export --connect jdbc:mysql://mini1:3306/urlcontentanalyse --username root --password 123456 \
--table urlrule \
--export-dir /wc/output4/ \
--columns url \
--input-fields-terminated-by '\t'
的时候出现错误提示
Exception in thread "main" Java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
在java中调用sqoop接口进行mysql和hdfs直接数据传输时,遇到以下错误:
Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
这里需要注意,sqoop有两个版本:
sqoop-1.4.4.bin__hadoop-1.0.0.tar.gz(对应hadoop1版本)
sqoop-1.4.4.bin__hadoop-2.0.4-alpha.tar.gz(对应hadoop2版本)
http://mirrors.tuna.tsinghua.edu.cn/apache/sqoop/1.4.6/
出现上面的错误就是hadoop和对应的sqoop版本不一致,二者保持一致即可解决问题。
4.Hive
Hive是使用java api创建自定义函数的时候出现如下报错信息:
java.lang.UnsupportedClassVersionError: cn/itcast/hive/UDF/UDFtoLower : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.hive.ql.exec.FunctionTask.getUdfClass(FunctionTask.java:307)
at org.apache.hadoop.hive.ql.exec.FunctionTask.createTemporaryFunction(FunctionTask.java:174)
at org.apache.hadoop.hive.ql.exec.FunctionTask.execute(FunctionTask.java:74)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.FunctionTask. cn/itcast/hive/UDF/UDFtoLower : Unsupported major.minor version 52.0
原因分析:
由于hive版本和编写hive自定义的函数的javaAPI之间的兼容性问题导致。
解决方案:
在eclipse中:右击项目-->属性
之后选择edit编辑当前的JRE版本,设置为当前版本的上一个或者两个版本,基本就可以解决了。
5 hbase启动时提示如下错误:
2016-12-28 19:00:26,462 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-12-28 19:00:26,875ERROR [main] master.HMasterCommandLine: Master exiting
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMaster
原因分析:
从网上查到原因是:Hbase中
缺少aws-Java-sdk-1.7.4.jar包。
解决方法:
去Hadoop\share\hadoop\tools\lib下找到该包,并存放到Hbase\lib下
6.Hbase启动时提示java_home查找不到
hduser@CSLAP106:~$ /usr/local/hbase/hbase-0.94.6.1/bin/start-hbase.sh
+======================================================================+
| Error: JAVA_HOME is not set and Java could not be found |
+----------------------------------------------------------------------+
| Please download the latest Sun JDK from the Sun Java web site |
| > http://java.sun.com/javase/downloads/ < |
| |
| HBase requires Java 1.6 or later. |
| NOTE: This script will find Sun Java whether you install using the |
| binary or the RPM based installer. |
+======================================================================+
在已经配置了java环境变量的情况下:
可能是在hbase_env.sh中配置java_home的那行备注了:
#export JAVA_HOME=/usr/jdk/
删除注释应该就可以了
7.Hive启动时报错:
"main" java.lang.IncompatibleClassChangeError: Found interface jline.Terminal, but class was expected
Logging initialized using configuration in jar:file:/home/hadoop/apps/hive/lib/hive-common-0.14.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/apps/hadoop-2.6.4/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/apps/hive/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface jline.Terminal, but class was expected
at jline.ConsoleReader.<init>(ConsoleReader.java:191)
at jline.ConsoleReader.<init>(ConsoleReader.java:186)
at jline.ConsoleReader.<init>(ConsoleReader.java:174)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:798)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:746)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
原因分析:
./hadoop-2.6.0/share/hadoop/yarn/lib/jline-0.9.94.jar
./apache-hive-1.2.0-bin/lib/jline-2.12.jar
hadoop中的jline-2.12.jar和hive中的版本不一致
解决方法:
将hive中的jline-2.12.jar拷贝到hadoop中
8.HIve启动出错:
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/apps/hadoop-2.6.4/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/apps/hive/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
原因分析:
经排查可能是hive的版本过高,
解决方案:
修改为低版本的hive
解决。
9.hive中注释显示中文字符乱码的解决方法:
hive元数据中utf8的修改
10 hadoop在格式化集群的时候 :hdfs namenode -format 时:报
SHUTDOWN_MSG: Shutting down NameNode at java.net.UnknownHostException: sparkproject1: sparkproject1
原因分析:找不到对应的sparkproject1
错误处理:运行hostname 查看自己的本机名
也就是说,Hadoop在格式化HDFS的时候,通过hostname命令获取到的主机名是localhost.localdomain,然后在/etc/hosts文件中进行映射的时候,没有找到
此时查看/etc/sysconfig/network文件中对应的hostname是不是和/etc/hosts文件中的名称一致。
解决办法:修改两个文件中的名称一致,并重启网络,service network restart
如果重启网络没有效果,重启机器可以解决