一、常用访问地址
Cluster Web访问URL(集群)
http://127.0.0.1:8088/cluster/
DFS Web访问URL
http://127.0.0.1:50070/dfshealth.html#tab-overview
Node Web访问URL
http://127.0.0.1:8042/node/node
二、错误记录
1、进入Mysql bin数据库并进行初始化失败
进入Mysql bin数据库并进行初始化失败
本地进入目录
/Applications/MAMP/Library/bin
执行命令
./schematool -dbType mysql -initSchema
.......
.......
Error: Table 'ctlgs' already exists (state=42S01,code=1050)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
问题原因
由于mysql版本原因,导致导入初始化数据时出现问题。
解决办法
升级到mysql8.0.11, 最选升级到8.0.16但不稳定,无法用127.0.0.0链接数据库只能用localhost.
换8.0.11后执行正常。
2、hive 登录成功,show databases提示错误
Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Logging initialized using configuration in file:/Users/software/hadoop/hadoop2.9/hive3.1/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> show databases;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
hive>
>
>
> show tables;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
hive> show databases;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
hive> show databases;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
问题原因
未执行./schematool -dbType mysql -initSchema 初始化数据导致。
解决办法
初始数据结构后登录正常。
3、sqoop导入Mysql数据错误
bogon:bin$ ./sqoop import \
> --connect jdbc:mysql://127.0.0.1:3306/db \
> --username root --password 11111111 \
> --table testSqoop \
> --hive-import \
> --hive-table testSqoop -m 1
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
错误原因
找不到或无法加载主类 org.apache.sqoop.Sqoop
解决办法
1、下载sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz包。
地址: http://archive.apache.org/dist/sqoop/1.4.7/
2、将sqoop-1.4.7.jar放到sqoop根目录lib下面。
3、修改sqoop/bin目录下的sqoop配置文件
将#exec ${HADOOP_COMMON_HOME}/bin/hadoop org.apache.sqoop.Sqoop "$@"exec
改为
${HADOOP_COMMON_HOME}/bin/hadoop jar /Users/software/sqoop/sqoop-1.4.7/lib/sqoop-1.4.7.jar org.apache.sqoop.Sqoop "$@"
4、sqoop导入报java.lang.NoClassDefFoundError: org/apache/avro/LogicalType
bogon:bin$ ./sqoop import --connect jdbc:mysql://127.0.0.1:3306/db --username root --P --table testSqoop --hive-import --hive-table testSqoop -m 1
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /Users/software/sqoop/sqoop-1.4.7/bin/../../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
19/07/10 22:15:12 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Enter password:
19/07/10 22:15:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/07/10 22:15:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/LogicalType
at org.apache.sqoop.manager.DefaultManagerFactory.accept(DefaultManagerFactory.java:67)
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:184)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:272)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:96)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:616)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.LogicalType
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
原因方法
下载的sqoop版本不正确,需要下载源码包执行命令即可。
执行错误tar名: sqoop-1.4.7.tar.gz
执行正确tar包: sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz
5、sqoop import 导入报 java.sql.SQLException
bin$ ./sqoop import --connect jdbc:mysql://127.0.0.1:3306/db --username root --P
–table testSqoop --hive-import --hive-table testSqoop -m 1
…
…
…
19/07/10 22:55:00 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: The connection property ‘zeroDateTimeBehavior’ acceptable values are: ‘CONVERT_TO_NULL’, ‘EXCEPTION’ or ‘ROUND’. The value ‘convertToNull’ is not acceptable.
java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: The connection property ‘zeroDateTimeBehavior’ acceptable values are: ‘CONVERT_TO_NULL’, ‘EXCEPTION’ or ‘ROUND’. The value ‘convertToNull’ is not acceptable.
解决办法
./sqoop import --connect jdbc:mysql://127.0.0.1:3306/db?zeroDateTimeBehavior=CONVERT_TO_NULL --username root --P --table testSqoop --hive-import --hive-table testSqoop -m 1
6、sqoop import 报 java.lang.ClassNotFoundException: Class testSqoop not found
./sqoop import --connect jdbc:mysql://127.0.0.1:3306/db?zeroDateTimeBehavior=CONVERT_TO_NULL --username root --P --table testSqoop --hive-import --hive-table testSqoop -m 1
......
......
......
java.lang.Exception: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class testSqoop not found
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:491)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:551)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class testSqoop not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2395)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getInputClass(DBConfiguration.java:403)
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.createDBRecordReader(DataDrivenDBInputFormat.java:270)
at org.apache.sqoop.mapreduce.db.DBInputFormat.createRecordReader(DBInputFormat.java:266)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:521)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:270)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: Class testSqoop not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2299)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2393)
... 12 more
19/07/10 23:18:57 INFO mapreduce.Job: Job job_local1144579819_0001 running in uber mode : false
19/07/10 23:18:57 INFO mapreduce.Job: map 0% reduce 0%
19/07/10 23:18:57 INFO mapreduce.Job: Job job_local1144579819_0001 failed with state FAILED due to: NA
19/07/10 23:18:57 INFO mapreduce.Job: Counters: 0
19/07/10 23:18:57 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
19/07/10 23:18:57 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 2.047 seconds (0 bytes/sec)
19/07/10 23:18:57 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
19/07/10 23:18:57 INFO mapreduce.ImportJobBase: Retrieved 0 records.
19/07/10 23:18:57 ERROR tool.ImportTool: Import failed: Import job failed!
解决办法
在import导入命令中加入 “–bindir ./” 参数错误问题
./sqoop import --connect jdbc:mysql://127.0.0.1:3306/db?zeroDateTimeBehavior=CONVERT_TO_NULL --username root --P --table testSqoop --hive-import --hive-table testSqoop --bindir ./ -m 1
7、sqoop import ERROR tool.ImportTool: Import failed: java.io.IOException
./sqoop import --connect jdbc:mysql://127.0.0.1:3306/db?zeroDateTimeBehavior=CONVERT_TO_NULL --username root --P --table testSqoop --hive-import --hive-table testSqoop --bindir ./ -m 1
......
......
......
19/07/11 14:23:22 INFO hive.HiveImport: Loading uploaded data into Hive
19/07/11 14:23:22 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
19/07/11 14:23:22 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
... 18 more
解决方法
Sqoop&Hive,解决ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
解决方法:
在 $SQOOP_HOME/lib目录下执行以下命令
ln -s /app/soft/apache-hive-2.3.2-bin/lib/hive-exec-2.3.2.jar hive-exec-2.3.2.jar
ln -s /Users/software/hadoop/hadoop2.9/hive3.1/lib/hive-exec-3.1.1.jar hive-exec-3.1.1.jar
8、Cannot create directory /tmp/hive/
Logging initialized using configuration in file:/Users/software/hadoop/hadoop2.9/hive3.1/conf/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /tmp/hive/75606c63-4f07-440a-93b8-748e6b1c9a5e. Name node is in safe mode.
The reported blocks 21 has reached the threshold 0.9990 of total blocks 21. The number of live datanodes 1 has reached the minimum number 0. In safe mode extension. Safe mode will be turned off automatically in 10 seconds. NamenodeHostName:localhost
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.newSafemodeException(FSNamesystem.java:1412)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1400)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2989)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1096)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:652)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:503)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:871)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:817)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2606)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:648)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:588)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /tmp/hive. Name node is in safe mode.
解决方法
NameNode在启动的时候首先进入安全模式,如果datanode丢失的block达到一定的比例(1-dfs.safemode.threshold.pct),则系统会一直处于安全模式状态即只读状态。
dfs.safemode.threshold.pct(缺省值0.999f)表示HDFS启动的时候,如果DataNode上报的block个数达到了元数据记录的block个数的0.999倍才可以离开安全模式,否则一直是这种只读模式。如果设为1则HDFS永远是处于SafeMode。
下面这行摘录自NameNode启动时的日志(block上报比例1达到了阀值0.9990)
The ratio of reported blocks 1.0000 has reached the threshold 0.9990. Safe mode will be turned off automatically in 18 seconds.
有两个方法离开这种安全模式
(1)修改dfs.safemode.threshold.pct为一个比较小的值,缺省是0.999。
(2)hadoop dfsadmin -safemode leave命令强制离开
用户可以通过dfsadmin -safemode value 来操作安全模式,参数value的说明如下:
enter – 进入安全模式
leave – 强制NameNode离开安全模式
get - 返回安全模式是否开启的信息
wait – 等待,一直到安全模式结束。
9、FAILED: HiveException java.lang.RuntimeException: Unable to instantiate
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive (default)> show databases;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
[2]+ Stopped hive
解决问题
运行hive提示Name node is in safe mode.错误
1.问题所在:
内存不足
可以使用df -hl查看内存情况
2.解决方式:
(1)rm -rf 删除限制文件夹及内部内容
(2)使用
hdfs dfsadmin -safemode leave