大数据——那些年走过的坑(异常报错解决方案,持续更新)

1. Sqoop将mysql中表导入到hive遇到 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

2021-08-03 21:13:28,937 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
2021-08-03 21:13:28,938 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
	at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
	at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
	at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
	at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:264)
	at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
	... 12 more

解决方案1:

往~/.bashrc最后加入 export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*
然后刷新配置,source ~/.bashrc

解决方案2:

将hive中的hive-common-3.1.2.jar拷贝至sqoop的lib目录下

[root@singleNode bin]# find /opt/install/hive/ -name 'hive-common*.jar'
/opt/install/hive/lib/hive-common-3.1.2.jar
[root@singleNode bin]# cp /opt/install/hive/lib/hive-common-3.1.2.jar /opt/install/sqoop/lib/

2. 使用sqoop从mysql导数据到hive报错ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")

2018-06-22 12:28:32,398 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")</strong>
        at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
        at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
        at org.apache.logging.log4j.core.jmx.Server.register(Server.java:379)
        at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:171)
        at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:147)
        at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:457)
        at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:246)
        at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:230)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:140)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:113)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:98)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:156)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:121)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:73)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:54)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:661)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

18/06/22 12:28:32 WARN common.LogUtils: hive-site.xml not found on CLASSPATH

Logging initialized using configuration in jar:file:/home/hive/lib/hive-exec-2.0.0.jar!/hive-log4j2.properties
18/06/22 12:28:32 INFO SessionState:
Logging initialized using configuration in jar:file:/home/hive/lib/hive-exec-2.0.0.jar!/hive-log4j2.properties
18/06/22 12:28:32 INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
18/06/22 12:28:32 INFO metastore.ObjectStore: ObjectStore, initialize called
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-api-jdo-4.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-api-jdo-4.2.1.jar."
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-rdbms-4.1.7.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-rdbms-4.1.7.jar."
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-core-4.1.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-core-4.1.6.jar."
18/06/22 12:28:33 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/06/22 12:28:33 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
18/06/22 12:28:41 ERROR bonecp.BoneCP: Unable to start/stop JMX

解决方案1:

修改jdk的文件jdk1.8.0_11/jre/lib/security/java.policy
具体配置如下:在文件中添加如下内容

grant {
        permission javax.management.MBeanTrustPermission "register";
};

解决方案2:

将hive-site.xml复制到sqoop的conf目录下即可

3. 虚拟机重启后虚拟机内docker镜像连不上 Error response from daemon: Container xxxx is not running

解决方案

查看sysctl net.ipv4.ip_forward状态 若为0就需要修改

file

具体使用的几个命令

  • echo ‘net.ipv4.ip_forward = 1’ >> /usr/lib/sysctl.d/50-default.conf
  • sysctl -p /usr/lib/sysctl.d/50-default.conf

4. Sqoop空指针异常报警 ERROE: sqoop.Sqoop:Got exception running Sqoop: java.lang.NullPointerException java.lang.NullPointerException

 解决方案:

1)检查sqoop-site.xml配置是否有缺失

2)添加json依赖(需要自己找下这个jar包)

cp /opt/software/java-json.jar /opt/install/sqoop/lib/

5. Sqoop运行时 Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader
	at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:370)
	at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:108)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:264)
	at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
	at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
	at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
	at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:530)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.shims.ShimLoader

解决方案:

cp /opt/install/hive/lib/hive-shims* /opt/install/sqoop/lib/

6. Sqoop 文件已存在异常报警 ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException

2021-08-04 02:02:52,989 ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://singleNode:8020/user/root/temp already exists
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:164)
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
        at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
        at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
        at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

解决方案:

删除掉提示的文件即可

7. Sqoop 创建从mysql向hive表中增量导入的job时空指针异常报警 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException

 解决方案:

删除sqoop的lib文件夹中的hive-exec-3.1.2.jar

rm -f /opt/install/lib/hive-exec-3.1.2.jar

8. Hive中运行任务报错:FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

 FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

错误原因
第一种,yarn资源不足
原因:
该错误是YARN的虚拟内存计算方式导致,上例中用户程序申请的内存为1Gb,YARN根据此值乘以一个比例(默认为2.1)得出申请的虚拟内存的值,当YARN计算的用户程序所需虚拟内存值大于计算出来的值时,就会报出以上错误。调节比例值可以解决该问题。具体参数为:yarn-site.xml中的yarn.nodemanager.vmem-pmem-ratio

解决方案:

调整hadoop配置文件yarn-site.xml中值:

vim /opt/install/hadoop/etc/hadoop/yarn-site.xml
-----------------------------------------------------------

<property>
   <name>yarn.scheduler.minimum-allocation-mb</name>
   <value>2048</value>
   <description>default value is 1024</description>
</property>

重启hiveserver2 

9. Mysql远程登录权限异常报警 java.sql.SQLException: Access denied for user 'root'@'localhost'

image-20210804104617777

解决方案: 

设置远程登入权限并刷新

# 启动服务
systemctl start mysql
# 修改MySQL密码
/usr/bin/mysqladmin -u root password 'root'
# 登陆MySQL设置权限
mysql -uroot -proot 
> update mysql.user set host='%' where host='localhost';
> delete from mysql.user where host<>'%' or user='';
> flush privileges;

10.flink yarn-session.sh启动出现java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Path

2021-08-09 02:15:33,593 ERROR org.apache.flink.yarn.cli.FlinkYarnSessionCli                [] - Error while running the Flink session.
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Path
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.getLocalFlinkDistPathFromCmd(FlinkYarnSessionCli.java:347) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.applyDescriptorOptionToConfig(FlinkYarnSessionCli.java:473) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.toConfiguration(FlinkYarnSessionCli.java:394) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.run(FlinkYarnSessionCli.java:571) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.lambda$main$4(FlinkYarnSessionCli.java:860) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
	at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:860) [flink-dist_2.12-1.13.2.jar:1.13.2]
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Path
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_171]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_171]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_171]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_171]
	... 7 more

解决方案:

将flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar拷贝至flink的bin目录下

链接:https://pan.baidu.com/s/1a9bdL2amXq0lyZMDF2GLhw 
提取码:ltpc 

11. flink 启动时Maximun Memory:1536MB Requeested:1600MB

解决方案:

vim /opt/software/hadoop/etc/hadoop/yarn-site.xml
<!--yarn容器允许分配的最大内存-->
<property>
    <name>yarn.scheduler.maximum-allocation-mb</name>
    <value>2048</value>
</property>
<!--yarn容器允许管理的物理内存大小-->
<property>
    <name>yarn.nodemanager.resource.memory-mb</name>
    <value>2048</value>
</property>

12. Multiple Executor Mode模式配置配置对executor主机内存限制

解决方案:

修改 web-server conf/azkaban.properties 配置
# execute 主机过滤器配置, 去掉 MinimumFreeMemory
# MinimumFreeMemory 过滤器会检查 executor 主机空余内存是否会大于 6G,如果不足 6G,则 web-server 不会将任务交由该主机执行
azkaban.executorselector.filters=StaticRemainingFlowSize,CpuStatus

  • 2
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

Vicky_Tang

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值