最近在Windows环境下搭建了一套Spark的环境,用Java实现了几个基于Spark的程序,这个过程中遇见了很多问题,可能是由于不太熟悉,所以在解决问题的过程中花费了很长的时间,所以小编把遇到的问题整理在这里,希望可以帮大家少走弯路:
1.错误提示:
Exception in thread"main" java.lang.SecurityException: class"javax.servlet.FilterRegistration"'s signer information does notmatch signer information of other classes in the same packag
问题:jar包冲突
解决方案:删除servlet-api,只保留javax.servlet-api;
2.错误提示:
java.io.IOException:Could not locate executable D:\YSC\workspace\hadoop-2.7.4\bin\bin\winutils.exein the Hadoop binaries.
问题:在hadoop安装包中不存在winutils.exe文件,所以我们需要从网上下载hadoop中bin文件夹(下载地址),之后将bin文件替换本地hadoop路径下的bin文件;
3.错误提示:
Caused by:java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir:/tmp/hive on HDFS should be writable. Current permissions are: ------
问题:不具备写权限
解决方案:
使用cmd窗口,执行如下命令:
命令格式:winutils.exe绝对路径chmod 777 要开放权限的路径
D:\Program Files\hadoop-2.7.0\bin\winutils.exe chmod 777 D:\tmp\hive
4.错误提示:
java.io.IOException:Could not locate executable null\bin\winutils.exein the Hadoop binaries.
atorg.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
atorg.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
atorg.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
atorg.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
atorg.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
atorg.apache.hadoop.security.Groups.<init>(Groups.java:86)
atorg.apache.hadoop.security.Groups.<init>(Groups.java:66)
atorg.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
问题:hadoop-home未配置,导致找winutils.exe的路径开头是null;
解决方案:
1)配置HADOOP_HOME
2)代码中添加:
System.setProperty("hadoop.home.dir","D:\\hadoop-2.7.4");