配置了环境变量却依然报错Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
	at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
	at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
	at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
	at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
	at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
	at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
	at scala.Option.getOrElse(Option.scala:201)
	at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
	at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
	at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
18:11:09.346 [main] WARN org.apache.hadoop.util.Shell - Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
	at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:547)
	at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:568)
	at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:591)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:688)
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
	at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
	at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
	at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
	at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
	at scala.Option.getOrElse(Option.scala:201)
	at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
	at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
	at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
	at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
	at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
	... 16 common frames omitted
18:11:09.348 [main] DEBUG org.apache.hadoop.util.Shell - Failed to find winutils.exe

Java通过spark提供的接口org.apache.spark.api.java调用spark但是这仅限于在普通项目中。例如

windows上配置hadoop并通过idea连接本地spark和服务器spark本篇文章中介绍了在普通Maven项目如何使用spark。但当同样的项目移植到spring boot时就行不通了。老是包如标题的错误。

[org.apache.hadoop.util.Shell] - Failed to detect a valid hadoop home directory
java.io.IOException : HADOOP_HOME or hadoop.home.dir are not set.

可以已经配置了环境变量 还是出现找不到路径

在这里插入图片描述

在代码中人工添加路径:

System.setProperty("hadoop.home.dir","D:\\SoftWares\\Apache\\spark-3.3.1-bin-hadoop3");
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

xvwen

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值