window 上跑hadoop问题之java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.

异常内容:

2018-04-11 16:32:28,514 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - Submitting tokens for job: job_local1975654255_0001
2018-04-11 16:32:28,561 WARN [org.apache.hadoop.conf.Configuration] - file:/tmp/hadoop-Zimo/mapred/staging/Zimo1975654255/.staging/job_local1975654255_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2018-04-11 16:32:28,562 WARN [org.apache.hadoop.conf.Configuration] - file:/tmp/hadoop-Zimo/mapred/staging/Zimo1975654255/.staging/job_local1975654255_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2018-04-11 16:32:28,663 DEBUG [org.apache.hadoop.security.UserGroupInformation] - PrivilegedAction as:Zimo (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
2018-04-11 16:32:28,758 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - Cleaning up the staging area file:/tmp/hadoop-Zimo/mapred/staging/Zimo1975654255/.staging/job_local1975654255_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:435)
    at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
    at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:177)
    at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:164)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:98)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
    at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:157)
    at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:636)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
    at com.hadoop.phoneStatistics.ExcelPhoneStatistics.run(ExcelPhoneStatistics.java:117)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at com.hadoop.phoneStatistics.ExcelPhoneStatistics.main(ExcelPhoneStatistics.java:128)
2018-04-11 16:32:28,767 DEBUG [org.apache.hadoop.ipc.Client] - Stopping client
2018-04-11 16:32:28,767 DEBUG [org.apache.hadoop.ipc.Client] - IPC Client (1166151249) connection to centpy/192.168.86.134:9000 from Zimo: closed
2018-04-11 16:32:28,768 DEBUG [org.apache.hadoop.ipc.Client] - IPC Client (1166151249) connection to centpy/192.168.86.134:9000 from Zimo: stopped, remaining connections 0

解决办法:

1,首先下载hadoop windows编译的版本,可以去我的下载中找hadoop_windows编译版本;
2,配置系统环境变量配置HADOOP_HOME ,并且添加进path 变量里(%HADOOP_HOME%\bin);
在这里插入图片描述
在这里插入图片描述
3,注意下HADOOP_HOME\bin 里是否有hadoop.dll 和 winutils.exe 这两个文件,没有的话不可以使用;
在这里插入图片描述
4,上述完成后再重启电脑即可解决;

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值