Eclipse运行wordcount程序时报错,信息如下:
2020-08-15 16:12:32,580 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - map 0% reduce 0%
2020-08-15 16:12:52,739 WARN [LocalJobRunner Map Task Executor #0] hdfs.BlockReaderFactory (BlockReaderFactory.java:getRemoteBlockReaderFromTcp(716)) - I/O error constructing remote block reader.
java.net.ConnectException: Connection timed out: no further information
Eclipse可以正常访问到hdfs,检查配置都没有问题,想到了用hadoop自带的例子去测试,结果一直卡在map 0% reduce 0%,后来更改mapred-site.xml为如下配置:
<property>
<name>mapreduce.job.tracker</name>
<value>hdfs://hadoop01:9001</value>
<final>true</final>
<property>
问题解决!
原因:mapreduce.framework.name 配置用yarn来进行计算时必须启动nodemanager,如果不使用yarn,进行mapreduce.job.tracker配置,也可以用MRv2来执行job,这样就不需要启动