今天折腾了一天,终于搞明白了几点:
1. Mapper和Reducer的单元测试可以在windows环境下的Eclipse中进行。
我用的hadoop 1.1.1。 需要有hadoop-core-1.1.1.jar,MRunit0.9.0, Mockito-all-1.8.5.jar, 还有几个hadoop 1.1.1的lib目录下的几个包。
要用MRUnit Tutorial 中的测试类的写法来写测试类。https://cwiki.apache.org/confluence/display/MRUNIT/MRUnit+Tutorial
2. 在windows环境下的Eclipse中搞不通Running Locally on Test Data
Running a Job in a Local Job Runner
《hadoop:The Definitive Guide》第三版157页的代码在在windows环境下的Eclipse4.2.1下运行总是报错
13/03/11 17:26:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/03/11 17:26:58 ERROR security.UserGroupInformation: PriviledgedActionException as:2171330005824 cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-2171330005824\mapred\staging\2171330005824567325916\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-2171330005824\mapred\staging\2171330005824567325916\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:918)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at MaxTemperatureDriver.run(MaxTemperatureDriver.java:39)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at MaxTemperatureDriver.main(MaxTemperatureDriver.java:43)
看来hadoop包中只能认linux的文件结构。不知道有没有人搞通过用windows来做开发调试hadoop程序。
也许用CgyWin那个破玩意可以。但那个东西实在太难用了。
没时间吃这个螃蟹了,还是老老实实用Ubuntu桌面版来搞hadoop开发与调试吧。