1、http://cw550284.iteye.com/blog/1064844

2、http://lirenjuan.iteye.com/blog/1280729

cygwin安装完成后，配置环境变量：CYGWIN_HOME，并将%CYGWIN_HOME%\bin配置在PATH中。

在建立是时候无密码访问时可能会出现错误，但是无法定位到问题所在，可以打开ssh的debug模式：
ssh -vv localhost

ssh服务没有启动

ssh: Permission denied
Problem: you can't login to your account. You set the password using passwd, but it still gives you this error.
Solution: The problem is that sometime Cygwin does not create your local user in the /etc/passwd file. The solution is simple:
mkpasswd.exe -c > /etc/passwd
Now you should see your Windows user in the passwd file. Now use the passwd command to give yourself a password for the Cygwin user. This is not the same as the Windows user.

2、修改core-site.xml的配置；

3、修改mapred-site.xml的配置；

安装hadoop-eclipse-plugin
d、删除E:\Program Files\eclipse\configuration下的org.eclipse.update文件夹

执行map任务时出现：
12/03/07 14:56:13 INFO mapred.JobClient: Task Id : attempt_201203071039_0011_m_000001_2, Status : FAILED
java.io.FileNotFoundException: File E:/cygdrive/e/data/tmp/mapred/local/taskTracker/jobcache/job_201203071039_0011/attempt_201203071039_0011_m_000001_2/work/tmp does not exist.
at org.apache.hadoop.mapred.Child.main(Child.java:155)

<property>
<name>mapred.child.tmp</name>
<description> To set the value of tmp directory for map and reduce tasks.
If the value is an absolute path, it is directly assigned. Otherwise, it is
prepended with task's working directory. The java tasks are executed with
option -Djava.io.tmpdir='the absolute path of the tmp dir'. Pipes and
streaming are set with environment variable,
TMPDIR='the absolute path of the tmp dir'
</description>
</property>

执行map reduce任务时出现：
java.lang.IllegalArgumentException: Can't read partitions file
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:560) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:639) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1424) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1419)
at org.apache.hadoop.hbase.mapreduce.hadoopbackport.TotalOrderPartitioner.readPartitions(TotalOrderPartitioner.java:296)