1. compile the WordCount.java
javac -classpath ./hadoop-core-0.20.203.0.jar -d ../hdp_test -s ../hdp_test ../hdp_test/WordCo
unt.java -Xlint:deprecation
2. jar it
jar -cvf ~/hdp_test/WordCount.jar .
3. change core-site.xml
add the dfs property,
because the format meta data of hdfs was save in the /tmp in default,
so it 's clear after reboot the machine
change the dfs proterty block:
<property>
<name>hadoop.tmp.dir</name>
<value>/home/chenglun/tmp</value>
<description>A base for other temporary directories</description>
</property>
4.
hadoop namenode format
5. run it
5.0 prepare the input data
hadoop jar WordCount.jar org.myorg.WordCount /home/chenglun/input /home/chenglun/output
Appendix: hdfs shell:
hadoop fs -ls [path]
hadoop fs -cat [path]
hadoop fs -mkdir [dir]
hadoop fs -copyFromLocal local URI
javac -classpath ./hadoop-core-0.20.203.0.jar -d ../hdp_test -s ../hdp_test ../hdp_test/WordCo
unt.java -Xlint:deprecation
2. jar it
jar -cvf ~/hdp_test/WordCount.jar .
3. change core-site.xml
add the dfs property,
because the format meta data of hdfs was save in the /tmp in default,
so it 's clear after reboot the machine
change the dfs proterty block:
<property>
<name>hadoop.tmp.dir</name>
<value>/home/chenglun/tmp</value>
<description>A base for other temporary directories</description>
</property>
4.
hadoop namenode format
5. run it
5.0 prepare the input data
hadoop jar WordCount.jar org.myorg.WordCount /home/chenglun/input /home/chenglun/output
Appendix: hdfs shell:
hadoop fs -ls [path]
hadoop fs -cat [path]
hadoop fs -mkdir [dir]
hadoop fs -copyFromLocal local URI