YARN环境搭建提交MapReduce作业

使用版本: hadoop-2.6.0-cdh5.7.0

yarn-site.xml

mapred-site.xml

启停YARN

 

cd /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop

由于只有mapred-site.xml.template文件,所以cp mapred-site.xml.template mapred-site.xml

mapred-site.xml 

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

 

yarn-site.xml

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>

 

启动yarn

$ sbin/stop-yarn.sh

验证是否启动成功

jps

8274 NameNode

8402 DataNode

9033 NodeManager

9370 Jps

8571 SecondaryNameNode

8892 ResourceManager

 

页面验证

http://192.168.0.66:8088/cluster

 

停止yarn

  $ sbin/stop-yarn.sh

 

在根目录下新建data文件夹,文件夹下新增hello.txt文件

文件内容为

hello world welcome

hello welcome

 

执行命令在hdfs上新建文件夹

hadoop fs -mkdir -p /input/wc

 

将hello.txt copy到hdfs文件夹下

hadoop fs -put hello.txt /input/wc

 

查看文件

hadoop fs -text /input/wc/hello.txt

 

提交mr作业到yarn上运行:wc

/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/mapreduce

hadoop-mapreduce-examples-2.6.0-cdh5.7.0.jar

 

hadoop  jar /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.7.0.jar wordcount /input/wc/hello.txt /output/wc

 

控制台打印

19/07/14 23:21:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

19/07/14 23:21:55 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032

19/07/14 23:21:57 INFO input.FileInputFormat: Total input paths to process : 1

19/07/14 23:21:57 INFO mapreduce.JobSubmitter: number of splits:1

19/07/14 23:21:57 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1563114486260_0002

19/07/14 23:21:58 INFO impl.YarnClientImpl: Submitted application application_1563114486260_0002

19/07/14 23:21:58 INFO mapreduce.Job: The url to track the job: http://hadoop001:8088/proxy/application_1563114486260_0002/

19/07/14 23:21:58 INFO mapreduce.Job: Running job: job_1563114486260_0002

19/07/14 23:22:07 INFO mapreduce.Job: Job job_1563114486260_0002 running in uber mode : false

19/07/14 23:22:07 INFO mapreduce.Job:  map 0% reduce 0%

19/07/14 23:22:13 INFO mapreduce.Job:  map 100% reduce 0%

19/07/14 23:22:21 INFO mapreduce.Job:  map 100% reduce 100%

19/07/14 23:22:23 INFO mapreduce.Job: Job job_1563114486260_0002 completed successfully

19/07/14 23:22:23 INFO mapreduce.Job: Counters: 49

    File System Counters

        FILE: Number of bytes read=44

        FILE: Number of bytes written=222939

        FILE: Number of read operations=0

        FILE: Number of large read operations=0

        FILE: Number of write operations=0

        HDFS: Number of bytes read=139

        HDFS: Number of bytes written=26

        HDFS: Number of read operations=6

        HDFS: Number of large read operations=0

        HDFS: Number of write operations=2

    Job Counters

        Launched map tasks=1

        Launched reduce tasks=1

        Data-local map tasks=1

        Total time spent by all maps in occupied slots (ms)=4633

        Total time spent by all reduces in occupied slots (ms)=6126

        Total time spent by all map tasks (ms)=4633

        Total time spent by all reduce tasks (ms)=6126

        Total vcore-seconds taken by all map tasks=4633

        Total vcore-seconds taken by all reduce tasks=6126

        Total megabyte-seconds taken by all map tasks=4744192

        Total megabyte-seconds taken by all reduce tasks=6273024

    Map-Reduce Framework

        Map input records=2

        Map output records=5

        Map output bytes=54

        Map output materialized bytes=44

        Input split bytes=105

        Combine input records=5

        Combine output records=3

        Reduce input groups=3

        Reduce shuffle bytes=44

        Reduce input records=3

        Reduce output records=3

        Spilled Records=6

        Shuffled Maps =1

        Failed Shuffles=0

        Merged Map outputs=1

        GC time elapsed (ms)=214

        CPU time spent (ms)=2720

        Physical memory (bytes) snapshot=353509376

        Virtual memory (bytes) snapshot=2702671872

        Total committed heap usage (bytes)=284164096

    Shuffle Errors

        BAD_ID=0

        CONNECTION=0

        IO_ERROR=0

        WRONG_LENGTH=0

        WRONG_MAP=0

        WRONG_REDUCE=0

    File Input Format Counters

        Bytes Read=34

    File Output Format Counters

        Bytes Written=26



页面查看作业 

 

 

可以看到新建的yarn作业

 

执行

hadoop fs -ls /output/wc/

 

查看生成结果

-rw-r--r--   1 root supergroup          0 2019-07-14 23:10 /output/wc/_SUCCESS

-rw-r--r--   1 root supergroup         26 2019-07-14 23:10 /output/wc/part-r-00000

 

执行

hadoop fs -text /output/wc/part-r-00000

 

查看执行结果

hello    2

welcome    2

world    1

 

当我们再次执行该作业,会报错

WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoop001:8020/output/wc already exists

org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoop001:8020/output/wc already exists

    at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)

    at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:270)

    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)

    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)

    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)

    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)

    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)

    at org.apache.hadoop.examples.WordCount.main(WordCount.java:87)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    at java.lang.reflect.Method.invoke(Method.java:498)

    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)

    at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)

    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    at java.lang.reflect.Method.invoke(Method.java:498)

    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

 

想要再次执行需要删除 /output/wc 文件夹以及下面的文件

hadoop fs -rm -r /output/wc

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值