基于HBase Hadoop 分布式集群环境下的MapReduce程序开发

10 篇文章 0 订阅

 

  HBase分布式集群环境搭建成功后,连续4、5天实验客户端Map/Reduce程序开发,这方面的代码网上多得是,写个测试代码非常容易,可是真正运行起来可说是历经挫折。下面就是我最终调通并让程序在集群上运行起来的一些经验教训。

  一、首先说一下我的环境:

  1,集群的环境配置请见这篇博文

  2,开发客户机环境:操作系统是CentOS6.5,JDK版本是1.7.0-60,开发工具是Eclipse(原始安装是从google的ADT网站下载的ADT专用开发环境,后来加装了Java企业开发的工具,启动Flash还是ADT的:))。

  3,etc/hosts文件添加了集群的所有主机。

  二、必要准备工作:

  1,java类库准备:从集群中scp复制了Hadoop/share下所有文件,hbase/lib目录下的所有文件到开发机的/home/username/lib目录下备用(其中username是开发用户名称),打开eclipse,选择菜单window->Preferences,

在左侧树形菜单中导航至Java--Build Path--User Libraries,我添加了3个User Library,“hadoop-common”,“hadoop-yarn”,“hbase”

然后分别添加如下(这是User Libraries的导出文件,可以在上面的界面中导入)的jar包。

?
<? xml version="1.0" encoding="UTF-8" standalone="no"?>
< eclipse-userlibraries version="2">
     < library name="hadoop-common" systemlibrary="false">
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce/hadoop-mapreduce-client-app-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/hadoop-auth-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/hadoop-annotations-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/hadoop-common-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-logging-1.1.3.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-compress-1.4.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/guava-11.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-collections-3.2.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/slf4j-api-1.7.5.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/slf4j-log4j12-1.7.5.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/zookeeper-3.4.5-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-configuration-1.6.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-math3-3.1.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/activation-1.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/asm-3.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/avro-1.7.5-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-beanutils-1.7.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-beanutils-core-1.8.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-cli-1.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-codec-1.4.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-digester-1.8.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-el-1.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-httpclient-3.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-io-2.4.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-lang-2.6.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/commons-net-3.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/httpclient-4.2.5.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/httpcore-4.2.5.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jackson-core-asl-1.8.8.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jackson-jaxrs-1.8.8.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jackson-xc-1.8.8.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jasper-compiler-5.5.23.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jasper-runtime-5.5.23.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/java-xmlbuilder-0.4.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jaxb-api-2.2.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jaxb-impl-2.2.3-1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jersey-core-1.9.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jersey-json-1.9.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jersey-server-1.9.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jets3t-0.9.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jettison-1.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jetty-6.1.26.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jetty-util-6.1.26.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jsch-0.1.42.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jsp-api-2.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/jsr305-1.3.9.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/junit-4.8.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/log4j-1.2.17.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/mockito-all-1.8.5.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/paranamer-2.3.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/protobuf-java-2.5.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/servlet-api-2.5.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/snappy-java-1.0.4.1.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/stax-api-1.0-2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/xmlenc-0.52.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/common/lib/xz-1.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce/lib/netty-3.6.2.Final.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/hdfs/hadoop-hdfs-2.3.0-cdh5.0.2.jar"/>
     </ library >
     < library name="hadoop-yarn" systemlibrary="false">
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-app-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-core-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-jobclient-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-common-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-shuffle-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-api-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-client-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-common-2.3.0-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.3.0-cdh5.0.2.jar"/>
     </ library >
     < library name="hbase" systemlibrary="false">
         < archive path="/home/gzg/javadev/lib/hbase/hbase-client-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-common-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-hadoop2-compat-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-hadoop-compat-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-it-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-prefix-tree-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-protocol-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-testing-util-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-thrift-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/protobuf-java-2.5.0.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-server-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/hbase-shell-0.96.1.1-cdh5.0.2.jar"/>
         < archive path="/home/gzg/javadev/lib/hbase/htrace-core-2.01.jar"/>
     </ library >
</ eclipse-userlibraries >

 注:这个很有可能并不是运行Hbase Map/Reduce任务需要的最精简的类库,恕我没有逐一测试。

  2,新建一个Java/Application,我的名字就叫hbase1,然后在项目project视图中点击右键,选择“Build Path”--“Add Libraries...”--选择“User Library”,然后选中上面添加的三个User Library。

  3,复制集群的三个配置文件到项目的src目录下,这三个配置文件是:

?
1
core-site.xml  hdfs-site.xml  yarn-site.xml

   三、编写代码,这里基本按照我所遇挫折过程逐步完善代码(缺少类库之类的就略掉了):

   1,遭遇空指针错误--reverseDNS

主文件内容:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
/**
  * @author gzg
  *
  */
public class Test1  {
 
     /**
      * @param args
      * @throws IOException
      * @throws ZooKeeperConnectionException
      * @throws MasterNotRunningException
      */
     Configuration conf= null ;
     public static void main(String[] args) {
         try {
             new TestHBaseMapred().run( false );
         } catch (yh.utils.system.CommandError cep){
             cep.printStackTrace();
         } catch (ClassNotFoundException e) {
             // TODO Auto-generated catch block
             e.printStackTrace();
         } catch (IOException e) {
             // TODO Auto-generated catch block
             e.printStackTrace();
         } catch (InterruptedException e) {
             // TODO Auto-generated catch block
             e.printStackTrace();
         }
     }
}

 TestHBaseMapRed.java内容:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
public class TestHBaseMapred {
     public void run( boolean toFile) throws ClassNotFoundException, IOException, InterruptedException{
         org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();<br>          //设置zookeeper的地址s
         conf.set( "hbase.zookeeper.quorum" , "zk1,zk2,zk3" );<br>          //此设置好像没用
         conf.set( "hadoop.job.user" , "hadoop" );
         org.apache.hadoop.mapreduce.Job job= null ;
         String  tmpTable = "_mrtest1" ;
         yh.utils.ElapsedTimeMeasure etm=yh.utils.ElapsedTimeMeasure.createAndStart();
         try {
             
             job = Job.getInstance(conf);           
             job.setJobName( "count meter" );
                         
             if (toFile== false ){
                 HBaseUtil.dropTable(conf, tmpTable);
                 HBaseUtil.createTable(conf, tmpTable, "c" );
                 job.setOutputFormatClass(org.apache.hadoop.hbase.mapreduce.TableOutputFormat. class );
                 job.getConfiguration().set(org.apache.hadoop.hbase.mapreduce.TableOutputFormat.OUTPUT_TABLE, tmpTable);
                 //
                 job.setReducerClass(TblReduce. class );
             } else {         
                 job.setOutputFormatClass(org.apache.hadoop.mapreduce.lib.output.TextOutputFormat. class );
                 Path op = new Path( "result" );          
                 org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.setOutputPath(job, op);
                 //
                 job.setReducerClass(TxtReduce. class );
             }          
             
             Scan scan = new Scan();
             scan.addFamily(Bytes.toBytes( "c" ));
             org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob( "dzdl_t1" ,scan,TblMap. class ,Text. class ,LongWritable. class ,job);
             job.waitForCompletion( true );
         } catch (IOException e) {
             // TODO Auto-generated catch block
             e.printStackTrace();
         }
         System.out.println( "time elapsed :" +etm.stop().toString());
         
     }
     public static class TblMap extends org.apache.hadoop.hbase.mapreduce.TableMapper<Text, LongWritable>{
         
         @Override
         protected void map(ImmutableBytesWritable key,Result result,org.apache.hadoop.mapreduce.Mapper.Context  ctx) throws IOException, InterruptedException{
             Cell c=result.getColumnLatestCell(Bytes.toBytes( "c" ), Bytes.toBytes( "col3" ));
             String col= Bytes.toString(CellUtil.cloneQualifier(c));
             Text kv = new Text();
             kv.set( "count_result" );
             LongWritable lv = new LongWritable();
             lv.set( 1 );
             ctx.write(kv, lv);
             //org.apache.log4j.Logger.getRootLogger().info(kv+" : 1");
         }
     }
     
     public static class  TblReduce extends org.apache.hadoop.hbase.mapreduce.TableReducer<Text,LongWritable,ImmutableBytesWritable>{
         @Override
         protected void reduce(Text key,Iterable<LongWritable> values,Context ctx) throws IOException, InterruptedException{
             long t = 0 ;
             for (LongWritable v:values){
                 t+=v.get();
             }
             byte [] ok=Bytes.toBytes( "count_result" );
             Put p= new Put(ok);
             
             p.add(Bytes.toBytes( "c" ),key.copyBytes(),Bytes.toBytes(t));
         p.add(Bytes.toBytes( "c" ),Bytes.toBytes( "14112896" ),Bytes.toBytes( 14112896 ));
             ctx.write( new ImmutableBytesWritable(ok),p);
         }
     }
     
     public static class TxtReduce extends org.apache.hadoop.mapreduce.lib.reduce.LongSumReducer<Text>{
         @Override
         public void reduce(Text key,Iterable<LongWritable> values,Context ctx) throws IOException, InterruptedException{
             long t = 0 ;
             for (LongWritable v:values){
                 t+=v.get();
             }
             //Put p=new Put(key.copyBytes());
             
             //p.add(Bytes.toBytes("c"),key.copyBytes(),Bytes.toBytes(t));
             //ctx.write(new ImmutableBytesWritable(key.copyBytes()),p);
             LongWritable l = new LongWritable();
             l.set(t);
             ctx.write(key,l);
             
             
         }
     }
}

 此时运行,会报如下错误:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Exception in thread "main" java.lang.NullPointerException
     at org.apache.hadoop.net.DNS.reverseDns(DNS.java: 92 )
     at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.reverseDNS(TableInputFormatBase.java: 218 )
     at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java: 184 )
     at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java: 493 )
     at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java: 510 )
     at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java: 394 )
     at org.apache.hadoop.mapreduce.Job$ 10 .run(Job.java: 1295 )
     at org.apache.hadoop.mapreduce.Job$ 10 .run(Job.java: 1292 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapreduce.Job.submit(Job.java: 1292 )
     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java: 1313 )
     at hbase1.TestHBaseMapred.run(TestHBaseMapred.java: 82 )
     at hbase1.Test1.main(Test1.java: 69 )

看错误,不知道为什么需要调用DNS服务,hosts文件好像没有作用, 搜索很久,终于在一个老外的问题里找到答案,conf添加如下设置解决。

?
1
conf.set( "hbase.nameserver.address" , "nn1,nn2" );

 添加后,运行仍会报下面的错无,但是好像并不影响程序继续运行:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
2014 - 08 - 21 14 : 05 : 04 , 158 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn1/ 192.168 . 5.111 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2: Name or service not known]; remaining name '111.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 170 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn6/ 192.168 . 5.116 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '116.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 171 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn3/ 192.168 . 5.113 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '113.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 172 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn4/ 192.168 . 5.114 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '114.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 173 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn4/ 192.168 . 5.114 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '114.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 173 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn1/ 192.168 . 5.111 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '111.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 174 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn6/ 192.168 . 5.116 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '116.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 174 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn4/ 192.168 . 5.114 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '114.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 175 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn5/ 192.168 . 5.115 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '115.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 175 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn6/ 192.168 . 5.116 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '116.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 199 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn5/ 192.168 . 5.115 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '115.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 200 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn2/ 192.168 . 5.112 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '112.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 201 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn5/ 192.168 . 5.115 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '115.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 201 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn2/ 192.168 . 5.112 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '112.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 202 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn3/ 192.168 . 5.113 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '113.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 05 : 04 , 539 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - number of splits: 15

    2,本地运行的问题。解决了上面的问题,程序终于可以运行了,可是我在rm的web界面中却看不到我提交的任务,再次求助bing,终于找到症结所在,需要添加设置如下:

?
1
2
         //此项必须设置,否则程序只是在本地客户端上运行,通过rm的浏览器管理界面看不到!
conf.set( "mapreduce.framework.name" , "yarn" );      

    3,添加了上面语句后,新的问题又来了,运行程序出现新的错误:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
2014 - 08 - 21 14 : 15 : 08 , 887 INFO [org.apache.hadoop.mapreduce.Job] - The url to track the job: http: //rm2:23188/proxy/application_1408378794442_0014/
2014 - 08 - 21 14 : 15 : 08 , 888 INFO [org.apache.hadoop.mapreduce.Job] - Running job: job_1408378794442_0014
2014 - 08 - 21 14 : 15 : 21 , 466 INFO [org.apache.hadoop.mapreduce.Job] - Job job_1408378794442_0014 running in uber mode : false
2014 - 08 - 21 14 : 15 : 21 , 469 INFO [org.apache.hadoop.mapreduce.Job] -  map 0 % reduce 0 %
2014 - 08 - 21 14 : 15 : 34 , 988 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000013_0, Status : FAILED
2014 - 08 - 21 14 : 15 : 35 , 023 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000011_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 39 , 485 INFO [org.apache.hadoop.mapreduce.Job] -  map 7 % reduce 0 %
2014 - 08 - 21 14 : 15 : 39 , 490 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000003_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
2014 - 08 - 21 14 : 15 : 39 , 497 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000006_0, Status : FAILED
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 39 , 505 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 40 , 663 INFO [org.apache.hadoop.mapreduce.Job] -  map 0 % reduce 0 %
2014 - 08 - 21 14 : 15 : 40 , 736 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000004_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 42 , 783 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000013_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 44 , 921 INFO [org.apache.hadoop.mapreduce.Job] -  map 7 % reduce 0 %
2014 - 08 - 21 14 : 15 : 44 , 925 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000005_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
 
2014 - 08 - 21 14 : 15 : 46 , 019 INFO [org.apache.hadoop.mapreduce.Job] -  map 0 % reduce 0 %
2014 - 08 - 21 14 : 15 : 46 , 066 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
2014 - 08 - 21 14 : 15 : 46 , 070 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000002_0, Status : FAILED
 
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 46 , 073 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000009_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 46 , 076 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000014_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 47 , 086 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000007_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 15 : 59 , 146 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000006_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 00 , 237 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000003_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 01 , 312 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000013_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
2014 - 08 - 21 14 : 16 : 01 , 329 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000004_1, Status : FAILED
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 03 , 396 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000008_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
2014 - 08 - 21 14 : 16 : 03 , 401 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000000_1, Status : FAILED
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 04 , 488 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000009_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
2014 - 08 - 21 14 : 16 : 04 , 493 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000001_1, Status : FAILED
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 05 , 530 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000010_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 06 , 546 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000012_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 09 , 672 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000006_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 10 , 935 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000014_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 19 , 239 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000012_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
2014 - 08 - 21 14 : 16 : 19 , 243 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000003_2, Status : FAILED
 
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 19 , 246 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000000_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 20 , 307 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000004_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 21 , 326 INFO [org.apache.hadoop.mapreduce.Job] - Task Id : attempt_1408378794442_0014_m_000011_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1895 )
     at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java: 196 )
     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java: 722 )
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 340 )
     at org.apache.hadoop.mapred.YarnChild$ 2 .run(YarnChild.java: 168 )
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java: 415 )
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1548 )
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 163 )
Caused by: java.lang.ClassNotFoundException: Class hbase1.TestHBaseMapred$TblMap not found
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java: 1801 )
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java: 1893 )
     ... 8 more
 
2014 - 08 - 21 14 : 16 : 23 , 352 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 100 %
2014 - 08 - 21 14 : 16 : 24 , 378 INFO [org.apache.hadoop.mapreduce.Job] - Job job_1408378794442_0014 failed with state FAILED due to: Task failed task_1408378794442_0014_m_000013
Job failed as tasks failed. failedMaps: 1 failedReduces: 0
 
2014 - 08 - 21 14 : 16 : 24 , 584 INFO [org.apache.hadoop.mapreduce.Job] - Counters: 13
     Job Counters
         Failed map tasks= 31
         Killed map tasks= 13
         Launched map tasks= 44
         Other local map tasks= 29
         Data-local map tasks= 15
         Total time spent by all maps in occupied slots (ms)= 1665094
         Total time spent by all reduces in occupied slots (ms)= 0
         Total time spent by all map tasks (ms)= 832547
         Total vcore-seconds taken by all map tasks= 832547
         Total megabyte-seconds taken by all map tasks= 1278792192
     Map-Reduce Framework
         CPU time spent (ms)= 0
         Physical memory (bytes) snapshot= 0
         Virtual memory (bytes) snapshot= 0
*** measure elapsed time,stoped at 2014 - 08 - 21 14 : 16 : 24 : 588
time elapsed :0days 0hours 1minutes 29seconds 879milliseconds[ 89 .879seconds]

 试着设置 job.setJarByClass(Test1.class); 无效。此问题困扰了我两天,难道只能上传到集群集群上运行吗?这样也太不方便了,N久后,终于我有幸找到了小鱼儿的博客 的这篇文章MapReduce提交作业常见问题,终于是找到了解决方法。我采用的是文中所提第三种方法,在程序开始运行前自己首先打包自己,就是调用的是写好的shell文件,这里也贴出来,以下两个文件在和src同级的dist目录中

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
[gzg @centos dist]$ cat manifest.mf
Manifest-Version: 1.0
Class-Path: . $HADOOP_HOME/lib hbase1_lib/hadoop-mapreduce-client-app- 2.3 . 0 -cdh5. 0.2 .ja
  r hbase1_lib/hadoop-auth- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hadoop-annotat
  ions- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hadoop-common- 2.3 . 0 -cdh5. 0.2 .jar h
  base1_lib/commons-logging- 1.1 . 3 .jar hbase1_lib/commons-compress- 1.4 . 1
  .jar hbase1_lib/guava- 11.0 . 2 .jar hbase1_lib/commons-collections- 3.2 . 1
  .jar hbase1_lib/slf4j-api- 1.7 . 5 .jar hbase1_lib/slf4j-log4j12- 1.7 . 5 .ja
  r hbase1_lib/zookeeper- 3.4 . 5 -cdh5. 0.2 .jar hbase1_lib/commons-configur
  ation- 1.6 .jar hbase1_lib/commons-math3- 3.1 . 1 .jar hbase1_lib/activatio
  n- 1.1 .jar hbase1_lib/asm- 3.2 .jar hbase1_lib/avro- 1.7 . 5 -cdh5. 0.2 .jar h
  base1_lib/commons-beanutils- 1.7 . 0 .jar hbase1_lib/commons-beanutils-co
  re- 1.8 . 0 .jar hbase1_lib/commons-cli- 1.2 .jar hbase1_lib/commons-codec-
  1.4 .jar hbase1_lib/commons-digester- 1.8 .jar hbase1_lib/commons-el- 1.0
  .jar hbase1_lib/commons-httpclient- 3.1 .jar hbase1_lib/commons-io- 2.4 .
  jar hbase1_lib/commons-lang- 2.6 .jar hbase1_lib/commons-net- 3.1 .jar hb
  ase1_lib/httpclient- 4.2 . 5 .jar hbase1_lib/httpcore- 4.2 . 5 .jar hbase1_li
  b/jackson-core-asl- 1.8 . 8 .jar hbase1_lib/jackson-jaxrs- 1.8 . 8 .jar hbase
  1_lib/jackson-mapper-asl- 1.8 . 8 .jar hbase1_lib/jackson-xc- 1.8 . 8 .jar hb
  ase1_lib/jasper-compiler- 5.5 . 23 .jar hbase1_lib/jasper-runtime- 5.5 . 23 .
  jar hbase1_lib/java-xmlbuilder- 0.4 .jar hbase1_lib/jaxb-api- 2.2 . 2 .jar
  hbase1_lib/jaxb-impl- 2.2 . 3 - 1 .jar hbase1_lib/jersey-core- 1.9 .jar hbase
  1_lib/jersey-json- 1.9 .jar hbase1_lib/jersey-server- 1.9 .jar hbase1_lib
  /jets3t- 0.9 . 0 .jar hbase1_lib/jettison- 1.1 .jar hbase1_lib/jetty- 6.1 . 26
  .jar hbase1_lib/jetty-util- 6.1 . 26 .jar hbase1_lib/jsch- 0.1 . 42 .jar hbas
  e1_lib/jsp-api- 2.1 .jar hbase1_lib/jsr305- 1.3 . 9 .jar hbase1_lib/junit- 4
  . 8.2 .jar hbase1_lib/log4j- 1.2 . 17 .jar hbase1_lib/mockito-all- 1.8 . 5 .jar
   hbase1_lib/paranamer- 2.3 .jar hbase1_lib/protobuf-java- 2.5 . 0 .jar hbas
  e1_lib/servlet-api- 2.5 .jar hbase1_lib/snappy-java- 1.0 . 4.1 .jar hbase1_
  lib/stax-api- 1.0 - 2 .jar hbase1_lib/xmlenc- 0.52 .jar hbase1_lib/xz- 1.0 .j
  ar hbase1_lib/netty- 3.6 . 2 .Final.jar hbase1_lib/hadoop-hdfs- 2.3 . 0 -cdh5
  . 0.2 .jar hbase1_lib/hadoop-mapreduce-client-app- 2.3 . 0 -cdh5. 0 .2_2.jar
  hbase1_lib/hadoop-mapreduce-client-core- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib
  /hadoop-mapreduce-client-jobclient- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hado
  op-mapreduce-client-common- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hadoop-mapre
  duce-client-shuffle- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hadoop-yarn-api- 2.3
  . 0 -cdh5. 0.2 .jar hbase1_lib/hadoop-yarn-client- 2.3 . 0 -cdh5. 0.2 .jar hbas
  e1_lib/hadoop-yarn-common- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hadoop-yarn-a
  pplications-distributedshell- 2.3 . 0 -cdh5. 0.2 .jar hbase1_lib/hbase-clie
  nt- 0.96 . 1.1 -cdh5. 0.2 .jar hbase1_lib/hbase-common- 0.96 . 1.1 -cdh5. 0.2 .ja
  r hbase1_lib/hbase-hadoop2-compat- 0.96 . 1.1 -cdh5. 0.2 .jar hbase1_lib/hb
  ase-hadoop-compat- 0.96 . 1.1 -cdh5. 0.2 .jar hbase1_lib/hbase-it- 0.96 . 1.1 -
  cdh5. 0.2 .jar hbase1_lib/hbase-prefix-tree- 0.96 . 1.1 -cdh5. 0.2 .jar hbase
  1_lib/hbase-protocol- 0.96 . 1.1 -cdh5. 0.2 .jar hbase1_lib/hbase-shell- 0.9
  6.1 . 1 -cdh5. 0.2 .jar hbase1_lib/hbase-testing-util- 0.96 . 1.1 -cdh5. 0.2 .ja
  r hbase1_lib/hbase-thrift- 0.96 . 1.1 -cdh5. 0.2 .jar hbase1_lib/protobuf-j
  ava- 2.5 .0_2.jar hbase1_lib/htrace-core- 2.01 .jar hbase1_lib/hbase-serv
  er- 0.96 . 1.1 -cdh5. 0.2 .jar
Main-Class: hbase1.Test1

 

?
1
2
3
4
[gzg @centos dist]$ cat makejar
cd "$(dirname $0)"
rm -f hbase1.jar
jar cvfm hbase1.jar manifest.mf  -C "../bin/" .

 同时,修改了TestHBaseMapred.run方法签名,传递jar包路径,新的run方法如下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
public void run(String appJarFile, boolean toFile) throws ClassNotFoundException, IOException, InterruptedException{
         //org.apache.hadoop.conf.Configuration conf = yh.utils.hbase.HBaseUtil.getConfiguration("zk1,zk2,zk3");
         org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();
         conf.set( "hbase.zookeeper.quorum" , "zk1,zk2,zk3" );
         conf.set( "hadoop.job.user" , "hadoop" );
         conf.set( "mapreduce.framework.name" , "yarn" );
         
         conf.set( "hbase.nameserver.address" , "nn1,nn2" );
                 
         org.apache.hadoop.mapreduce.Job job= null ;
         String  tmpTable = "_mrtest1" ;
         yh.utils.ElapsedTimeMeasure etm=yh.utils.ElapsedTimeMeasure.createAndStart();
         try {
             
             job = Job.getInstance(conf);           
             job.setJobName( "count meter" );
             //job.setJarByClass(Test1.class);
             <strong>job.setJar(appJarFile);</strong>
             
             if (toFile== false ){
                 HBaseUtil.dropTable(conf, tmpTable);
                 HBaseUtil.createTable(conf, tmpTable, "c" );
                 job.setOutputFormatClass(org.apache.hadoop.hbase.mapreduce.TableOutputFormat. class );
                 job.getConfiguration().set(org.apache.hadoop.hbase.mapreduce.TableOutputFormat.OUTPUT_TABLE, tmpTable);
                 //
                 job.setReducerClass(TblReduce. class );
             } else {         
                 job.setOutputFormatClass(org.apache.hadoop.mapreduce.lib.output.TextOutputFormat. class );
                 Path op = new Path( "result" );          
                 org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.setOutputPath(job, op);
                 //
                 job.setReducerClass(TxtReduce. class );
             }          
             
             Scan scan = new Scan();
             scan.addFamily(Bytes.toBytes( "c" ));
             org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob( "dzdl_t1" ,scan,TblMap. class ,Text. class ,LongWritable. class ,job);
             job.waitForCompletion( true );
         } catch (IOException e) {
             // TODO Auto-generated catch block
             e.printStackTrace();
         }
         System.out.println( "time elapsed :" +etm.stop().toString());
         
     }

    4,在选择写入文件的Reduce方法是,同样遇到权限问题,我采用的解决方法是给所有用户授予了读写权限,这里注意/tmp也要授权,运行时用到

?
1
2
[hadoop @zk1 hbase]$ hdfs dfs -chmod 777 /user
[hadoop @zk1 hbase]$ hdfs dfs -chmod 777 /tmp

    另外,最后还有一点注意,设置log4j.properties文件中的级别为INFO,不要设置为WARN,否则看不到很多信息,不利于发现错误原因。

?
1
2
3
4
5
6
7
8
9
10
11
12
13
# Configure logging for testing: optionally with log file
#log4j.rootLogger=WARN, stdout
log4j.rootLogger=INFO, stdout
# log4j.rootLogger=WARN, stdout, logfile
 
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n
 
log4j.appender.logfile=org.apache.log4j.FileAppender
log4j.appender.logfile.File=target/spring.log
log4j.appender.logfile.layout=org.apache.log4j.PatternLayout
log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n

 

    四、At last,终于大功告成,看看map/reduce的运行过程

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
*** measure elapsed time,started at 2014 - 08 - 21 14 : 48 : 35 : 880
2014 - 08 - 21 14 : 48 : 36 , 601 WARN [org.apache.hadoop.util.NativeCodeLoader] - Unable to load native -hadoop library for your platform... using builtin-java classes where applicable
2014 - 08 - 21 14 : 48 : 36 , 682 INFO [org.apache.hadoop.conf.Configuration.deprecation] - hadoop. native .lib is deprecated. Instead, use io. native .lib.available
2014 - 08 - 21 14 : 48 : 42 , 227 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:zookeeper.version= 3.4 . 5 -cdh5. 0.2 -- 1 , built on 06 / 09 / 2014 16 : 09 GMT
2014 - 08 - 21 14 : 48 : 42 , 227 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:host.name= 211.137 . 170.246
2014 - 08 - 21 14 : 48 : 42 , 227 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java.version= 1.7 .0_60
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java.vendor=Oracle Corporation
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java.home=/usr/java/jdk1. 7 .0_60/jre
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java. class .path=/home/gzg/javadev/lib/hadoop/mapreduce/hadoop-mapreduce-client-app- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/hadoop-auth- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/hadoop-annotations- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/common/hadoop-common- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-logging- 1.1 . 3 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-compress- 1.4 . 1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/guava- 11.0 . 2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-collections- 3.2 . 1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/slf4j-api- 1.7 . 5 .jar:/home/gzg/javadev/lib/hadoop/common/lib/slf4j-log4j12- 1.7 . 5 .jar:/home/gzg/javadev/lib/hadoop/common/lib/zookeeper- 3.4 . 5 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-configuration- 1.6 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-math3- 3.1 . 1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/activation- 1.1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/asm- 3.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/avro- 1.7 . 5 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-beanutils- 1.7 . 0 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-beanutils-core- 1.8 . 0 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-cli- 1.2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-codec- 1.4 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-digester- 1.8 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-el- 1.0 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-httpclient- 3.1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-io- 2.4 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-lang- 2.6 .jar:/home/gzg/javadev/lib/hadoop/common/lib/commons-net- 3.1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/httpclient- 4.2 . 5 .jar:/home/gzg/javadev/lib/hadoop/common/lib/httpcore- 4.2 . 5 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jackson-core-asl- 1.8 . 8 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jackson-jaxrs- 1.8 . 8 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jackson-mapper-asl- 1.8 . 8 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jackson-xc- 1.8 . 8 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jasper-compiler- 5.5 . 23 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jasper-runtime- 5.5 . 23 .jar:/home/gzg/javadev/lib/hadoop/common/lib/java-xmlbuilder- 0.4 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jaxb-api- 2.2 . 2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jaxb-impl- 2.2 . 3 - 1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jersey-core- 1.9 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jersey-json- 1.9 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jersey-server- 1.9 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jets3t- 0.9 . 0 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jettison- 1.1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jetty- 6.1 . 26 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jetty-util- 6.1 . 26 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jsch- 0.1 . 42 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jsp-api- 2.1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/jsr305- 1.3 . 9 .jar:/home/gzg/javadev/lib/hadoop/common/lib/junit- 4.8 . 2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/log4j- 1.2 . 17 .jar:/home/gzg/javadev/lib/hadoop/common/lib/mockito-all- 1.8 . 5 .jar:/home/gzg/javadev/lib/hadoop/common/lib/paranamer- 2.3 .jar:/home/gzg/javadev/lib/hadoop/common/lib/protobuf-java- 2.5 . 0 .jar:/home/gzg/javadev/lib/hadoop/common/lib/servlet-api- 2.5 .jar:/home/gzg/javadev/lib/hadoop/common/lib/snappy-java- 1.0 . 4.1 .jar:/home/gzg/javadev/lib/hadoop/common/lib/stax-api- 1.0 - 2 .jar:/home/gzg/javadev/lib/hadoop/common/lib/xmlenc- 0.52 .jar:/home/gzg/javadev/lib/hadoop/common/lib/xz- 1.0 .jar:/home/gzg/javadev/lib/hadoop/mapreduce/lib/netty- 3.6 . 2 .Final.jar:/home/gzg/javadev/lib/hadoop/hdfs/hadoop-hdfs- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-app- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-core- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-jobclient- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-common- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/mapreduce2/hadoop-mapreduce-client-shuffle- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-api- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-client- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-common- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hadoop/yarn/hadoop-yarn-applications-distributedshell- 2.3 . 0 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-client- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-common- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-hadoop2-compat- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-hadoop-compat- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-it- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-prefix-tree- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-protocol- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-testing-util- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-thrift- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/protobuf-java- 2.5 . 0 .jar:/home/gzg/javadev/lib/hbase/hbase-server- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/hbase-shell- 0.96 . 1.1 -cdh5. 0.2 .jar:/home/gzg/javadev/lib/hbase/htrace-core- 2.01 .jar:/home/gzg/javadev/hbase1/bin:/usr/java/jdk1. 7 .0_60/jre/lib:/usr/lib64
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java.io.tmpdir=/tmp
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:java.compiler=<NA>
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:os.name=Linux
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:os.arch=amd64
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:os.version= 2.6 . 32 - 431 .el6.x86_64
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:user.name=gzg
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:user.home=/home/gzg
2014 - 08 - 21 14 : 48 : 42 , 228 INFO [org.apache.zookeeper.ZooKeeper] - Client environment:user.dir=/home/gzg/javadev/hbase1
2014 - 08 - 21 14 : 48 : 42 , 230 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 42 , 254 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 42 , 261 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk2/ 192.168 . 5.122 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 42 , 276 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk2/ 192.168 . 5.122 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 42 , 335 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk2/ 192.168 . 5.122 : 2181 , sessionid = 0x1647ed72def6009d , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 43 , 407 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 43 , 408 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 43 , 409 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk1/ 192.168 . 5.121 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 43 , 412 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk1/ 192.168 . 5.121 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 43 , 421 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk1/ 192.168 . 5.121 : 2181 , sessionid = 0x15478ef8bf2f016f , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 53 , 961 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x15478ef8bf2f016f closed
2014 - 08 - 21 14 : 48 : 53 , 961 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 53 , 985 INFO [org.apache.hadoop.hbase.client.HBaseAdmin] - Started disable of _mrtest1
2014 - 08 - 21 14 : 48 : 54 , 051 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 54 , 053 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk3/ 192.168 . 5.123 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 54 , 057 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 54 , 067 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk3/ 192.168 . 5.123 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 54 , 072 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk3/ 192.168 . 5.123 : 2181 , sessionid = 0x174786daf57f01d0 , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 54 , 119 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x174786daf57f01d0 closed
2014 - 08 - 21 14 : 48 : 54 , 119 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 54 , 235 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 54 , 236 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 54 , 237 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk2/ 192.168 . 5.122 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 54 , 243 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk2/ 192.168 . 5.122 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 54 , 281 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk2/ 192.168 . 5.122 : 2181 , sessionid = 0x1647ed72def6009e , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 54 , 325 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x1647ed72def6009e closed
2014 - 08 - 21 14 : 48 : 54 , 326 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 54 , 558 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 54 , 560 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk3/ 192.168 . 5.123 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 54 , 561 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 54 , 562 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk3/ 192.168 . 5.123 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 54 , 568 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk3/ 192.168 . 5.123 : 2181 , sessionid = 0x174786daf57f01d1 , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 54 , 584 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x174786daf57f01d1 closed
2014 - 08 - 21 14 : 48 : 54 , 584 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 54 , 894 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 54 , 899 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 54 , 900 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk3/ 192.168 . 5.123 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 54 , 901 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk3/ 192.168 . 5.123 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 54 , 906 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk3/ 192.168 . 5.123 : 2181 , sessionid = 0x174786daf57f01d2 , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 54 , 922 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x174786daf57f01d2 closed
2014 - 08 - 21 14 : 48 : 54 , 922 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 55 , 430 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x7f120aa6 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 55 , 431 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x7f120aa6 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 55 , 432 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk1/ 192.168 . 5.121 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 55 , 433 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk1/ 192.168 . 5.121 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 55 , 439 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk1/ 192.168 . 5.121 : 2181 , sessionid = 0x15478ef8bf2f0170 , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 55 , 456 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x15478ef8bf2f0170 closed
2014 - 08 - 21 14 : 48 : 55 , 457 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 55 , 459 INFO [org.apache.hadoop.hbase.client.HBaseAdmin] - Disabled _mrtest1
2014 - 08 - 21 14 : 48 : 55 , 638 INFO [org.apache.hadoop.hbase.client.HBaseAdmin] - Deleted _mrtest1
2014 - 08 - 21 14 : 48 : 55 , 638 INFO [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation] - Closing master protocol: MasterService
2014 - 08 - 21 14 : 48 : 55 , 638 INFO [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation] - Closing zookeeper sessionid= 0x1647ed72def6009d
2014 - 08 - 21 14 : 48 : 55 , 658 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x1647ed72def6009d closed
2014 - 08 - 21 14 : 48 : 55 , 659 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 55 , 824 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=hconnection- 0x30dd1e89 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 55 , 826 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=hconnection- 0x30dd1e89 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 55 , 826 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk2/ 192.168 . 5.122 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 55 , 836 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk2/ 192.168 . 5.122 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 55 , 855 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk2/ 192.168 . 5.122 : 2181 , sessionid = 0x1647ed72def6009f , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 56 , 473 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 180000 watcher=catalogtracker-on-hconnection- 0x30dd1e89 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 56 , 474 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=catalogtracker-on-hconnection- 0x30dd1e89 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 56 , 475 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk2/ 192.168 . 5.122 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 56 , 476 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk2/ 192.168 . 5.122 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 56 , 482 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk2/ 192.168 . 5.122 : 2181 , sessionid = 0x1647ed72def600a0 , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 56 , 500 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x1647ed72def600a0 closed
2014 - 08 - 21 14 : 48 : 56 , 501 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 56 , 502 INFO [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation] - Closing master protocol: MasterService
2014 - 08 - 21 14 : 48 : 56 , 502 INFO [org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation] - Closing zookeeper sessionid= 0x1647ed72def6009f
2014 - 08 - 21 14 : 48 : 56 , 506 INFO [org.apache.zookeeper.ZooKeeper] - Session: 0x1647ed72def6009f closed
2014 - 08 - 21 14 : 48 : 56 , 507 INFO [org.apache.zookeeper.ClientCnxn] - EventThread shut down
2014 - 08 - 21 14 : 48 : 56 , 508 INFO [root] - Table _mrtest1 created!
2014 - 08 - 21 14 : 48 : 56 , 909 INFO [org.apache.zookeeper.ZooKeeper] - Initiating client connection, connectString=zk2: 2181 ,zk1: 2181 ,zk3: 2181 sessionTimeout= 90000 watcher=hconnection- 0x6a86ad83 , quorum=zk2: 2181 ,zk1: 2181 ,zk3: 2181 , baseZNode=/hbase
2014 - 08 - 21 14 : 48 : 56 , 911 INFO [org.apache.zookeeper.ClientCnxn] - Opening socket connection to server zk3/ 192.168 . 5.123 : 2181 . Will not attempt to authenticate using SASL (unknown error)
2014 - 08 - 21 14 : 48 : 56 , 912 INFO [org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper] - Process identifier=hconnection- 0x6a86ad83 connecting to ZooKeeper ensemble=zk2: 2181 ,zk1: 2181 ,zk3: 2181
2014 - 08 - 21 14 : 48 : 56 , 915 INFO [org.apache.zookeeper.ClientCnxn] - Socket connection established to zk3/ 192.168 . 5.123 : 2181 , initiating session
2014 - 08 - 21 14 : 48 : 56 , 922 INFO [org.apache.zookeeper.ClientCnxn] - Session establishment complete on server zk3/ 192.168 . 5.123 : 2181 , sessionid = 0x174786daf57f01d3 , negotiated timeout = 40000
2014 - 08 - 21 14 : 48 : 56 , 965 INFO [org.apache.hadoop.hbase.mapreduce.TableOutputFormat] - Created table instance for _mrtest1
2014 - 08 - 21 14 : 49 : 07 , 111 INFO [org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider] - Failing over to rm2
2014 - 08 - 21 14 : 49 : 12 , 459 WARN [org.apache.hadoop.mapreduce.JobSubmitter] - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this .
2014 - 08 - 21 14 : 49 : 25 , 381 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn1/ 192.168 . 5.111 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2: Name or service not known]; remaining name '111.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 385 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn6/ 192.168 . 5.116 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '116.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 386 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn3/ 192.168 . 5.113 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '113.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 387 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn4/ 192.168 . 5.114 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '114.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 387 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn4/ 192.168 . 5.114 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '114.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 388 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn1/ 192.168 . 5.111 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '111.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 389 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn6/ 192.168 . 5.116 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '116.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 389 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn4/ 192.168 . 5.114 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '114.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 390 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn5/ 192.168 . 5.115 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '115.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 390 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn6/ 192.168 . 5.116 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '116.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 413 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn5/ 192.168 . 5.115 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '115.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 414 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn2/ 192.168 . 5.112 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '112.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 415 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn5/ 192.168 . 5.115 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '115.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 415 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn2/ 192.168 . 5.112 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '112.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 416 ERROR [org.apache.hadoop.hbase.mapreduce.TableInputFormatBase] - Cannot resolve the host name for dn3/ 192.168 . 5.113 because of javax.naming.ConfigurationException: Unknown DNS server: nn1,nn2 [Root exception is java.net.UnknownHostException: nn1,nn2]; remaining name '113.5.168.192.in-addr.arpa'
2014 - 08 - 21 14 : 49 : 25 , 595 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - number of splits: 15
2014 - 08 - 21 14 : 49 : 25 , 925 INFO [org.apache.hadoop.mapreduce.JobSubmitter] - Submitting tokens for job: job_1408378794442_0015
2014 - 08 - 21 14 : 49 : 26 , 170 INFO [org.apache.hadoop.yarn.client.api.impl.YarnClientImpl] - Submitted application application_1408378794442_0015
2014 - 08 - 21 14 : 49 : 26 , 219 INFO [org.apache.hadoop.mapreduce.Job] - The url to track the job: http: //rm2:23188/proxy/application_1408378794442_0015/
2014 - 08 - 21 14 : 49 : 26 , 220 INFO [org.apache.hadoop.mapreduce.Job] - Running job: job_1408378794442_0015
2014 - 08 - 21 14 : 49 : 38 , 592 INFO [org.apache.hadoop.mapreduce.Job] - Job job_1408378794442_0015 running in uber mode : false
2014 - 08 - 21 14 : 49 : 38 , 594 INFO [org.apache.hadoop.mapreduce.Job] -  map 0 % reduce 0 %
2014 - 08 - 21 14 : 50 : 40 , 813 INFO [org.apache.hadoop.mapreduce.Job] -  map 7 % reduce 0 %
2014 - 08 - 21 14 : 50 : 47 , 911 INFO [org.apache.hadoop.mapreduce.Job] -  map 11 % reduce 0 %
2014 - 08 - 21 14 : 50 : 49 , 945 INFO [org.apache.hadoop.mapreduce.Job] -  map 13 % reduce 0 %
2014 - 08 - 21 14 : 51 : 03 , 145 INFO [org.apache.hadoop.mapreduce.Job] -  map 20 % reduce 0 %
2014 - 08 - 21 14 : 51 : 04 , 159 INFO [org.apache.hadoop.mapreduce.Job] -  map 20 % reduce 4 %
2014 - 08 - 21 14 : 51 : 10 , 265 INFO [org.apache.hadoop.mapreduce.Job] -  map 20 % reduce 7 %
2014 - 08 - 21 14 : 51 : 28 , 566 INFO [org.apache.hadoop.mapreduce.Job] -  map 24 % reduce 7 %
2014 - 08 - 21 14 : 51 : 30 , 598 INFO [org.apache.hadoop.mapreduce.Job] -  map 33 % reduce 7 %
2014 - 08 - 21 14 : 51 : 31 , 613 INFO [org.apache.hadoop.mapreduce.Job] -  map 33 % reduce 9 %
2014 - 08 - 21 14 : 51 : 34 , 656 INFO [org.apache.hadoop.mapreduce.Job] -  map 33 % reduce 11 %
2014 - 08 - 21 14 : 51 : 42 , 840 INFO [org.apache.hadoop.mapreduce.Job] -  map 40 % reduce 11 %
2014 - 08 - 21 14 : 51 : 46 , 912 INFO [org.apache.hadoop.mapreduce.Job] -  map 40 % reduce 13 %
2014 - 08 - 21 14 : 51 : 55 , 050 INFO [org.apache.hadoop.mapreduce.Job] -  map 47 % reduce 13 %
2014 - 08 - 21 14 : 51 : 56 , 074 INFO [org.apache.hadoop.mapreduce.Job] -  map 47 % reduce 16 %
2014 - 08 - 21 14 : 52 : 03 , 177 INFO [org.apache.hadoop.mapreduce.Job] -  map 53 % reduce 16 %
2014 - 08 - 21 14 : 52 : 08 , 241 INFO [org.apache.hadoop.mapreduce.Job] -  map 53 % reduce 18 %
2014 - 08 - 21 14 : 52 : 12 , 303 INFO [org.apache.hadoop.mapreduce.Job] -  map 60 % reduce 18 %
2014 - 08 - 21 14 : 52 : 14 , 343 INFO [org.apache.hadoop.mapreduce.Job] -  map 60 % reduce 20 %
2014 - 08 - 21 14 : 52 : 26 , 528 INFO [org.apache.hadoop.mapreduce.Job] -  map 64 % reduce 20 %
2014 - 08 - 21 14 : 52 : 27 , 551 INFO [org.apache.hadoop.mapreduce.Job] -  map 73 % reduce 20 %
2014 - 08 - 21 14 : 52 : 29 , 581 INFO [org.apache.hadoop.mapreduce.Job] -  map 73 % reduce 22 %
2014 - 08 - 21 14 : 52 : 32 , 621 INFO [org.apache.hadoop.mapreduce.Job] -  map 73 % reduce 24 %
2014 - 08 - 21 14 : 52 : 44 , 811 INFO [org.apache.hadoop.mapreduce.Job] -  map 80 % reduce 24 %
2014 - 08 - 21 14 : 52 : 47 , 848 INFO [org.apache.hadoop.mapreduce.Job] -  map 80 % reduce 27 %
2014 - 08 - 21 14 : 52 : 56 , 950 INFO [org.apache.hadoop.mapreduce.Job] -  map 84 % reduce 27 %
2014 - 08 - 21 14 : 52 : 57 , 966 INFO [org.apache.hadoop.mapreduce.Job] -  map 87 % reduce 27 %
2014 - 08 - 21 14 : 52 : 59 , 997 INFO [org.apache.hadoop.mapreduce.Job] -  map 87 % reduce 29 %
2014 - 08 - 21 14 : 53 : 42 , 487 INFO [org.apache.hadoop.mapreduce.Job] -  map 91 % reduce 29 %
2014 - 08 - 21 14 : 53 : 43 , 506 INFO [org.apache.hadoop.mapreduce.Job] -  map 93 % reduce 29 %
2014 - 08 - 21 14 : 53 : 45 , 540 INFO [org.apache.hadoop.mapreduce.Job] -  map 93 % reduce 31 %
2014 - 08 - 21 14 : 53 : 47 , 575 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 31 %
2014 - 08 - 21 14 : 53 : 48 , 584 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 33 %
2014 - 08 - 21 14 : 53 : 51 , 611 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 44 %
2014 - 08 - 21 14 : 53 : 54 , 642 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 55 %
2014 - 08 - 21 14 : 53 : 57 , 679 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 66 %
2014 - 08 - 21 14 : 54 : 00 , 710 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 71 %
2014 - 08 - 21 14 : 54 : 03 , 749 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 79 %
2014 - 08 - 21 14 : 54 : 06 , 791 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 88 %
2014 - 08 - 21 14 : 54 : 09 , 829 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 96 %
2014 - 08 - 21 14 : 54 : 11 , 854 INFO [org.apache.hadoop.mapreduce.Job] -  map 100 % reduce 100 %
2014 - 08 - 21 14 : 54 : 11 , 873 INFO [org.apache.hadoop.mapreduce.Job] - Job job_1408378794442_0015 completed successfully
2014 - 08 - 21 14 : 54 : 12 , 082 INFO [org.apache.hadoop.mapreduce.Job] - Counters: 61
     File System Counters
         FILE: Number of bytes read= 324596614
         FILE: Number of bytes written= 651096206
         FILE: Number of read operations= 0
         FILE: Number of large read operations= 0
         FILE: Number of write operations= 0
         HDFS: Number of bytes read= 1105
         HDFS: Number of bytes written= 0
         HDFS: Number of read operations= 15
         HDFS: Number of large read operations= 0
         HDFS: Number of write operations= 0
     Job Counters
         Killed map tasks= 6
         Launched map tasks= 21
         Launched reduce tasks= 1
         Data-local map tasks= 15
         Rack-local map tasks= 6
         Total time spent by all maps in occupied slots (ms)= 5299964
         Total time spent by all reduces in occupied slots (ms)= 413798
         Total time spent by all map tasks (ms)= 2649982
         Total time spent by all reduce tasks (ms)= 206899
         Total vcore-seconds taken by all map tasks= 2649982
         Total vcore-seconds taken by all reduce tasks= 206899
         Total megabyte-seconds taken by all map tasks= 4070372352
         Total megabyte-seconds taken by all reduce tasks= 423729152
     Map-Reduce Framework
         Map input records= 14112896
         Map output records= 14112896
         Map output bytes= 296370816
         Map output materialized bytes= 324596698
         Input split bytes= 1105
         Combine input records= 0
         Combine output records= 0
         Reduce input groups= 1
         Reduce shuffle bytes= 324596698
         Reduce input records= 14112896
         Reduce output records= 1
         Spilled Records= 28225792
         Shuffled Maps = 15
         Failed Shuffles= 0
         Merged Map outputs= 15
         GC time elapsed (ms)= 13168
         CPU time spent (ms)= 403830
         Physical memory (bytes) snapshot= 6978674688
         Virtual memory (bytes) snapshot= 29039456256
         Total committed heap usage (bytes)= 4911529984
     HBase Counters
         BYTES_IN_REMOTE_RESULTS= 194636060
         BYTES_IN_RESULTS= 5490524772
         MILLIS_BETWEEN_NEXTS= 1834495
         NOT_SERVING_REGION_EXCEPTION= 0
         NUM_SCANNER_RESTARTS= 0
         REGIONS_SCANNED= 15
         REMOTE_RPC_CALLS= 5050
         REMOTE_RPC_RETRIES= 0
         RPC_CALLS= 141167
         RPC_RETRIES= 0
     Shuffle Errors
         BAD_ID= 0
         CONNECTION= 0
         IO_ERROR= 0
         WRONG_LENGTH= 0
         WRONG_MAP= 0
         WRONG_REDUCE= 0
     File Input Format Counters
         Bytes Read= 0
     File Output Format Counters
         Bytes Written= 0
*** measure elapsed time,stoped at 2014 - 08 - 21 14 : 54 : 12 : 089
time elapsed :0days 0hours 5minutes 36seconds 209milliseconds[ 336 .209seconds]

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值