定制mapreduce输出

这里以redis数据库为例。

这里的例子是,我想统计日志文件中的某天各个小时的访问量,日志格式为:

?
1
2014 - 02 - 10 04 : 52 : 34 127.0 . 0.1 xxx

我们知道在写mapreduce job时,要配置输入输出,然后编写mapper和reducer类,hadoop默认输出是到hdfs的文件中,例如:

?
1
job.setOutputFormatClass(FileOutputFormat. class );

现在我们想要将任务计算结果输出到数据库(redis)中,怎么做呢?可以继承FileOutputFormat类,定制自己的类,看代码:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
publicclass LoginLogOutputFormat<K, V> extends FileOutputFormat<K, V> {
    /**
     * 重点也是定制一个RecordWriter类,每一条reduce处理后的记录,我们便可将该记录输出到数据库中
     */
    protectedstatic class RedisRecordWriter<K, V> extends RecordWriter<K, V>{
        privateJedis jedis; //redis的client实例
         
        publicRedisRecordWriter(Jedis jedis){
            this.jedis = jedis;
        }
         
        @Override
        publicvoid write(K key, V value) throws IOException,
                InterruptedException {
             
            booleannullKey = key == null;
            booleannullValue = value == null;
            if(nullKey || nullValue) return;
             
            String[] sKey = key.toString().split("-");
            String outKey = sKey[0]+"-"+sKey[1]+"-"+sKey[2]+"_login_stat";//zset key为yyyy-MM-dd_login_stat
            jedis.zadd(outKey.getBytes("UTF-8"), -1,
                        (sKey[3]+":"+value).getBytes("UTF-8"));//zadd, 其值格式为: 时刻:访问量
        }
 
        @Override
        publicvoid close(TaskAttemptContext context) throws IOException,
                InterruptedException {
            if(jedis != null) jedis.disconnect(); //关闭链接
        }
    }
     
    @Override
    publicRecordWriter<K, V> getRecordWriter(TaskAttemptContext job)
            throwsIOException, InterruptedException {
        Jedis jedis = RedisClient.newJedis(); //构建一个redis,这里你可以自己根据实际情况来构建数据库连接对象
        //System.out.println("构建RedisRecordWriter");
        returnnew RedisRecordWriter<K, V>(jedis);
    }
}
下面就是整个job实现:
?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
publicclass LoginLogStatTask extends Configured implements Tool {
    publicstatic class MyMapper extends Mapper<LongWritable, Text, Text, IntWritable>{
        @Override
        protectedvoid map(LongWritable key, Text value, Context context)
                throwsIOException, InterruptedException {
            if(value == null || "".equals(value))return;
            // 解析value,如: 2014-02-10 04:52:34 127.0.0.1 xxx
            String[] fields = value.toString().split(" ");
            String date = fields[0];
            String time = fields[1];
            String hour = time.split(":")[0];
            String outKey = date+"-"+hour;
            context.write(newText(outKey), new IntWritable(1));
        }
    }
     
    publicstatic class MyReducer extends Reducer<Text, IntWritable, Text, IntWritable>{
        @Override
        protectedvoid reduce(Text key, Iterable<IntWritable> values,
                Context context)
                throwsIOException, InterruptedException {
            intcount = 0;
            while(values.iterator().hasNext()){ //统计数量
                count ++;
                values.iterator().next();
            }
            context.write(key,new IntWritable(count));
        }
    }
 
    @Override
    publicint run(String[] args) throws Exception {
        Configuration conf = getConf();
        List<Path> inputs = new ArrayList<>();
        String inputPath = args[0];
        if(inputPath.endsWith("/")){//如果是目录
            inputs.addAll(HdfsUtil.listFiles(inputPath, conf));
        }else{//如果是文件
            inputs.add(newPath(inputPath));
        }
        longts = System.currentTimeMillis();
        String jobName = "login_logs_stat_job_" + ts;
        Job job = Job.getInstance(conf, jobName);
        job.setJarByClass(LoginLogStatTask.class);
        //添加输入文件路径
        for(Path p : inputs){
            FileInputFormat.addInputPath(job, p);
        }
        //设置输出路径
        Path out = new Path(jobName + ".out");//以jobName.out作为输出
        FileOutputFormat.setOutputPath(job, out);
        //设置mapper
        job.setMapperClass(MyMapper.class);
        //设置reducer
        job.setReducerClass(MyReducer.class);
         
        //设置输入格式
        job.setInputFormatClass(TextInputFormat.class);
        //设置输出格式
        job.setOutputFormatClass(LoginLogOutputFormat.class);
        //设置输出key类型
        job.setOutputKeyClass(Text.class);
        //设置输出value类型
        job.setOutputValueClass(IntWritable.class);
        job.waitForCompletion(true);
        returnjob.isSuccessful()?0:1;
    }
      
    publicstatic void main(String[] args) throws Exception {
        Configuration conf = new Configuration();
        intres = ToolRunner.run(conf, new LoginLogStatTask(), args);
        System.exit(res);
    }

运行job后,就会在redis数据库中有对应的key:

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值