云盘万能钥匙 For Chrome 下载和安装

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
这是一个基于Hadoop网络的云盘上传和下载的代码实现,主要用到了Hadoop的HDFS和MapReduce框架。 上传部分: 1. 首先,定义一个上传的Mapper类,继承自Hadoop的Mapper类,实现map函数。 public static class UploadMapper extends Mapper<LongWritable, Text, Text, BytesWritable> { private Text filename = new Text(); public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String filepath = value.toString(); File file = new File(filepath); filename.set(file.getName()); byte[] data = new byte[(int) file.length()]; FileInputStream fis = new FileInputStream(file); fis.read(data); fis.close(); context.write(filename, new BytesWritable(data)); } } 2. 然后,定义一个上传的Reducer类,继承自Hadoop的Reducer类,实现reduce函数。 public static class UploadReducer extends Reducer<Text, BytesWritable, Text, Text> { public void reduce(Text key, Iterable<BytesWritable> values, Context context) throws IOException, InterruptedException { Configuration conf = context.getConfiguration(); String hdfsPath = conf.get("hdfsPath"); String filepath = hdfsPath + "/" + key.toString(); Path path = new Path(filepath); FileSystem fs = path.getFileSystem(conf); FSDataOutputStream out = fs.create(path); for (BytesWritable value : values) { out.write(value.getBytes(), 0, value.getLength()); } out.close(); context.write(key, new Text("Upload completed!")); } } 3. 最后,定义上传的Driver类,继承自Hadoop的Configured类和Tool类,实现run函数。 public int run(String[] args) throws Exception { Configuration conf = getConf(); Job job = Job.getInstance(conf, "UploadFile"); job.setJarByClass(UploadFile.class); job.setMapperClass(UploadMapper.class); job.setReducerClass(UploadReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(BytesWritable.class); job.setInputFormatClass(TextInputFormat.class); job.setOutputFormatClass(TextOutputFormat.class); FileInputFormat.addInputPath(job, new Path(args[0])); FileOutputFormat.setOutputPath(job, new Path(args[1])); return job.waitForCompletion(true) ? 0 : 1; } 下载部分: 1. 首先,定义一个下载的Mapper类,继承自Hadoop的Mapper类,实现map函数。 public static class DownloadMapper extends Mapper<Text, BytesWritable, Text, Text> { public void map(Text key, BytesWritable value, Context context) throws IOException, InterruptedException { Configuration conf = context.getConfiguration(); String localPath = conf.get("localPath"); String filepath = localPath + "/" + key.toString(); FileOutputStream fos = new FileOutputStream(filepath); fos.write(value.getBytes(), 0, value.getLength()); fos.close(); context.write(key, new Text("Download completed!")); } } 2. 然后,定义下载的Driver类,继承自Hadoop的Configured类和Tool类,实现run函数。 public int run(String[] args) throws Exception { Configuration conf = getConf(); Job job = Job.getInstance(conf, "DownloadFile"); job.setJarByClass(DownloadFile.class); job.setMapperClass(DownloadMapper.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(BytesWritable.class); job.setInputFormatClass(KeyValueTextInputFormat.class); job.setOutputFormatClass(TextOutputFormat.class); FileInputFormat.addInputPath(job, new Path(args[0])); FileOutputFormat.setOutputPath(job, new Path(args[1])); return job.waitForCompletion(true) ? 0 : 1; } 以上就是基于Hadoop网络的云盘上传和下载的代码实现。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

COCO56(徐可可)

建议微信红包:xucoco56

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值