注意:下面的所有代码都是在linux的eclipse中进行编写。
1.首先测试从hdfs中下载文件:
下载文件的代码:(将hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz文件下载到本地/opt/download/doload.tgz)
packagecn.qlq.hdfs;importjava.io.FileOutputStream;importjava.io.IOException;importorg.apache.commons.compress.utils.IOUtils;importorg.apache.hadoop.conf.Configuration;importorg.apache.hadoop.fs.FSDataInputStream;importorg.apache.hadoop.fs.FileSystem;importorg.apache.hadoop.fs.Path;public classHdfsUtil {public static void main(String a[]) throwsIOException {//to upload a file
Configuration conf = newConfiguration();
FileSystem fs=FileSystem.get(conf);
Path path= new Path("hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz");
FSDataInputStream input=fs.open(path);
FileOutputStream output= new FileOutputStream("/opt/download/doload.tgz");
IOUtils.copy(input, output);
}
}
直接运行报错:
原因是程序不认识 hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz 这样的目录
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkP