有人称hdfs存储过程,我不写习惯,容易联想到oracle存储过程,
hdfs以块(block)为单位,每块64MB
检查block:
hadoop fsck / -files -blocks
命令行接口:
hadoop fs -copyFromLocal input/docs/quangle.txt quangle.txt
1.1.1 注意hadoop权限
创建目录:
hadoo fs -mkdir dzg_hadoop_data
授权:
hadoop fs -chmod 777 /dzg_hadoop_data
读文件
import java.io.InputStream;
import java.net.URI;
import java.net.URL;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FsUrlStreamHandlerFactory;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
public class FileSystemCat {
static {
URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());
}
public static void main(String[] args) throws Exception {
String uri = "hdfs://192.168.121.133:9000/hadoop/data/input";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
InputStream in = null;
try {
in = fs.open(new Path(uri));
IOUtils.copyBytes(in, System.out, 4096, false);
} finally {
IOUtils.closeStream(in);
}
}
}
***请注意路径和文件是否存在 ,input为无后缀文件
1.1.1 写文件
import java.io.InputStream;
import java.net.URI;
import java.net.URL;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FsUrlStreamHandlerFactory;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
public class FileSystemCreate {
static {
URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());
}
static String uri = "hdfs://192.168.121.133:9000/hadoop/data/";
static Configuration conf = new Configuration();
static FileSystem fs = null;
public static void main(String[] args) throws Exception {
fs = FileSystem.get(URI.create(uri), conf);
InputStream in = null;
FSDataOutputStream out = null;
try {
out = fs.create(new Path(uri+"data_create_test"));
out.writeBytes("Hello Hadoop create file ... Hello Hadoop create file ... Hello Hadoop create file ... ");
out.close();
} finally {
IOUtils.closeStream(in);
}
}
}
到现在还没体会到hadoop对大数据的贡献,待续...........................
***************************************************************************
***********如果有问题可进群讨论: 大数据QQ交流:208881891**********
***************************************************************************