基于文件的数据结构
两种文件格式:
1、SequenceFile
2、MapFile
SequenceFile
1、SequenceFile文件是Hadoop用来存储二进制形式的<key,value>
对而设计的一种平面文件(Flat File)。
2、可以把SequenceFile当做一个容器,把所有文件打包到SequenceFile类中可以高效的对小文件进行存储和处理。
3、SequenceFile文件并不按照其存储的key进行排序存储,SequenceFile的内部类Writer**提供了append功能**。
4、SequenceFile中的key和value可以是任意类型Writable或者是自定义Writable类型。
SequenceFile压缩
1、SequenceFile的内部格式取决于是否启用压缩,如果是,要么是记录压缩,要么是块压缩。
2、三种类型:
A.无压缩类型:如果没有启用压缩(默认设置),那么每个记录就由它的记录长度(字节数)、键的长度,键和值组成。长度字段为四字节。
B.记录压缩类型:记录压缩格式与无压缩格式基本相同,不同的是值字节是用定义在头部的编码器来压缩。注意,键是不压缩的。
C.块压缩类型:块压缩一次压缩多个记录,因此它比记录压缩更紧凑,而且一般优先选择。当记录的字节数达到最小大小,才会添加到块。该最小值由io.seqfile.compress.blocksize
中的属性定义。默认值是1000000字节。格式为记录数、键长度、键、值长度、值。
无压缩格式与记录压缩格式
块压缩格式
SequenceFile文件格式的好处:
A.支持基于记录(Record)或块(Block)的数据压缩。
B.支持splittable,能够作为MapReduce的输入分片。
C.修改简单:主要负责修改相应的业务逻辑,而不用考虑具体的存储格式。
SequenceFile文件格式的坏处:
坏处是需要一个合并文件的过程,且合并后的文件将不方便查看。因为它是二进制文件。
读写SequenceFile
写过程:
1)创建Configuration
2)获取FileSystem
3)创建文件输出路径Path
4)调用SequenceFile.createWriter得到SequenceFile.Writer对象
5)调用SequenceFile.Writer.append追加写入文件
6)关闭流
读过程:
1)创建Configuration
2)获取FileSystem
3)创建文件输出路径Path
4)new一个SequenceFile.Reader进行读取
5)得到keyClass和valueClass
6)关闭流
<code class="hljs livecodeserver has-numbering">org.apache.hadoop.io Class SequenceFile There are <span class="hljs-constant">three</span> SequenceFile Writers based <span class="hljs-command"><span class="hljs-keyword">on</span> <span class="hljs-title">the</span> <span class="hljs-title">SequenceFile</span>.<span class="hljs-title">CompressionType</span> <span class="hljs-title">used</span> <span class="hljs-title">to</span> <span class="hljs-title">compress</span> <span class="hljs-title">key</span>/<span class="hljs-title">value</span> <span class="hljs-title">pairs</span>: </span> <span class="hljs-number">1</span>、Writer : Uncompressed records. <span class="hljs-number">2</span>、RecordCompressWriter : Record-compressed <span class="hljs-built_in">files</span>, only <span class="hljs-built_in">compress</span> values. <span class="hljs-number">3</span>、BlockCompressWriter : Block-compressed <span class="hljs-built_in">files</span>, both <span class="hljs-built_in">keys</span> & values are collected <span class="hljs-operator">in</span> <span class="hljs-string">'blocks'</span> separately <span class="hljs-operator">and</span> compressed. The size <span class="hljs-operator">of</span> <span class="hljs-operator">the</span> <span class="hljs-string">'block'</span> is configurable</code>
无压缩方式、记录压缩、块压缩实例
<code class="hljs avrasm has-numbering">package SequenceFile<span class="hljs-comment">;</span> import java<span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.IOException</span><span class="hljs-comment">;</span> import java<span class="hljs-preprocessor">.net</span><span class="hljs-preprocessor">.URI</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.conf</span><span class="hljs-preprocessor">.Configuration</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.fs</span><span class="hljs-preprocessor">.FileSystem</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.fs</span><span class="hljs-preprocessor">.Path</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.IOUtils</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.IntWritable</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.SequenceFile</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.SequenceFile</span><span class="hljs-preprocessor">.CompressionType</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.Text</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.Writable</span><span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.compress</span><span class="hljs-preprocessor">.BZip</span>2Codec<span class="hljs-comment">;</span> import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.util</span><span class="hljs-preprocessor">.ReflectionUtils</span><span class="hljs-comment">;</span> public class Demo01 { final static String uri= <span class="hljs-string">"hdfs://liguodong:8020/liguodong"</span><span class="hljs-comment">;</span> final static String[] data = { <span class="hljs-string">"apache,software"</span>,<span class="hljs-string">"chinese,good"</span>,<span class="hljs-string">"james,NBA"</span>,<span class="hljs-string">"index,pass"</span> }<span class="hljs-comment">;</span> public static void main(String[] args) throws IOException { //<span class="hljs-number">1</span> Configuration configuration = new Configuration()<span class="hljs-comment">;</span> //<span class="hljs-number">2</span> FileSystem fs = FileSystem<span class="hljs-preprocessor">.get</span>(URI<span class="hljs-preprocessor">.create</span>(uri),configuration)<span class="hljs-comment">;</span> //<span class="hljs-number">3</span> Path path = new Path(<span class="hljs-string">"/tmp.seq"</span>)<span class="hljs-comment">;</span> write(fs,configuration,path)<span class="hljs-comment">;</span> read(fs,configuration,path)<span class="hljs-comment">;</span> } public static void write(FileSystem fs,Configuration configuration,Path path) throws IOException{ //<span class="hljs-number">4</span> IntWritable key = new IntWritable()<span class="hljs-comment">;</span> Text value = new Text()<span class="hljs-comment">;</span> //无压缩 <span class="hljs-comment">/*@SuppressWarnings("deprecation") SequenceFile.Writer writer = SequenceFile.createWriter (fs,configuration,path,key.getClass(),value.getClass());*/</span> //记录压缩 @SuppressWarnings(<span class="hljs-string">"deprecation"</span>) SequenceFile<span class="hljs-preprocessor">.Writer</span> writer = SequenceFile<span class="hljs-preprocessor">.createWriter</span> (fs,configuration,path,key<span class="hljs-preprocessor">.getClass</span>(), value<span class="hljs-preprocessor">.getClass</span>(),CompressionType<span class="hljs-preprocessor">.RECORD</span>,new BZip2Codec())<span class="hljs-comment">;</span> //块压缩 <span class="hljs-comment">/*@SuppressWarnings("deprecation") SequenceFile.Writer writer = SequenceFile.createWriter (fs,configuration,path,key.getClass(), value.getClass(),CompressionType.BLOCK,new BZip2Codec());*/</span> //<span class="hljs-number">5</span> for (int i = <span class="hljs-number">0</span><span class="hljs-comment">; i < 30; i++) {</span> key<span class="hljs-preprocessor">.set</span>(<span class="hljs-number">100</span>-i)<span class="hljs-comment">;</span> value<span class="hljs-preprocessor">.set</span>(data[i%data<span class="hljs-preprocessor">.length</span>])<span class="hljs-comment">;</span> writer<span class="hljs-preprocessor">.append</span>(key, value)<span class="hljs-comment">;</span> } //<span class="hljs-number">6</span>、关闭流 IOUtils<span class="hljs-preprocessor">.closeStream</span>(writer)<span class="hljs-comment">; </span> } public static void read(FileSystem fs,Configuration configuration,Path path) throws IOException { //<span class="hljs-number">4</span> @SuppressWarnings(<span class="hljs-string">"deprecation"</span>) SequenceFile<span class="hljs-preprocessor">.Reader</span> reader = new SequenceFile<span class="hljs-preprocessor">.Reader</span>(fs, path,configuration)<span class="hljs-comment">;</span> //<span class="hljs-number">5</span> Writable key = (Writable) ReflectionUtils<span class="hljs-preprocessor">.newInstance</span> (reader<span class="hljs-preprocessor">.getKeyClass</span>(), configuration)<span class="hljs-comment">;</span> Writable value = (Writable) ReflectionUtils<span class="hljs-preprocessor">.newInstance</span> (reader<span class="hljs-preprocessor">.getValueClass</span>(), configuration)<span class="hljs-comment">;</span> while(reader<span class="hljs-preprocessor">.next</span>(key,value)){ System<span class="hljs-preprocessor">.out</span><span class="hljs-preprocessor">.println</span>(<span class="hljs-string">"key = "</span> + key)<span class="hljs-comment">;</span> System<span class="hljs-preprocessor">.out</span><span class="hljs-preprocessor">.println</span>(<span class="hljs-string">"value = "</span> + value)<span class="hljs-comment">;</span> System<span class="hljs-preprocessor">.out</span><span class="hljs-preprocessor">.println</span>(<span class="hljs-string">"position = "</span>+ reader<span class="hljs-preprocessor">.getPosition</span>())<span class="hljs-comment">;</span> } IOUtils<span class="hljs-preprocessor">.closeStream</span>(reader)<span class="hljs-comment">;</span> } }</code>
运行结果:
<code class="hljs makefile has-numbering"><span class="hljs-constant">key</span> = 100 <span class="hljs-constant">value</span> = apache,software <span class="hljs-constant">position</span> = 164 <span class="hljs-constant">key</span> = 99 <span class="hljs-constant">value</span> = chinese,good <span class="hljs-constant">position</span> = 197 <span class="hljs-constant">key</span> = 98 <span class="hljs-constant">value</span> = james,NBA <span class="hljs-constant">position</span> = 227 <span class="hljs-constant">key</span> = 97 <span class="hljs-constant">value</span> = index,pass <span class="hljs-constant">position</span> = 258 <span class="hljs-constant">key</span> = 96 <span class="hljs-constant">value</span> = apache,software <span class="hljs-constant">position</span> = 294 <span class="hljs-constant">key</span> = 95 <span class="hljs-constant">value</span> = chinese,good <span class="hljs-constant">position</span> = 327 ...... <span class="hljs-constant">key</span> = 72 <span class="hljs-constant">value</span> = apache,software <span class="hljs-constant">position</span> = 1074 <span class="hljs-constant">key</span> = 71 <span class="hljs-constant">value</span> = chinese,good <span class="hljs-constant">position</span> = 1107</code>
MapFile
<code class="hljs java has-numbering"><span class="hljs-keyword">public</span> <span class="hljs-class"><span class="hljs-keyword">class</span> <span class="hljs-title">MapFile</span> {</span> <span class="hljs-javadoc">/** The name of the index file. */</span> <span class="hljs-keyword">public</span> <span class="hljs-keyword">static</span> <span class="hljs-keyword">final</span> String INDEX_FILE_NAME = <span class="hljs-string">"index"</span>; <span class="hljs-javadoc">/** The name of the data file. */</span> <span class="hljs-keyword">public</span> <span class="hljs-keyword">static</span> <span class="hljs-keyword">final</span> String DATA_FILE_NAME = <span class="hljs-string">"data"</span>; }</code>
MapFile是经过排序的索引的SequenceFile,可以根据key进行查找。
与SequenceFile不同的是, MapFile的Key一定要实现WritableComparable接口 ,即Key值是可比较的,而value是Writable类型的。
可以使用MapFile.fix()方法来重建索引,把SequenceFile转换成MapFile。
它有两个静态成员变量:
<code class="hljs java has-numbering"><span class="hljs-keyword">static</span> <span class="hljs-keyword">final</span> String INDEX_FILE_NAME <span class="hljs-keyword">static</span> <span class="hljs-keyword">final</span> String DATA_FILE_NAME</code>
通过观察其目录结构可以看到MapFile由两部分组成,分别是data和index。
index作为文件的数据索引,主要记录了每个Record的key值,以及该Record在文件中的偏移位置。
在MapFile被访问的时候,索引文件会被加载到内存,通过索引映射关系可迅速定位到指定Record所在文件位置。
因此,相对SequenceFile而言, MapFile的检索效率是高效的,缺点是会消耗一部分内存来存储index数据。
需注意的是, MapFile并不会把所有Record都记录到index中去,默认情况下每隔128条记录存储一个索引映射。当然,记录间隔可人为修改,通过MapFIle.Writer的setIndexInterval()
方法,或修改io.map.index.interval
属性;
读写MapFile
写过程:
1)创建Configuration
2)获取FileSystem
3)创建文件输出路径Path
4)new一个MapFile.Writer对象
5)调用MapFile.Writer.append追加写入文件
6)关闭流
读过程:
1)创建Configuration
2)获取FileSystem
3)创建文件输出路径Path
4)new一个MapFile.Reader进行读取
5)得到keyClass和valueClass
6)关闭流
具体操作与SequenceFile相似。
命令行查看二进制文件
hdfs dfs -text /liguodong/tmp.seq
原文转载至:http://blog.csdn.net/scgaliguodong123_/article/details/46391061