应用场景:
pako可以对字符串或者 Uint8Array 数据进行压缩和解压, 牺牲一部分读写性能获得更大的空间可用性
压缩率大概到8-10倍这个样子,释放的空间还是可观的,相当nice,除了压缩的时候会耗时,解压的时候还蛮快的
安装:
npm install pako
参数:
Additional options, for internal needs:
chunkSize - size of generated data chunks (16K by default)
raw (Boolean) - do raw deflate
gzip (Boolean) - create gzip wrapper
to (String) - if equal to 'string', then result will be "binary string" (each char code [0..255])
header (Object) - custom header for gzip
text (Boolean) - true if compressed data believed to be text
time (Number) - modification time, unix timestamp
os (Number) - operation system code
extra (Array) - array of bytes with extra data (max 65536)
name (String) - file name (binary string)
comment (String) - comment (binary string)
hcrc (Boolean) - true if header crc should be added
使用:
var fs = require("fs");
var pako = require('pako');
var path = require('path')
var encodeJson = function(sourceFile='res/all.json',targetFile='res/all.bin'){
let contents = fs.readFileSync(sourceFile, {encoding: 'utf8'});
let buffer = Buffer.from(pako.deflate(JSON.stringify(contents)).buffer);
fs.writeFileSync(targetFile, buffer);
}
var decodeJson = function(sourceFile='res/all.bin',targetFile='res/all1.json'){
let contents = fs.readFileSync(sourceFile, {encoding: 'binary'});
let buffer = JSON.parse(Buffer.from(pako.inflate(contents).buffer));
fs.writeFileSync(targetFile, buffer);
}
// encodeJson();
decodeJson();