As a rule of , each file, directory, and block takes about 150 bytes. So, for example,
if you had one million files, each taking one block, you would need at least
300 MB of memory.
Hadoop为什么处理小数据量时效果不好
最新推荐文章于 2019-05-29 15:58:25 发布
As a rule of , each file, directory, and block takes about 150 bytes. So, for example,
if you had one million files, each taking one block, you would need at least
300 MB of memory.