可以查看 hdfs hive hbase文件存储大小
hadoop fs -count
或者
hdfs dfs -count /51JOB
[root@chinadaas01 ~]# hadoop fs -count /hbase1/zzjg_detail
文件夹数 文件数 这个目录下的文件bytes
查看文件夹下 各个目录的总大小:
hdfs dfs -du -h /user/hive/warehouse/ste_qyjs.db/
能看到这个库下所有表的总大小 并以G 或者K 显示
直接看第三个命令:
[bigdata@task1-sandbox tfb]$ hadoop fs -du -s -h /user/hive/warehouse/dev_treasury.db
1.1 T 3.3 T /user/hive/warehouse/dev_treasury.db
查看分区表大小:“
hadoop fs -du /user/hive/warehouse/dev_treasury.db/ods_common_users|awk '{ SUM += $1 } END { print SUM/(1024*1024*1024)}'
awk处理完所有输入行之后才处理END模式, 上面命令是讲所有分区表和非分区表的大小统计进去 以GB显示
<!--5f39ae17-8c62-4a45-bc43-b32064c9388a:W3siYmxvY2tJZCI6IjEyMjEtMTUyOTM5MzI3MDg1NSIsImJsb2NrVHlwZSI6InBhcmFncmFwaCIsInN0eWxlcyI6eyJhbGlnbiI6ImxlZnQiLCJpbmRlbnQiOjAsInRleHQtaW5kZW50IjowLCJsaW5lLWhlaWdodCI6MS43NX0sInR5cGUiOiJwYXJhZ3JhcGgiLCJyaWNoVGV4dCI6eyJkYXRhIjpbeyJjaGFyIjoiICJ9LHsiY2hhciI6ImgifSx7ImNoYXIiOiJhIn0seyJjaGFyIjoiZCJ9LHsiY2hhciI6Im8ifSx7ImNoYXIiOiJvIn0seyJjaGFyIjoicCJ9LHsiY2hhciI6IiAifSx7ImNoYXIiOiJmIn0seyJjaGFyIjoicyJ9LHsiY2hhciI6IiAifSx7ImNoYXIiOiItIn0seyJjaGFyIjoiZCJ9LHsiY2hhciI6InUifSx7ImNoYXIiOiIgIn0seyJjaGFyIjoiLyJ9LHsiY2hhciI6InUifSx7ImNoYXIiOiJzIn0seyJjaGFyIjoiZSJ9LHsiY2hhciI6InIifSx7ImNoYXIiOiIvIn0seyJjaGFyIjoiaCJ9LHsiY2hhciI6ImkifSx7ImNoYXIiOiJ2In0seyJjaGFyIjoiZSJ9LHsiY2hhciI6Ii8ifSx7ImNoYXIiOiJ3In0seyJjaGFyIjoiYSJ9L