1. 文件限额
hdfs 的限额配置允许我们以
文件的个数
,或者文件的大小
来限制我们在某个目录下上传的文件数量或者文件内容总量, 以便达到我们类似百度网盘等限制每个用户上传的最大的文件的量
1.1 查看配额信息
hdfs dfs -count -q -h <操作路径>
示例
[hadoop@hadoop181 ~]$ hdfs dfs -count -q -h /data/client
none inf none inf 1 0 0 /data/client
[hadoop@hadoop181 ~]$
显示参数描述
arg1 | arg2 | arg3 | arg4 | arg5 | arg6 | arg7 | arg8 |
---|---|---|---|---|---|---|---|
none | inf | none | inf | 1 | 0 | 0 | /data/shell |
文件个数限额 | 剩余可上传文件大小 | 文件大小限额 | inf | 1 | 0 | 0 | /data/shell |
1.2 配置数量限额
(1) 创建演示目录
[hadoop@hadoop181 ~]$ hdfs dfs -mkdir -p /data/shell
(2) 配置文件限额
[hadoop@hadoop181 ~]$ hdfs dfsadmin -setQuota 5 /data/shell
(3) 生成演示文件
[hadoop@hadoop181 ~]$ echo "1">a.txt
[hadoop@hadoop181 ~]$ echo "2">b.txt
[hadoop@hadoop181 ~]$ echo "3">c.txt
[hadoop@hadoop181 ~]$ echo "4">d.txt
[hadoop@hadoop181 ~]$ echo "5">e.txt
(4) 上传文件
配置了 最多5个文件, 即实际只能上传4个文件, 如下所示上传第5个文件报错
[hadoop@hadoop181 ~]$ hdfs dfs -put a.txt /data/shell/
2020-09-11 09:20:55,330 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
[hadoop@hadoop181 ~]$ hdfs dfs -put b.txt /data/shell/
2020-09-11 09:21:03,631 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
[hadoop@hadoop181 ~]$ hdfs dfs -put c.txt /data/shell/
2020-09-11 09:21:10,985 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
[hadoop@hadoop181 ~]$ hdfs dfs -put d.txt /data/shell/
2020-09-11 09:21:16,935 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
[hadoop@hadoop181 ~]$ hdfs dfs -put e.txt /data/shell/
put: The NameSpace quota (directories and files) of directory /data/shell is exceeded: quota=5 file count=6
查看限额信息
[hadoop@hadoop181 ~]$ hdfs dfs -count -q -h /data/client
显示参数描述
arg1 | arg2 | arg3 | arg4 |
---|