1.查看redis信息
#连接上redis 后
info
2.批量执行命令
#1.使用管道
$ cat cmds.txt
set foo1 bar1
set foo2 bar2
set foo3 bar3
......
$ cat cmds.txt | redis-cli
OK
OK
OK
#2.使用输入重定向
redis-cli < cmds.txt
#3.批量删除 (慎用)
redis-cli KEYS "prefix:*" | xargs redis-cli DEL
3.set多行字符串
#redis set多行字符串 , 使用-x命令
$ cat str.txt
Ernest Hemingway once wrote,
"The world is a fine place and worth fighting for."
I agree with the second part.
$ redis-cli -x set foo < str.txt
OK
4.重复执行命令
# 间隔1s,执行5次,观察qps的变化
$ redis-cli -r 5 -i 1 info | grep ops
instantaneous_ops_per_sec:43469
instantaneous_ops_per_sec:47460
instantaneous_ops_per_sec:47699
instantaneous_ops_per_sec:46434
instantaneous_ops_per_sec:47216
#交互模式下
127.0.0.1:6379> 5 ping
PONG
PONG
PONG
PONG
PONG
5.导出CSV
#--csv命令
$ redis-cli rpush lfoo a b c d e f g
(integer) 7
$ redis-cli --csv lrange lfoo 0 -1
"a","b","c","d","e","f","g"
6.执行lua脚本
#1.eval
eval "return redis.pcall('mget', KEYS[1], KEYS[2])" 2 foo1 foo2
#2. 保存到文件中
$ cat mset.txt
return redis.pcall('mset', KEYS[1], ARGV[1], KEYS[2], ARGV[2])
$ cat mget.txt
return redis.pcall('mget', KEYS[1], KEYS[2])
$ redis-cli --eval mset.txt foo1 foo2 , bar1 bar2
OK
$ redis-cli --eval mget.txt foo1 foo2
1) "bar1"
2) "bar2"
7.监控服务器状态
#redis-cli --stat 默认1s 间隔输出 -i 调整
>redis-cli --stat
------- data ------ --------------------- load -------------------- - child -
keys mem clients blocked requests connections
137376 6.56G 13 0 4318451996 (+0) 342 SAVE
137374 6.56G 13 0 4318452037 (+41) 342 SAVE
137375 6.56G 13 0 4318452198 (+161) 342 SAVE
137375 6.56G 13 0 4318452497 (+299) 342 SAVE
8.扫描大key
#--bigkeys 参数可以很快扫出内存里的大 KEY,使用 -i 参数控制扫描间隔,避免扫描指令导致服务器的 ops 陡增报警 扫描每个类型的最大key
redis-cli --bigkeys -i 0.01
# Scanning the entire keyspace to find biggest keys as well as
# average sizes per key type. You can use -i 0.1 to sleep 0.1 sec
# per 100 SCAN commands (not usually needed).
[00.00%] Biggest string found so far 'dataCallback:26957:idfa:46E8C4CF-2F3D-42C0-8CCF-36D96ABC31A4' with 275 bytes
[00.00%] Biggest string found so far 'dataCallback:25173:idfa:DA816D59-7CBF-4C95-886F-A5FCA821E0BC' with 284 bytes
[00.00%] Biggest string found so far 'callback:appId:channelId:idfa:27055:1:FE82819D-7970-43F3-9861-7DB252408D3B' with 504 bytes
[00.01%] Biggest hash found so far 'ip:monitor:appid:result:26292:2019-12-19' with 105 fields
[00.01%] Biggest hash found so far 'monitorReportIdfa:date:appId:2020-01-20:25319' with 1891 fields
[00.02%] Biggest hash found so far 'filterCache:1488279748' with 4647 fields
[00.02%] Biggest string found so far 'callback:appId:channelId:idfa:25173:1790:46D67B36-DF8F-42B0-9D3D-CC815E27D834' with 569 bytes
[00.03%] Biggest string found so far 'callback:appId:channelId:idfa:23391:1414:AE8561D9-A0EE-48ED-B21A-CBA74AF61891' with 586 bytes
[00.05%] Biggest string found so far 'doc:app:25855:channel:1' with 7502 bytes
[00.14%] Biggest hash found so far 'filterCache:1463027890' with 10894 fields
[00.18%] Biggest string found so far 'doc:app:26256:channel:1' with 7556 bytes
[00.18%] Biggest set found so far 'reportCache:27075' with 46 members
[00.18%] Biggest set found so far 'submitCache:458032309' with 7853 members
[00.20%] Biggest set found so far 'submitCache:1445976208' with 190628 members
[00.21%] Biggest hash found so far 'filterCache:1279715325' with 89592 fields
[00.31%] Biggest string found so far 'doc:app:25126:channel:1931' with 7574 bytes
[00.45%] Biggest set found so far 'submitCache:1445388401' with 724829 members
[00.68%] Biggest string found so far 'doc:app:25055:channel:1633' with 7580 bytes
[00.71%] Biggest hash found so far 'filterCache:333903271' with 270836 fields
[02.71%] Biggest string found so far 'doc:app:25590:channel:1789' with 7586 bytes
[03.21%] Biggest string found so far 'doc:app:25993:channel:1908' with 7634 bytes
[06.54%] Biggest string found so far 'doc:app:24357:channel:1831' with 7752 bytes
[06.83%] Biggest hash found so far 'filterCache:1448347483' with 501539 fields
[13.23%] Biggest set found so far 'submitCache:852917296' with 1650633 members
[14.85%] Biggest string found so far 'launch:118638' with 8829 bytes
[54.28%] Biggest set found so far 'submitCache:382201985' with 7215796 members
[59.54%] Biggest hash found so far 'filterCache:1008000951' with 612354 fields
[84.78%] Biggest hash found so far 'filterCache:1159233677' with 1077558 field
9.禁用命令
# 把keys命令 更换为 abckeysabc 在配置文件中增加 "" 删除命令
rename-command keys abckeysabc
rename-command keys ""