使用场景:业务系统使用clickhouse数据库作为大数据仓库,正式运行后累计了700G的原始数据和清洗后数据,急需备份。
下载安装clickhouse-backup
[root@clickhouse02 ~]# wget https://github.com/AlexAkulov/clickhouse-backup/releases/download/v2.1.3/clickhouse-backup-2.1.3-1.x86_64.rpm
[root@clickhouse02 ~]# rpm -ivh clickhouse-backup-2.1.3-1.x86_64.rpm
安装后配置文件在/etc/clickhouse-backup目录中,需要复制一份出来,并修改其内容
[root@clickhouse02 ~]# cp /etc/clickhouse-backup/config.yml.example /etc/clickhouse-backup/config.yml
[root@clickhouse02 ~]# vim /etc/clickhouse-backup/config.yml
general:
remote_storage: none
max_file_size: 107374182400 #这里需要配置下
disable_progress_bar: true
backups_to_keep_local: 0
backups_to_keep_remote: 0
log_level: info
allow_empty_backups: false
download_concurrency: 1
upload_concurrency: 1
use_resumable_state: false
restore_schema_on_cluster: ""
upload_by_part: true
download_by_part: true
restore_database_mapping: {}
retries_on_failure: 3
upload_retries_pause: 30s
watch_interval: 1h
full_interval: 24h
watch_backup_name_template: shard{shard}-{type}-{time:20060102150405}
retriesduration: 100ms
watchduration: 1h0m0s
fullduration: 24h0m0s
clickhouse:
username: default
password: "********" #配置clickhouse的密码
host: localhost #由于是在clickhouse服务器上安装的clickhouse-backup,所以这里默认就好
port: 9000
disk_mapping: {}
skip_tables:
.
.
.
修改后:wq保存退出。
使用clickhouse-backup
这里我出现了一个报错,创建备份时no such file or directory
[root@clickhouse02 ~]# clickhouse-backup create #数据库全量备份命令
2023/03/14 14:48:09 Freeze 'default.test_iot_water_law_original'
2023/03/14 14:48:09 Freeze 'default.test_iot_water_law_stat_data'
2023/03/14 14:48:09 Freeze 'default.test_mv'
2023/03/14 14:48:09 Freeze 'default.test_report_revenue_price_type'
2023/03/14 14:48:09 Freeze 'default.unsettled_dwd_revenue_dim_pay_off_time'
2023/03/14 14:48:09 Skip 'system.asynchronous_metric_log'
2023/03/14 14:48:09 Skip 'system.metric_log'
2023/03/14 14:48:09 Skip 'system.query_log'
2023/03/14 14:48:09 Skip 'system.query_thread_log'
2023/03/14 14:48:09 Skip 'system.trace_log'
2023/03/14 14:48:09 Copy metadata
2023/03/14 14:48:09 Done.
2023/03/14 14:48:09 Move shadow
2023/03/14 14:48:09 open /var/lib/clickhouse/store/shadow: no such file or directory
千思苦想,翻了翻公司的书,没有发现任何问题。问题后来想了想可能是用户权限的问题,于是使用sudo让clickhouse能够使用一些root的权限,执行如下命令后,发现能够正常备份了
[root@clickhouse02 ~]# sudo clickhouse-backup create
.
.
.
2023/03/14 11:01:48.644651 info done backup=2023-03-14T03-01-34 logger=backuper operation=create table=default.app_customer_address
2023/03/14 11:01:48.644805 info done backup=2023-03-14T03-01-34 logger=backuper operation=create table=default.app_configuration_all
2023/03/14 11:01:48.644853 info ALTER TABLE `default`.`app_configuration` FREEZE WITH NAME 'f279b90643cd4048b5333e6fe6831a09'; logger=clickhouse
2023/03/14 11:01:48.648322 info done backup=2023-03-14T03-01-34 logger=backuper operation=create table=default.app_configuration
2023/03/14 11:01:48.648519 info done backup=2023-03-14T03-01-34 logger=backuper operation=create table=default.app_business_area_meter_all
2023/03/14 11:01:48.648568 info ALTER TABLE `default`.`app_business_area_meter` FREEZE WITH NAME '694ff1cb65d747c39a07bfdb6f55fb9c'; logger=clickhouse
2023/03/14 11:01:48.653602 info done backup=2023-03-14T03-01-34 logger=backuper operation=create table=default.app_business_area_meter
2023/03/14 11:01:48.653669 info SELECT value FROM `system`.`build_options` where name='VERSION_DESCRIBE' logger=clickhouse
2023/03/14 11:01:48.658686 info done backup=2023-03-14T03-01-34 duration=14.095s logger=backuper operation=create
2023/03/14 11:01:48.658753 info clickhouse connection closed logger=clickhouse
问题解决