由于各种情况 测试环境不稳定 我需要在本地部署一套
1.下载安装
下载地址
TDengine 发布历史及下载链接 | TDengine 文档 | 涛思数据
windows下支持的最后一个版本3.0.7.1 后续不支持windows了
下载这两默认安装
点击
TDengine-server-3.0.7.0-Windows-x64.exe 默认安装c盘
TDengine-client-3.0.7.0-Windows-x64.exe 默认安装c盘
2.修改配置
修改hosts文件添加
192.168.123.5 tdserver
进入安装目录C:\TDengine\cfg 修改配置文件 复制
locale c
charset utf-8
firstEp tdserver
fqdn tdserver
logDir D:\code\TDengine\log
dataDir D:\code\TDengine\data
########################################################
# #
# Configuration #
# #
########################################################
######### 0. Client only configurations #############
# The interval for CLI to send heartbeat to mnode
# shellActivityTimer 3
############### 1. Cluster End point ############################
# The end point of the first dnode in the cluster to be connected to when this dnode or the CLI utility is started
#firstEp tdserver
# The end point of the second dnode to be connected to if the firstEp is not available
# secondEp
############### 2. Configuration Parameters of current dnode #####
# The FQDN of the host on which this dnode will be started. It can be IP address
#fqdn tdserver
# The port for external access after this dnode is started
# serverPort 6030
# The maximum number of connections a dnode can accept
# maxShellConns 5000
# The directory for writing log files, if you are using Windows platform please change to Windows path
#logDir D:\code\TDengine\log
# All data files are stored in this directory, if you are using Windows platform please change to Windows path
#dataDir D:\code\TDengine\data
# temporary file's directory, if you are using Windows platform please change to Windows path
# tempDir /tmp/
# Switch for allowing to collect and report service usage information
# telemetryReporting 1
# Switch for allowing to collect and report crash information
# crashReporting 1
# The maximum number of vnodes supported by this dnode
# supportVnodes 0
# The interval of this dnode reporting status to mnode, [1..10] seconds
# statusInterval 1
# The minimum sliding window time, milli-second
# minSlidingTime 10
# The minimum time window, milli-second
# minIntervalTime 10
# The maximum allowed query buffer size in MB during query processing for each data node
# -1 no limit (default)
# 0 no query allowed, queries are disabled
# queryBufferSize -1
# The compressed rpc message, option:
# -1 (no compression)
# 0 (all message compressed),
# > 0 (rpc message body which larger than this value will be compressed)
# compressMsgSize -1
# query retrieved column data compression option:
# -1 (no compression)
# 0 (all retrieved column data compressed),
# > 0 (any retrieved column size greater than this value all data will be compressed.)
# compressColData -1
# system time zone
# timezone UTC-8
# system time zone (for windows 10)
# timezone Asia/Shanghai (CST, +0800)
# system locale
# locale en_US.UTF-8
# system charset
# charset UTF-8
# stop writing logs when the disk size of the log folder is less than this value
# minimalLogDirGB 1.0
# stop writing temporary files when the disk size of the tmp folder is less than this value
# minimalTmpDirGB 1.0
# if free disk space is less than this value, this dnode will fail to start
# minimalDataDirGB 2.0
# enable/disable system monitor
# monitor 1
# The following parameter is used to limit the maximum number of lines in log files.
# max number of lines per log filters
# numOfLogLines 10000000
# write log in async way: 1 - async, 0 - sync
# asyncLog 1
# time period of keeping log files, in days
# logKeepDays 0
############ 3. Debug Flag and levels #############################################
# The following parameters are used for debug purpose only by this dnode.
# debugFlag is a 8 bits mask: FILE-SCREEN-UNUSED-HeartBeat-DUMP-TRACE_WARN-ERROR
# Available debug levels are:
# 131: output warning and error
# 135: output debug, warning and error
# 143: output trace, debug, warning and error to log
# 199: output debug, warning and error to both screen and file
# 207: output trace, debug, warning and error to both screen and file
# debug flag for all log type, take effect when non-zero value
# debugFlag 0
# debug flag for timer
# tmrDebugFlag 131
# debug flag for util
# uDebugFlag 131
# debug flag for rpc
# rpcDebugFlag 131
# debug flag for jni
# jniDebugFlag 131
# debug flag for query
# qDebugFlag 131
# debug flag for client driver
# cDebugFlag 131
# debug flag for dnode messages
# dDebugFlag 135
# debug flag for vnode
# vDebugFlag 131
# debug flag for meta management messages
# mDebugFlag 135
# debug flag for wal
# wDebugFlag 135
# debug flag for sync module
# sDebugFlag 135
# debug flag for tsdb
# tsdbDebugFlag 131
# debug flag for tq
# tqDebugFlag 131
# debug flag for fs
# fsDebugFlag 131
# debug flag for udf
# udfDebugFlag 131
# debug flag for sma
# smaDebugFlag 131
# debug flag for index
# idxDebugFlag 131
# debug flag for tdb
# tdbDebugFlag 131
# debug flag for meta
# metaDebugFlag 131
# generate core file when service crash
# enableCoreFile 1
修改
firstEp fqdn logDir dataDir
3.运行 修改密码
C:\TDengine目录下
cmd 运行C:\TDengine 下的taosd.exe 启动服务端
cmd 运行C:\TDengine 下的taosadapter.exe 用于dbeaver工具连接
cmd 命令行(taos指令卡住了 就关闭taosd.exe cmd窗口 重新启动) 然后执行
taos
alter user root pass '你的密码';
用户管理指令
4.dbeaver 连接tdengine
输入root 和你的密码
dbeaver 可视化界面对于tdengine的支持不太好 不太能展示出超级表和子表的关系
使用:
TDengine Java Connector | TDengine 文档 | 涛思数据
tdengine 各种东西的版本不兼容 一定要仔细看文档
<dependency>
<groupId>com.taosdata.jdbc</groupId>
<artifactId>taos-jdbcdriver</artifactId>
<version>3.2.1</version>
</dependency>
TDengine/examples/JDBC at 3.0 · taosdata/TDengine · GitHub
1.导出
可以使用过dbeaver导出超级表的所有数据 csv文件
但是! 不支持导入超级表 只支持普通表和子表
1.dbeaver导出超级表 一直下一步 改下你想要存的地方就行 默认是utf-8编码
2. 查看文档
这个只针对普通表和子表
然后再看
只支持linux
行 我自己写代码
csv java依赖
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>5.6</version>
</dependency>
@CsvBindByName(column = "ts") 指定表头
@PostMapping("/importData")
public String importData(MultipartFile file) throws IOException {
InputStreamReader reader = new InputStreamReader(file.getInputStream(), Charset.forName("UTF-8"));
//SuperMetricImportVO属性加了@CsvBindByName
CsvToBean<SuperMetricImportVO> csvToBean = new CsvToBeanBuilder<SuperMetricImportVO>(reader)
.withType(SuperMetricImportVO.class)
.build();
List<SuperMetricImportVO> list = csvToBean.parse();
reader.close();
//提前手动建好超级表 根据解析出的tag 动态创建子表
//导入
return "success!";
}
导入可以参考
导入速率 30000/s