DataFlux是上海驻云自研发的一套大数据统一分析平台,可以通过对任何来源、类型、规模的实时数据进行监控、分析和处理,释放数据价值。
DataFlux包含五大功能模块:
- Datakit 采集器
- Dataway 数据网关
- DataFlux Studio 实时数据洞察平台
- DataFlux Admin Console 管理后台
- DataFlux.f(x) 实时数据处理开发平台
面向企业提供全场景的数据洞察分析能力, 具有实时性、灵活性、易扩展、易部署等特点。
安装DataKit
PS:以Linux系统为例
第一步:执行安装命令
DataKit 安装命令:
DK_FTDATAWAY=[你的 DataWay 网关地址] bash -c "$(curl https://static.dataflux.cn/datakit/install.sh)"
补充安装命令中的 DataWay 网关地址,然后复制安装命令到主机上执行即可。
例如:如果的 DataWay 网关地址 IP 为 1.2.3.4,端口为 9528(9528为默认端口),则网关地址为
http://1.2.3.4:9528/v1/write/metrics,安装命令为:
DK_FTDATAWAY=http://1.2.3.4:9528/v1/write/metrics bash -c "$(curl https://static.dataflux.cn/datakit/install.sh)"
安装完成后,DataKit 默认会自动运行,并且会在终端中提示 DataKit 的状态管理命令
Docker 指标采集
采集 docker 指标上报到 DataFlux 中
- 已安装 DataKit(DataKit 安装文档)
打开 DataKit 采集源配置文件夹(默认路径为 DataKit 安装目录的 conf.d 文件夹),找到 docker 文件夹,打开里面的 docker.conf。
设置:
# Read metrics about docker containers
[[inputs.docker]]
## Docker Endpoint
## To use TCP, set endpoint = "tcp://[ip]:[port]"
## To use environment variables (ie, docker-machine), set endpoint = "ENV"
endpoint = "unix:///var/run/docker.sock"
## Set to true to collect Swarm metrics(desired_replicas, running_replicas)
## Note: configure this in one of the manager nodes in a Swarm cluster.
## configuring in multiple Swarm managers results in duplication of metrics.
gather_services = false
## Only collect metrics for these containers. Values will be appended to
## container_name_include.
## Deprecated (1.4.0), use container_name_include
container_names = []
## Set the source tag for the metrics to the container ID hostname, eg first 12 chars
source_tag = false
## Containers to include and exclude. Collect all if empty. Globs accepted.
container_name_include = []
container_name_exclude = []
## Container states to include and exclude. Globs accepted.
## When empty only containers in the "running" state will be captured.
## example: container_state_include = ["created", "restarting", "running", "removing", "paused", "exited", "dead"]
## example: container_state_exclude = ["created", "restarting", "running", "removing", "paused", "exited", "dead"]
# container_state_include = []
# container_state_exclude = []
## Timeout for docker list, info, and stats commands
timeout = "5s"
## Whether to report for each container per-device blkio (8:0, 8:1...) and
## network (eth0, eth1, ...) stats or not
perdevice = true
## Whether to report for each container total blkio and network stats or not
total = false
## docker labels to include and exclude as tags. Globs accepted.
## Note that an empty array for both will include all labels as tags
docker_label_include = []
docker_label_exclude = []
## Which environment variables should we use as a tag
tag_env = ["JAVA_HOME", "HEAP_SIZE"]
## Optional TLS Config
# tls_ca = "/etc/telegraf/ca.pem"
# tls_cert = "/etc/telegraf/cert.pem"
# tls_key = "/etc/telegraf/key.pem"
## Use TLS but skip chain & host verification
# insecure_skip_verify = false
配置好后,重启 DataKit 即可生效
验证数据上报
完成数据采集操作后,我们需要验证数据是否采集成功并且上报到DataWay,以便后续能正常进行数据分析及展示
操作步骤:登录DataFlux——数据管理——指标浏览——验证数据是否采集成功
Docker 指标:
使用DataFlux实现数据洞察
根据获取到的指标项进行数据洞察设计,例如:
Docker 监控视图
DataFlux基于自研的DataKit数据(采集器)目前已经可以对接超过200种数据协议,包括:云端数据采集、应用数据采集、日志数据采集、时序数据上报、常用数据库的数据汇聚,帮助企业实现最便捷的IT 统一监控。