本文介绍实时数据订阅功能的数据消费格式定义说明和示例。
前提条件
您已经完成客户端配置(kafka/Prometheus监控系统/api地址),且公网网络联通正常。同时客户端服务器入方向端口允许访问。
说明本文以kafka消费数据为例,基本操作步骤如下:
安装kafka客户端,并修改相关配置信息。
配置客户端服务器网络入方向允许访问。
启动客户端kafka程序。
在云监控控制台,创建数据订阅任务。(填写客户端kafka相关配置信息)
数据消费定义说明
指标数据格式(kafka消息队列)
告警数据不做压缩,直接消费即可
监控数据vminsert写入kafka的数据采用zstd压缩,单条message可能包含多个metrics指标,消费到数据后先解压缩,解压后生成数据如下:
{
"metrics":[{
"tags":{
"__report_by__":"harvest",
"cpu":"cpu-total",
"host":"test-2dd4bfrv",
"idc":"neimengaz03",
"job":"virtual_machine",
"__name__":"cpu_util",
"uuid":"test-8194e630-fec1"
},
"fields":{
"cpu_util":0.26697814117055
},
"name":"prometheus_remote_write",
"timestamp":1713482161
}]
}
指标数据格式(remotewrite-API)
{"metric":{"__name__":"cpu_util","__report_by__":"harvest","cpu":"cpu-total","from":"subscription_translate","host":"ecm-dd23","idc":"neimengaz03","job":"virtual_machine","region_id":"test","uuid":"test-8194e630-fec1"},"value":[1743406897.472,"0.2334889928784665"]}
告警数据格式
[
{
"service":"cstor_sfs",
"dimension":"oceanfs",
"region_id":"test",
"idc":"neimengaz03",
"key":"fd8ef95badd54206bc243cd4fe6da114",
"model_id":"405f3438-fba8-52d1-a7e5-8e67f26f92b6",
"issue_id":"67b0aead8d4ae0276248dd21",
"info_id":"67b0aead8d4ae0276248dd22",
"name":"资源分组海量文件规则",
"alarm_type":"series",
"status":0,
"ctime":1739632301,
"value":104857600,
"resource":[{
"name":"uuid",
"value":"56m0a9zta02dkxx7"
},{
"name":"instancename",
"value":"oceanfs-ac40"
},{
"name":"console_resource_id",
"value":"56m0a9zta02dkxx7"
}],
"metric":"fs_capacity_total",
"alarm_name":"资源分组海量文件规则",
"threshold":"0",
"operator":"ge",
"unit":"MB"
},
...
{
"service":"cstor_sfs",
"dimension":"oceanfs",
"region_id":"81f7728662dd11ec810800155d307d5b",
"idc":"neimengaz03",
"key":"fd8ef95badd54206bc243cd4fe6da114",
"model_id":"405f3438-fba8-52d1-a7e5-8e67f26f92b6",
"issue_id":"67b0aead8d4ae0276248dd21",
"info_id":"67b0aead8d4ae0276248dd22",
"name":"资源分组海量文件规则",
"alarm_type":"series",
"status":0,
"ctime":1739632301,
"value":104857600,
"resource":[{
"name":"uuid",
"value":"56m0a9zta02dkxx7"
},{
"name":"instancename",
"value":"oceanfs-ac40"
},{
"name":"console_resource_id",
"value":"56m0a9zta02dkxx7"
}],
"metric":"fs_capacity_total",
"alarm_name":"资源分组海量文件规则",
"threshold":"0",
"operator":"ge",
"unit":"MB"
}
]
事件数据格式
[
{
"specversion":"1.0",
"id":"test-8194e630-fec1",
"source":"ctyun.site_monitor",
"type":"site_monitor:response_timeout",
"subject":"ctyun.site_monitor:fdda0184-2211-40dd-8098-02bd4e1f0452:response_timeout",
"datacontenttype":"application/json",
"time":"2025-01-16T14:06:52.368536895Z",
"data":{
"taskID":"fdda0184-2211-40dd-8098-02bd4e1f0452",
"pointID":"1e138ae8-b991-4c25-a139-b0ad3ac2f22a",
"targetAddr":"203.83.233.26",
"targetIP":"",
"dnsServer":null,
"responseTime":"",
"accountID":"",
"protocol":"ping",
"interval":60,
"err":"fdda0184-2211-40dd-8098-02bd4e1f0452 Failed to sent any pkg or receive any valid pkg, sent: 10, receive: 0"
},
"reportidc":"guizhou03",
"ctyunregion":"guizhou03"
},
...
{
"specversion":"1.0",
"id":"1d7554df-b7c4-4109-be17-7da11a09dd6a",
"source":"ctyun.site_monitor",
"type":"site_monitor:response_timeout",
"subject":"ctyun.site_monitor:fdda0184-2211-40dd-8098-02bd4e1f0452:response_timeout",
"datacontenttype":"application/json",
"time":"2025-01-16T14:06:52.368536895Z",
"data":{
"taskID":"fdda0184-2211-40dd-8098-02bd4e1f0452",
"pointID":"1e138ae8-b991-4c25-a139-b0ad3ac2f22a",
"targetAddr":"203.83.233.26",
"targetIP":"",
"dnsServer":null,
"responseTime":"",
"accountID":"",
"protocol":"ping",
"interval":60,
"err":"fdda0184-2211-40dd-8098-02bd4e1f0452 Failed to sent any pkg or receive any valid pkg, sent: 10, receive: 0"
},
"reportidc":"guizhou03",
"ctyunregion":"guizhou03"
}
]
说明
订阅数据类型分为指标类数据、事件类数据、告警类数据。
指标类数据:指标数据是对系统、服务或资源的各种量化特征进行实时或定期监测和收集的数据。这些数据通常以数值形式表示,用于描述系统的运行状态、性能表现和资源使用情况等。
事件类数据:事件数据是记录云资源或服务状态变化的离散信息,通常由系统自动生成,用于描述某一时刻发生的操作或状态变更。
告警类数据:告警数据是基于指标数据或事件数据,当系统检测到某些特定告警规则时产生。告警数据通常包含了告警的类型、登记、发生时间、相关资源信息以及告警的具体描述等。
消费数据乱码问题
对于Java生产者和消费者,可以在配置中设置字符编码:
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("serializer.encoding", "UTF-8"); // 设置编码格式
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("deserializer.encoding", "UTF-8"); // 设置编码格式
如果使用的是命令行工具,确保控制台的字符编码与Kafka消息编码一致。
在Windows系统中,可以通过修改控制台默认编码为UTF-8来解决:
chcp 65001
在Linux系统中,可以通过设置环境变量来调整字符编码:
export LANG=en_US.UTF-8