安装Logstash
先下载logstash-6.4.3
链接:https://pan.baidu.com/s/1PirEKgby6OOIVJUwI4h6wg
提取码:xw4n
logstash-6.4.3
解压后 添加以下文件
将mysql.conf放到根目录下
整合数据库
安装MySQL
下载mysql-connector-java-8.0.19.jar文件
input {
jdbc {
jdbc_driver_library => "/root/mysql-connector-java-8.0.19.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://121.43.147.173:3306/test"
jdbc_user => "root"
jdbc_password => "root"
schedule => "* * * * *"
statement => "SELECT * FROM user WHERE update_time >= :sql_last_value"
use_column_value => true
tracking_column_type => "timestamp"
tracking_column => "update_time"
last_run_metadata_path => "syncpoint_table"
}
}
output {
elasticsearch {
# ES的IP地址及端口
hosts => ["127.0.0.1:9200","127.0.0.1:9201"]
# 索引名称 可自定义
index => "user"
# 需要关联的数据库中有有一个id字段,对应类型中的id
document_id => "%{id}"
document_type => "user"
}
stdout {
# JSON格式输出
codec => json_lines
}
}
表
启动LOGSTASH
./logstash -f /mysql.conf
多表
1、添加mysql1.conf文件(和mysql.conf相似)2、添加配置 pipelines.yml (注意格式,不能多空格与字符)
- pipeline.id: table1
path.config: "/mysql.conf"
- pipeline.id: table2
path.config: "/mysql1.conf"
3、./logstash 启动
整合文件
配置log_config01.conf文件将tomcat.log放到根目录
input {
# 从文件读取日志信息 输送到控制台
file {
path => "/tomcat.log"
codec => "json" ## 以JSON格式读取日志
type => "elasticsearch"
start_position => "beginning"
}
}
# filter {
#
# }
output {
# 标准输出
# stdout {}
# 输出进行格式化,采用Ruby库来解析日志
stdout { codec => rubydebug }
elasticsearch {
hosts => ["192.168.212.253:9200","192.168.212.253:9201"]
index => "es-%{+YYYY.MM.dd}"
}
}
启动
./bin/logstash -f /log_config01.conf
GET /es-2021.04.30/_search
整合kafka
docker安装Kafka、Zookeeper配置log_config01.conf,将文件放到根目录
input {
kafka {
bootstrap_servers => "127.0.0.1:9092"
topics => ["my_log"]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["127.0.0.1:9200","127.0.0.1:9201"]
index => "my_log"
}
}
./bin/logstash -f /log_config01.conf