ELK-logstash收集日志

通过ogstash 收集日志

 

不难理解,我们的日志通常都是在日志文件中存储的,所以,当我们在使用INPUT插件时,收集日志,需要使用file模块,从文件中读取日志的内容,那么接下来讲解的是,将日志内容输出到另一个文件中,如此一来,我们可以将日志文件统一目录,方便查找。

注意:Logstash与其他服务不同,收集日志的配置文件需要我们根据实际情况自己去写。
前提:需要Logstash对被收集的日志文件有读的,并且对要写入的文件,有写入的权限。

 

默认读取配置文件

[root@localhost ~]# more /etc/logstash/logstash.yml

#一般情况我们配置包含配置文件

[root@localhost ~]# grep "^[a-Z]" /etc/logstash/logstash.yml
path.data: /var/lib/logstash
pipeline.ordered: auto
path.logs: /var/log/logstash


#以后所有的配置文件都写到conf.d下面

1.配置logstash 

 

[root@localhost ~]# more /etc/logstash/conf.d/system-log.conf
input {
file {
type => "messagelog"
path => "/var/log/messages"
start_position => "beginning" #第一次从头收集,之后从新添加的日志收集
}
}
output {
file {
path => "/tmp/%{type}.%{+yyyy.MM.dd}"
}
}

 

检查配置

[root@localhost ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/system-log.conf -t
Configuration OK
[INFO ] 2021-06-15 22:35:02.583 [LogStash::Runner] runner - Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

启动

[root@localhost ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/system-log.conf &
[1] 19626

[INFO ] 2021-06-16 01:14:13.255 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9601}
[INFO ] 2021-06-16 01:14:14.165 [Converge PipelineAction::Create<main>] Reflections - Reflections took 57 ms to scan 1 urls, producing 23 keys and 47 values
[INFO ] 2021-06-16 01:14:14.657 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/system-log.conf"], :thread=>"#<Thread:0x2d4569b1 run>"}
[INFO ] 2021-06-16 01:14:15.387 [[main]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>0.73}
[INFO ] 2021-06-16 01:14:15.553 [[main]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_452905a167cf4509fd08acb964fdb20c", :path=>["/var/log/messages"]}
[INFO ] 2021-06-16 01:14:15.581 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2021-06-16 01:14:15.637 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2021-06-16 01:14:15.708 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2021-06-16 01:14:16.446 [[main]>worker0] file - Opening file {:path=>"/tmp/messagelog.2021.06.16"}

 

测试收集日志

#监控收集的日志

[root@localhost ~]# tail -f /tmp/messagelog.2021.06.16

#添加一条数据到message日志

[root@localhost ~]# echo 11111 >> /var/log/messages

ELK-logstash收集日志

 

上一篇:惊了!我这样优化了Tomcat后,网站访问性能竟飙升


下一篇:Kibana搜索原理