ELK收集日志之logstash使用

一、logstash使用

1.logstah收集文件日志

不难理解,我们的日志通常都是在日志文件中存储的,所以,当我们在使用INPUT插件时,收集日志,需要使用file模块,从文件中读取日志的内容,那么接下来讲解的是,将日志内容输出到另一个文件中,如此一来,我们可以将日志文件统一目录,方便查找。

注意:Logstash与其他服务不同,收集日志的配置文件需要我们根据实际情况自己去写。
前提:需要Logstash对被收集的日志文件有读的,并且对要写入的文件,有写入的权限。

2.配置logstash

#默认配置文件
[root@logstash ~]# vim /etc/logstash/logstash.yml
#启动logstash回去读取conf.d下面的配置文件
path.config: /etc/logstash/conf.d

3.配置logstash收集文件日志到文件

1)配置

[root@logstash ~]# vim /etc/logstash/conf.d/message.conf
input {
  file {
    path => "/var/log/messages"
    start_position => "beginning"
  }
}

output {
  file {
    path => "/tmp/message_%{+YYYY.MM.dd}.log"
  }
}

2)启动logstash

#先检查语法
[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf -t

#启动
[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf &

3)查看新文件内容

[root@logstash ~]# tail /var/log/messages
Jul 17 15:01:01 logstash systemd: Started Session 448 of user root.
Jul 17 15:05:01 logstash systemd: Started Session 449 of user root.
[root@logstash ~]# tail /tmp/message_2020.07.17.log 
{"@version":"1","path":"/var/log/messages","message":"Jul 17 15:01:01 logstash systemd: Started Session 448 of user root.","@timestamp":"2020-07-17T07:05:42.341Z","host":"logstash"}
{"@version":"1","path":"/var/log/messages","message":"Jul 17 15:05:01 logstash systemd: Started Session 449 of user root.","@timestamp":"2020-07-17T07:05:42.341Z","host":"logstash"}

4.配置收集日志到ES

1)配置

[root@logstash tmp]# vim /etc/logstash/conf.d/message_es.conf 
input {
  file {
    path => "/var/log/messages"
    start_position => "beginning"
  }
}

output {
  elasticsearch {
    hosts => ["10.0.0.51:9200"]
    index => "messages_%{+YYYY-MM-dd}.log"
  }
}

2)启动logstash

#先检查语法
[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf -t

#启动
[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf &

启动多个logstash进程需要配置多个data目录,否则会出现这样的报错

[ERROR] 2020-07-20 11:59:22.363 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

5.启动logsstash多实例

1)创建多实例数据目录

[root@logstash ~]# mkdir /data/logstash/{message_file,secure_file} -p
#授权目录logstash权限
[root@logstash ~]# chown -R logstash.logstash /data/logstash/

2)启动多实例

#启动多实例要加一个参数 --path.data 指定多实例不同的数据目录

[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message_es.conf --path.data=/data/logstash/message_file &

[root@logstash tmp]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/secure_es.conf --path.data=/data/logstash/secure_file &

6.单个进程收集多个日志

1)停掉原来的进程删掉索引

2)配置方式一:

[root@logstash ~]# vim /etc/logstash/conf.d/double_es.conf 
input {
  file {
    type => "messages_log"
    path => "/var/log/messages"
    start_position => "beginning"
  } 
  file {
    type => "secure_log"
    path => "/var/log/secure"
    start_position => "beginning"
  } 
} 

output {
  if [type] == "messages_log" {
    elasticsearch {
      hosts => ["10.0.0.51:9200"]
      index => "messages_%{+YYYY-MM-dd}.log"
    }
  }
  if [type] == "secure_log" {
    elasticsearch {
      hosts => ["10.0.0.51:9200"]
      index => "secure_%{+YYYY-MM-dd}.log"
    }
  }
}

3)配置方式二:

[root@logstash ~]# vim /etc/logstash/conf.d/doubles_es.conf
input {
  file {
    type => "messages_log"
    path => "/var/log/messages"
    start_position => "beginning"
  }
  file {
    type => "secure_log"
    path => "/var/log/secure"
    start_position => "beginning"
  }
}

output {
  elasticsearch {
    hosts => ["10.0.0.51:9200"]
    index => "%{type}_%{+YYYY-MM-dd}.log"
  }
}

4)启动

[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/doubles_es.conf

二、收集tomcat日志

1.安装tomcat

#上传包
#安装java环境
#解压包
[root@logstash ~]# tar xf apache-tomcat-9.0.30.tar.gz 
#移动并做软连接
[root@logstash ~]# mv apache-tomcat-9.0.30 /usr/local/
[root@logstash ~]# ln -s /usr/local/apache-tomcat-9.0.30 /usr/local/tomcat

2.启动tomcat

#配置一个页面
[root@logstash ~]# echo "test logstash log" > /usr/local/tomcat/webapps/ROOT/index.html

#启动
[root@logstash ~]# /usr/local/tomcat/bin/startup.sh 

[root@logstash ~]# netstat -lntp        
tcp6       0      0 :::8080                 :::*                    LISTEN      84967/java

3.配置logstash收集tomcat日志

[root@logstash ~]# vim /etc/logstash/conf.d/tomcat_es.conf 
input {
  file {
    path => "/usr/local/tomcat/logs/catalina.*.log"  #input 插件不识别变量,日志只收集当天的,以前的日志文件第二天之后不会再写入,所以这里用* 就可以收集每天的日志。
    start_position => "beginning"
  }
}

output {
  elasticsearch {
    hosts => ["10.0.0.51:9200"]
    index => "tomcat_%{+YYYY-MM-dd}.log"
  }
}

[root@logstash ~]# vim /etc/logstash/conf.d/tomcat_access_es.conf
input {
  file {
    path => "/usr/local/tomcat/logs/localhost_access_log.*.txt"
    start_position => "beginning"
  }
}

output {
  elasticsearch {
    hosts => ["10.0.0.51:9200"]
    index => "tomcat_access_%{+YYYY-MM-dd}.log"
  }
}

4.启动

[root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/tomcat_access_es.conf

5.收集tomcat错误日志

1)概念

当收集tomcat错误日志时,一条报错可能是很多行,收集到以后时很多条数据,查看时比较麻烦

#解决方式
1.跟开发协商,将tomcat日志格式改为json格式,直接收集即可
2.通过logstash的模块将日志合并

2)方式一:

#进入tomcat配置文件目录
[root@elkstack03 ~]# cd /usr/local/tomcat/conf
#编辑server配置文件
[root@elkstack03 conf]# vim server.xml
#在138行,添加如下内容
<Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs"
               prefix="tomcat_access_log" suffix=".log"
               pattern="{&quot;clientip&quot;:&quot;%h&quot;,&quot;ClientUser&quot;:&quot;%l&quot;,&quot;authenticated&quot;:&quot;%u&quot;,&quot;AccessTime&quot;:&quot;%t&quot;,&quot;method&quot;:&quot;%r&quot;,&quot;status&quot;:&quot;%s&quot;,&quot;SendBytes&quot;:&quot;%b&quot;,&quot;Query?string&quot;:&quot;%q&quot;,&quot;partner&quot;:&quot;%{Referer}i&quot;,&quot;AgentVersion&quot;:&quot;%{User-Agent}i&quot;}"/> 

3)方式二:

[root@logstash ~]# vim /etc/logstash/conf.d/tomcat_mutiline_es.conf 
input {
  file {
    type => "java_log"
    path => "/usr/local/tomcat/logs/localhost_access_log.*.txt"
    start_position => "beginning"
    codec => multiline {
      pattern => "^\["
      negate => true
      what => "previous"
    }
  }
}
output {
  elasticsearch {
    hosts => ["10.0.0.51:9200"]
    index => "tomcat_mutiline_%{+YYYY-MM-dd}.log"
  }
}

#注释:
[root@elkstack03 ~]# vim /etc/logstash/conf.d/java.conf
input {
        stdin {
        codec => multiline {
#当遇到[开头的行时候将多行进行合并
        pattern => "^\["
#true为匹配成功进行操作,false为不成功进行操作
        negate => true
#与上面的行合并,如果是下面的行合并就是next
        what => "previous"
        }}
}
output {
        stdout {
        codec => rubydebug
        }
}
上一篇:【docker】Kibana-dockerdesktop基础单机部署


下一篇:进击的Android注入术《二》