Mysql 高负载排查思路
发现问题
top命令 查看服务器负载,发现 mysql竟然百分之两百的cpu,引起Mysql 负载这么高的原因,估计是索引问题和某些变态SQL语句.
排查思路
1. 确定高负载的类型,top命令看负载高是CPU还是IO。
2. mysql 下执行查看当前的连接数与执行的sql 语句。
3. 检查慢查询日志,可能是慢查询引起负载高。
4. 检查硬件问题,是否磁盘故障问题造成的。
5. 检查监控平台,对比此机器不同时间的负载。
确定负载类型(top)
top - :: up days, :, user, load average: 124.17, 55.88, 24.70
Tasks: total, running, sleeping, stopped, zombie
Cpu(s): 2.4%us, 1.0%sy, 0.0%ni, 95.2%id, 2.0%wa, 0.1%hi, 0.2%si, 0.0%st
Mem: 3090528k total, 2965772k used, 124756k free, 93332k buffers
Swap: 4192956k total, 2425132k used, 1767824k free, 756524k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
mysql 6250m .5g S 257.1 49.9 :34.45 mysqld
查看当前的连接数与执行的sql 语句
show processlist;
Id User Host db Command Time State Info
slave 8.8.8.142: NULL Binlog Dump Has sent all binlog to slave; waiting for binlog to be updated NULL
slave 8.8.8.120: NULL Binlog Dump Has sent all binlog to slave; waiting for binlog to be updated NULL
biotherm 8.8.8.46: biotherm Query Sending data SELECT * FROM xxx_list WHERE tid = '' AND del = ORDER BY id
DESC LIMIT ,
biotherm 8.8.8.49: biotherm Query Sending data SELECT * FROM xxx_list WHERE tid = '' AND del = ORDER BY id
DESC LIMIT ,
..............................................
biotherm 8.8.8.42: biotherm Query Sending data SELECT * FROM xxx_list WHERE tid = '' AND del =
记录慢查询
编辑Mysql 配置文件(my.cnf),在[mysqld]字段添加以下几行:
log_slow_queries = /usr/local/mysql/var/slow_queries.log #慢查询日志路径
long_query_time = #记录SQL查询超过10s的语句
log-queries-not-using-indexes = #记录没有使用索引的sql
查看慢查询日志
tail /usr/local/mysql/var/slow_queries.log
# Time: ::
# User@Host: biotherm[biotherm] @ [8.8.8.45]
# Query_time: 1294.881407 Lock_time: 0.000179 Rows_sent: Rows_examined:
SET timestamp=;
SELECT * FROM xxx_list WHERE tid = '11xx' AND del = ORDER BY id DESC LIMIT , ;
4个参数
Query_time: 0 Lock_time: 0 Rows_sent: 1 Rows_examined: 54
分别意思为:查询时间 锁定时间 查询结果行数 扫描行数,主要看扫描行数多的语句,然后去数据库加上对应的索引,再优化下变态的sql 语句。
极端情况kill sql进程
找出占用cpu时间过长的sql,在mysql 下执行如下命令:
show processlist;
确定后一条sql处于Query状态,且Time时间过长,锁定它的ID,执行如下命令:
kill QUERY ;
注意:杀死 sql进程,可能导致数据丢失,所以执行前要衡量数据的重要性。