web日志分析脚本nginx&http
1,http日志分析
#!/bin/bash for i in $@;do echo ===================== "$i" =============================>>weblog.txt echo "IP data">>weblog.txt awk ‘{print $1}‘ $i |wc -l>>weblog.txt awk ‘{print $1}‘ $i |sort | uniq -c |wc -l>>weblog.txt echo "sokect data">>weblog.txt awk -F‘"‘ ‘{print $8}‘ $i |grep -v "^-" |wc -l>>weblog.txt awk -F‘"‘ ‘{print $8}‘ $i |grep -v "^-" |sort|uniq -c | wc -l>>weblog.txt echo "sokect qu chong">>weblog.txt awk -F‘"‘ ‘{print $8}‘ $i |grep -v "^-" |awk ‘/^http/ {++state[$NF]} END {for(key in state) print key,",",state[key]}‘ >> $i.csv echo -e "\n\n" done,
2,nginx日志分析
#!/bin/bash for i in $@;do echo ===================== "$i" =============================>>weblog.txt echo "IP data">>weblog.txt awk ‘{print $1}‘ $i |wc -l>>weblog.txt awk ‘{print $1}‘ $i |sort | uniq -c |wc -l>>weblog.txt echo "sokect data">>weblog.txt awk -F‘"‘ ‘{print $9}‘ $i |grep -v "^-" |wc -l>>weblog.txt awk -F‘"‘ ‘{print $9}‘ $i |grep -v "^-" |sort|uniq -c | wc -l>>weblog.txt echo "sokect qu chong">>weblog.txt awk -F‘"‘ ‘{print $9}‘ $i |grep -v "^-" |awk ‘/^http/ {++state[$NF]} END {for(key in state) print key,",",state[key]}‘ >> $i.csv echo -e "\n\n" done
说明:获取IP和访问接口的域根据web配置文件中日志格式的顺序获得
3,web日志某时间段IP、PV分析
grep "01\/May\/2014:20:.* +0800" access_log.20140501 >> /data/httpd/fenxi.log
awk ‘{print $1}‘ fenxi.log | wc -l |more
awk ‘{print $1}‘ fenxi.log | sort | uniq -c |wc -l
本文出自 “我的运维博客” 博客,请务必保留此出处http://linuxpython.blog.51cto.com/10015972/1643732
郑重声明:本站内容如果来自互联网及其他传播媒体,其版权均属原媒体及文章作者所有。转载目的在于传递更多信息及用于网络分享,并不代表本站赞同其观点和对其真实性负责,也不构成任何其他建议。