<< Back to Blog

Web Server Log Analysis

If you have a web app running on a web server, you know that it is so helpful to analyse the access and error logs to get information about how it is functioning. Systems administrators use different tools to analyse HTTP statistics, but not all of them provide enough information. For instance, Google Analytics can't provide client IP addresses which might be really useful to have then you want to investigate what was going on. By looking into a website access log it is possible to extract client IP addresses and analyse their reputation. Therefore, it is essential to use special instruments for log analysis. As an illustration, RST Cloud shows which malicious sources that had repeatedly attacked other websites contacted your web server.

ip reputation and web attacks

Moreover, sometimes there are multiple web servers processing requests for one site or multiple sites located on one server. So, it is essential to merge or split logs for each site on the fly. If you configure apache web server, you often use a special directive named Virtual Host to host multiple sites on the server. If you configure nginx, it is named Server Block. To configure log harvesting for these situations you should create a special log source definition and RST Cloud will automatically process the data. To import the log data you can use FTP, Syslog UDP, Syslog TCP or Agent-based connector.

add log source

It is easy to gather information from logs about how many hits each file is getting, which 404s, 500s and other errors are generating. If you want to know statistics about static file downloads, only a web server will give you exact information.

status codes

Although, Apache and Nginx access logs are similar, IIS logs are different. For that case, RST Cloud has integrated parsers for these types of logs too. Also, if you have custom format, RST Support Team can add a specific regex pattern for you.

rst cloud choose log type

RST Cloud uses machine learning technologies to analyse anomalies and predict behaviour based on previously detected activity. This technology helps to understand crawlers visits and determine the facts when crawlers start to behave abnormally. The same techniques and signature-based detection subsystem allow to look into the log data and find out attacks and other malicious attempts of the intruder to hack your site.

predict crawler activity

To conclude, the possibility to search web server log data online RST Cloud gives you a lot of advantages:

 - gives you a possibility to view client IP addresses and their reputation;
 - automatically splits or merges log data to show analytics for the site despite the quantity of virtual hosts (apache) or server blocks (nginx);
 - is the most accurate method to analyse requests especially when you need to know about a static content usage;
 - supports IIS logs as good as Apache and Nginx;
 - provides comprehensive analytics based on the machine learning technology;
 - detects anomalous requests and web attacks.



Posted on May 28, 2016 by Yury Sergeev