We use cookies to make Serpstat better. By clicking "Accept cookies", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. Learn more

Report a bug

How-to 9 min read

How to analyze website logs

Each reference to your web resource is recorded both from the user and from the search crawlers. This allows you to see which search engines crawl your website, how the visitor behaved on the website, and more.
Site logs, as a rule, are valuable for a technical audit of websites and can be extremely useful for SEO optimization. Server log analysis:

  • shows how much crawling budget is lost and at which point;
  • helps to identify and correctly configure 404, 500 errors and others;
  • allows you to find pages that are rarely crawled or ignored by search engines;
  • many other opportunities.

How web server log file works

When a user enters a URL into a browser, he first breaks it into three components. For instance:
In this case, the browser understands that https is the protocol, your_site_address.com is the name of the server, and example.html is the name of the file.

The server name is converted to an IP address through the domain name server. The HTTP GET request is then sent to the web server through the appropriate protocol for the requested page or file, while the HTML is returned to the browser and then it is interpreted to format the visible page on the screen. Each of these requests is recorded to the log file of the web server.

To put it simply, the process looks like this: the visitor makes a redirection on the page, the browser passes his request to the server on which the website is located. In response, the server returns the page requested by the user. And after that, it records everything that happens in the log-file.

All you need to analyze a website's search engine crawling is to export data and filter out requests made by a robot, for instance, Googlebot. It is more convenient to use a browser and IP range.

The log file itself is raw information, a solid text. But proper processing and analysis provide an unlimited source of information.

Log file content and structure

The log file structure ultimately depends on the type of server used and its configurations. For instance, Apache log analysis will be different from Nginx log analysis. But there are several common attributes that are almost always present in the log file:

  • IP address request;
  • date and time;
  • geography;
  • GET/POST method;
  • URL request;
  • HTTP status code;
  • browser.

See the record example including the data above: - - [12 / Oct / 2018: 01: 02: 03 -0100] « GET / resources / whitepapers / retail-whitepaper / HTTP / 1.1 « 200 »-« »Opera / 1.0 (compatible; Googlebot / 2.1; + http://www.google.com/bot.html)

Additional attributes that can be sometimes available include the following:

  • host name;
  • request/client IP address;
  • loaded bytes;
  • time spent.

WordPress log file export

In order for a similar file to appear on the WordPress platform website, you must enable the logging function. To do this, find the file named wp-config.php in the root folder of the website. Download the file to your computer to access editing.

Next, find the line: "That's all, stop editing! Happy blogging. " Before it, add a new line of code:
define( 'WP_DEBUG', true );
It switches the website in debug mode, which enables the display of error notifications.

Now run error recording in the log file. To do this, add a new one right below the previous code line:
define( 'WP_DEBUG_LOG', true );
To enter the WordPress website log file, you need to go to FTP or the file manager. Next, open the wp-content folder in the website's shared folder, find the file called "debug" as you can see in the screenshot:
How to find a logfile on WordPress
On some websites, the file may be located in the logs or access logs folders.

This will open the log file that you need to copy and transfer to Excel for easier sorting. Monthly period data is usually used for analysis purposes.

Log file analyzers

As more companies move to the cloud, log analytics, log analysis, and journal management tools are becoming more demanded.

Some people check log files manually. Webmasters export the file and analyze it in Excel. In this case, you only need sorting and a few formulas, but this approach is outdated, and won't be considered in this article.

Using special tools to analyze log files can facilitate the processing of large volumes of information. Some webmasters install a log analyzer on the server itself. This method is convenient for projects located on their own servers; then the logs will be saved an unlimited amount of time. Websites on a third-party hosting will have a storage time frame for a maximum of 1 month. Therefore, rotation and archiving are required.

Screaming Frog Log File Analyzer

For example, let's analyze log files through Screaming Frog Log File Analyzer.

The tool provides access to the free version, limiting the event log to one thousand lines. For a small project, this amount will be enough.

Download and install the software on your computer, then upload the log files or make a list of all the URLs that are present on the website. The file export is described above in this article. Open the tool and create a new project using the New button on the top panel:
Logs analysis in Screaming Frog Log File Analyzer
This will open the main control window, which collects information about visits to the web resource by search robots.
Logs analyzer Screaming Frog
See the report tabs for more information.
How to analyze logs online using Screaming Frog Log File Analyzer
An online analysis of the logs will show the page response codes in the table, dates of the last transitions, content, the number of search bots, and more.

Other analyzers will have a brief review.

GoAccess is designed for quick data analysis. Its main idea is to quickly look at the server logs and analyze them in real time without using your browser.

Splunk lets you process up to 500 MB of data per day for free. This is a great way to collect, store, search, compare and analyze website logs.

Logmatic.io is a log analysis tool designed specifically to improve software performance. The focus is on program data, which includes logs. Currently, the tool has got paid version only.

Logstash is a free, open-source tool for managing events and logs. It can be used for collecting logs, their storage, and analysis.


Log file analysis is useful primarily for website SEO promotion. To carry out the analysis, you will need to upload the file from the root folder of the website.

In most cases, the tool contains the following data:

  • IP address request;
  • date and time;
  • geography;
  • GET/POST method;
  • URL request;
  • HTTP status code;
  • browser.

In order to analyze a file, use sorting and do it manually in an Excel spreadsheet or install Screaming Frog Log File Analyzer or a similar tool. There are also a number of tools that are installed directly on the website server. This option is suitable if you have your own servers.

This article is a part of Serpstat's Checklist tool
Checklist at Serpstat" title = "How to analyze website logs 16261788336171" />
Checklist is a ready-to-do list that helps to keep reporting of the work progress on a specific project. The tool contains templates with an extensive list of project development parameters where you can also add your own items and plans.
Try Checklist now

Learn how to get the most out of Serpstat

Want to get a personal demo, trial period or bunch of successful use cases?

Send a request and our expert will contact you ;)

Rate the article on a five-point scale

The article has already been rated by 7 people on average 3.14 out of 5
Found an error? Select it and press Ctrl + Enter to tell us