Start Exploring Keyword Ideas

Use Serpstat to find the best keywords for your website

How-to 13 min read December 30, 2021

What's the purpose of website log analysis

Each reference to your web resource is recorded both from the user and from the search crawlers. This allows you to see which search engines crawl your website, how the visitor behaved on the website, and more.
Site logs, as a rule, are valuable for a technical audit of websites and can be extremely useful for SEO optimization. Server log analysis:

  • shows how much crawling budget is lost and at which point;
  • helps to identify and correctly configure 404, 500 errors and others;
  • allows you to find pages that are rarely crawled or ignored by search engines;
  • many other opportunities.

What is a website log file?

Website log files are a major data source for network observability. A server log file is a computer-generated data file that provides information on an operating system's, application's, server's, or other device's usage patterns, activities, and operations.

Why analyze the log?

You'll be able to spot mistakes before your users even see them if you carry out a log file analysis. Since your business works with a large volume of log file data created by its systems, employing ML-powered log analytics software is the ideal option if you don't want to waste time manually evaluating logs. You always can use log analysis tools to meet your needs

How to get log files of a website

Since log files are kept on a web server, you must first log in. You can do so by following these steps:

  1. Using your hosting provider's control panel. A file manager is included with several hosting platforms. Look for terms like "file management," "files," "file manager," and so on.

  2. FTP. You'll need the following items:

  • Access the server through FTP

  • Get FTP address, login, and password

Open an FTP client, establish a new connection to your server, and then log in using your username and password. You can acquire your access logs when you've accessed a server file directory.

The most common mistakes

Log analysis can be challenging, and mistakes can be made, wasting resources and posing a security risk. Here are some of the most common log analysis blunders to avoid:

  • Lack of a centralized log system. Despite the fact that you'll be receiving log data from a variety of sources, it's critical to have centralized logs to make maintenance easier.

  • Carry out an analysis only after a security breach. You can operate proactively and potentially fix a security concern before it grows by monitoring logs on a regular basis.

Keeping logs for a limited period of time. It is recommended that old logs be archived rather than deleted entirely
Personal demonstration
Leave a request, and we will conduct a personal demonstration of the service for you, provide you with a trial period, and offer comfortable conditions for starting exploring the tool

How web server log file works

When a user enters a URL into a browser, he first breaks it into three components. For instance:
In this case, the browser understands that https is the protocol, is the name of the server, and example.html is the name of the file.

The server name is converted to an IP address through the domain name server. The HTTP GET request is then sent to the web server through the appropriate protocol for the requested page or file, while the HTML is returned to the browser and then it is interpreted to format the visible page on the screen. Each of these requests is recorded to the log file of the web server.

To put it simply, the process looks like this: the visitor makes a redirection on the page, the browser passes his request to the server on which the website is located. In response, the server returns the page requested by the user. And after that, it records everything that happens in the log-file.

All you need to analyze a website's search engine crawling is to export data and filter out requests made by a robot, for instance, Googlebot. It is more convenient to use a browser and IP range.

The log file itself is raw information, a solid text. But proper processing and analysis provide an unlimited source of information.

Log file content and structure

The log file structure ultimately depends on the type of server used and its configurations. For instance, Apache log analysis will be different from Nginx log analysis. But there are several common attributes that are almost always present in the log file:

  • IP address request;
  • date and time;
  • geography;
  • GET/POST method;
  • URL request;
  • HTTP status code;
  • browser.

See the record example including the data above: - - [12 / Oct / 2018: 01: 02: 03 -0100] « GET / resources / whitepapers / retail-whitepaper / HTTP / 1.1 « 200 »-« »Opera / 1.0 (compatible; Googlebot / 2.1; +

Additional attributes that can be sometimes available include the following:

  • host name;
  • request/client IP address;
  • loaded bytes;
  • time spent.

WordPress log file export

In order for a similar file to appear on the WordPress platform website, you must enable the logging function. To do this, find the file named wp-config.php in the root folder of the website. Download the file to your computer to access editing.

Next, find the line: "That's all, stop editing! Happy blogging. " Before it, add a new line of code:
define( 'WP_DEBUG', true );
It switches the website in debug mode, which enables the display of error notifications.

Now run error recording in the log file. To do this, add a new one right below the previous code line:
define( 'WP_DEBUG_LOG', true );
To enter the WordPress website log file, you need to go to FTP or the file manager. Next, open the wp-content folder in the website's shared folder, find the file called "debug" as you can see in the screenshot:
How to find a logfile on WordPress
On some websites, the file may be located in the logs or access logs folders.

This will open the log file that you need to copy and transfer to Excel for easier sorting. Monthly period data is usually used for analysis purposes.

Screaming Frog Log File Analyzer

For example, let's analyze log files through Screaming Frog Log File Analyzer.

The tool provides access to the free version, limiting the event log to one thousand lines. For a small project, this amount will be enough.

Download and install the software on your computer, then upload the log files or make a list of all the URLs that are present on the website. The file export is described above in this article. Open the tool and create a new project using the New button on the top panel:
Logs analysis in Screaming Frog Log File Analyzer
This will open the main control window, which collects information about visits to the web resource by search robots.
Logs analyzer Screaming Frog
See the report tabs for more information.
How to analyze logs online using Screaming Frog Log File Analyzer
An online analysis of the logs will show the page response codes in the table, dates of the last transitions, content, the number of search bots, and more.

Other analyzers will have a brief review.

GoAccess is designed for quick data analysis. Its main idea is to quickly look at the server logs and analyze them in real time without using your browser.

Splunk lets you process up to 500 MB of data per day for free. This is a great way to collect, store, search, compare and analyze website logs. is a log analysis tool designed specifically to improve software performance. The focus is on program data, which includes logs. Currently, the tool has got paid version only.

Logstash is a free, open-source tool for managing events and logs. It can be used for collecting logs, their storage, and analysis.

FAQ. Common questions about analyzing website logs

How do I find website logs?

Hosting log files are stored in the /var/log directory, where: site_name. access_log — site access log, site_name.

What do Web server logs show?

A web server log commonly includes request date and time, client IP address, requested page, bytes served, HTTP code, referrer, and user agent.


Log file analysis is useful primarily for website SEO promotion. To carry out the analysis, you will need to upload the file from the root folder of the website.

In most cases, the tool contains the following data:

  • IP address request;
  • date and time;
  • geography;
  • GET/POST method;
  • URL request;
  • HTTP status code;
  • browser.

In order to analyze a file, use sorting and do it manually in an Excel spreadsheet or install Screaming Frog Log File Analyzer or a similar tool. There are also a number of tools that are installed directly on the website server. This option is suitable if you have your own servers.

This article is a part of Serpstat's Checklist tool
Checklist at Serpstat
Checklist is a ready-to-do list that helps to keep reporting of the work progress on a specific project. The tool contains templates with an extensive list of project development parameters where you can also add your own items and plans.
Try Checklist now

Speed up your search marketing growth with Serpstat!

Keyword and backlink opportunities, competitors' online strategy, daily rankings and SEO-related issues.

A pack of tools for reducing your time on SEO tasks.

Get free 7-day trial

Rate the article on a five-point scale

The article has already been rated by 7 people on average 3.14 out of 5
Found an error? Select it and press Ctrl + Enter to tell us

Discover More SEO Tools

Tools for Keywords

Keywords Research Tools – uncover untapped potential in your niche

Serpstat Features

SERP SEO Tool – the ultimate solution for website optimization

Keyword Difficulty Tool

Stay ahead of the competition and dominate your niche with our keywords difficulty tool

Check Page for SEO

On-page SEO checker – identify technical issues, optimize and drive more traffic to your website

Share this article with your friends

Are you sure?

Introducing Serpstat

Find out about the main features of the service in a convenient way for you!

Please send a request, and our specialist will offer you education options: a personal demonstration, a trial period, or materials for self-study and increasing expertise — everything for a comfortable start to work with Serpstat.




We are glad of your comment
I agree to Serpstat`s Privacy Policy.

Thank you, we have saved your new mailing settings.

Report a bug

Open support chat
mail pocket flipboard Messenger telegramm