How-to 10 min read

How to optimize SPA websites

SPA websites have become popular thanks to their simplified architecture and convenience to an end-user. To promote such application websites, you need to correctly submit pages to search bots using standard SEO tools at the same time.

What are SPA applications

If you are on a website where the header and footer are unchanged, and the remaining elements are loaded dynamically, this is a single-page application.

The only HTML document is used for a cover when developing a single-page application. Variable components are brought to a user via a particular technology: Ajax, HTML5, a template engine or a JavaScript framework.

In fact, after a browser request to the server, the latter does not need to create a completely new page. It returns only that part of the information to the browser that needs to be replaced according to the query. This happens quickly when the page is partially reloaded.

Advantages of SPA websites

  • A page is not reloaded with every click; therefore, information on the page is not being changed completely;

  • the information is loading faster thanks to the simplified structure;

  • like on regular websites, each page can exist as a separate element with its own structure and functions, and a failure of one page will not affect the entire site.

Drawbacks of SPA websites

  • Indexing by bots is different from the traditional feed in the HTML code;

  • you need to study JavaScript thoroughly to create one;

  • search engine optimization of the SPA website is more complicated than usual;

  • the Java module should be switched on for the correct browser display.

Of all the search engines, only Google and Ask.com can render single-page application pages using Google's search results. But as an optimizer, you should keep in mind the rest of the search engines.

To make bots crawl pages better, add the <meta name = "fragment" content = "!"> parameter. Google bots now render pages directly. In this case, you just need to use the correct URL format.

How robots of Google see SPA websites

When accessing the page, the bot first scans it, then analyzes it and determines the final time required for processing. Tests show that if scanning time exceeds 5 seconds, the bot leaves the page. The same happens when scanning scripts. If If the search engine bot finds the code too complicated, it leaves the page. Given the complexity, the obfuscation of code and possible errors, bot algorithms will find JavaScript files insignificant.

Even a small error will result in the page not being crawled. One option is to create the so-called polyfill. This is a simplified version of the site for bots, not as functional as the user sees on the outside, but it contains the necessary information for ranking by search engines. Anyway, don't make your website too heavy.

To check the indexing of a specific page on Google, use the Site operator. To do this, enter the following in the search bar in incognito mode:
site: {site.com} «{fragment}»
The indexed page will be displayed in the search results. If the information is not displayed, then it means that the page was not scanned by the bot. To conduct a direct analysis of the page, it is best to immediately look at the DOM (Document Object Model). To do this, select a part of the text, right-click and select "Inspect Element" right in your browser.
View Document Object Model in a browser
The document-object model is used to track dynamic changes on the page and is a hierarchy of meta tags and text strings. This structure has primary and child nodes that are available for changing in the JavaScript application. For example, there is a certain HTML page:
<!doctype html>
<title>Buy a laptop with backlight in Moscow</title>
<h1>Best laptops with backlight</h1>
<p>Welcome to our online store.</p>
<p>You can buy a laptop here. For example,
<a href="http://site.ru">this one</a>.</p>
The structure of the page will look like this:
HTML Page Layout
Also, download the Chrome 41 browser. This is exactly the version that Google uses to render pages. It is useful to analyze errors in the console of this browser from time to time.

If your site has hidden text under the cut, use the href parameter in the links and do not rely solely on the onClick. Without this parameter, the pages under the cut will most likely be skipped by the bot. Do not use the standard hashtag # in the URL, but simply replace it with the hashbang #!, and then the bot will read the page name.
Before: https://site.com/#url  
After: https://site.com/#!url
To look at the website from the side of the bots, go to the Google Search Console and find the Fetch and Render parameters there. This information is fully true except for timeout metrics. For a robot, they are smaller than what can be indicated here.
Page Report in the Google Search Console

Search engine optimization of SPA websites

Promote JavaScript applications in a traditional way. It sometimes happens that indexing errors are not directly related to the website architecture, but occur due to improper actions from the point of view of SEO. In this case, in addition to the standard actions, it is necessary to:
Optimize the website scripts so that they are easier to read and faster to load.
Avoid using iFrames or create separate URLs with their content.
When creating links, use a format that offers static URLs in addition to JavaScript. This is done so that everyone can see the page, including users without Java in the browser.
If you need to set up a redirect, feel free to use it! Tests show that the bot is able to read the redirect and go to the desired page. That is, a redirect in JavaScript occurs similarly to 301 in HTML.

For the optimization purposes, you also need to augment the 404 pages. According to the standard, 200 OK is resulting for non-existent pages single-page applications, and that is incorrect. Your task is to ensure that the 404 error is displayed in the page title.

To register your site in Google Analytics, use the Tag Manager and the History trigger, or transfer the data about each page manually through the Set command and replace the variable value with a new one.

How to make bots index your web pages

Use the «?_escaped_fragment=» parameter. This is currently not the most relevant option.
Submit HTML copies to bots by defining them through the User-agent (Google will see the original pages).
Transform the website into an isomorphic JavaScript application.
You can create duplicate pages both on your server and on a third-party one. The first option is relevant for large projects; the second one is relevant for medium and small ones with the involvement of specialized tools. Among these optimization methods, the third option is considered the best which implies the creation of an isomorphic application. It is even recommended by Google:
Recommendations for optimization from Google SPA-sites
The essence is that users and bots initially receive the entire set of information to which Java scripts are loaded. As a result, the user sees dynamic pages and the bots get HTML copies of them.


The SPA is a single page that immediately downloads all scripts and CSS files. During user actions, only parts of the page change, and its shell remains unchanged.

To optimize SPA websites, you must apply the standard SEO tools and additionally ensure that URLs are shaped so that bots can understand them.

Currently, only Google and Ask.com are able to index SPA websites in their original appearance. For the rest of the search engines, HTML copies of the pages are required. Ideally, the user utilizes JavaScript functionality, and the bots read simplified and clear duplicates.

This article is a part of Serpstat's Checklist tool
Checklist at Serpstat" title = "How to optimize single-page application sites 16261788327122" />
Checklist is a ready-to-do list that helps to keep reporting of the work progress on a specific project. The tool contains templates with an extensive list of project development parameters where you can also add your own items and plans.
Try Checklist now

Speed up your search marketing growth with Serpstat!

Keyword and backlink opportunities, competitors' online strategy, daily rankings and SEO-related issues.

A pack of tools for reducing your time on SEO tasks.

Get free 7-day trial

Rate the article on a five-point scale

The article has already been rated by 2 people on average 5 out of 5
Found an error? Select it and press Ctrl + Enter to tell us

Share this article with your friends

Are you sure?

Introducing Serpstat

Find out about the main features of the service in a convenient way for you!

Please send a request, and our specialist will offer you education options: a personal demonstration, a trial period, or materials for self-study and increasing expertise — everything for a comfortable start to work with Serpstat.




We are glad of your comment
I agree to Serpstat`s Privacy Policy.

Thank you, we have saved your new mailing settings.

Report a bug

Open support chat
mail pocket flipboard Messenger telegramm