This site uses cookies and other tracking technologies to make possible your usage of the website, assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide better user experience.

By using the website, you agree to our Privacy policy

Accept and continue

Report a bug

Cancel
117
How-to 10 min read September 3, 2019

How to optimize SPA websites

SPA websites have become popular thanks to their simplified architecture and convenience to an end-user. To promote such application websites, you need to correctly submit pages to search bots using standard SEO tools at the same time.

What are SPA applications

If you are on a website where the header and footer are unchanged, and the remaining elements are loaded dynamically, this is a single-page application.

The only HTML document is used for a cover when developing a single-page application. Variable components are brought to a user via a particular technology: Ajax, HTML5, a template engine or a JavaScript framework.

In fact, after a browser request to the server, the latter does not need to create a completely new page. It returns only that part of the information to the browser that needs to be replaced according to the query. This happens quickly when the page is partially reloaded.

Advantages of SPA websites

  • A page is not reloaded with every click; therefore, information on the page is not being changed completely;

  • the information is loading faster thanks to the simplified structure;

  • like on regular websites, each page can exist as a separate element with its own structure and functions, and a failure of one page will not affect the entire site.

Drawbacks of SPA websites

  • Indexing by bots is different from the traditional feed in the HTML code;

  • you need to study JavaScript thoroughly to create one;

  • search engine optimization of the SPA website is more complicated than usual;

  • the Java module should be switched on for the correct browser display.

Of all the search engines, only Google and Ask.com can render single-page application pages using Google's search results. But as an optimizer, you should keep in mind the rest of the search engines.

To make bots crawl pages better, add the <meta name = "fragment" content = "!"> parameter. Google bots now render pages directly. In this case, you just need to use the correct URL format.

How robots of Google see SPA websites

When accessing the page, the bot first scans it, then analyzes it and determines the final time required for processing. Tests show that if scanning time exceeds 5 seconds, the bot leaves the page. The same happens when scanning scripts. If If the search engine bot finds the code too complicated, it leaves the page. Given the complexity, the obfuscation of code and possible errors, bot algorithms will find JavaScript files insignificant.

Even a small error will result in the page not being crawled. One option is to create the so-called polyfill. This is a simplified version of the site for bots, not as functional as the user sees on the outside, but it contains the necessary information for ranking by search engines. Anyway, don't make your website too heavy.

To check the indexing of a specific page on Google, use the Site operator. To do this, enter the following in the search bar in incognito mode:
site: {site.com} «{fragment}»
The indexed page will be displayed in the search results. If the information is not displayed, then it means that the page was not scanned by the bot. To conduct a direct analysis of the page, it is best to immediately look at the DOM (Document Object Model). To do this, select a part of the text, right-click and select "Inspect Element" right in your browser.
View Document Object Model in a browser
The document-object model is used to track dynamic changes on the page and is a hierarchy of meta tags and text strings. This structure has primary and child nodes that are available for changing in the JavaScript application. For example, there is a certain HTML page:
<!doctype html>
<html>
<head>
<title>Buy a laptop with backlight in Moscow</title>
</head>
<body>
<h1>Best laptops with backlight</h1>
<p>Welcome to our online store.</p>
<p>You can buy a laptop here. For example,
<a href="http://site.ru">this one</a>.</p>
</body>
</html>
The structure of the page will look like this:
HTML Page Layout
Also, download the Chrome 41 browser. This is exactly the version that Google uses to render pages. It is useful to analyze errors in the console of this browser from time to time.

If your site has hidden text under the cut, use the href parameter in the links and do not rely solely on the onClick. Without this parameter, the pages under the cut will most likely be skipped by the bot. Do not use the standard hashtag # in the URL, but simply replace it with the hashbang #!, and then the bot will read the page name.
Before: https://site.com/#url  
After: https://site.com/#!url
To look at the website from the side of the bots, go to the Google Search Console and find the Fetch and Render parameters there. This information is fully true except for timeout metrics. For a robot, they are smaller than what can be indicated here.
Page Report in the Google Search Console

Search engine optimization of SPA websites

Promote JavaScript applications in a traditional way. It sometimes happens that indexing errors are not directly related to the website architecture, but occur due to improper actions from the point of view of SEO. In this case, in addition to the standard actions, it is necessary to:
1
Optimize the website scripts so that they are easier to read and faster to load.
2
Avoid using iFrames or create separate URLs with their content.
3
When creating links, use a format that offers static URLs in addition to JavaScript. This is done so that everyone can see the page, including users without Java in the browser.
If you need to set up a redirect, feel free to use it! Tests show that the bot is able to read the redirect and go to the desired page. That is, a redirect in JavaScript occurs similarly to 301 in HTML.

For the optimization purposes, you also need to augment the 404 pages. According to the standard, 200 OK is resulting for non-existent pages single-page applications, and that is incorrect. Your task is to ensure that the 404 error is displayed in the page title.

To register your site in Google Analytics, use the Tag Manager and the History trigger, or transfer the data about each page manually through the Set command and replace the variable value with a new one.

How to make bots index your web pages

1
Use the «?_escaped_fragment=» parameter. This is currently not the most relevant option.
2
Submit HTML copies to bots by defining them through the User-agent (Google will see the original pages).
3
Transform the website into an isomorphic JavaScript application.
You can create duplicate pages both on your server and on a third-party one. The first option is relevant for large projects; the second one is relevant for medium and small ones with the involvement of specialized tools. Among these optimization methods, the third option is considered the best which implies the creation of an isomorphic application. It is even recommended by Google:
Recommendations for optimization from Google SPA-sites
The essence is that users and bots initially receive the entire set of information to which Java scripts are loaded. As a result, the user sees dynamic pages and the bots get HTML copies of them.

Conclusion

The SPA is a single page that immediately downloads all scripts and CSS files. During user actions, only parts of the page change, and its shell remains unchanged.

To optimize SPA websites, you must apply the standard SEO tools and additionally ensure that URLs are shaped so that bots can understand them.

Currently, only Google and Ask.com are able to index SPA websites in their original appearance. For the rest of the search engines, HTML copies of the pages are required. Ideally, the user utilizes JavaScript functionality, and the bots read simplified and clear duplicates.

This article is a part of Serpstat's Checklist tool
Checklist at Serpstat
Checklist is a ready-to-do list that helps to keep reporting of the work progress on a specific project. The tool contains templates with an extensive list of project development parameters where you can also add your own items and plans.
Try Checklist now

Learn how to get the most out of Serpstat

Want to get a personal demo, trial period or bunch of successful use cases?

Send a request and our expert will contact you ;)

Rate the article on a five-point scale

The article has already been rated by 0 people on average out of 5
Found an error? Select it and press Ctrl + Enter to tell us

Share this article with your friends

Sign In Free Sign Up

You’ve reached your query limit.

Or email
Forgot password?
Or email
Back To Login

Don’t worry! Just fill in your email and we’ll send over your password.

Are you sure?

Awesome!

To complete your registration you need to enter your phone number

Back

We sent confirmation code to your phone number

Your phone Resend code Queries left

Something went wrong.

Contact our support team
Or confirm the registration using the Telegram bot Follow this link
Please pick the project to work on

Personal demonstration

Serpstat is all about saving time, and we want to save yours! One of our specialists will contact you and discuss options going forward.

These may include a personal demonstration, a trial period, comprehensive training articles & webinar recordings, and custom advice from a Serpstat specialist. It is our goal to make you feel comfortable while using Serpstat.

Name

Email

Phone

We are glad of your comment
Upgrade your plan

Upgrade your plan

Export is not available for your account. Please upgrade to Lite or higher to get access to the tool. Learn more

Sign Up Free

Спасибо, мы с вами свяжемся в ближайшее время

Invite
View Editing

E-mail
Message
Optional
E-mail
Message
Optional

You have run out of limits

You have reached the limit for the number of created projects. You cannot create new projects unless you increase the limits or delete existing projects.

I want more limits