This site uses cookies and other tracking technologies to make possible your usage of the website, assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide better user experience.

By using the website, you agree to our Privacy policy

Accept and continue

Report a bug

Cancel
2488 32 1
SEO 14 min read July 3, 2019

The Pitfalls Of Optimizing JavaScript Sites: Are They Worth It?

The Pitfalls Of Optimizing JavaScript Sites: Are They Worth It?

Stacy Mine
Editor at Serpstat
The optimization of JavaScript-based sites is always up for discussion. On the one hand, these sites offer extended UX options and fast loading time. On the other hand, the necessary SEO measures lead to problems. We asked experts to make clear whether SEO and JavaScript fit together, and in which cases it is advisable to create a JS site. I have collected their responses and prepared a summary. Enjoy!

What are the pitfalls of optimizing JS-based sites?

JavaScript and SEO is the buzz topic today as more and more sites are using advanced JavaScript frameworks and libraries such as Angular, React, Vue.js, and Polymer.

However, the reality is that the creation of developers cannot always be adapted to the requirements of the search robot. Although Google declares that they are pretty good at rendering JavaScript sites, the statements differ from the actual state of affairs: it doesn't matter how well the optimization is done, and what the age of the domain is. The site which is completely written in JS will have difficulties in optimization even with a Google bot that runs JS scripts, but not in the way we would like. For example, while scanning a JS-based site, the bot may not even reach its second level.

If we are talking about Google, then simple and accessible sites are necessary for successful optimization, and JavaScript can make everything more complicated. Although JavaScript should be available for indexing by search engines, it's still not worth storing and processing important data for ranking a site, since all efforts to optimize such content will likely have a zero performance.
We can highlight the following problems in optimizing JS sites:
JavaScript is very error-prone. One mistake in your JS code may lead to Google's inability to display your page.
Search engines can recognize content transmitted via JS, but not in all cases. For example, if the code is too complicated, confusing, or contains errors, the search engine may not recognize it.
It takes a lot of time to parse, compile, and run JS files.
If the site is full of JavaScript elements, Google Bot has to wait until all the elements of the JS-file components are loaded, and only then it can index the content.
If the content is being loaded via JavaScript for about 5 seconds, then search engines may not see it at all.
Google is often unable to detect new URLs and properly scan them on JS sites, as a result, some SEO things cannot always work
A quite common mistake is that scripts are blocked from indexing in the robots.txt file, as a result, search engines may not index downloadable content.
Many features are simply not available for Google Bot.
Sites on JS engines often use browser onclick events instead of links, search engines see such links, but the reference weight is not transmitted through them.
JS content indexing takes more time compared to HTML content. If the site is not written in JS entirely (which is extremely rare), then when visiting a page by a search bot, HTML content is first indexed, and the search engine can approach the JS content only after a few such iterations.
There are still many sites that use methods of hiding parts of the text (scrolls, "read more" buttons, etc.), while this text is not in the DOM and opens with an onclick event that runs a certain JS script. It is often enough to bring such text into the visible area, this leads to its almost instantaneous indexing.
Using JS on image sites. Developers often focus on the visual side of such resources, which leads to poor performance of the site loading speed. The problem is usually in the fact that the connection of JS-libraries is at the beginning of the page formation code. Such errors can be solved by putting one part of the scripts into separate files, with a call at the right time on the right page. And for the other part, we put down the downloading of such scripts below the code, as far as possible.
As for Yandex, it's not a secret to anyone that it still cannot correctly index sites on JS engines, although he's taking the first steps in this direction.
 
As for SEO on JavaScript sites, you should set up the server-side rendering of all pages, and search engines will have access to the code (HTML, CSS). With the standard approach (these are, most often, PHP sites), browser or robots can load all the page description code from the server. After that, the browser (or robot) can only process and load HTML (plus CSS) and display a page that is ready for interaction (for a user) or analysis (for robots). Sites that use server rendering, most often don't have problems with indexing by search engines.

Since sites on JavaScript often use the method of rendering the page already on the client side, there are some issues during the initial loading of the page. Before launching the JavaScript mechanisms, the browser and bots receive a blank page, as they asynchronously load data from the server side of the application and update the page markup. When using this engine for the site, it is vital to make sure that search engines scan the pages of the site correctly so that it did not happen that the error in the script closed the path to indexing the page.

Plus, you need to focus on the time that the engine will spend on putting the page code together. If it is more than 4-5 seconds, the search robot will simply not be able to index the contents of the pages correctly. And it will be uncomfortable for the user to work with the site.

You should always analyze the cache pages of your site in Google. The text cache will show whether Google bot sees your site as you show it to the user. You should look at pages in Google Search Console like a Google bot to check the errors. It is important to remember that JS-events that fill the site are differently seen by the user and Google bot (for example, clicking on the JS-link, opening a form with dynamically loaded content). This can be a problem for indexing certain types of content (for example, a block of reviews), or a tool for hiding content that is not desirable for the Google bot.

Typically, implementing such simple solutions can significantly reduce the percentage of failures.
So, what conclusions can we draw:
The main issue with JavaScript is indexing.
JavaScript is prone to errors.
The problem of getting the wrong content into search results.
Technical limitations of Google: many features are not available for Google bot.
Search engines can't see the content on the page, because JS elements load up to 5 seconds.

Under what conditions can business think about using JS engines?

Average sites don't need to think about using JS-engines, as there are more suitable and simple solutions for ordinary tasks. Using JS engines make sense if you plan to create a speedy functional site for a vast audience. JavaScript sites have a future, but today if you plan to grow your business using organic search, you should think and consider all pros and cons. A JavaScript site can be safely launched, in case you plan to get most of the visitors directly, via referral links or contextual advertising. In case of clicks from search results, the optimization of such a site can be complicated, since search engines don't always read and index JavaScript sites correctly.

Services with an emphasis on user engagement should have thought about switching to JavaScript a long time ago. SPA sites help to improve the experience of engagement and keep users for a long time due to the advantages over the ordinary ones. This is a subjective opinion, but it coincides with the current dynamics of the development of eCommerce leaders development. Projects use a single code to be with users regardless of the device or platform.

If search traffic is significant for the business, it is not recommended to make JS sites, as further SEO is too complicated. It's worth thinking about JS engines if the most important thing for your site is the speed of work (as JS-based site which works correctly can guarantee minimal delays from the server and clients' side). As an option, if your business just needs to be in trend (that is, the fact of the existence of such a site is essential).

One of the main conditions is the willingness to provide a financial part in the development of the JS engine since the quality of the product has a direct impact on the indexation of the site by search engines. But a well-designed JavaScript site is one of the components of a successful modern business.

But if you have a small company or agency, there is no reason to spend money on such a site for 1000 - 10 000 visitors per month.
Steven van Vessum
VP of Community at ContentKing
Can you successfully optimize a JS website?
In case you make the JS website easily crawlable and indexable for search engines using server-side rendering or workarounds like dynamic rendering, yes.

Why? If you're relying on search engines to render the pages, you're going to have a hard time ranking your site. It's like starting a soccer game with 0-5.

Which aspects should you consider when optimizing JS websites?
Making sure that search engines can easily discover, crawl and index your content is essential for every website, so that includes websites that heavily rely on JS too.

There are multiple ways to go about optimizing JS websites, notably server-side rendering or dynamic rendering.

What SEO advantages or disadvantages do these sites have?
In my book, JS websites don't have SEO advantages, only UX advantages.

Example: one of the most well known JS websites is Twitter. It provides a highly interactive interface, and is continuously showing updates.

When we're talking about the disadvantages of JS websites, and provided you're not using any kind of server-side rendering or dynamic rendering, I'd say the biggest disadvantage is that you're relying on search engines to render the pages. This process is highly inefficient, and slow. Rendering pages cost a lot of server resources, so search engines can only allocate a small number of its resources to rendering web pages. This means you shouldn't be surprised it can take search engines several days, or even weeks, to index your freshly published content. For comparison: with little effort you can have regular HTML pages indexed in a few hours without doing anything fancy.

In which cases is it worth creating a JS-based site?
The Twitter example I already mentioned is suitable here: in case you want to provide a highly interactive interface and real-time updates. Say for instance: a website that keeps track of sports results (e.g. soccer scores). You'd want to update your visitors in real-time about changes.

It's good to note that there's plenty of good use cases in which it totally makes sense to heavily rely on JS when developing sites. For instance, for Web Apps, such as ContentKing, JS is essential to provide the user experience we're aiming for.

Make sure you use JavaScript wisely, and don't rely on search engines to just "figure it out."
Let's draw the conclusions:
JavaScript is not suitable for typical small companies and agencies.
Sites in JavaScript and SEO-optimization can become friends, but this will require more effort and knowledge. However, the result is worth it, and there are many successful examples.
JavaScript sites have great potential for large, expensive projects with extensive features.
A JavaScript website can be launched if the majority of visitors come directly, via referral links or contextual advertising.
Also, JavaScript has a greater advantage for sites with a high degree of interactivity.
And what do you think about sites on JavaScript engines? Share your experience and leave your comments!

Learn how to get the most out of Serpstat

Want to get a personal demo, trial period or bunch of successful use cases?

Send a request and our expert will contact you ;)

Rate the article on a five-point scale

The article has already been rated by 2 people on average 5 out of 5
Found an error? Select it and press Ctrl + Enter to tell us
Subscribe to our newsletter
Keep up to date with our latest news, events and blog posts!

Share this article with your friends

Sign In Free Sign Up

You’ve reached your query limit.

Or email
Forgot password?
Or email
Back To Login

Don’t worry! Just fill in your email and we’ll send over your password.

Are you sure?

Awesome!

To complete your registration you need to enter your phone number

Back

We sent confirmation code to your phone number

Your phone Resend code Queries left

Something went wrong.

Contact our support team
Or confirm the registration using the Telegram bot Follow this link
Please pick the project to work on

Personal demonstration

Serpstat is all about saving time, and we want to save yours! One of our specialists will contact you and discuss options going forward.

These may include a personal demonstration, a trial period, comprehensive training articles & webinar recordings, and custom advice from a Serpstat specialist. It is our goal to make you feel comfortable while using Serpstat.

Name

Email

Phone

We are glad of your comment
Upgrade your plan

Upgrade your plan

Export is not available for your account. Please upgrade to Lite or higher to get access to the tool. Learn more

Sign Up Free

Спасибо, мы с вами свяжемся в ближайшее время

Invite
View Editing

E-mail
Message
Optional
E-mail
Message
Optional

You have run out of limits

You have reached the limit for the number of created projects. You cannot create new projects unless you increase the limits or delete existing projects.

I want more limits