10843 170
SEO 16 min read September 1, 2020

How A Web Entity Can Grow Traffic And Lose Google Core Algorithm Update At The Same Time

How We Doubled Organic Traffic For An Amazon Affiliate Website: Experience Of SERPBiz
Евгения Сидорчук
Holistic SEO Expert at Performics
This article is the final part of the Unibaby SEO case study. In previous parts, I have put our successes and failures together with SEO theories and brainstorms.
No one knows the exact number of Google Ranking Factors. But if they have millions of baby algorithms, I am sure that it is too far above 200 hundred. If, as SEOs, we want to win every Google Core Algorithm Update, we need to think in a more logical, analytical, and free attitude about our Web Entity's situation.
What about the things we have even didn't talk about? What did all of these millions of baby algorithms find about our web site? Which site was creating more cost for Google to crawl? Which site was easier to understand for Google? Which content, service, and function of the web sites were more fast, helpful, and easy to use?

When we ask all of these questions, we can understand why we couldn't win the Core Algorithm Update.

To help you imagine this situation, you will find a list of undone things list for Unibaby.com.tr.
Crawl efficiency and cost along with Page Speed
I always think PageSpeed and crawl efficiency go together because they are connected. If you improve your web site's speed, Googlebot can crawl, render, and index your web entity faster like users can reach their purpose more quickly.
You may see that most of the images on our web pages couldn't be rendered by Googlebot. In my experience, other error means Crawl Quota Problems
Some of the errors related to crawl efficiency and PageSpeed:
We didn't create a dynamic or static sitemap because of the technical issues (except the .axd version which includes 23 percent of the web entity.)
We still have 301, 302, 404, 410 internal links on our source code.
We didn't mark all of the 404 status codes as 410.
We didn't perform CSS refactoring.
We didn't make JS TreeShaking.
We didn't use proper image alt tags.
We didn't use image URLs.
Our web server responds to the Googlebot averagely in 1 second, it is pretty slow for both users and search engines.
Our category and product pages are still weak in terms of content, context, speed and relevance.
We have a faceted navigation structure that creates duplicate pages that canonicalize itself.
We didn't clean unused codes from our web pages.
We didn't create bundles for a lesser request count.
We haven't decreased our DOM Size.
We haven't optimized our critical rendering path.
We haven't compressed or minified all of the resources in 100% efficiency.
We haven't started to use HTTP 2.1 instead of HTTP 1.1.
We haven't compressed our WOFF files and unified them.
We have implemented LazyLoad but it was not compatible with all of the user-agents.
We haven't fixed the mixed content problems for every web page.
With Serpstat, you may perform a comprehensive SEO Audit in terms of On-page, Off-page, Technical SEO, and Pagespeed. On a single screen, you may find everything you want to know.
Effective Site Audit With Serpstat: Tool Overview
Crawl efficiency and wrong site structure
This is another critical issue that is hard to explain in a few lines. That's why I am focusing on key points without search engine theories and practice information.
We didn't optimize our site structure: our products are in another category without a grouping or hierarchy than their main and actual type (also, related to wrong PageRank and category purpose profiling issues).
Our general product category was not correlative with our URL structure and breadcrumb.
Most of our important pages have more than 3-click depth and it means that Googlebot thought that these pages are not so important.
We haven't optimized our internal link and PageRank distribution; there are unimportant pages with more inner links than even the homepage.
We haven't used semantic HTML and schema markup for every web page with its potential (especially for the product and eCommerce web pages). If you think that is this related to crawl efficiency, the answer is yes.)
You may see an example of Internal Link Structure and Estimate the Cumulative PageRank Distribution. During these two months, we have decreased the gap between Homepage, Informative Content, and Product Web Pages but it is still not enough.
We haven't used proper queries for different categories and products and an appropriate hierarchy for both eCommerce and informative blog content.
We haven't used enough optimized cross-linking.
We are using server-side cache and service workers, but we do not use dynamic rendering for better crawl efficiency.
If you have a wrong Site Structure, Google can't understand the context of your web pages and the relevance of their main queries. That's why I always recommend the usage of Sitemap Index for grouping similar category pages on a sitemap can help Google to understand their relevance to each other. Also, it will help you to understand the Coverage Report better.
Content, meta tags and User Experience errors
Even the content length and unique value affect the domain's authority and crawl efficiency (evaluation budget). Still, since I have taken your time more than enough, I won't enter there.
With Serpstat, you may perform a quick one-page audit for the general aspect of your Web Entity in terms of Technical SEO and User Experience.
We haven't optimized all of the meta tags for every web page because of the brand's time management issues.
We haven't cleaned all of the thin content.
We haven't cleaned all of the pages with a bad layout.
We haven't fixed the User Experience errors on the web site.
We are continuing to use interstitial pop-up despite Google's guidelines.
We are hiding interstitials pop-up from the Googlebot because of the third-party script brands.
We didn't follow Google's UX and design recommendations for every web page URL.
We didn't calibrate our color contrast for different page components.
We haven't closed the content gap for every competitor.
We haven't merged the web pages with weak authority.
We haven't properly adjusted our web page layouts for every different screen size.
Most of the old content is not refreshed properly.
Quality Deserves Freshness is a motto of Mr. Singhal (if you don't know Mr. Singhal, good luck with your SEO career)
In this section, like before, we have completed tasks with half. We have compressed the images, but the brand couldn't review them properly. Because of this situation and intense marketing program, we couldn't fix the image alt tags or URLs except for 10%.
Personal demonstration
Our specialists will contact you and discuss options for further work. These may include a personal demonstration, a trial period, comprehensive training articles, webinar recordings, and custom advice from a Serpstat specialist. It is our goal to make you feel comfortable while using Serpstat.
Web accessibility, social media, user retention and security
Unibaby.com.tr has some security vulnerabilities related to old JQuery Versions (imagine a YMYL site has a security issue.) Unibaby.com.tr has using JQuery UI 1.12.1 (from 6 years ago), JQuery 3.5.1 (from 9 years ago), both of them have different security vulnerabilities.
Source: Synk.io
In terms of web accessibility, Unibaby.com.tr has not an advantage according to its competitors. From image alt attributes to aria labels, mostly everything is missing. If you think Google cares about web accessibility, you should be sure that the answer is yes. Chrome and Google Developers are susceptible to web accessibility. You may also find some webmaster Hangouts examples, which Martin Splitt gives details about web accessibility and its importance.

Furthermore, we have not optimized above the fold part for better user experience for every browser and device. You may find a Google patent and Matt Cutts explanation from 2004 related to above the fold part.
Matt Cutts' opinions and Page Layout Algorithm Update explanation, you may also check this Google Patent for a better understanding Above-the-fold Term for Google from 2004.
About the social media, Unibaby.com.tr has not used its best free community bounding weapons better than some of its competitors so far. Despite this, Unibaby.com.tr has been doing its best to increase user retention percentage. Since we have increased our organic traffic by nearly 200%, most of our users are new, and they are regularly returning.
And another security issue was cross-site cookies. For 14 different cookies, SameSite Attribute wasn't/isn't used properly. For more information: Cross-Site Cookies
If you are an experienced website SEO auditor, you can easily guess why and how these couldn't be done… So, I won't go there. But determining a future Core Algorithm Update date and following Google's improvements, and calibrating your Web Entity for the users' benefit and Search Engine's opinions should be a reflex for SEO since Medic Update.
Conclusion of this SEO case study
I know this was a detailed article about Google's thinking and evaluation ways. But if you think that we have done enough research to answer why we couldn't win the Core Algorithm Update, you are wrong. Google has tons of data sources from Chrome, Gmail, Doubleclick to Android Phones, and Home Speakers. They are gathering all the resources to understand which web entity is better than others, and they are creating a detailed algorithm to differentiate the tiniest ranking factors from each other.
"As a result of our work with Performics and Tuğberk, we achieved 500% growth in 5 months."

Ezgi Ergene,
Digital Marketing Specialist,
Eczacıbaşı Consumer Products
This is the comparison of the first 26 Days of both April and May. It is similar to Search Console Data Graphic, there is a %10 percent drop which is recovering for now. You may see that the sudden drop is beginning 4-5-6 May which is the date of May Core Algorithm Update.
I have just put some major and minor factors for our growth and fall, but I didn't put the micro elements and indirect factors for rankings here.

So, let me answer the magic question. After the Core Algorithm Update, did we lose traffic?

Yes, we have lost around 10% of our monthly organic traffic after 180% organic traffic growth. Also, our growth has stopped. The interesting thing is that most of our competitors also have lost the Core Algorithm Update.

In this situation, Google changes its preferences in terms of Site Profiles. It starts to prefer another kind of website from a different identity and sector to others. Sometimes, it changes its opinions about page layout or content structure, navigation structure preferences.
This is the comparison of Session and New Users Data between May and April 2020. We see that the real loss in New Users, our returning visitor numbers are basically the same. It means that we have a good user retention percent (checked with Cohort Analysis and User Explore Sections from Google Analytics).
So, we haven't talked about most of these preference change systems and how to determine a pattern between losers and winners with the help of Google Patents, updates, declarations, and live examples. But I believe that we have shown an analytical thinking example related to SEO of 2020 and beyond.

Finally, if you are an SEO that knows the importance of thinking analytically, you should ask yourself two questions about Core Updates every day, as I mentioned in my previous SEO Case Studies.
  • When will the next Google Core Algorithm Update be?
  • What will the next Google Core Algorithm Update be about?
By answering these two questions with everyday developments, Google's official and informal publications, descriptions, and behavior, you must be managing your project by competing against time.

Stay safe.

Speed up your search marketing growth with Serpstat!

Keyword and backlink opportunities, competitors' online strategy, daily rankings and SEO-related issues.

A pack of tools for reducing your time on SEO tasks.

Get free 7-day trial

Rate the article on a five-point scale

The article has already been rated by 5 people on average 5 out of 5
Found an error? Select it and press Ctrl + Enter to tell us

Discover More SEO Tools

Backlink Cheсker

Backlinks checking for any site. Increase the power of your backlink profile

API for SEO

Search big data and get results using SEO API

Competitor Website Analytics

Complete analysis of competitors' websites for SEO and PPC

Keyword Rank Checker

Google Keyword Rankings Checker - gain valuable insights into your website's search engine rankings

Share this article with your friends

Are you sure?

Introducing Serpstat

Find out about the main features of the service in a convenient way for you!

Please send a request, and our specialist will offer you education options: a personal demonstration, a trial period, or materials for self-study and increasing expertise — everything for a comfortable start to work with Serpstat.

Name

Email

Phone

We are glad of your comment
I agree to Serpstat`s Privacy Policy.

Thank you, we have saved your new mailing settings.

Report a bug

Cancel
Open support chat
mail pocket flipboard Messenger telegramm