12406 212
Cases 36 min read September 20, 2021

Importance Of User Experience, Authoritative Content And Page Speed Improvements For A Rapid Organic Traffic Growth: Unibaby Project

How We Doubled Organic Traffic For An Amazon Affiliate Website: Experience Of SERPBiz
How We Doubled Organic Traffic For An Amazon Affiliate Website: Experience Of SERPBiz
Евгения Сидорчук
Holistic SEO Expert at Performics
In this case study, we will see that in two months, despite 200% of organic traffic growth, a web entity will not win the Google Core Algorithm Update. The causes of a sudden increase in organic traffic and the effects of Core Algorithm Update are transferred scientifically, showing successes and failures.
In the previous article, we discussed the basic technical and analytical SEO items with Search Engine principles and theories.

In this article, page speed, page layout, content types, structured data, user experience, internal linking, PageRank distribution, social media activities, user retention will be examined with the same attitude.
Exactly 8 years ago, a section from the "5 Common SEO Mistake" video published on Webmaster Official Youtube channel. It says "Think of SEO completely as a User Experience.". UX is always a fundamental value for SEO.
In particular, I believe that this part of the SEO case study has gained even more importance, with Chrome Developers' comments on Chrome web vitals and Google's official user experience and ranking algorithm relationship.
Pagespeed and user experience recommendations and improvements
As I said at the beginning, we are also talking about our failures. That's why I can't say that everything went well, but you will find what we have changed in terms of page speed here.
Service Worker and repeated visits
Service Worker is a Javascript file that runs parallel to the browser's main thread to process its components from the local storage. Thanks to Service Workers, a web page can work offline, send push notifications, or cache the resource on itself.
You may examine the Service Workers that are being downloaded to your Chrome, from DevTools>Application Part. You may also close them or test them.
In Unibaby.com.tr's address, there wasn't any caching strategy; at first, I recommended that they use traditional browser cache along with server-side cache. But, the development team has developed a better option than my request. They had a ready-to-go Service Worker technology. When we implemented Service Worker technology to the website, it improved our user experience, server performance, and crawl efficiency.
You may see that most of the CDN content is coming from Service Worker.
Googlebot always visits a website without browser history/cache and cookies on its storage. But when Googlebot crawls a group of web pages from the same site, it develops a cache/history and cookies. So, we have improved our cache strategy in terms of search engine friendliness.
Bulk image optimization and image meta tags
In Unibaby Web Entity, there were many images without any optimization in terms of resolution, size, and meta tags. The beginning size of all images on the Web Entity was 15 MB. We have compressed the images until 3,5 MB. We have written new alt tags, URLs, and captions for them according to the web page's purpose and possible intents of the user segments.
I believe that every SEO has a problem with third-party trackers if they don't have dynamic rendering
Also, we have planned to use different image extensions according to different browsers. Because of Safari's web picture incompatibility, dynamic serving has been taken into the agenda. In this section, I have used four different image compression technologies. First, I have cleaned the unnecessary pixels from the image. I have decreased the unnecessarily high resolution from the pictures. I have used both of Google's squoosh.app and tiny.png together for a better compression level. Every image compressor has different methods and advantages. While creating a compressed and optimized image, we have paid attention not to impair the image's quality.
If you want to quickly check if the developer team has made any changes on everything, I strongly recommend Serpstat's Website SEO Checker chrome extension.
In this section, like before, we have completed tasks with half. We have compressed the images, but the brand couldn't review them properly. Because of this situation and intense marketing program, we couldn't fix the image alt tags or URLs except for 10%.
Updated Serpstat Website SEO Checker: Analysis Of Any Site In A Few Clicks
You can download Serpstat Website SEO Checker below:
Personal demonstration
Our specialists will contact you and discuss options for further work. These may include a personal demonstration, a trial period, comprehensive training articles, webinar recordings, and custom advice from a Serpstat specialist. It is our goal to make you feel comfortable while using Serpstat.
Using preconnect and preload for resources
"link rel="preconnect" is a resource prioritization command for the browser. Some of the resource prioritization code lines are not seen as commands mostly. Chrome and Firefox have a clever strategy to protect the user experience and bandwidth efficiency.
During the initial loading, our webpage couldn't give a response to the users.
It means that on some devices and network conditions according to your local DNS speed, resource prioritization hints can be ignored. This is usually about resource prioritization hints' hierarchy.

Below you may find the definitions and differences about resource Prioritization Hints:
DNS prefetch: does only DNS Reverse Look Up.
Preconnect: does DNS Reverse LookUp, TLS Negotiation, and TCP Handshake.
Preload: does DNS Reverse LookUp, TLS Negotiation, TCP Handshake, and loading with high priority.
Prefetch: prefetch a resource from another web page.
Prerender: does loading and rendering another web page with all of the resources.
The browsers can ignore the last two from the list if you implement them wrong.
I have told these differences to the developer team with solid and real examples from Chrome DevTools over a local server. After a brief presentation, we have decided to use Preconnect for 3-rd party trackers and preload for the Critical CSS and JS Resources.

We have performed both in a short period. But during the new deployments by the developer team and also the added new 3rd party trackers, these improvements have stayed active for only one month.
Minifying and compressing JS, CSS and HTML documents
On Unibaby.com.tr, HTML Resources and some CSS and HTML documents are not being compressed and mangled.

It can be done in many ways; you may delete spaces, comments, variables, Javascript functions, unused CSS variables, properties, unnecessary DOM Elements.

This list can be longer, for instance, you may also focus on cleaning unused codes, or you can decrease the request count with bundles. On this project, we mainly concentrate on compressing rather than minifying because of the time limit.

We have compressed all of the HTML Documents, JS, and CSS Resources on this part of the project. But still, during the new deployments and publishing processes, some of the old documents or uncompressed resources are getting live. This should remind us of the beginning sentences of this case study.
As you may see our TTFB Time is pretty bad especially because of the Web Server.
While minifying the resources, we have also deleted the comment lines, but CSS refactoring, JS Code optimization, and Javascript Tree Shaking and creating bundles couldn't be done in two months.

A healthy and robust relationship between SEOs and Developers, along with the customer's education, are two crucial points for a successful project.
Meta Tag optimization/organization
Meta Tags are the most initial ranking and relevance signals for the Search Engine. A Search Engine doesn't always render the Javascript but always comes to a web page to see whether the web page is still live or not and the page's content/function, whether changed or not.

So if you don't create a proper Meta Tag Strategy and CTR test platform, probably Google will rewrite your meta tags, and you may not reach your target audience.
This was an example breakdown of our web page profile from a small portion.
I have performed only 4 processes for Meta Tag optimization and organization:
Differentiate all of the duplicate and repetitive Meta Tags according to the user and Landing Page's function.
Creating a CTR test platform for different Meta Tag strategies.
Fixing the missing Meta Tags.
Canonical tag, Meta Title, Meta Description are moved to the top of the source code.
I should give a little bit more information about the fourth action. Googlebot usually crawls a Web Entity according to the Entity's importance for the user and refreshing content frequency. And if your web page's content and functionality don't change in significant portions, Googlebot won't need to render Javascript on your web page unless it uses Client-side Rendering.
During this SEO case Study, serpstat's Chrome Extension made it easier to check every aspect of Web Entities.
Most of Googlebot's repeated visits are for seeing the content changes, internal links, meta tags, and status code of the web page.

Because of this situation, Google recommends not using Javascript for Meta Tags, internal links, and main content of the web page.
Image Source: Bill Slawski's Evolution of Search Presentation.
We have paid attention to Entity's and their relationships in our articles.
In short, if Googlebot visits your web page mostly to see the main components for Search Ecosystem and Search Engine Users, put these components to the top of the source code for it. So, Googlebot finds them easily. I believe this also improves your Crawl Efficiency.

I have experienced some similar problems for some projects like below and fixed them with this method.
example is from a Client-side Rendered Web Page
This example is from a Client-side Rendered Web Page, I have also encountered this error with Client-side Rendered Web Page.
We have refreshed more than 600 Meta Tags along with fixing some Wrong Canonical Tag Usage examples.
Heading tag optimization, organization and Google's explanation about heading tags
This section will try to explain why we optimized our heading tags despite Google's latest explanations related to heading tags. If you want to see what we have done rather than why we have done, you may skip this list.
Although Google says that Heading Tags are not important for SEO, Heading Tags are essential for semantic HTML usage. If the vast majority of the internet uses something wrong, it is difficult for a Search Engine to use that thing as a Ranking Factor.

Such as HTML Lang Attribute. Googlers said that they don't use it as a Ranking or evaluation point. Most people buy an English theme with English HTML Lang Attribute but use it in another language. This confuses Google. But, if you have a semantic and stable Web Entity, the Search Engine can easily understand you and make work different and cheaper algorithms for your Web Entity.

This also goes with Heading Tags. If you use them in Semantic HTML in the right way to give hints to the Googlebot, it will make Googlebot's job easier to crawl, render, and evaluate your content.

In addition to these, Google sometimes rewrites a web page's title according to its Heading 1 tag. It means that Google thinks that a web page's H1 tag represents a web page's content/function better in some cases because most of the publishers change their meta title for more clicks and higher CTR without thinking about the actual content on their web page.
On the other hand, Heading 1 is directly on the web page. Most of the publishers and SEOs can't change the H1 like meta title (because it is not entirely under the control of SEOs or most publishers ignore H1 like Meta Title).

So, if Google can use Heading 1 as Meta Title in its experiments and meta tag rewrite systems without telling you, we still should care about Heading Tags. Furthermore, Google may use and try new things for Heading Tags in the future without telling you like they stop using Link Rel Prev and Next Tags without telling us for years.

In case your CSS and JS resources can't be fetched, downloaded, or rendered or Googlebot come to your site without rendering your JS Resources, your HTML document can tell more about your content to the Search Engine.

Google may use Heading Tags in Search Algorithms and increase its value for Search Rankings even without telling you. Do you remember that they have stopped using Pagination Tags without even telling us for years?

Google's explanation can decrease Publisher's attention for the Heading Tags, and Google may exactly want this. Because this way, only people who care about a proper Heading Hierarchy can use heading tags in their Semantic HTML.

Imagine that Google makes an explanation for Social Media Shares and Ranking Correlations, we would see a massive explosion in social shares just for Search Engine's explanation. This is precisely opposite what Google wants. Google wants natural internet interaction without any manipulative intent.
Usually, Web Sites make an error sitewide, if you check Homepage, Blog, and Product Pages along with Corporate Pages at least one, you may analyze a website quickly.
In this way, the Search Engine can understand which publisher cares more about Semantic, Stable, and Clear Web Structure and Users' satisfaction.
My last thought about this topic is that Google is not the only Search Engine. Also, Google's explanation doesn't say that we don't give any weight for heading tags; they say we can understand the web better according to past years.
Also, this is also from Google Developers web site.
With these thoughts, we have reorganized our Heading tags:
We have used at least one H1 Tags, which defines the web page's purpose with proper font size and position on the web page.

We have used H2 tags for every section on the web page, which defines that section's purpose. We have also used H3 Tags for intermediate and minor parts between H2 Tags.

After a complete brain-storm, we have finally set up a semantic, hierarchic, clear, and easy to understand Heading Structure on our web pages.
According to Semantic HTML Rules, 'strong' and 'bold' are not the same. They have different meanings. I am happy to see that Serpstat lets you check them easily and differentiate them from each other.
Internal links and anchor texts
Internal links and anchor texts are important for context, relevance, PageRank distribution, crawl frequency, and navigation. In Unibaby.com.tr's address, many things have to be done in terms of Internal Links. Suppose your Site Hierarchy and Navigation Structure along with Internal Link Structure don't give a correlative and consistent message with your users' journey on your website. In that case, the Search Engine can cancel internal link advantages for your web site because of this wrong implementation.
A description and schema from Using Anchor Text with Hyperlink Structures for web searches Patent.
A description and schema from Using Anchor Text with Hyperlink Structures for web searches Patent.
Imagine one of your unimportant and fewer traffic URLs has more internal links than your Homepage or the most important Landing Page? Imagine you are trying to give most of your link equity and PageRank value to an unvisited web page for 6 months. If you would be a Search Engine, would you let publishers harm its rankings with this wrong implementation, or would you just cancel the internal link structure for that web site because of wrong signals?
Google works on the web as it finds it. It does not expect the web to be what it wants.

However, at this point the mindset in Heading Tags is valid. If you use it correctly, the Search Engine will give you an advantage. If you use it incorrectly, you will not be able to take advantage of it. If you use it manipulatively, you will be damaged.
While checking some web pages' external and internal link profile along with Anchor Texts, you may simply understand the general aspect and mentality of a Web Entity.
Anchor Texts in Unibaby.com.tr was/is also problematic.
Some of the web pages don't have any anchor text or internal contextual link. eCommerce pages and blog pages were not linking each other in enough portions for more revenue and easier Intelligent Navigation. Some of them have extremely high internal links and also external links along with spammy anchor texts.

Any related topic or product and its blog equivalent were not linking each other. The most used anchor texts were brand keywords and ordinary CTA words instead of product and generic queries. Internal Links carry different Pagerank and Importance values according to their position, size, color, click possibility and dwell time. Unibaby.com.tr hasn't paid attention to their internal links and their interaction with users.
Image Source: Matt Cutts
Our moves for a better PageRank Distribution and Internal Link Structure:
we have cleaned anchor texts from the web pages. We have used more relevant and contextual anchor texts between more relevant web pages. We have changed our internal link profile in favor of our important web pages to show consistency between user/click flow and internal link structure so that Search Engine Crawlers can understand what part of the site is important. We have also changed the position of the internal link in the content.
Nofollow and link sculpting for better PageRank distribution
Another dangerous area to talk about for an SEO is PageRank Distribution. Google's only clear explanation for Link Sculpting belongs to Matt Cutts and it is from ten years ago.
The purpose of Link Sculpting is orienting most of the PageRank Value of all web pages to the most important web pages via internal and external links. This way, we can show Googlebot which pages are the most important and have quality.

Most SEOs have three mistakes for Link Sculpting.
First, they don't care about PageRank Calculation and Values. Despite this, Google always tries to develop different PageRank Models for better understanding web page quality in a shorter time with lesser iteration count and cost.
Second, they use the "nofollow" tag for internal links. Nofollow Tag is for external links and it doesn't prevent any kind of PageRank flow for internal links. Using nofollow tag for internal links also can be a signal for non-trust for your web site's URLs along with the wrong usage.
Third, the only way to cut PageRank flow among web pages using Robots.txt and Disallow Commands.
I won't explain these rules and my mindset for Link Sculpting, you may find it from Matt Cutt's original article or you may read one of my other SEO Case Studies.
An example of High External Linking Page Count from a small domain crawl.
The four important things to care about Nofollow as hint: nofollow tag is just a hint, Google may decide that linking URL should give some of its PageRank value to target Link. Google didn't start to use Nofollow Tag's new rules and nature on the all of the web yet, we know this because we didn't see any kind of Google Dance related to this update and also, John Mueller has stated that "Google is performing small experiments to understand nature of links better on a small portion of the web."
Google didn't say that "We will use Nofollow as a hint always on all over the web", Google stated that "We might use Nofollow as a hint on some situations."

Also, if you have any kind of suspicion about Nofollow as a hint, you should use "Sponsored" instead of Nofollow or UGC Tags. You may see the reason of this suggestion below:
So, in short:
Link Sculpting and PageRank Calculation is still an important factor for SEO despite of SEOs negative attitude for it.
In the last case, nofollow can be a hint but Google still doesn't use the last features and rules actively for all of the web.
If you have any suspicion about an external link, you can simply use the "Sponsored" attribute for external links.
For internal PageRank, using nofollow is a mistake.
Disallowing internal links and categories is the optimum option for Link Sculpting along with a proper Site Hierarchy and Internal Link Structure.
A related dialog from Twitter from a close time.
In the light of these facts and comments, we have deleted most of the external links, nofollowed the rest. Also, we have changed the external links' positions and target URLs. Unibaby.com.tr was linking its competitors for the most important queries and topics. A Brand should be an original resource for its content, function, and service; if you show your competitor as a trusted source to your customers and Search Engine Crawlers, this might harm your rankings.
We have used Global Health Cooperations and Research Institutions as information sources instead of ordinary eCommerce competitors.
Structured data and content marketing strategy
When we started to project, there were lots of thin content and pages with a bad layout. Google has performed its first Page Layout Update in 2012, which has targeted firstly extensive ads usage. But Page Layout updates have continued to improve User Experience, especially after Mobile-First Indexing and Page Speed Google updates.

Google can perceive a content's feeling structure, the expertise level of language use, readability, reliability, and the Entity's authority on a topic. Unibaby.com.tr's informative and eCommerce contents were not comprehensive, complete guidelines, even their eCommerce aggregators, gave more information about their products.
A warning related to Core Algorithm Updates and Content Structure from a close time
There might be said lots of things for the content strategy but you can see a summary for this section below:

  • Performed a Content GAP Analysis with our competitors.
  • Extracted all of the user queries which have more than 1000 Impressions but zero clicks.
  • Extracted top landing pages of top competitors and their queries and compared them to ours.
  • Researched potential mothers' and fathers' concerns, information needs before and after their baby's born along with new fathers' and mothers' experiences.
  • Examined the layout of competitors along with their content structure.
  • Created extremely detailed content briefs for writers for every possible informative, navigational, and commercial query.
  • Deleted some of the underperformed contents.
  • Created a Data Studio Dashboard for determining Keyword Cannibalization Problems.
  • Merged thin and cannibalized contents.
This is the comparison of New Users and Bounce Rate. We have increased the New Users Count 177% for daily, we have decreased the Bounce Rate by 7,41%
  • Expanded product show cards with more information and re-organized their meta tags along with popular and relevant queries from the users.
  • Used Google's Standard Natural Language Processor to see whether Google can understand our new content better or not.
  • Used only present tense sentences on the new content with Scientific Facts.
  • Extracted Entities, TF-IDF Words, and Phrases from the content and compare them to competitors' to see which one is more relevant, short, and useful for users.
  • Used FAQ Structured Data for giving users what they need without even clicking along with creating Brand Awareness, and have more SERP State.

Perfect Content should be short as much as possible but long as needed.
Most of these steps are being taken by Amit Singhal 23 Questions related to Google Core Algorithm Updates which I recommend you to read them on a regular basis and think about.
Social media activity, user retention, and Google Search Relevance
I know that I have entered a controversial area for just one article, but I also see a correlation between a successful Social Media Activity/Campaign and Google Search Trends. I know that I have entered enough controversial areas for just one article, but I also see a correlation between a successful Social Media Activity/Campaign and Google Search Trends. Of course, Google always will deny this because Google is trying to create its own Social Media and trying to turn Google SERP into a Social Media Platform to become a more important part of our lives.
Matt Cutts, video
In this video, Matt Cutts says that they use Facebook and Twitter web pages for Ranking Signals just like other web pages. But also, he has said that they are trying to understand Brand and Person Identities better thanks to Social Media Accounts and Social Media Links on video series.a
Because of this purpose, Google can't tell us that they use their competitors' data and domain addresses for creating better and useful Search Results but on the other hand, Matt Cutts admitted that they have integrated Twitter Links and Data into the algorithm.
Nick Peterman, Dan Holt, Inbound Marketing, Page 28.
Also, as an Entity Based Search Engine, Google has integrated Brand's Social Media Accounts into the SERP and Brand's Knowledge Panels.
A Google Patent Image to calculate Influencer Value
There are lots of other aspects of this topic such as depromotion of Facebook pages in the Search Results such as Dailymotion and other social media platforms. Also, there are tons of other patents, explanations, and correlations along with causations for the relevance of Search Rankings with Social Media Activity but I believe for this article these arguments are enough.
Also, we have seen that the Brand's Social Media is active but its followers are not active as much as competitors'
Also, we have lots of information from the era of Matt Cutts. Because in the old days, Google was giving more information about its algorithms. After a while, they have started to hide more information and attract attention to different areas with some keywords such as "User Experience, Pagespeed, and Great Content". Despite these mottos, I can say that we have to think in an analytical way even more than the old days because of this effort to hide algorithms, updates, and tips.

Furthermore, you will find below proof that shows Google was more open in old days related to its algorithm.
Also, we have seen that the Brand's Social Media is active but its followers are not active as much as competitors'
In the old days, as you might see, Google was publishing every tiny detail about their Search Algorithm Changes publicly. These days, I think they are not intimate like old times, but I can't blame them because of BlackHat Ages. The biggest danger for a Search Engine is "gaming its own system" against itself. If they hide these things better, they can protect their algorithms from manipulation and bias better. So, that's why we need to think more analytical and free according to the old days.

P.S: In 2014, they have stopped publishing this information, they said "World get bored…" instead of "We want to hide our algorithm".

Related to Social Media, you will find a changelog below from Google's own official blog. Like Matt Cutts' explanation related to Twitter Links and Author Reputation, it says that they are getting better on indexing and creating relations between profiles.
Source: Google Webmaster Blog
This conversation can also include unsuccessful Google+ and Link Rel Authorship along with E-A-T and Entity-based Search Engine features but, we are here for another issue.

  • I have used RivalIQ, Twitter Analytics, Youtube Analytics, and Google Analytics to examine social media traffic and behavior to the web site.
  • I have researched the hashtags, engagement rates, and user engagement time.
  • I have researched the Youtube hashtags, comments' most included words, video titles, descriptions, and their links along with info cards.
  • We have created links that connect social media profiles to the domain and prepared an organization structure data which includes them.
  • We have researched the social media contents which are taking organic traffic via Google and their features.
  • We have started using all of the social media accounts parallelly more active and semantic for creating a better topical relevance and authority with certain types of main queries.
  • We have used Brand Name, Category Name, and Sub Topic together on the social media posts and hashtags like in URLs of the Web Entity and linked them to each other.
  • We have increased our related index count in Google with social media accounts.
  • We have recommended open Medium, Reddit, and Trustpilot Accounts also.
Our Social Media Cross-channel Report Screenshot
Most of these lists are checked and done and some of them are being performed right now.

And our list can be longer for this SEO case study, but we can't cover them all in just one article with SEO theories and practices. Also, I can't talk about more topics in this article for the sake of readers and editorial standards.
Conclusions and takeaways
Throughout the second article, we covered Page Speed in similar formats for both Users and Search Engine Crawlers. We have transferred Google's mindset and principles to the many issues such as PageRank Distribution, Link Sculpting, Site-Structure, Semantic HTML, Social Media's importance on SERP, Structured Data and Authoritative Content Marketing with what has been done within the scope of the project.

We've worked with page layouts, the speed of providing information to the user, how Semantic HTML and Structured Data can be used together, how Site-Hierarchy can send signals to Search Engine, and much more.

So, I need to ask the last question and try to answer it properly.

Can a Web Entity grow more than 190% in two months and lose a Google Core Algorithm Update at the same time?

We will see the answer, in the last article of this SEO Case Study along with its reasons.

Speed up your search marketing growth with Serpstat!

Keyword and backlink opportunities, competitors' online strategy, daily rankings and SEO-related issues.

A pack of tools for reducing your time on SEO tasks.

Get free 7-day trial

Rate the article on a five-point scale

The article has already been rated by 5 people on average 5 out of 5
Found an error? Select it and press Ctrl + Enter to tell us

Discover More SEO Tools

Website Audit

Website SEO analysis – gain detailed insights into your website's technical health

Batch Analysis of Competitors' Domains

DA Domain Checker – get valuable information about the competitors' domains

AI Content Tools

AI Content Marketing Tools – simplify and optimize the content creation process

Local SEO Tool

Our local SEO platform – optimize your website for maximum impact

Share this article with your friends

Are you sure?

Introducing Serpstat

Find out about the main features of the service in a convenient way for you!

Please send a request, and our specialist will offer you education options: a personal demonstration, a trial period, or materials for self-study and increasing expertise — everything for a comfortable start to work with Serpstat.

Name

Email

Phone

We are glad of your comment
I agree to Serpstat`s Privacy Policy.

Thank you, we have saved your new mailing settings.

Report a bug

Cancel
Open support chat
mail pocket flipboard Messenger telegramm