16817 30
Serpstat updates 42 min read

How to stop paying for trash in the backlink index?

Research on backlink index data from Serpstat, Ahrefs, MOZ, Megaindex, Semrush, and Majestic.
Исследование данных ссылочных индексов Serpstat, Ahrefs, MOZ, Megaindex, Semrush и Majestic
Research on the backlink index
Зоряна Матюшенко
Customer Support & Education Specialist at Serpstat
Can you be sure that your SEO tool depicts the backlink diversity with 100% objectivity? While developing our own backlink index, we analyzed users' reviews to determine a more effective way to use an SEO platform to get the most accurate data.

And the results truly amazed us! (We honestly tried to avoid clickbait here, but there's no way around it). Long story short, the backlink index volume is not the crucial point.

The bigger the database — the lower the quality. Large databases often suffer from outdated data and duplicates. But wait, there's more! We've gathered a set of backlink insights you just can't afford to miss!


Building your own Backlink index is a resource-intensive process; that's why most SEO services use backlinks data they bought from vendors with crawlers. Such data is difficult to replenish and update because all changes are on the supplier's side. To influence the quality of the link data directly, Serpstat has developed its own backlink index.
The Serpstat Backlink index now includes 1 trillion links from 386 million domains. Crawlers add 2 billion unique links from 2.5 million domains daily.
As usual, SEO platforms often use the terms "the largest index," the most comprehensive database," "the best Backlink index" to describe their tools' abilities. But does quantity correspond to quality? We decided to go beyond the quantitative parameters of the Semrush, Ahrefs, MOZ, Majestic, and Megaindex services to figure out the value they deliver in the analysis of the site's backlink profile.

How are link indexes different on each platform?


When comparing Backlink indexes, it is essential to mind different approaches to building links databases. For example, they can be formed from several "subindexes" containing data of varying relevance. Let's take a closer look at the differences in the architecture of the backlink databases of the abovementioned platforms.

Ahrefs' backlink index is built with the following subindexes:

  • Live — links that were found during the last crawl session and remain active until the subsequent crawling;

  • Recent — the subindex contains data on active, as well as lost links that were listed as "live" for the last 3-4 months. This list may include the links that were marked lost due to technical reasons.

  • Historical — all the links of the domain that have been assigned the "live" status, starting from 2016 (since the launch of the Ahrefs index). At the moment, these links might have lost relevance.
Semrush uses "fresh index". It consists of links recorded during the last six months of crawling.

MOZ displays "fresh" data from the last crawl. Although the data is not updated for all links regularly — higher-quality pages are the first priority. The quality is determined by a machine-learning algorithm.

Majestic's Index consists of the following parts:

  • "Fresh index" is all links considered active during the last crawl. The update rate is from 90 to 120 days.

  • "Historical index" - in addition to fresh-index data, contains all links that have ever been detected within the analyzed domain.
Megaindex stores historical data on links since the launch of their own index in 2015.Serpstat backlink index only holds fresh links — those that were active within the last 70 days. This architecture allows updating the entire database for this period. However, if an active link is not found during the next check, information about it is stored for another six months in the "Lost Links" section. Therefore, users only get an up-to-date breakdown (whether it's a competitor's backlink profile or their own website).


Backlink index architecture
Backlink index atchitecture

Backlink index comparison: methodology
Backlink indexes can be compared using different parameters. From the database size and the extent to which the data is complete — to the servers' and crawlers' capacity. In this study, we have included several parameters for quantitative and qualitative inspection:

  • Backlink index volume;

  • Data refresh rate;

  • Link accuracy;

  • Match rate (The uniqueness of backlinks).

#expert_perspective


Which parameters do you pay attention to when analyzing the backlink index?

Working with the backlink profile, I pay attention to different aspects of websites. For example, suppose we exclude such vital points as site ranking, content quality, the way it was formed, the placement of materials with external links, pricing, and other indicators. Let's focus solely on link metrics. In that case, we initially pay attention to the quality metrics of the overall backlink database. Depending on the service you use, they will include:

  • Domain Rank (DR);
  • AS (Authority Score);
  • Citation Flow (CF) and Trust Flow (TF

Secondly, it is worth focusing on quantitative indicators:

  • Total number of unique domains;
  • Total number of external links;
  • Number of IP addresses and subnets;
  • Number of unique outbound links from the domain;
  • SEO / ORG link distribution.
  • Backlink dynamics.
Ramazan Mindubaev, Head of SEO at TRINET.Group
The most important parameters regarding backlink indexes are freshness and relevance. When analyzing backlinks it's important to see links that are live right now and therefore may be taken into consideration by search engines. I do not depend on links that have been live somewhen in 2007 or alike.

Secondly, I need to see the links that are relevant to the page. Seeing thousands of auto-generated "top 500 web pages"-links from all those well-known directory pages does not help if I miss recently published links in topic-related forums or user reviews.
    Sebastian Adler, SEO-Manager, Seoseb
    The main parameters:
    1. Link placement. You have to make sure it's not located in a directory with purchased links only (better to check manually);
    2. DR (using Ahrefs API);
    3. The quantity of referring domains and pages (using Serpstat API);
    4. The total amount of outbound domains and pages (using Serpstat API);
    5. The number of keywords for which the site is ranked. (using Serpstat API);
    6. Dynamic in the number of keywords in the last six months. (using Serpstat API);
    7. Thematic (better to check manually).
    Vladislav Naumov, Head of SEO at Inweb
    When analyzing a backlink profile of a website, I check the Domain Rating or equivalent of the referring domains, whether it's a DoFollow/NoFollow/UGC link, what Anchor Text has been used for the link, and when the link was last seen.

    Knowing how many links to my domain a referring domain is providing is also useful so I can see at an instant whether the link is being spammed or perhaps located sitewide. I think it's also beneficial to show the overall number of referring domains and backlinks my chosen domain has as a quick overview that can be monitored easily for any sudden spikes or drops.

    Though not essential, the average monthly traffic of a referring domain is a useful indicator to see how popular a website is without having to research it further using other toolsets, as would the statistics about the referring domain's own backlink profile.
      Steph Andrusjak, SEO Specialist, www.seosteph.co.uk
      When running link building campaigns, I pay attention to a various set of criteria, whether they are qualitative or quantitative. My focus on the different parameters is also dependent on whether I'm running a campaign "at scale" or a more targeted approach.

      When working on an "at scale" campaign, i.e. looking at domains "in bulk", I'm looking first for DR, to some extent the UR, DomPop, TF and CF.
      When I go for the more targeted approach, I'm adding to this more granular criteria (rather qualitative), such as the trend of the visibility index of domains, language, follow/nofollow, or directly looking at the website: freshness of content, where the backlink could go, and other factors.

      If I'm looking at backlink index for monitoring and reporting, I tend to keep things simple: DR, DomPop, TF and CF. The more data point we bring, the more complex it gets, and the more difficult it is to report it to non-SEO.
        Baptiste Hausmann, SEO Manager at Mollie.com, seosmann.com

        I usually focus on the following parameters:

        • The traffic on the source page;
        • How many keywords from regional TOP-100 are listed on a page;
        • The type of link (not just Dofollow / Nofollow - Sponsored, UGC, link in comments, footer (site-wide link)). If you think about it, you can come up with many more types — link-picture, anchor, non-anchor, etc;
        • Page type (based on micro-markup or domain). For instance: Governmental, Educational, Catalog, Forum, Review, Blog;
        • Number of other outbound links on this page;
        • Date the page was published and the date of the first discovery;
        • Domain's topic;
        • Links' title;
        • Domain's title (useful for filtering out unnecessary domains);
        • Length of content on a given page;
        • Is this domain on the database of link exchange services? It would be handy to catch such;

        We find the data manually if we cannot catch it using a tool. Which, of course, is annoying — after all, we have to gather data from different spreadsheets and analyze it in a pivot.
        Nikolay Shmichkov, SEO specialist at SEOquick
        When analyzing a link profile, we pay attention to the following parameters:

        - DR and UR;
        - number of referring domains, backlinks, acceptors;
        - follow / nofollow proportion;
        - percentage of anchor / non-anchor links, as well as brand anchors;
        - availability (.gov, .edu domains);
        - the volume of traffic and keywords, etc.
        Igor Shulezhko, rankUP
        In off-page SEO, backlinks are a more crucial SERP factor. Before vetting the link for our client, we check whether it was penalized by Google in past, whether traffic comes from the country's we're targeting.

        We also check sites' top-performing keywords to confirm the site has pay-worthy keywords relevant to our client niche. DR metrics must be stable and have fewer outbound links, compared to inbound links volume.
        Raveel Nadeem, Co-Founder at Backlinks Guru
        The most important conditions for a backlink to be effective is a properly selected donor and a proper text environment of the link. When choosing the donor we usually pay attention to the Domain Rate, the frequency of publication of new materials, the presence of an SSL certificate, the relevancy by topic, regionality, the domain history, the traffic and DR dynamics. The text of the article itself is of great importance - it must be unique, with a proper keyword density. Choosing the type of the link - text or image and, respectively, anchor or alt text, is also very important.
        Nikolay Galinov, Head of SEO at Netpeak Bulgaria, netpeak.net

        Large database: advantage or disadvantage?


        To evaluate a link analytics service, it is customary to operate with the backlink index volume — the number of referring domains or pages to a site in the database of a specific platform. Such an indicator may seem convenient and understandable, especially if you must choose one tool from several alternatives. Here are the platforms' statistics from official sources.
        The volume of backlink index, architecture
        Volume of backlink index
        *Majestic does not use the "referring domains quantity" metric to estimate the index volume, so we do not analyze the statistics for this tool to compare different services correctly.
        Data Sources:
          However, an extensive database does not always indicate good quality data. Often, the index contains duplicates and outdated backlinks. You might notice spammy links in reports, which most likely appeared in the index once and continue to persist. Due to the index's architecture, there may be overlays of data collected at different times (historical, recent, fresh, and any other subindexes).
          Why is the link data so different?

          There are a few reasons:
          1
          Frequency and rate of index update. The time it takes for the entire index to be updated, i.e., the crawler goes through and updates all links in the database. In Serpstat's backlink analysis reports, the last indexation date is displayed on the right side of the screen.

          External links indexation
          Links indexation date rate
          2
          The platforms' speed and ability to find new data on the website, as well as delete irrelevant backlinks.
          3
          Crawling pages specificities:

          • whether robots.txt, meta robots and x-robots-tags, noindex/nofollow tags are taken into account;
          • the availability of website during several simultaneous crawling sessions,
          • individual blocking of platforms through the above methods or by HTTP headers, by type crawler calls to the page (GET, POST, HEAD), by an identifier (User-Agent) or on IP level.
          What to do if the SEO tool does not detect your link?
          Take advantage of the Recrawling URL Tool. Submit a list of pages with new backlinks, and Serpstatbot will collect data within 24 hours.
          Considering the difference in the architecture and crawlers' algorithms of the indifferent platforms (crawling depth, taking into account the uniqueness of the host, IP, subnets), it is incorrect to evaluate the indexes totally by the number of backlinks or referring domains.
          An extensive database is purely a marketing goal. If the quantity of backlinks is higher than the competitor's, it does not mean that the data is better. Serpstat can make the index database larger in a month than all competitors combined, but our users will not benefit from this.
          Mikhail Broun, Serpstat
          It's not a secret that there might be low-quality outdated content hidden in a huge database. Therefore, data correctness and relevance essential, rather than just volume.

          #expert_perspective


          Does the only size of the backlink index play a role? Is the relevance and uniqueness of the data still significant?

          The more data, the easier it is to analyze. That is why experts often turn to several services to get results, and then make a comparison. With a good filtering system, a large database will give you an advantage, but with a bad one, it will only add headaches and confuse you.

          It would be very appropriate if the services were integrated with the Google search console and right in the service it would be possible to view which links are visible in the console.
          Oleksandr K., SEO Specialist at SEOTM Digital Agency
          The size of the backlink index and the relevance are important, but for me, the presence of a donor in the index is more important than just knowing that some garbage resource refers to my site. Ahrefs shows more links than Serpstat, but this difference is primarily due to junk donors that Google doesn't even index. We use the Serpstat API to get donors' data.

          It would be a perfect feature to filter only referring pages in the Google / Yandex index.
          Vladislav Naumov, Head of SEO, Inweb
          Naturally, the size of the database is an essential metric in competitive analysis, as it gives a broad understanding of how competitors work, what types of links they are building, what methods they use. Likewise, the relevance and uniqueness of the database are important parameters. We believe that what matters is not how large, relevant, or unique the database is but how accurately it works and finds the same links taken into account by search engines.
          Ramazan Mindubaev, Head of SEO at TRINET.Group

          No database gives 100% effect, and therefore the relevance of the data and the size are significant. So, some platforms can boast about many links and domains, but these domains might be duplicates.

          Inside the backlink index tool, you need a feature for searching for these duplicate links, filtering out junk links from site analysis reports, and other things (in Serpstat, this is implemented automatically). Otherwise, you have to analyze massive data tables, where you have to find valuable donors. Often, you end up with 100,000 lines, and there are less than three hundred actual exciting links.
          Nikolay Shmichkov, SEO specialist, SEOquick
          The quality of the data is the most important. For me, looking at the size of the backlink index is secondary. The most critical aspect of any tool, for that matter, is that the data can be trusted and is sufficient to define strategies and tactics to help my clients grow their business and organic reach. But of course, if an index is limited, then the quality of that data will be impacted.
          Baptiste Hausmann, SEO Manager at Mollie.com, seosmann.com
          For many scenarios, the size is important, as I want to get an overview of all the links pointing to a particular page. An indication if the link is live (and since when) is super helpful, but I tend to check the availability of the backlinks on my own in most cases.

          What can be super frustrating are duplicated links, as they create "noise". I like tools that try to list the "original" link and flag copies automatically.
          Stephan Czysch, SEO Specialist, stephan-czysch.de

          From my standpoint, the size of the backlink index is essential, but even more important is its relevance. For example, all displayed links in the database must be working. Therefore, it is critical to clean the database of broken links regularly.
          Mikhail Shakin, Shakin.ru
          Personal demonstration
          Leave a request, and we will conduct a personal demonstration of the service for you, provide you with a trial period, and offer comfortable conditions for starting exploring the tool

          Data refresh rate


          To determine how often data is being refreshed, we have to consider the whole data refresh cycle. It officially takes 60 to 180 days for the analyzed platforms to update the link information.
          Data refresh rate indicaror
          Data refresh rate
          But is this a realistic timeline? Let's analyze the data, provided by SEO platforms:
          We've detected outdated entries in the reports, which couldn't be there if the declared update frequency was true.

          #expert_perspective


          What is the optimal refresh rate for the index? Is it important to get new links right away?

          I think this depends on the use case. For SEO analysis or audits a monthly updated but accurate index does its job. When it comes to penalty reconsiderations a or other urgent use cases it maybe helpful to have a very recent snapshot.
          Sebastian Adler, SEO-Manager, Seoseb, seoseb.de
          Hypothetically, yes — it is essential to update the data in the index. Nevertheless, in reality, it often turns out that websites come with an undeveloped backlink profile, and it is necessary to build up extensive numeral links to the web resource. Thus, when analyzing competitors, we have a lot of work to grow the quantity of the backlinks, so a conditional lag in the data for 1-2 months is acceptable for most tasks. It is much more critical not to obtain more data, but those data that use search engines as indexed links.

          I would suggest the services look more toward integration with the personal offices of Yandex Webmaster and Google Search Console to receive data from there and use while labeling donors. Those backlinks that do not have marking can be used for further analysis and statistics. It would be interesting to explore these parameters.
          Ramazan Mindubaev, Head of SEO at TRINET.Group
          For some use cases (like copying your competitors backlink strategy or combating a negative backlink attack), high update frequency is crucial.
          Malte Landwehr, Head of SEO at Idealo, maltelandwehr.de
          If you get a lot of backlinks organically, a link database updated in more or less "real-time" is nice to keep track of. This is also a great monitoring feature to keep track of your competitors' link-building efforts in real-time and set short-term actions.

          Nevertheless, I know that this is only possible to a limited extent, even with the greatest usage of link crawling resources. So a delay of a few days or weeks won't be that bad.

          Some SEO tool providers offer features to import link data from Google Search Console, Bing Webmaster Tools, Yandex. Webmaster etc. That's particularly useful to get even more backlink data during link-building campaigns.

          Besides a "real-time" backlink index it's also useful to have a separate index that shows outdated, lost, or broken backlinks.
          Alexander Außermayr, SEO, CRO & Web Analytics, aussermayr.com
          It is enough to update the database once every two weeks. I don't notice any practical value in catching a new inbound link the next day.
          Vladislav Naumov, Head of SEO at Inweb
          I can't claim to know the exact optimal frequency but for me, having the index refreshed once a month or on a quarterly basis is essential.
          Getting new links right away is of course an ultimate goal but not really realistic or practicable.
          Charlotte Maghe, Head of SEO, sortlist.com
          Every SEO specialist always wants to receive links immediately.

          Integration with a webmaster is super important - so that he quickly found backlinks in the index forcibly according to potential domains. So far, for today, this functionality is not detected, and we have to seek almost all requests in the search.

          There is a direct correlation between the links and positions, but the devil is in the details. A clean reference profile in white SEO is more critical than many quantitative metrics - I discovered this in 2016-2017 and after that. We often make backlinks within the outreach method, and we do not always get reports that links were placed by mail. Therefore, it is necessary to search for them manually because the backlink index of the platforms comes with the delay.
          Nikolay Schmitkov, SEO specialist at SEOquick
          I suppose, from once a week to once a month is optimal. The ability to get new links immediately for me is not particularly important.
          Mikhail Shakin, author of the shakin.ru blog

          Links accuracy


          No tool can guarantee a 100% link display. For example, services track only 50-70% of the total backlinks a domain has, but it is frequently enough to work with a backlink profile regularly.
          Seeing all the links is not particularly necessary to analyze the backlink profile. Ahrefs, for example, shows a bunch of links, but there are also a lot of "spammy" websites which refer to low-quality content in a row.
          Vladislav Naumov, Head of SEO at Inweb https://inweb.ua/
          However, it is still essential to get as much information as possible for some tasks. For example, if we are talking about recovering from a penalty.
          During experiments on building up a link profile, we analyzed the data in 3 popular tools and compared it with Yandex Webmaster and Google Search Console (GSC) data. As a result, backlink analyzers find about 40-60% of all links posted on the project, which is quite a small indicator. And approximately 40% of links in Yandex Webmaster and GSC can't see the SEO tool we used. Thus, there was an awareness that to cover 80 +% of backlinks data, you have to analyze reports from all available platforms.
          Ramazan Mindubaev, Head of SEO at TRINET.Group

          Match Rate


          Besides the size of the database, the frequency of updates, and the data coverage, it is crucial to focus on its uniqueness in comparison, especially if there are several link analysis tools to choose from.

          The share of unique referring domains is a parameter showing the percentage of donors in similar reports from competitors' platforms didn't detect.

          For this analysis, we randomly selected ten domains in different niches: from gaming to YMYL, but we tried to diversify the selection in terms of geography, niches, and approximate traffic. List of domains we researched:

          • brightlocal.com (USA) - online marketingworldofwarcraft.com (USA) - gaming
          • namepros.com (USA) - web hosting
          • subaru.com (USA) - the automotive industry
          • ourplanet.com (USA) - environmental protection, government organizations
          • ispringsolutions.com (USA) - IT, programming
          • roccat.org (USA) - video games and consoles
          • 11bitstudios.com (Poland) - game design
          • aeg.com (Ukraine) - equipment
          • ppmco.org (USA) - a government health organization.
          For each domain, we've gathered the following data:

          • Referring donor by Serpstat;
          • Referring donor by competitive service;
          • Common donors to Serpstat and a competitor;
          • Unique donor domains Serpstat.
          Serpstat and Ahrefs, unique domains
          Serpstat vs. Ahrefs
          Serpstat and Semrush unique domai
          Serpstat vs Semrush
          Unique domains Serpstat and MOZ
          Serpstat vs MOZ
          Serpstat index contains about 60% of referring domains included in the index of alternative resources and about 40% of unique data not found in similar platforms.
          By examining a site's link profile using Serpstat, you will get more relevant data for analysis. And, using a combination of several services, you will see the complete picture of your current backlinks promotion.

          #expert_perspective


          Have you compared the number of links in different tools for your projects? If so, what conclusions have you made?

          Yes, I almost always compare. Sometimes there are very unexpected results. I use Serpstat more often for projects in the CIS countries, but over the past year, the service has shown very good results in the American market as well.

          Ahrefs is also a good tool, which almost always shows the fullest volume of the link profile. I also used the SEO power suite - a good stationary service. Each service has its own advantages, it is better when you have the opportunity to work with each one personally.
          Oleksandr K., SEO Specialist at SEOTM Digital Agency
          At Sortlist, we often check the number of our links in various tools such as Ahrefs or Semrush, it's always better to have multiple sources of data checkpoints.

          But as already mentioned, quantity is not the only criterion to take into account. This data should always be taken with a grain of salt. Spoiler alert: the data still shows a lot of spammy links coming up.

          As main points, I would say that you should always weigh your data and those of others, and try to look at quality rather than quantity, and also relevancy is key! So yes, checking your backlink profile in different tools is always a good idea.
          Charlotte Maghe, Head of SEO, sortlist.com
          Yes, I did. Every SEO should try different tools and choose the one they like best. Some variations of the data in the Google Search Console are somewhat acceptable for any instrument. It's ok.
          Mikhail Shakin, author of the Shakin.ru blog
          I have used a lot of different backlink tools and databases. Most of the time it seemed that having a large index is the most important problem they try to solve.

          Accuracy and the possibility to tweak the way those tools gather their link information are things I always have to struggle with. I do not look at index sizes but try to use databases that represent a data set that gives me the most accurate data.
          Sebastian Adler, SEO-Manager at Seoseb, seoseb.de
          The number of links displayed in different SEO tools can differ wildly. What I have noticed is that Serpstat seems to cut out the organic spam that websites naturally pick up and just shows the links that matter.

          I think Serpstat is the first I've seen to do that out of the different tools I've used over the years. Whilst this is helpful to cut out the noise, knowing about the spam sites is sometimes useful in case those domains are actually impacting your domain's health.
          Steph Andrusjak, SEO Specialist, seosteph.co.uk
          I think it is important to never trust one single tool, but to combine the data from all of them (plus the Google Search Console), to get a wider overview. The one insight I believe is often overlooked is discounting disavowed links and looking for the hidden ones to put them into consideration.
          Timo Fach, Head of SEO at iGaming.com
          Yes, I have done it many times.

          So far, no tool gives clear parameters needed for filtering out, so you have to sort the domains and check them separately.
          For example, I need to check in Google not only NOFOLLOW, but also SPONSORED / UGC content. If you need a domain comparison tool — Serpstat has a more suitable implementation.

          Serpstat has a good link analysis tool, and its data is worth exploring.
          Nevertheless, it would be helpful if there were more integrations with the webmaster to find your links promptly and, of course, have excellent methods for sorting link data. I spoke about it also according to the database size, from their history to the current condition (Serpstat already has these features). And for sorting, I lack several metrics that no one has yet, and I have to check links manually.

          Nikolay Shmichkov, SEO specialist at SEOquick
          Yes, I use different tools. The core insight here from me is that you never know the source of data that is being presented. Most of the time, every tool shows different results, making it harder to decide on which data to rely on. My strategy is to compare, check, and sort the data manually so I can get a most complete overview.
          Gundel Woite, Manager at Jenpix
          In everyday work, I use several tools to obtain more detailed data. For now, no single tool can guarantee a 100% link display. Ahrefs shows a lot of links, but many of them are junk. Some data from Search Console is also useful.

          Serpstat's solution is pretty good so far in terms of convenience (it has an API function for analyzing backlinks and a URL rescan tool). When you work with massive data, it saves a lot of time.

          Aleksey Biba, Head of SEO, privatbank.ua
          Personal demonstration
          Leave a request, and we will conduct a personal demonstration of the service for you, provide you with a trial period, and offer comfortable conditions for starting exploring the tool

          Why Serpstat?

          So how do you stop paying for garbage in service link indices? Above, we've discovered that having a lot of reference data doesn't matter if it's outdated or incomplete.

          Since the link ranking factor is one of the leading in SEO, it is necessary to pay careful attention to the analysis of links and the quality of the analyzed data.
          When analyzing a backlink profile, I first look at the number of referring domains. After that, I also pay attention to the quantity of the backlinks, but the referring domains are more critical since the links can be site-wide.

          I study the dynamics of backlink diversity growth. It should increase gradually and constantly, without drastic changes (both positive and negative).

          It also makes sense to compare the link diversity with competitors and their growth dynamics.
          Mikhail Shakin, author of the Shakin.ru blog
          Serpstat Backlink Analysis will help you obtain the necessary information to research (even if you are already using another tool).

          Serpstat focuses on the quality of the index data. And, in addition to the regular reports of link analysis, such as:

          • Referring domains
          • Backlinks
            External links
          • External domains
          • Link anchors
          • Top pages (by link diversity)
          The platform has many unique link-building features - Links intersect, Redirecting Domains, Malicious Domains, and URL recrawling tool.

          1
          Serpstat Recrawling URL Tool will provide data on links "on-demand". Send a list of pages (up to 500 URLs per day) with new backlinks, and our robot will collect data on them within 2-3 days. The feature is available to users with any paid plan.

          The tool will be helpful if you purchased links and want to immediately track their impact on SDR to remove them if necessary and avoid negative consequences. Also, if your client requires up-to-date information "for today" and new links have not yet been displayed in the report.
          Recrawling tool
          2
          Links intersect, is a tool that allows you to carry out simultaneously the comparison of the link diversity of several domains to find common and unique donors for each analyzed web resource.

          For example, you want to find high-quality and relevant link placement websites. Previously, you would have to explore each competitor separately and manually search for common sites in the reports. Now it is enough to find several relevant competitors, enter their domains in the tool and automatically find those donors who link to all of them.
          Links intersect
          3
          Redirecting Domains - a report that shows sites that link entirely to the analyzed domain while transmitting link weight. Building up entire domains that are redirected to a website is a popular link-building strategy. Track the influence of this method on the domain and find out which of the competitors buys drops and attaches them to the website.
          Referring redirecting domains
          4
          Malicious domains. If your site received a link from a questionable resource, you could see the platform and type of vulnerability targeted by the "infected" donor.
          Malicious donors
          5
          In the API for backlink analysis, there are 16 methods for data around referring domains and pages, malicious donors and acceptors, dynamics of your link diversity, anchors, leading pages by the number of backlinks, donors intersect, etc. Automate routine SEO processes and create custom reports.
          Do you have technical questions about the link analysis service? Contact Serpstat support team. Our specialists will answer within ONE minute to guide you. Only a few platforms provide free support. And we are among them! No limits and restrictions, even for users without a paid subscription.
          Personal demonstration
          Leave a request, and we will conduct a personal demonstration of the service for you, provide you with a trial period, and offer comfortable conditions for starting exploring the tool

          Conclusion

          Most of the time, the size of the backlink database (the number of referring domains or pages) is used to evaluate indexes, and this indicator seems to be straightforward. However, only the relevance and completeness of the data are decisive factors.

          SEO specialists often take advantage of different platforms and combine data from several reports and indexes for a comprehensive analysis. One of these tools in your arsenal can be Serpstat, which, in addition to its backlink index, also has numerous unique reports for backlink analysis.

          Well, we have shared various insights from the world of backlinks. Leave comments to discuss the article and ask additional questions about backlink diversity analysis in Serpstat :)

          Learn how to get the most out of Serpstat

          Want to get a personal demo, trial period or bunch of successful use cases?

          Send a request and our expert will contact you ;)

          Rate the article on a five-point scale

          The article has already been rated by 14 people on average 4.64 out of 5
          Found an error? Select it and press Ctrl + Enter to tell us

          Share this article with your friends

          Are you sure?

          Introducing Serpstat

          Find out about the main features of the service in a convenient way for you!

          Please send a request, and our specialist will offer you education options: a personal demonstration, a trial period, or materials for self-study and increasing expertise — everything for a comfortable start to work with Serpstat.

          Name

          Email

          Phone

          We are glad of your comment
          I agree to Serpstat`s Privacy Policy.

          Thank you, we have saved your new mailing settings.

          Report a bug

          Cancel

          We use cookies to make Serpstat better. By clicking "Accept cookies", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. Learn more

          Open support chat
          mail pocket flipboard Messenger telegramm