|SEO||– 14 min read –||November 28, 2018|
How To Make A Niche Analysis
With Our Crawling Tool
With Our Crawling Tool
2. Crawling features
3. How to make a niche analysis
3.1. Data storage and extension
3.2. Analyzing crawling results
3.3. Dynamics of positions and traffic changes by domain
3.4. What happens to keywords in certain positions
3.5. Analysis of changes in the top by phrase
3.6. Comparison with competitors
3.7. What we can do with the crawling results
4. Difference between Crawling, Rank Tracking and the API "Top for a keyword"
5. Summing up
- scan Google top
- choose any region (or several)
- select a convenient scanning schedule
- receive data using the API
- site position (according to the region and schedule you've set)
- target URL
- the number of results
- paid search results on the phrase
A report looks like this:
in addition to positions, we can regularly scan volume on these phrases. Having the frequency and position of the phrase in your domain and competitor domains, it is already possible to predict the approximate traffic both in general and individual categories.
you can find out which categories are in the lead, which fall behind and, accordingly, what are the keywords which make your website fall behind.
which of the competitors came as close as possible to the positions of the promoted site and by what phrases.
sorting phrases/categories that have moved to the top. Based on these data, you can pay attention to these categories while promoting.
you can track regional search results to understand in which regions competitors are in the lead and on which topics. It often happens that you are the leader in large cities, but there are cities where niche competitors bypass you on specific topics, and you don't notice that.
crawling will fix the snippets of each site, so you can evaluate the snippets of competitors, learn their weaknesses and improve yours.
How to make a niche analysis
Data storage and extension
Let's take a real example. We want to analyze the SEO niche in Google USA, and we have already collected 64 requests for this. Every day we parsed tops, received volume from the Serpstat API and found the CTR data in Google search results depending on the position.We'll not go into details of the technical implementation of this solution, because you are free to choose the method of processing and storing JSON objects. I used the R + Google Sheets + Google BigQuery bundle, but you can do without R by using Apps Script.
We'll add three columns to the final table with data — CTR, Queries (frequency) and Traffic. Then it's time to go to Data Studio and build interactive reports based on this table.
Analyzing crawling results
We'll not pay attention to some obvious or non-obvious numbers, but let's see how we can work with such a report. Also, we will not distribute positions and traffic to different sheets of the report — this is how all of our data will be placed in one screen. I created the CTR for this report myself, any reference is coincidental :)
Dynamics of positions and traffic changes by domain
What happens to keywords in certain positions
Analysis of changes in the top by phrase
Comparison with competitors
We take the data for November 23. The total traffic is 237.3 thousand. The traffic on Udemy is 2.4 thousand, this is about 1%. At Search Engine Watch traffic is 1.7 thousand, this is 0.7%
What we can do with the crawling results
We also didn't add the analysis of special elements and contextual advertising. Having this data, you can enter corrections for CTR depending on their availability, and this will allow you to get more accurate estimates of traffic.
Another possibility is parsing snippets. You can analyze snippets of sites in the top 10 and take additional information from there for optimizing pages. For example, all your competitors offer free shipping directly in the snippet, but you don't, and because of that you lose competitive advantage.
You can add all this information to a report in Data Studio, and it'll turn into a comprehensive top analytics system. Such a thing is always convenient to keep on hand. Start by collecting data and further improve it as needed.