This site uses cookies and other tracking technologies to make possible your usage of the website, assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide better user experience.

By using the website, you agree to our Privacy policy

Accept and continue

Report a bug

Cancel
1718 5
SEO 9 min read February 13, 2019

How To Get 34% Increase In Traffic By Optimizing Your Existing Content

How To Get 34% Increase In Traffic By Optimizing Your Existing Content

Dino Kukic
SEO Manager at Hundred5
The way an average user works with Google Search Console is by logging in, using its dashboard, and analyzing what they get there using the predefined reports. However, one of the main issues is that in GSC dashboard you can't actually see the queries your individual pages rank for unless you're in the new dashboard and only page by page.
Enter Google Search Console API.

Its API comes in handy in these situations, because you can pull all the data and use it somewhere else. In addition, you have no limit to what you can pull and you can export queries with the page that ranks for it and the metrics related. Not only is this quite crucial for the success in SERP, but it also makes a good solution to common problems such as not provided keywords in Google Analytics.

Why should you go back to the old content

The bigger your website gets, the harder it is to manage your website's presence in the SERP. And, let's be honest, the majority of website owners don't go back to review the existing content and see its performance, but rather checks it in bulk and as a main tool for driving more traffic uses new content. While logical and a valid solution, often, we'll realize that there are hidden gems in the old content, too.

Now, not all tactics are so logical either, sometimes when you look back, you can see that the majority of your existing content doesn't really drive any traffic and, to reduce the number of pages link juice gets spread to, you might actually choose to delete it and, opposite of common sense, increase the organic traffic.

However, we don't go that far, and this approach is all about getting on top of what you have when it comes to the existing content, which keywords are you ranking for and which keywords could you rank for if minimum targeted effort is invested.

Getting the data from Google Search Console

There are few ways to approach this and one of them would be to use a programming language to query the Google Search Console API for which I've chosen R. And, another would be to use a Google Sheets add-on called 'Search Analytics for Sheets'. If you're less technical, the obvious choice would be the add-on, but if you like to get your hands a little bit dirty you go with R.

The reason why I started doing this process with R was because the add-on didn't work for quite a long time due to changes in GSC API, but prior to writing this, I've checked and, surprisingly enough, it worked. However, it often has authorization issues for which I haven't found a solution except for using something else for pulling the data. Therefore, the guide itself will include both ways.

Pulling the data from GSC using R

While R might not be the first choice for the majority of people, I find it pretty comfortable for working with APIs and manipulating data on a scale. For the purpose of optimizing content and this small tutorial we'll use R just to pull the data from GSC and the rest we'll do in Google Sheets. The package we're using is developed by IIH Nordic's very own Mark Edmondson and it has saved many SEOs and Web Analysts.

To use it you'll have to obtain Client ID and Secret which you can do using the Google API Console.
install.packages('googleAuthR')
install.packages('searchConsoleR')
library(googleAuthR)
library(searchConsoleR)
options("searchConsoleR.client_id" = "ADD_YOUR_CLIENT_ID")
options("searchConsoleR.client_secret" = "ADD_YOUR_CLIENT_SECRET")
scr_auth()
sc_websites <- list_websites()
#Check the list of websites in GSC (best to copy URL for the query)
sc_websites
#Pull the data - you can edit the date range
gsc_data <- 
  search_analytics("https://hundred5.com/",
                   "2018-06-14", "2018-09-14", 
                   c("query", "page"))
#Create a CSV file based on the query
write.csv(gsc_data, file = "gsc_data_export.csv")
#Find the location of the file
getwd()
The purpose of the last function is to provide you with a path to the CSV file you have created, but as an alternative, you can use setwd() with a path where you'd like it to create a file.

After you have done this, import the CSV file or paste the data into the Google Sheets and continue reading this article from the next section.

Pulling the data with Search Analytics for Sheets Add-on

Note: at this stage, you won't have the option "Search Analytics for Sheets".

Once you click "Get add-ons", you should search for "Search Analytics for Sheets" and install it.

Go again to Add-ons, but now, since you have the option, go to Search Analytics for Sheets and then 'Open Sidebar'
From the options, at this stage, you only need "Query" and "Page" and then just press "Request Data" at the end. However, if you are targeting a specific country with a language other than English you should also use the "Filter By" option, choose "Country" and write down the country you're targeting.

Working with GSC Data to find low-hanging fruits

From here, you'll have a table that should look like this:
Press Cmd + A (or Ctrl + A on Windows) to select the whole table and then, Data > Pivot Table.
As for the settings, the pivot table should contain the following:

Rows:

  • Page, Order: Ascending
  • Query, Order: Descending, Sort By: SUM of Impressions (can be added after adding impressions as values)

Columns: None

Values:

  • Clicks, Summarize By: SUM
  • Impressions, Summarize By: SUM
  • CTR, Summarize By: AVERAGE
  • POSITION, Summarize By: AVERAGE

Filters: None

To make it easier to read and work with, you can select the entire column where CTR is and format it as a percentage and the entire Position column and reduce the number of decimals to one or two.

Now it's time to select the pages we're going to work with. We're looking for pages that have keywords with a significant search volume (read: impressions) while ranking roughly between 8th and 12th position. To do that, first we'll identify the pages with enough impressions:
From there, we'll open to see which keywords are there for each of those to find keywords we want to target better with the page. From this particular example, after opening up the queries, I have identified 3 keywords I'd like to target better:
Now that I have the keywords my page ranks on a position where if it moves up by few positions, significant results can be achieved, I would do some of the following to get there:

  • Adjust the title to better address the keyword (of course, you don't want to kill your rankings for keywords with higher search volume if you have that already)
  • Address the keyword I'd like to target in the first paragraph
  • Create another paragraph to address the keywords
  • Build links towards this page.

Results you can expect with this approach

After an initial smaller drop in the traffic, two weeks after, Google started picking up the changes and we started seeing significant, positive change in the organic traffic. In total, the changes accounted for an increase by 34.17% in Users and 34.15% in Sessions which can be seen on the screenshot below.
As for the individual articles, here are the 3 most important ones when it comes to the effort put:
As you can see, if you own an established blog, you can go back to what you have published before and use this approach to capitalize on the existing content. Sometimes, this can help, but sometimes you might realize that you would have to write a completely new blog post to address the keywords better. Which is also a valid approach, particularly if this old blog post has links pointing to that URL.

Rate the article on a five-point scale

The article has already been rated by 5 people on average 5 out of 5
Found an error? Select it and press Ctrl + Enter to tell us
Subscribe to our newsletter
Keep up to date with our latest news, events and blog posts!

Share this article with your friends

Sign In Free Sign Up

You’ve reached your query limit.

Or email
Forgot password?
Or email
Back To Login

Don’t worry! Just fill in your email and we’ll send over your password.

Are you sure?

Awesome!

To complete your registration you need to enter your phone number

Back

We sent confirmation code to your phone number

Your phone Resend code Queries left

Something went wrong.

Contact our support team
Or confirm the registration using the Telegram bot Follow this link
Please pick the project to work on

Personal demonstration

Serpstat is all about saving time, and we want to save yours! One of our specialists will contact you and discuss options going forward.

These may include a personal demonstration, a trial period, comprehensive training articles & webinar recordings, and custom advice from a Serpstat specialist. It is our goal to make you feel comfortable while using Serpstat.

Name

Email

Phone

We are glad of your comment
Upgrade your plan

Upgrade your plan

Export is not available for your account. Please upgrade to Lite or higher to get access to the tool. Learn more

Sign Up Free

Спасибо, мы с вами свяжемся в ближайшее время

Invite
View Editing

E-mail
Message
Optional
E-mail
Message
Optional

You have run out of limits

You have reached the limit for the number of created projects. You cannot create new projects unless you increase the limits or delete existing projects.

I want more limits