Ahrefs, Moz, SEMrush…there are some great SEO tools out there; and with them, you can do a variety of things. You can check backlink profiles. Do keyword research. Find unlinked mentions and guest posting opportunities. You can even run comprehensive SEO audits with the click of a button. But whether you’re agency or in-house, small business or enterprise—there are certain areas of functionality where those tools fall short. And where they fail, Google Search Console prevails.
While powerful, the garden-variety SEO tool is, or should be, supplemental to your SEO strategy. If you’re in the business of optimizing for organic search, you should be living in Search Console and using other tools to help you complete ancillary tasks. Not totally comfortable with Search Console as a living space? Not to worry. Today, I’m going to teach you how to get nice and cozy with the most pivotal features Search Console has to offer.
Even better: after launching the new and improved Search Console in January, Google officially moved it out of beta last week. So today, while I’m going to be teaching you 7 steps to making the most out of the new Google Search Console, I’ll also discuss how the new interface and the old interact differ.
Alrighty, then! Let’s hop in.
Step #1. Add and Verify Your Site
Before we get into functionality, you’re going to want to add and verify your site within Search Console. Head to the dropdown at the top left of your dashboard and click “add property.”
Make sure you enter your site’s URL exactly as it appears in your browser. If you support multiple protocols (http:// and https://), or multiple domains (example.com, m.example.com, and www.example.com), make sure you add each as a separate property. Once your site is added, Search Console will begin collecting data.
Just adding a property won’t give you access, though—you also have to verify that you own it. Head to the Manage Property tab for the property you added on the Search Console home page.
Select “verify property” in the dropdown and choose one of the recommended verification methods. These will vary depending on the makeup of the site you’re verifying. If you’re struggling to implement one of the verification methods, want to change your verification method, or simply want a more in-depth explanation of each process, this page is a great resource on all things site verification.
Step #2: Indicate a Preferred Domain
Indicating a preferred domain tells Google whether you want your site listed as https://www.example.com or https://example.com. Choosing one over the other is not going to give you any kind of advantage in organic search; however, you do want to make sure you choose one or the other.
Select your property from the Search Console home page (note: we are doing this in the old Search Console). Once in, click the gear icon in the top right of your dashboard and select Site Settings:
In the Preferred Domain section, you will see the option to select between the www.example.com and example.com versions of your site. You will also see the option “Don’t set a preferred domain.” Select this option, and you leave yourself open to the possibility of Google treating “www” and “non-www” URLs as different URLs. Doing so could really splinter the link equity of those pages and hinder search visibility. By instead selecting one version of your site as “preferred,” you are telling Google to treat any non-preferred domains it comes across as your preferred domain.
Step #3: Integrate Search Console with Google Analytics
Analytics gives you traffic and conversion data; Search Console gives you a look at the causal search factors underlying that data. Linking the two gives you a huge boost in reporting.
To link Search Console and Analytics, head to the admin panel at the bottom left of your Analytics dashboard. From there, you’ll want to click into the Property Settings at the Property level.
Note: If you don’t have access to Property Settings, as I do not here, it means you don’t have Edit permissions at the Property level. You’ll need to acquire that from another owner if you want to link Search Console yourself.
Next, scroll down to Search Console Settings. You’ll see your website’s URL, which confirms that the website is verified in Search Console and you have permission to make changes. Under Search Console, select the reporting view in which you want to see data, click Save, and you’re ready to rock.
You’ll now see a Search Console report within the Audience tab of your Analytics dashboard.
Using that report, you now have the ability to correlate pre-click data like queries and impressions with post-click data like bounce rate and goal completes. The Landing Pages report houses search data for every URL on your site that is displayed in the search results. So if there’s a page you recently updated and you’re hoping better rankings will translate to more traffic for that page, or if there’s a page that is tanking in traffic and you want to find out if which specific search metrics are contributing to that tank-age, you can use the Landing Pages report to fully understand those correlations.
How has change in click-through rate over time affected goal completes? How has average position in the SERP affected sessions or time-on-page? Linking Search Console and Analytics allows you to analyze all of these unique relationships. You can also use the Countries report, the Devices report, and the Queries report to analyze these same metric relationships when broken out by country, device, and search query.
Another note: Unfortunately, the Search Console report only shows data as far back as Search Console has been collecting it for your site. Fortunately, while the old Search Console gives you just three months of search data, the new Search Console gives you 16 months. This is null and void if you’re just linking Analytics to Search Console now, but over time, it’s definitely helpful having that extra data to analyze (think of how often you look back past three months to look at a page’s historical traffic/conversion numbers).
Step #4: Submit a Sitemap
Not sure if you have a sitemap? Head to example.com/sitemap.xml. If there’s nothing there, you don’t have one.
Naturally, you need to have a sitemap if you want to submit one to Search Console. Here are some sitemap generation best practices:
- File size: Less than 50 MB
- Number of URLs: 50,000
- If you have more than 50,000 URLs: Generate multiple sitemaps
- Only include canonical URLs. Exclude URLs you’ve disallowed with robots.txt
- From Google: “XML sitemaps should contain URLs of all pages on your site.” If you have a large site, you can paraphrase this as, “…all valuable pages on your site.” That includes any page with high-quality, original content. It excludes “utility pages”—pages that might be useful to a user, but are not useful as a search landing page.
- Common CMS (Content Management Systems) like WordPress and Drupal have plugins that help you generate sitemaps. Some, like Squarespace, generate and update them automatically.
- If all else fails, this article gives a rundown on how to build a dynamic site map. If you have a small site, this tool will build one for you.
Ok, so you have your sitemap! Now, to help Google understand the content your site consists of, you’re going to want to submit it. To do so, head to the Sitemap tab in the new Search Console:
Enter your new sitemap URL, click submit, and bam! You’re in business.
Step #5: Leverage the Index Coverage Status Report to Fix Site Errors
The old Search Console housed the Index Status report in the Google Index tab; the new Search Console displays it right in the dashboard, where you can’t miss it.
They’ve also updated the name to the Index Coverage Status report. It looks like this:
Per Google, the new report provides all the same information as the old report, plus detailed crawl status information from the Index. What kind of glorious insights can you glean from this new (but ostensibly the same) report? Let’s run through each of the tabs.
- Error: Runs through all potential site errors so you can go through and make fixes. These could include server errors, redirect errors, robot.txt errors, 404s, and a variety of others.
- Warnings: A warning means a page is indexed, but blocked by robots.txt. If you want to block a page from the index, Google prefers the ‘noindex’ tag over robots.txt. A page blocked via robots.txt can still show up in the index if other pages link to it. These warnings give you the opportunity to go through and correctly de-index those pages.
- Valid Pages: All of these pages are in the index. If you see the “Indexed, not submitted in sitemap status” you should make sure you add those URLs to your sitemap. “Indexed; consider marking as canonical” means that page has duplicate URLs, and you should mark it as canonical.
- Excluded Pages: These are pages that have been blocked from the index by a ‘noindex’ directive, a page removal tool, robots.txt, a crawl anomaly, by virtue of being duplicate content, etc.
Google gives great insight on what each and every one of these statuses means, and how you should go about fixing them. We don’t have room to get into all of them here, but generally speaking, you can get the 411 on each URL by clicking on the tab you want to investigate, then clicking on the description that populates the Details section of the report:
Click the URL in the Examples tab:
That will open up this nice panel that gives you a few different ways to inspect the issue:
Here’s what you can do with each of these functions:
- Inspect URL: Look at the referring page, the last crawl time, whether or not crawling is allowed, whether or not indexing is allowed, whether or not you’ve declared that page as canonical, and whether or not Google views that page as canonical.
- Test Robots.txt blocking: Head to your site’s robots.txt file (example.com/robots.txt) and you can see all the elements on your site blocked from the Index. Naturally, you’re not always going to remember which elements appear on which pages. The robots.txt tester highlights parts of your page that may or not be triggering robots.txt blocking.
- Fetch as Google: Allows you to see your page exactly as Google sees it. Googlebot heads immediately to your page and shows you the downloaded HTTP response it reads. Use “Fetch and Render” and you can also see the physical layout of your page as Google sees it.
- View as Search Results: Allows you to see what your page looks like in the Index.
These functions are available for all of your URLs. Search Console picks up in the Index, so you can use them to sort through the statuses associated with your errors, warnings, valid pages, and excluded pages. Make sure to validate your fixes so Google puts a rush on re-crawling the affected page:
That’s it in a nutshell! The Index Coverage Report can be used to detect and remedy every error associated with your site.
Step #6: Leverage the Performance Report to Update Content
Basically, all the metrics that you see in Analytics when you link to your account to Search Console come from the Performance Report. The Performance Report replaces the “Search Analytics” report in the old Search Console; like the Index Coverage report, there’s not too much of a difference between the old report and the new one. But you can still do some pretty sweet stuff with it. Let’s take a look.
First, open the Performance Report. It’s the first table you see in your overview.
You’re not limited to tracking these metrics in Analytics; you can use them to look for opportunities to improve performance. The best way to do this is to use the filter function:
Use the tabs to investigate your pre-click metrics at a query, page, country, or device level. You might want to see, for instance, which of your pages sit on page one of the search results but have a lower than site-average CTR in the past six months:
Or maybe you want to find queries for which you rank outside the top 10 but still get a solid amount of impressions, so you can then go back and optimize the corresponding pages in an attempt to gain rank:
Most third-party SEO tools have similar functions by which you can go in and look for keyword opportunities, but it’s nice getting data straight from Google!
Step #7: Use the Links Report to Boost Specific Pages
The Links Report is located at the bottom of your dashboard:
There are a few handy things you can do with it. Here are my two favorites:
1. Boost specific pages using your most linked-to pages. The most linked-to content on your site is where the most link equity lies. Linking internally to pages you want to boost from those equitable pages is a great way to increase rank. To find out where the most link juice resides on your site, click into either of the top linked pages sections of the link report:
The external links section allows you to sort by number of referring domains—which is a big ranking factor for Google, so those pages are going to have a lot of inherent equity. Find pages that are on the cusp of driving serious business value, link to them (naturally) from these high-quality pages, and track the results!
2. Disavow links from spammy sites.
Head to “Top linking sites” in your Links Report overview. Expand the list and you can see all the domains linking to your site. Add any low-quality or spammy domains to the file you’ll upload to Google’s disavow links tool. Note: Per Google, you should only disavow links if you are confident they are doing harm to your site. Disavowing links that are boosting performance will harm your site. Still, it’s worth investigating whether or not there are domains pointing to your site that you should be concerned about.
Beyond the 7 Steps
If you’ve grasped each of these 7 steps, you’ve put yourself in a good spot to move closer to Google Search Console mastery. These concepts and reports make up the main thrust of Search Consoles utility as an SEO tool. That said, you can of course go deeper. Backlinko has a super useful guide that gives great insight into some more advanced tactics if you’re interested.