After reading a number of articles about pruning a site’s content – mostly dealing with ecommerce sites – we wanted to run a test of our own on pruning blogs. When I say “pruning,” I mean removing pages that have few/none links, aren’t driving traffic/conversions, and receive very few visits.
If you haven’t read Everett Sizemore’s post on Moz about pruning ecommerce sites and the results, it’s a trip and you should stop reading this and go there first.
Most of the Seer team doesn’t know, but 84 posts vanished from our site a number of weeks ago.
Why? Based on data from pruning studies dealing with ecommerce sites, we learned that some sites can improve rankings & traffic for remaining, valuable pages. It’s referenced as “pruning” as it’s very much like a tree benefiting from the pruning of sick or dead limbs.
What We Removed:
- Duplicate pages – Seer’s blog generated duplicate pages at some point after our redesign a few years ago. These were pages that ended in “-2”. There were 14 of these pages that had no links at all and were 100% duplicates. Gone. Want to see one of them? (http://www.seerinteractive.com/blog/dont-delete-your-myspace-account-sell-it-part-one-2/)
- We had a number of posts that were completely off topic. We found a lot of these just by reading the URLs. Sell your MySpace page? Back when Seer practiced less purposeful link building, this was something we could post to the blog. How does this speak for Seer today? It doesn’t at all. While it’s important to know where we came from, and we’re happy to be transparent about old posts we’ve taken down, some topics just don’t belong anymore. It also had a number of broken images, which leads us to point 3.
- 70 posts were completely off topic or had all images broken. These pages drove little traffic, had high bounce rates than average, and simply had a poor UX. Between broken images and jacked up formatting, these got tossed.
Overall, we removed about 9% of total blog posts.
It was a small consensus that Seer was willing to give up about 1% of total blog traffic for this test. No blog post earned more than 20 organic visits in all of 2016. More importantly, none of them converted. That part isn’t surprising. If our “sell your MySpace” post with broken images created a genuine lead form submission, Seer likely wouldn’t want to work with whomever submitted it.
Most of the posts had zero rankings in the first two pages of search results. Many ranked for obscure words in the 50-100 range.
Any post that had even one big trusted source link, like being included in a searchenginejournal blog post recap, was kept live on the site. The original removal pool had nearly 100 posts. Considering links and their potential value, this was whittled down to 84.
WordPress has the flexibility to remove live posts and make them drafts. If our blog tanked, we could always move them back from drafts to published.
The big difference in this test vs Everett’s is that we completely removed posts from the site vs adding a noindex, follow instruction. This was necessary as duplicate posts, off brand posts and broken posts needed to be taken down.
There were some positive indicators that this test would be successful.
Indexation Stats:
20 days prior to this test, the site averaged 2,148 pages crawled per day. 20 days after launch, the site averaged 2,933 pages crawled per day. 37% increase is a good indicator that Google is spending more time on the site indexing fewer pages more often. Great! But that’s not anywhere close to calling this a success. Lets dig a layer deeper.
Ranking Results:
242 blog posts increased total keyword rankings via SEMRush. 137 decreased in total keyword rankings. 580 remained unchanged. Take into consideration that 84 of these posts were removed, so the decrease is truly 53 vs 137.
The biggest increase was Chris Berkely’s SEMrush post, moving from 376 total keywords to 470. That’s +94 keywords. The next largest increase was 89, then 59,56,53, and so on down the line.
The biggest decrease was /blog/own-it-how-to-create-a-google-analytics-account/, dropping from 251 total keywords to 229 (-22). The next largest drop was 16, 15, 14, 10, and so on. Much smaller scale than the increases.
So rankings overall had a positive increase.
Traffic Results:
That’s where this test goes south.
Traffic actually decreased slightly. From a baseline week to week 4 of the test, total organic traffic for the entire site fell 5.66%. Total organic traffic for the blog fell 7.6%.
This wasn’t seasonal as traffic remained nearly unchanged in previous years Feb – May.
Was the test successful?:
It wasn’t unsuccessful as we learned a few things.
- Google indexed the site 37% more for about a full month after the test.
- Rankings increased, but they increased for words that were previously un-ranked and now ranked 50th. That doesn’t drive traffic, but shows that Google allowed us to be slightly more relevant for more searches.
- The site is more on brand and a better experience for searchers AND Google. 14 less identical pages for Google to waste time on, better user experience for people visiting blog posts without broken images or jacked formatting. This could be the difference for someone doing their DD when evaluating Seer.
- Less relevant to this test, but we saw that Yahoo & Bing traffic made up just 2% of our overall organic traffic. My gut said that number should be more like 8%. The site could be missing out on some love from these two and we should check into Bing webmaster tools to make sure they aren’t running into any big issues.
Does this mean you shouldn’t prune your blog? Prune away if it makes sense to remove off brand, duplicate, or poorly formatted posts. Run your own test. This test removed about 9% of our overall blog posts. There simply may not have been enough poor/low quality posts for it to make a true difference.