2014 SEO Playbook: On-Page Factors


Welcome to part 2 of my annual SEO playbook. (Click here for part 1.) I have to thank Danny Sullivan and the Search Engine Land team for giving me the perfect outline for the 2014 playbook, the Periodic Table of SEO Success Factors.

Part 2 will cover on-page factors, including content, HTML and architecture. You’ll find more than enough food for thought and some very actionable steps. This is not a step-by-step SEO guide, and it’s pretty informal. Before embarking on a search engine optimization campaign, do your research or consult with an expert.

Periodic Table of SEO

Content: Quality

Quality was a big discussion item during 2013, especially topics like word count and deep content.

After Panda, you’d think we would be well past the age of producing short “fluff” articles. However, too many websites, especially business sites that struggle to post fresh content, continue the practice. Recently, I saw a corporate blog post listing 10 must-read books on a topic — the article consisted of 10 thumbnail images and the names of the books, linked to an online bookstore. You can’t afford to keep putting out cut-rate articles like that; in bulk, they are perfect Panda-penalty bait.

On the opposite end is deep content — pages or articles of around 1,500 words or more. Websites have seen success with this content, so it may make sense to take the time spent creating lots of short, “fluffy” posts and use it instead to produce a few longer, more meaningful articles. Whatever you do, make sure content is well written, with attention to grammar and spelling. Don’t just say something; back it up with thoughtful opinion or researched facts. Put some meat on the bones. Personally, when it comes to article content, if I cannot easily pass 450 words, I will combine it with other content or deem it not worth writing about.

As for e-commerce descriptions, I used to deem 250 words as the sweet spot. Nowadays, I am less concerned about word count and more focused on creating a great list, matching features with benefits.

Content: Keywords

Keyword research is not going anywhere and is still the foundation of all on-site SEO. The difference is, after the Hummingbird update, we are discussing the role of entities, where topics take the place of keywords in the result pages. Google has made great strides in synonym identification and concept grouping — some have even called it the death of the long-tail keyword. (But, as with all the supposed death knells in our field, this, too, is probably an exaggeration.)

My advice is to make sure each page stands on its own as a topic. Do not create multiple pages about the same exact thing in order to optimize for different keywords. Instead, stick to single, well-written, citation-worthy, topic pages and optimize them for multiple keywords. This can be another good reason to use long-form content.

Content: Engagement

Engagement is about whether visitors spend time reading your content or bounce quickly away. Once again, meaningful content is key. It’s amazing how it all comes back to quality. Are you publishing something your audience or target personas will want to read, or are you just filling holes in an editorial calendar — or perhaps publishing out of guilt because you have not published anything recently?

Engagement isn’t just limited to text content, either; Web page design is equally important. Words don’t just have to read well to be engaging — they have to look good. Readability includes everything from page layout to font selection to letter and line spacing. Additionally, pay attention to navigation and the presentation of links to other content, as these elements can have a huge impact on time, bounce rates and other visitor engagement metrics such as time on page/time on site.

Content: Ads

Another part of layout is the placement of ads. Search engines will not ding you for having advertisements. That would be hypercritical. What they will penalize is too many ads or inappropriate ad placements.

I do not foresee big changes in this area beyond the enhancement of current search engine policies. In addition to display ads, be especially wary of text link ads. Make certain they are content-appropriate or matching, and that you nofollow them. If you still use automated phrase link advertising inside your content, I strongly suggest you consider removing this. If you use interstitial or pop-up advertising, make sure it doesn’t interfere with the ability of search engines to crawl your pages.

Content: Freshness

I am a big proponent of fresh content — this includes not just posting about hot topics, but also ensuring that you are publishing new content on a regular or frequent basis. Not only is new content important to attract readership, it also improves crawl frequency and depth. Earlier, I wrote that you should not create content just to check off your editorial calendar. Not to backtrack, but if you do not have an editorial calendar in place, you probably should create one and get to work creating content.

Think of your content as a tool to generate awareness and trust. This means you must get beyond writing about just your company and its products or services. Go broader and become a resource — a real, viable, honest-to-goodness resource — for your target market and the people or companies that your target market serves.

Taking this broad approach will give you more to write about, allowing you to focus on topics of interest to your target market. This is the kind of content you can build an audience with. In my opinion, if you are not trying to build an audience at the top of the marketing funnel, you are probably doing it wrong. Obviously, there are exceptions to this; though, I think a lot more companies fail here than don’t need to worry about it.

HTML: Titles & Headers

Title tags are interesting right now. The usual rules for writing optimized title tags and headers have not changed. I do foresee search engines (Google especially) rewriting more title tags algorithmically. If you see Google rewriting your title tags, test changing your HTML to the same text Google presents in the SERPs. By test, I mean change a judicious few, then observe what happens to performance indicators. If you see improvement, a broader title tag optimization program could prove worthwhile.

Going back to entity search and optimizing for multiple keywords… when you are doing topic optimization, you must be cognizant of which keywords you use in the title and H1 tags. I wish I could give you a surefire formula, but one does not exist. As you look at synonyms, pay attention to which words or phrases received the most exact match searches and trust your intuition when it comes to popular language use.

HTML: Description

I don’t see anything changing with Meta description tag optimization. Write unique descriptions for every page. They will not change your rankings; but, well-written descriptions can increase click-through rate.

I always pay attention to length, around 150 characters. In reality, the actual length depends on the combined pixel width of all characters, but from a practical standpoint just make sure your descriptions are not getting cut off when they appear in the results.

For pages that appear in site links, be sure that the portion of the description that appears beneath each link forms a coherent thought. This is a place where many enterprise sites and brands can improve.

HTML: Structured Data Markup

Each year, it seems structured data markup is always a big topic.

First is the question of whether or not you should use it for organic search engine optimization. Some long-time experts do not like structured markup or machine-readable language because they do not want to help the search engines present information in a format that does not generate visits.

For example, if you type in the name of your favorite NFL team, Google will show you information about that team, including their next scheduled game, right on the SERP. Here’s an example I fondly remember: someone once asked, if you ran a zoo website, would you want Google to show your business hours at the top of the search results, or do you want people to visit the website, where they will learn more about current exhibits and events? This is a fair question — to which I think the fair answer is, whatever will get the most bodies through the door.

Google, Bing and Yahoo are going to show the data they want and in the format they desire regardless of how you or I feel. Personally, I’d much rather be a trusted source, even if it means my website information is made available in the SERPs. For this reason, I am a huge proponent of structured data markup like schema.org and RDFa.

Other forms of structured markup, like the author and publisher tags, are not controversial and have entered the realm of best practices. Use them.

HTML: Keyword Stuffing & Hidden Elements

Negative ranking factors like keyword stuffing and hidden text are so old that many of us practitioners brush them off as search engine optimization 101. Unfortunately nothing is ever so easy.

Stuffing is definitely a factor in e-commerce shopping cart optimization. It can be tricky not to use the same word or phrase over and over again when they are used as categories or descriptions for products. Different shopping carts have different levels of control. Some are more easily optimized than others. On category pages, it may be as simple as limiting the number of products you display on each page. Without going into an entire lesson on shopping cart optimization, what I will tell you is: if you have not done a shopping cart review in the last two years, it is time. Make certain your e-commerce platform is keeping up.

It still surprises me how often I see unintentional cloaking. Usually, it’s a result of the template writer getting around a quirk of the content management system. But I have also seen static links in a template that are cloaked using display: none on some pages while they appear on others, depending on something such as the category. The bottom line is this: if it appears on the page, it should be in the HTML. If it does not appear on the page, it should not appear in the HTML.

Architecture: Crawl

Not enough search engine optimizers pay attention to crawl. I realize this is a pretty broad statement, but too many of us get so caught up in everything else that this becomes one of the first things we ignore unless there are red, flashing error messages. Obviously, you want to make sure that search engines can crawl your website and all your pages (at least the ones you want crawled). Keep in mind that if you do not want to botch the flow of PageRank through your site, use the meta noindex, follow tag to exclude pages, not robots.txt.

The other concern you should have is whether or not search engines crawl and capture updates to existing pages in a timely manner. If not, it could be an overall domain authority issue or that PageRank is not flowing deep enough in sufficient quantities.

There are tricks to resolve this, such as linking to updated pages from your homepage or a level-one page until the updated deep page gets reached. The more wholesome approach is to make sure that the content which gets updated is naturally close to content or sections of content with higher authority, or to build legitimate internal links from related content that has its own off-site PageRank.

I am not telling you all your content should be crawled all the time. Search engines budget crawl frequency and depth for good reasons. What I am saying is manage your website crawl budget and use it well; don’t just leave everything up to chance.

Architecture: Duplicate Content

Earlier this year, Matt Cutts stunned the search engine optimization community by telling us not to worry about duplicate content. He assured us that Google will recognize this duplicate content, combine the disbursed authority, and present one URL in the SERPs.

This is really not a big surprise, as Google has been working toward this for quite some time. Webmaster tools has had automated parameter identification and Google spokespersons have discussed duplicate content consolidation for some time.

To repeat what I have written before, Google is not the only search engine out there and reality does not always work the way Google says it does. The bottom line is: keep managing your duplicate content by preventing or eliminating as much as possible, and as for the rest, put your canonical tags in place.

Speaking of canonical tags, I know a popular hack has been to use one canonical URL, improperly, on all the pages of multipage articles. There are other canonical hacks out there, as well. I’d be wary of these. If you’re using canonical tags, machine-readable content or advanced meta-tags, you’re basically waving a big red flag telling search engines that your website is technically savvy and using search engine optimization. In other words, you’re begging for additional scrutiny.

It would not surprise me if Google becomes more fierce in penalizing websites for this type of technical misdirection. Search engines tend to use a soft touch on levying penalties algorithmically for fear they will burn innocent websites. But as we have seen with Panda and Penguin, as they become more confident, they also become more aggressive. If you are optimizing for an employer, keep it clean.

Architecture: Speed

Most websites are not going to see an SEO benefit from increasing the speed of their website. Google has always said only a small fraction of sites are affected by this part of the ranking algorithm. This view seems to be borne out by correlation studies. Honestly, the best test of speed is to take your laptop to the local café and surf around your website. If you are not waiting for pages to load up, then you are probably okay.

The exceptions (sites that should be concerned about speed) are large enterprise and e-commerce websites. If you optimize for one of these, shaving a few milliseconds from load time may lower bounce rates and increase conversions or sales.

Architecture: URLs

The current best practices for URLs should hold true throughout 2014. Simple and easily readable URLs are not just about search engine optimization. With today’s multi-tabbed browsers, people are more likely to see your URLs than they are your title tags.

I will also add that, when seen in the search engine results pages, readable URLs are more likely to get clicked on than nonsensical ones. If your content management system cannot create readable URLs based on your title tags or will not let you customize URLs, it is probably time for a CMS review. This is now a basic search engine optimization feature, so if your CMS cannot handle it, I wonder about the rest of your CMS’s SEO efficacy.

Architecture: Mobile

2013 was an interesting year for mobile SEO. Google and Bing agree that the ideal configuration is for websites to have a single set of URLs for all devices and to use responsive Web design to present them accordingly. In reality, not all content management systems can handle this, and Web designers have presented case studies of situations where the search engine standard is neither practical nor desirable.

If you can execute what Google and Bing recommend, do so. However, if you cannot or have a good reason not to, be sure to use canonical tags that point to the most complete version of each page, probably your desktop version, and employ redirects based on browser platform for screen size.

You will not risk a penalty from the search engines as long as your website treats all visitors equally and doesn’t make exceptions for search engine spiders. Basically, this is similar to automatic redirection of visitors based on their geographic location for language preference.

That about wraps it up for on-page SEO factors in 2014. Be on the lookout for Part 3 of my 2014 SEO Playbook, which will cover off-page SEO factors relating to link building, local search and social media.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Thomas Schmitz is a longtime digital marketing professional who works with startups, SMBs, enterprise, media and not-for-profit organizations. Regarded as an expert in inbound and content marketing, search engine optimization and social media, Tom’s an innovative growth creator and turnaround specialist. Follow Tom at @TomSchmitz.



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version