Several years ago, I was evaluating Content Management Systems for a local tech company. As you surely already know, the Web CMS market is highly fragmented. There are literally hundreds of options available.
After a couple of weeks of research, it became apparent to me that no two Web CMSs are the same.
Many of them come with key features built in. Others are extensible via plugins. Some are very hardened for various reasons, and only a fit if your specific needs can be limited to the features available.
Aside from the tablestakes functionality, such as WYSIWYG editors and the ability to manage navigation without touching code, the next most important set of features relates to Search Engine Optimization (SEO).
SEO capabilities played a rather large role in my decision making process back then. Let’s look at the types of features you want to have available with whatever CMS you select for your own website.
On-Page Optimization
When we talk about SEO, most marketers and web professionals will immediately recognize the importance of on-page optimization. Before launching into an Off-Page campaign or even starting to share content on social networks, we all need to ensure that the topic, content and keyword targeting are properly structured on the site.
Any Web CMS worth its salt will include at least the following basic SEO features.
SEO-Friendly URLs
Whether via inherent functionality or a plugin, it is important to control the URL structure for every static page on your website. This can be done manually or automatically.
For manual control, you get to choose the exact words to use within the URL. If you know what you are doing, this provides the most flexible and proactive solution to SEO-friendly URLs.
There are a variety of options for automatically controlling URLs, depending on the Web CMS you select. Joomla, for example, has a feature where you can select to use “SEO Friendly URLs” built right into the core application. For blogs, WordPress offers Permalink control, and one of many plugins can be added to optimize page URLs and even to remove slugs from the URL strings.
If your chosen CMS does not offer SEO-Friendly URLs or complete URL control, you have two options: Either move to a different one or have a developer built it for you.
Control over Page Titles and Meta Descriptions
It amazes me to find content management systems available today that do not automatically offer you control over the Page Titles and Meta Descriptions. These two fields play a huge role in a) helping search engines decide what keywords are relevant to the content and b) enhancing click-thru rate within the SERPs (Search Engine Results Pages).
When evaluating a CMS, always investigate whether this is available on your preferred system. If not, look for a plugin or look elsewhere.
For the Page Title, it is not good enough to have the post or page name simply show up as Page Title. You need the ability to have the H1 auto-populate with the page name and the Page Title say whatever you need it to say to rank better.
Think of the meta description as your PPC ad text , but showing up for free in the organic listings. It should include the main target keyword for that page, but also tell the searcher why to click on your listing rather than the other nine organic, 8-11 PPC, or countless Google search links on the page. With so much noise, this is your chance to stand out. Don’t just let the CMS post an excerpt from the page.
Crawling Features
While the on-page materials tell search engines what to rank you for, it is paramount that you make the right pages available to search engines in the first place. You can have a good amount of influence over how, when and who crawls your website. Although all of the following features can be managed elsewhere, it is handy to have them available right from the UI of your CMS.
Automated XML Sitemap Generation
The XML sitemap is the equivalent of an index at the back of a college textbook. It tells search engines what pages are on the site, how important they are relative to each other, how often each page changes, and when the most recent update occurred.
This is the single most important way to provide guidance to search engine crawlers about where to focus their clicks. Every site should have an XML sitemap that is automatically submitted to Webmaster Tools on the leading search engines.
Unfortunately, very few Content Management Systems come standard with automated XML Sitemap capabilities. You can typically integrate this functionality rather easily by way of a plugin. For systems that have no such module, you will either need to have one built or manage the XML sitemap manually.
For sites with a lot of content or pages, I highly discourage taking on manual XML sitemap management. Having done it myself, I can tell you that the task is arduous and highly prone to error. If you can automate, always do so.
Access to robots.txt File
The robots.txt file also plays a key role in guiding the crawl process. As opposed to the XML sitemap, which tells crawlers where they should explore, the robots.txt file tells them where to avoid crawling. This is much more powerful than a “noindex” directive on one page, because you can block entire directories from the spiders via the robots.txt.
Many of you may now be asking: “Why would I want to discourage search engines from crawling any part of my website?” Great question!
In many cases, there are files available on your server or host that need to be accessed by a page on the site, but should not rank as a standalone. For many CMSs, we prefer to block the admin areas, so that the crawlers will spend time exploring pages of content that we actually want to rank.
If you are very tech savvy and have control panel or FTP access to the root directory on your hosting server, you can edit the robots.txt file in notepad or a similar text editor. Most users will only have access to the CMS itself. By allowing a window to the robots.txt file, you can do some basic remediation of crawling issues without having to wait in a queue for your web platform or IT group to make an update.
301 Redirect Management
You may not realize it, but 301 redirects are a key piece of SEO management. Pages have links. If those pages go away, the links lead to 404 errors. We all know that links are very important for domain authority, so you need to preserve those links! We do this by redirecting the hits to a new page that is relevant to the subject of the old page.
301 redirects are typically managed using the .htaccess file, which also resides in the root directory for your website. The rub is that errors in your .htaccess file can cause major issues, such as making the entire website unavailable and showing a 500 error instead!
Some web hosts offer basic 301 redirect capabilities via the control panel for your hosting plan. This is a great way to manage 301s without touching .htaccess, but it is not an option if you don’t have access to the hosting control panel.
To make this smoother, some content management systems such as Drupal offer the ability to manage 301 redirects right in the CMS dashboard. While not a mandatory feature, I recommend you opt for a CMS that offers this capability if your two options are otherwise on even ground.
Summary
With so many Web CMS options on the market today, it is easy to get overwhelmed when evaluating the choices. Include SEO features in your evaluation process. This will not only help reduce the options (and help keep you from getting overwhelmed), but it will also set you up for organic success once you migrate your website to the new platform.
Title image courtesy of venimo (Shutterstock)
Editor’s Note: Want more of Tommy’s SEO insights? Read Search Engine Optimization Trends to Watch in 2013
Tommy Landry has over 20 years of marketing and general business experience, with a heavy focus on online marketing. Operating out of Austin, TX, he is founder and President of Return On Now, which provides forward-looking Social SEO and Online Demand Generation consulting to companies of all sizes.