6 Ways Web Developers Can Improve SEO


The relationship between web developers and search-engine-optimization teams is sometimes contentious. Seemingly unrelated technical decisions by developers can impact organic search traffic.

But there are enormous SEO benefits in working with development personnel and their release planning and testing cycles. In 13 years, I’ve encountered just one developer who refused to consider SEO recommendations.

When SEO practitioners sit down with developers to discuss opportunities to drive revenue for the site, amazing things can happen. In this post, I’ll list six of them.

How Web Developers Can Improve SEO

SEO self-sufficiency. Identify ways the SEO team can become self-sufficient, freeing developers from mundane work. Can you edit canonical tags yourself via a bulk upload instead of asking developers to do it? Can you 301 redirect individual pages as needed when promotions expire?

Look for small tasks that take developers away from more strategic work. Ask if a solution could be built to enable someone on your team to do the task instead. It will probably result in faster implementation of these tasks since you won’t have to wait for them in a release cycle.

Binding JavaScript. Search engines are increasingly adept at crawling JavaScript. Google claims that it can crawl anything you throw at it, though it still avoids hashtags and form fields.

Still, if you want to ensure that the search engines can crawl your site, and associate signals of relevance and authority with each page, ask JavaScript developers to bind the destination URL with the anchor text. This creates a link that acts very similarly to plain HTML, sending all of the desired SEO signals.

301 redirects. Of critical importance when migrating content or redesigning a site, 301 redirects are also necessary for everyday actions. For example, when you change a category name from singular to plural it likely also changes the keyword URL.

Unfortunately, 301 redirects are a pain for developers to write and test. Look for an auto-redirect solution for the small instances like this so that every changed URL instantly redirects. You won’t have to remember to request it, developers won’t have to implement it, and it will automatically protect your natural search performance.

Crawl errors. Errors happen on every site. How many errors are “normal” depends on the type of error and the size of the site. For example, if Amazon had 1,000 404 file not found errors, it would be a drop in the bucket compared with the enormity of that site. But for a small ecommerce site with 100 products, even 20 errors would be a major problem.

Excessive errors drive down search engines’ algorithmic perception of site quality. If your site has a high percentage of errors compared to its size, the chances increase that a searcher would land on an error page from the search results page. That makes both the search engine and your site look bad. Search engines don’t want to take that risk. Thus the more errors on your site, the less likely search engines are to rank it well.

Site quality is an understandable sore spot for developers. They want a clean site, too. Reports of errors can make them defensive. Avoid subjective statements, such as “The site has a lot of errors,” and focus on the specifics. Come to the table with the data from Google Search Console or your web crawler that shows which pages are throwing errors, to quickly move to solutions.

Duplicate content. Search engines bots have a crawl budget. They spend a limited amount of time on your site. If your site is full of duplicate content, the bots will waste time crawling those redundant URLs and the search engines will have a hard time identifying which pages should rank. They also won’t discover new content as quickly.

There are many ways to resolve duplicate content. Each method comes with risks and benefits. Choosing a solution that is good for SEO and easily implemented by your development team requires discussion.

An added benefit to that discussion is that your developers will likely be able to tell you what’s causing the duplicate content in the first place. You may be able to address the problem at its source.

Speed and security. Google values site speed and security. Both are part of the ranking algorithm. And SEO practitioners must rely on development teams to improve these areas. Remember, speed and security are customer experience issues, too. Your development team could already have them on the roadmap.

In the case of site speed, developers likely have people harping on them companywide. Adding your voice to the mix isn’t likely to produce instant results. Instead, ask about their plans to improve the situation.

Also, read up on what it takes to implement these changes — they’re not easy fixes — so that you can discuss it knowledgeably. For example, contributor Hamlet Batista wrote an excellent article on moving to HTTPS. And all sorts of tools, such as Google’s PageSpeed Insights, can recommend how to improve load times.



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version