Machine learning & neural networks: The real future of SEO


I grew up in the good old days of Search Engine Optimization (SEO), when the keyword tag still meant something and you could make it to the top of SERPs by using the same keyword over and over in the website’s title or keyword tag. That was back when exact match results were the only ones returned from consumers’ searches, and crawlers like Google spent 99 percent of their time crawling and indexing instead of cataloging and evaluating the content’s actual quality and relevance. Those days are (thankfully) over.

Those good old days were followed (for good or ill) by years of SEO practitioners focused on chasing the constantly evolving algorithms, which, in my personal opinion, we have been doing all wrong. Most SEOs start out with something like Search Engine Land’s periodic table of SEO ranking factors or similar guides. We use tools like DeepCrawl and Screaming Frog to help chase down broken links and then ask tools like Moz and others to tell us where to place keywords in the title relative to the overall title length, or how our meta descriptions need to be X pixels less and our body copy has to have X outgoing links…

WE SERIOUSLY NEED TO STOP IT!

Although these are all great practices to follow, they are really corrective actions taken after something has already been done wrong, and none of these tactics will give brands position No. 1 for any high-value keywords. Instead, these should be considered from day one of the website build, implemented during site development, and not be an afterthought once the site is launched.

We recently did a test against 150,000 different SERPS, and based on a simple scoring model, the majority of the top three results didn’t follow even half of the best practice rules commonly found in the ranking feature lists.

In this test, we extracted 83 features from each of the SERPS (page speed, content length, link scoring, content density, social signals and so on) and used different models in an attempt to reverse-engineer the algorithm. Even with 83 features, we did not get any meaningful results; we found that websites in the top of the SERPs were just as poorly optimized as those on page 2.

This clearly shows us that while all those tactics are important for many reasons, even if you follow them exactly, it will not move your rank from 10 to 1.

So here comes the disappointing part of this article: I also have no idea how to get you a guaranteed position 1 — NOBODY does. But what I can tell you is that there is no simple way to recreate the algorithm, no easy script you can run, no simple linear regressions that can solve for it. We have seen No. 1 rankings that literally did everything wrong, and position 60 rankings that did everything right!

Here’s the good news: About two years ago, we got a look under the hood and learned why it has become so much harder to “manipulate” rankings, and why no matter how large the test or sample, it is impossible to re-create the actual algorithm.

It was November 9, 2015, the day Google publicly released TensorFlow. TensorFlow is an (now) open-source software library for machine intelligence. It is, in fact, the library that powers most of Google’s technology like Gmail, Photos, Voice and RankBrain.

TensorFlow originally was released as an evolution of Google’s internal neural network training framework “DistBelief” by the Google Brain team. On the simplest level, TensorFlow enables the large-scale and parallel manipulation of “Tensors,” multi-dimensional arrays that carry vectorized data.

The latest releases of TensorFlow have improved the scalability with new features that add APIs and deployments onto all types of devices.

TensorFlow & SEO

So what do machine learning and TensorFlow have to do with SEO, algorithms and reaching that coveted No. 1 spot in the SERPs?

As Google’s RankBrain gets smarter in understanding users and their intent, it’s also learning to better understand content, information and if that content will provide the right answer, not only to the query, but also to the individual user. With the algorithm now truly understanding the query intent on a linguistic level, it can deliver new kinds of results that are correlated and weighted in a way that a human brain can’t even begin to predict. This dramatically changes two aspects of SEO: technical SEO and content SEO.

As many have said before, the role of technical SEO in the context of fixing links, optimizing title tags and ensuring correct markup is no longer a valid SEO role, meaning there should be no brands out there hiring a practitioner just for that purpose. This nuts-and-bolts work should be done from the beginning of the website build and audited by the web dev team on an ongoing basis.

Instead, the true technical SEOs of the future need to understand more than just HTML and XML; they need to understand how machine learning works, how TensorFlow handles data and weighs inputs, and how to understand and train models. There will always be crawling and discovery, but the main focus is now more analytical — truly data-driven, with the SEO practitioner more a mathematician and software developer than web designer.

The last few years have seen a convergence of SEO content and content marketing. We know we must now create contextually relevant content that is authoritative, not just keyword-stuffed. Now it’s time to look at more than minimum/maximum character counts and keyword density. We have to start using machine learning models and linguistic analysis to weigh and score our content to ensure it truly answers the consumer question, instead of just telling a brand story.

Personally, I am excited about the new frontiers of SEO and the evolution of the field, and I invite anyone out there who’s in doubt to just look at the growth of voice and conversational search. It’s all powered by machine learning and technologies like TensorFlow. The time is now.


Opinions expressed in this article are those of the guest author and not necessarily MarTech Today. Staff authors are listed here.


About The Author

Benjamin Spiegel is the CEO of MMI Agency, a Houston-based brand activation agency that has been serving Fortune 500 clients since 1986. A digital advertising veteran with extensive experience in advertising, media, data, and technology, Benjamin has developed highly successful marketing campaigns for numerous global brands. Prior to joining MMI Agency, he was the VP of Innovation at Catalyst/GroupM, a WPP agency, where he managed the P&G business.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com