Disclaimer: All criticism of Google spokespeople contained herein is impersonal in nature. I know they are only representing the internal direction of the company and not acting independently. They do strive to be as helpful as they can.
When former head of web spam Matt Cutts was at Google, he spent a lot of time communicating with webmasters/site owners about updates. We knew what was coming, when it might be coming, and how severe it would possibly be.
If you woke up in the morning and your traffic had fallen off a proverbial cliff, you could go to Twitter and, based on what Cutts was posting, usually determine if Google had run an update. You could even tell how severe the rollout was, as Cutts would typically give you percentage of queries affected.
Although some believe Cutts was more about misinformation than information, when it came to updates, most would agree he was on point.
So if a site fell off that cliff, you could learn from Cutts what happened, what the update was named, and what it affected. This gave you starting points for what to review so that you could fix the site and bring it back into line with Google’s guidelines.
Why the help?
Cutts seemed to understand there was a need for the webmaster. After all, Google’s Search is not their product — the sites they return from that search are the product.
Without someone translating Google’s desires to site owners, those sites would likely not meet those guidelines very well. This would result in a poor experience for Google users. So, that transfer of knowledge between Google, SEOs and site owners was important. Without it, Google would be hard-pressed to find a plethora of sites that meet its needs.
Then, things changed. Matt Cutts left to go to the US Digital Service — and with his departure, that type of communication from Google ended, for the most part.
While Google will still let webmasters know about really big changes, like the mobile-first index, they’ve stopped communicating much detail about smaller updates. And the communication has not been in such an easily consumable format as Cutts tweeting update metrics.
In fact, very little is said today about smaller updates. It has gotten to the point where they stopped naming all but a very few of these changes.
Google communication in 2017
Right now, the Google spokespeople who primarily communicate with SEOs/webmasters are Gary Illyes and John Mueller. This is not a critique of them, as they communicate in the way Google has asked them to communicate.
Indeed, they have been very helpful over the past few years. Mueller holds Webmaster Central Office Hours Hangouts to help answer questions in long form. Illyes answers similar questions in short form on Twitter and attends conferences, where he participates in various AMA (Ask Me Anything) sessions with interviewers.
All this is helpful and appreciated… but unfortunately, it is not the same.
Highly specific information is difficult to find, and questioners are often are met with more vagueness than specifics, which can at times feel frustrating. Google has become obtuse in how they communicate with digital marketers, and that seems to be directed by internal company processes and policies.
This lack of algorithmic specificity and update confirmation is how we wound up with Phantom.
Welcome, Phantom
Google has many algorithms, as any SEO knows. Some, like Penguin and Panda, have been rolled into Google’s core algorithm and run in (quasi-) real time, while others, like the interstitial penalty, still run, well, when they run.
Big updates such as Penguin have always been set apart from the day-to-day changes of Google. There are potentially thousands of tweaks to core algorithms that run every year and often multiple times a day.
However, day-to-day changes affect sites much differently than massive algorithm updates like Panda, Penguin, Pigeon, Pirate, Layout, Mobilegeddon, Interstitial, and on and on. One is a quiet rain, the other a typhoon. One is rarely noticed, the other can be highly destructive.
Now, Google is correct in that webmasters don’t need to know about these day-to-day changes unless someone dials an algorithm up or down too much. You might not ever even notice them. However, there are other algorithms updates that cause enough disruption in rankings for webmasters to wonder, “Hey Google, what happened?”
This was true for an algorithm update that became known as Phantom.
Phantom?
There was a mysterious update in 2013 that SEO expert Glenn Gabe named “Phantom.” While it seemed to be focused on quality, it was not related to Panda or Penguin. This was new, and it affected a large number of sites.
When “Phantom” ran, it was not a minor tweak. Sites, and the sites that monitor sites, would show large-scale ranking changes that only seem to happen when there is a major algorithm update afoot.
Now, there was one occasion that Google acknowledged Phantom existed. However, aside from that, Google has not named it, acknowledged it, or even denied Phantom when SEOs believed it ran. Over time, this string of unknown quality updates all became known as Phantom.
The word “Phantom” came from the idea that we didn’t know what it was; we just knew that some update that was not Panda caused mass fluctuations and was related to quality.
Not Panda quality updates
The changes introduced by Phantom were not one set of changes like Panda or Penguin, which typically target the same items. However, the changes were not completely disparate and had the following in common:
- They were related to site quality.
- They were not Panda.
- They were all found in the Quality Raters Guide.
We don’t use the word “Phantom” anymore, but from 2013 to 2016, large-scale changes that were quality related and not Panda were commonly called Phantom. (It was easier than “that update no one admits exists, but all indicators tell us is there.”)
You can’t have so many sites shift that dramatically and tell SEOs the update does not exist. We all talk to each other. We know something happened. Not naming it just means we have to “make up” (educated guess) what we think it might be.
And from this mysterious Phantom, Fred was born.
‘Hello, Fred!’
In early March, 2017, the SEO world was rocked by a seemingly significant algorithm update that appeared to target link quality. Google, however, would not confirm this update, deflecting questions by responding that Google makes updates to its core algorithm nearly every day.
When Search Engine Land’s Barry Schwartz asked Gary Illyes if he cared to name the unconfirmed update, he responded jokingly:
‘Fred’ is more than a funny joke
Of course, Fred is not just a funny thing that happened on Twitter, nor is it just the default name for all Google’s future updates. In fact, it is not actually that funny when you break down what it really means. Fred is representative of something far deeper: Google’s historically unstated “black box.”
Now, Google does not use the term “black box,” but for all intents and purposes, that is exactly what “Fred” represents to webmasters and SEOs.
Meet Google’s black box
A black box is when a system’s inputs and outputs (and their general relationships) are known, but
- internal structures are not well understood (or understood at all);
- understanding these structures is deemed unnecessary for users; and/or
- inner workings are not meant be known due to a need for confidentiality.
To this end, Google has also communicated to SEOs through different channels that they are acting from a black box perspective — the way they used to before Matt Cutts took over Webmaster communications.
We have been told we don’t need to understand the algorithms. We have been told that this knowledge is not necessary to do the work. We have been told that all we need to do to be successful is be awesome. “Awesomeness” will get us where we need to be.
This all sounds good. It really does. Just be awesome. Just follow the Webmaster guidelines. Just read the Google Quality Rater’s Guide. You will be set.
Of course, the devil is in the details.
What does ‘awesome’ mean?
Follow the Webmaster Guidelines. Read the Quality Rater’s Guide. Follow these rules for “awesomeness.”
While that advice can help an SEO become awesome on a basic level, it can’t tell one what to do when there is a complex problem. Have a schema implementation issue? What about trying to figure out how to properly canonical pages when doing a site modification or move? Does being awesome tell me how to best populate ever-changing news sitemaps? What about if you get a manual action for that structured data markup because you did something wrong? What about load times?
There are a lot of questions about the million smaller details that fall under “being awesome” that, unfortunately, telling us to “be awesome” does not cover.
This is where the black box becomes potentially detrimental and damaging. Where do you get information about site changes once you have passed the basics of the Webmaster Guidelines and Quality Raters Guide? You saw a change in your site traffic last week; how do you know if it is just your site or an algorithm update if Google won’t tell you?
Being awesome
Google no longer wants SEOs to worry about algorithms. I get it. Google wants you to just be awesome. I get that, too. Google does not want people manipulating their algorithms. Webmaster Guidelines were first written to help stop spam. Google just wants you to make good sites.
The issue is that there still seems to be an unspoken assumption at Google that anyone who wants information about algorithm updates is just trying to find a way to manipulate results.
Of course, some do, but it should be noted most people who ask these questions of Google are just trying to make sure their clients and sites meet the guidelines. After all, there are multiple ways to create an “awesome” website, but some tactics can harm your SEO if done improperly.
Without any confirmations from Google, experienced SEOs can be pretty sure that their methods are fine — but “pretty sure” is not very comforting when you take your role as an SEO seriously.
So, while “being awesome” is a nice idea — and every site should strive to be awesome — it offers little practical help in the ever-changing world of SEO. And it offers no help when a site is having traffic or visibility issues.
So, why is this important?
The lack of transparency is important for several reasons. The first is that Google loses control over the part of product it has never controlled: the websites it delivers in search results. This is not a concern for site owners, but it seems the ability to actively direct sites toward their goals would be something Google would value and encourage.
They have added Developer Guides to make finding SEO/webmaster information easier, but these only help SEOs. Site owners do not have time to learn how to write a title tag or code structured data. These guides also are very high-level, for the most part — they communicate enough to answer basic questions, but not complex ones.
In the end, Google hurts itself by not communicating in greater detail with the people who help affect how the sites in their search results work.
If it is not communicated to me, I cannot communicate it to the client — and you can be assured they are not going to the Developers site to find out. I can also tell you it is much harder to get buy-in from those at the executive level when your reasoning for proposed changes and new initiatives is “because Google said to be awesome.”
If Google doesn’t tell us what it values, there’s little chance that site owners will make the sites Google wants.
Why else?
SEOs are not spammers. SEOs are marketers. SEOs are trying to help clients do their best and at the same time achieve that best by staying within what they know to be Google’s guidelines.
We work hard to keep up with the ever-changing landscape that is SEO. It is crucial to know whether a site was likely hit by an algorithm update and not, say, an error from that last code push. It takes a lot more time to determine this when Google is silent.
Google used to tell us when they rolled these major algorithm updates out, so it gave you parameters to work within. Now, we have to make our best guess.
I think it would be eye-opening to Google to spend a week or so at different SEOs’ desks and see what we have to go through to diagnose an issue. Without any clear communication from Google that something happened on their end, it leaves literally anything that happens on a website in play. Anything! At least when Google told us about algorithmic fluctuations, we could home in on that.
Without that help, we’re flying blind.
Flying blind
Now, some of us are really experienced in figuring this out. But if you are not a diagnostician — if you do not have years of website development understanding, and if you are not an expert in algorithms and how their changes appear in the tools we use — then you could find yourself barking up a very wrong tree while a crippled site loses money.
Every experienced SEO has had a conversation with a desperate potential client who had no idea they were in violation of Google’s guidelines — and now has no money to get the help that they need because they lost enough search visibility to severely hamper their business.
And that leads me to the last but most important reason that this black box practice can be so damaging.
People
People’s livelihoods depend on our doing our job well. People’s businesses rely on our being able to properly diagnose and fix issues. People’s homes, mortgages and children’s tuition rely on our not messing this up.
We are not spammers. We are often the one bridge between a business making it and employees winding up on unemployment. It may sound hyperbolic, but it’s not. I often joke that 50 percent of my job is preventing site owners from hurting their sites (and themselves) unknowingly. During earlier versions of Penguin, the stories from those site owners who were affected were often heartbreaking.
Additionally, without input from Google, I have to convince site owners without documentation or confirmation backup that a certain direction is the correct one. Can I do it? Sure. Would I like it if Google did not make my job of convincing others to make sites according to their rules that much harder? Yes.
Will Google change?
Unlikely, but we can hope. Google has lost sight of the very real consequences of not communicating clearly with SEOs. Without this communication, no one wins.
Some site owners will be lucky and can afford the best of the best of us who don’t need the confirmations to figure out what needs to be done. But many site owners? They will not be able to afford the SEO services they need. When they cannot afford to get the audit to confirm to them that yes, Google algorithms hurt your site, they will not survive.
Meanwhile, we as SEOs will have difficulties moving the needle internally when we cannot get buy-in from key players based on the idea of “being awesome.” Google will lose the ability to move those sites toward their aims. If we are not communicating Google’s needs to site owners, they will likely never hear about them. (There is a reason so many sites are still not mobile-ready!)
Is that black box worth it to Google? Perhaps. But is being obtuse and lacking in transparency truly beneficial to anyone in the long run?
It seems there are better ways to handle this than to simply direct everyone to make “awesome” sites and to read the Webmaster Guidelines. We are professionals trying to help Google as much as we are asking them to help us. It is a partnership, not an adversarial relationship.
No one is asking for trade secrets — just confirmation that Google made a change (or not) and generally what they changed.
It is like feeling really sick, going to the doctor, and he tells you, “Well you have a Fred.”
You ask the doctor, “What can I do for a case of ‘Fred?’”
He looks at you and says, “Easy! Just be awesome!” And then he walks out the door.
Well, you think, at least I have WebMD.
In the meantime, here are some ideas of how you can work with Fred and Google’s black box.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.