SMX Overtime: Your questions answered about Google penalties and their impact on websites


Being an international conference speaker comes with a great advantage: one gets to discuss SEO with a lot of like-minded SEO professionals. In the course of many conversations I had at SMX London, some topics and questions come up more frequently. A lot of people are interested in issues around linking and wish to understand Google penalties better. Here are the questions asked during and after my Google Penalties presentation at SMX London, along with my recommendations based on seven years working at Google Search and seven years working as a technical SEO consultant.

Backlinks, penalties and the disavow tool

Do you think Google uses the disavow tool for anything besides addressing penalties? Is it worth using time on?

If your website has been building links in the past or receives a lot of low quality links naturally, it is definitely worth using the disavow tool for its primary purpose and that is to be disassociating a website from its backlink risks. Short of removing or nofollowing backlinks, which is the preferred yet more time consuming and may not always be feasible method, it is currently the only method to mitigate backlink risks with search engines. Google keeps a closed lid on any alternative uses for the disavow data they keep receiving. While there is no official confirmation, it does not take much imagination to consider what could be done in terms of algorithm training and machine learning with this tremendous, consistent amount of data if an engineer were to focus on say 0.1% of patterns that are repeatedly disavowed globally. This is of course just a thought experiment, since Google never confirmed any such initiative.

What is the optimal PBN ratio? 

Assuming that it’s private blog networks we’re talking about the optimal ratio for link building purposes is zero. PBN are a Google Webmaster Guideline violation and – most importantly- easily detectable, which is why they using them is likely to raise undesirable Google attention and ultimately trigger a penalty lowering the sites rankings.

Have you ever seen an effect of using the disavow tool?

Yes, there’s no doubt that the disavow file is an effective tool to address backlink penalties and reduce backlink related risk levels.

Can a site be penalized beyond recovery?

Luckily, any website can be recovered. Occasionally it may take a lot of effort, especially when the spammy techniques have been used excessively in the past. But there are no un-recoverable websites.

Why doesn’t Google tell when they do algorithm penalties?

The reason why Google does not highlight the impact algorithms have on what may feel like a site being penalized is tha these are not algorithmic penalties. They are merely algorithms which are frequently updated and may result in changes in search visibility. Please read algorithms vs. manual penalties for more information on the differences between these two.

What could be a reason for Content Injection Security Issues even though I am sure that my site was not hacked at all?

Speaking from personal experience while moderating Google Webmaster Help Forums and working for Google Search, I haven’t seen many false positive security alerts. If confronted with a content injection warning, assume it is accurate, even if you can’t initially corroborate the problem and crawl the website in order to verify what is happening. If still in doubt, I recommend to seek the assistance of professionals who are dealing with similar cases on a daily basis.

Is manual action a matter of in or out of index? Or are there cases of ranking drops by a number of positions?

Manual actions are more nuanced than that. Only the worst of sites that do not add any value to the Internet are removed from search results. And even these can successfully apply for reconsideration. Manual actions (aka manual penalties) can be applied to the entire website or to merely one section, such as a subdomain. Or even more granular. Depending on the type of violation and severity of the transgression, the impact can range from loss of SERP real estate to dramatic loss of rankings. For more specific information on the subject please read the Comprehensive Guide on Penalties.

What does a penalty look like? How is it implemented? For example: Is it a black list or is it a rule to not show the page in top 10 results ?

The specific process’ within Google that lead to a manual spam action are a closely guarded secret. What is true though that it is a manual process with multiple safeguards in place.

Can you have a manual penalty without a notice in GSC?

If you don’t have a  manual spam action a.k.a. penalty notice in your Google Search Console, then you don’t need to worry about how to recover from a penalty at that moment.

https://searchengineland.com/
Google Search Console provides transparency about existing manual spam actions.

If one of my sites gets a penalty, does it affect my other properties in some way ?

No, it does not, unless they also violate Google Webmaster Guidelines.

Have you recently seen an effect of disavowing links when you haven’t received a manual penalty?

Absolutely. The webmaster was finally able to get some good-night rest instead of fretting about his link risks.

How to deal with hacked content warnings with appears randomly and is not really hacked content?

In fact, hacked content warnings are almost always accurate. However, tracing the reason for their appearance can be difficult. Crawling the website is an important step towards cleaning up the site. In doubt, seek assistance from a professional.

What may trigger the quality raters to evaluate your link profile for a manual penalty?

Google Quality Raters are temporary hires who operate remotely from their homes and help to test proposed changes to the Google algorithms. They do not evaluate websites or their backlink profiles for webspam purposes. They do not apply manual spam actions. And they are not privy to any insights regarding the inner workings of Google Search.

While doing our monthly audit, should we be using the disavow tool on every link that seems spammy?

Monthly audits seem way too frequent, even for the most competitive niches. Audits, including crawling a representative part of the website, should be conducted once or twice a year. The frequency of backlink reviews depends on risk levels. Previously challenging backlink profiles may require a quarterly review. Most websites are well served with reviews conducted every six to twelve months. In every instance, newly identified spammy links must be disavowed on the domain level.

Will affiliate links to your site add/reduce authority? Can they cause a manual action?

Yes, affiliate links to your website are frequently a backlink liability and can be the leading cause for a manual penalty. To avoid the risk, make sure the affiliate links to your site do not pass any authority. When in doubt, get a second opinion from a professional to evaluate your current setup.

Should I also use the disavow file to prevent a manual penalty? The effort is huge to review thousands of links but the effect is not clear.

READ ALSO  Starbucks' Closing of Teavana Stores Could Be Small Business Opportunity

That depends on risk levels which are individual for every site. Sites with a colorful past, that had previously been link building are likely to attract ore undesirable links even after stopping their intentional efforts. Most likely, the answer is ‘yes’, however it does not have to be a huge, resource intense effort. Instead, periodic reviews and updates tend to be sufficient to mitigate backlink risks.

If a competitor buys spam links to my property and then report me, does Google have a way to know that I shouldn’t be penalized?

That’s called negative SEO and Google may or may not understand the context of the links in question. That is exactly one of the situations that the disavow tool can be used for, and empowers the webmaster to maintain control of their own backlink profile.

Is disavow used for algorithmic evaluation or only manual?

Any by the website owner uploaded disavow files help search engines understand which links to ignore from an authority point of view. Search engines such as Google can then choose to use this in their algorithmic calculations when determining search visibility for a website. On the other hand, if a website is penalized the disavow tool can be one of the tools used to recover from the penalty.

Can a website be penalized because of pages with very short content or pages with duplicate content?

Yes, it is possible. Ask yourself, does the page add enough value or original and compelling content? I recommend reading The unique selling proposition: A key element for SEO success for more on this topic.

Does requesting a disavow cause a manual review from the quality team or is it automated?

Actually it is submitting a disavow file, a TXT document highlighting domains with specific links one wishes to disassociate with and it is a tool for the website owner. The uploaded disavow file is processed automatically and there is no human interaction in this process on Google’s side. The reconsideration request on the other hand is a manual one, in the course of which the case, including the rationale submitted are reviewed by a member of the Google Search team. So no, submitting a disavow file does not result in a manual review.

Algorithms, crawling, indexing and Google Search Console

Does Google put weight to the traffic through links? Do you have data to support that the traffic through links is what matters to rankings?

No, Google does not consider traffic a link that may be producing as a ranking signal.

Have you any advice to understanding the difference in the number of pages included in index between site console, and the “site:” search?

The site: operator results, which are mostly used by SEOs and barely ever by regular users, tends to return a rough estimate of the volume of pages crawled, indexed and potentially ranking. Coverage data shown in the Google Search Console is more accurate and useful for SEO purposes. Be sure to add your website as a domain property to Google Search Console to the most inclusive data.

Is there a way to set notification about Security Issues in Google Search Console?

There’s no need to actively set notifications for such issues in Google Search Console as Google does that automatically for the respective user. Next to any notices received in Google Search Console manual action overview, Google also highlights the issue detected by sending an additional email to the email address associated with the website owners in Google Search Console.

What metric is used to rate a page as slow/average/fast (speedscore/render time/…?) And do you think this metric is also used for ranking?

It is absolutely safe to assume that site performance is a huge ranking factor and that – when all other factors between websites are similar – the faster website always has a significant advantage. Google has published many speed-related studies and blog posts on how to improve your website’s performance. When testing the performance of your website, test for mobile users on 3G connections or try Google Lighthouse (built into Chromium/Google Chrome).

What are the key things you recommend watching over regularly on search console that can give us an idea on how we are faring against competitors?

Monitoring competitors is both tedious and providing precious little insights most of the time. Let’s take for instance keeping an eye on competitors link building activities. It may or may not suggest that Google Webmaster Guidelines are being bent or broken. But it does not provide any insight into the competitors disavow file and is therefore basically meaningless, from an SEO perspective.

It seems like there is no Search Quality team in smaller countries looking at casino keywords. Do they only care about English SERP?

During my time at Google Search, the team’s language capabilities were staggering. While English was the common tongue, everyone seemed to speak several languages, including some languages less common than others. I used to have a colleague sitting close to me who among other languages was fluent in Swahili. There is some merit to the perception that results in some languages return more questionable pages and that is owed to the fact that smaller languages tend to also consist of smaller amounts of indexable, quality content.

Is SERP snippet CTR a ranking factor in Google or not?

Yes, this is one of several user signals and therefore of great importance for rankings. However, don’t just focus on optimizing the CTR of your snippets because of possible ranking signals, instead focus on getting more and better converting users from your current SERP impressions to your website.

Does Google link different domains together if they have the same owner registered?

Google has a lot of data available and can often quite easily see if different domains belong to the same owner or not. However owning multiple domains is not a ranking factor, nor is it a negative signal.

Can we copy a review from another site, place it on the company’s site and mark it up with schema?

While it is possible, it serves no purpose. Republishing already existing content without adding value is basically scraping and can negatively impacting a sites content signals.

If algorithm updates isn’t penalties but taking a wrong turn, why doesn’t Google share the map for affected sites?

It’s a misconception to assume that algorithms fail, therefore a site is ranked lower then it may deserve, so Google has to compensate that to the site owner in some way. Google cares for their Google Search users and as long as users indicate with their behaviour that they are satisfied with the algorithmic updates, then so is Google.

What considerations (outside of disavowing dodgy links) do we need to look at when migrating another website into an existing one? (eg. combining two businesses)

Fundamentally it makes sense for the sites to be on the same topic. While integrating one website into another, establishing correct, lasting 301 redirects is important. At times it may make sense to make sure that no legacy issues, such as outdated content, are migrated. It is also important to ensure proper internal linking to migrate content, in order to avoid orphan pages. Lastly, be mindful that integrating two different sites into one can “reset” your rankings as it may take search engines a while to figure out what you are doing. Get help from a technical SEO professional before you start moving because there are often great opportunities to get rid of old signals that may have been holding your website back.

READ ALSO  The Source for SEO and Organic Marketing for Business – The Daily Telescope

What are your thoughts with AMP pages and using them to rank a site?

AMP is a site speed experiment aimed at improving the user experience in certain demographics, niches and/or regions where site speed performance is often an issue. As such AMP is favoured by Google, however it is not a ranking tool. There are alternative technologies available that offer equally fast, or even faster experiences while the website owner retains total control over content and server logs, which is partially lost with AMP. One additional danger with AMP is also that the focus of the website development, within dev and management teams, shifts from optimizing for fast site speed experiences to validating for AMP instead, resulting in not necessarily always the best decisions made for the content or the experience of the website. That having said, going AMP can be an instrumental tool to getting everyone within an organization aligned on budgeting site performance. To sum up, while AMP is an interesting speed experiment and isn’t inherently bad for a websites SEO, it certainly has its limitations and it depends heavily on the website, its target audience, the internal organization structure and the current infrastructure.

Connecting GSC to analytics: can it hurt your site SEO?

No, it can not.

Which are your top 3 features you would like to see migrated in the new SC?

1. URL Parameters and Structured Data overview on the domain property level. 

2. I would love to see the DNS error reports to come back in the new Google Search Console, as there is currently no other place where we can see if Googlebot is encountering DNS issues with accessing a website.

3. UX. Although the old interface was plain and simple in design and the new interface has much more data, what is really missing from the new Google Search Console compared to the old Search Console is a good user experience. Just to give you a few examples: 

  • The new UI is really slow compared to the old UI.
  • I can’t right click to open any result in a new tab (e.g. URL Inspection Tool). 
  • Opening multiple properties is cumbersome because there is no overview page of all properties added in the new Google Search Console. 
  • I have to use back buttons to navigate to previous selected reports or I end up on the homepage of the section and need to select everything from scratch again. 
  • It is hard to see the filters, when applied to any sample set, because of the form always hiding.

The old Search Console may have been old fashioned, but at least it was way more user friendly in basic navigational design.

Does it make sense to speed up site for Googlebot? For example, fully cache the site, because for users we can’t do that because of personalization.

Always have a default version of your website, without the personalization, for bots and users alike. This is then the version you can cache aggressively. Any personalization is of the indexable content is an enhancement of the default version for users that allow you to enhance their experience but your site should never be dependent on this.

Why Google ignore rel=canonical sometimes?

The reason why canonicals can be ignored is due to your website sharing conflicting technical SEO signals with Googlebot, such as mixing non-self referencing canonicals with noindex, or not linking to the canonicals but to other variations, or applying the wrong canonicals to pages (e.g. canonicalizing paginated pages to the first page or using relative paths), or your canonicals do not match your Sitemap URLs, etc. There are many reasons why Google may ignore your canonicals and a full technical SEO audit will tell you why.

I got an algorithmic down. Can I contact persons in Google for advice?

No. At conferences, it may be possible to approach a specialist, such as John Mueller, however they will be unlikely to provide an actionable, on-the-spot answer. Google does not offer individual support for webmasters. There is the Google Webmaster Help Forums, which is a communication channel, monitored by Google employees who may occasionally engage in the conversation. Alternatively you can ask your question in one of the Google Webmaster Video sessions. These are the best available options for reaching Google if you like. However, most likely the problem is not on Google’s side but on yours and the best alternative is to seek the advice of a third party, skilled SEO professional to help you with an analysis of your website.

What would be a typical reason for discovered not crawled and crawled not indexed (if it is not robots.txt, statuscode, noindex, redirect, canonical related)?

The reason why landing pages are not crawled, or crawled but not indexed is always the result of conflicting SEO signals. This can be technical in nature, like canonicals based, or based on quality, like content signals. Usually the reason for these issues can be identified with an in-depth SEO audit.

Why doesn’t Google share more specifics on their bigger algo updates if it’s “educational”?

Google does shares a lot of educational resources through their many channels. However it is important to keep in mind, that most website owners are not interested in the finer details of algorithmic updates. While from an SEO industry perspective it may look like something to be desired, even if Google did not worry about spammers trying to reverse engineer and spam their algorithms more, it is simply not feasible in the light of the multiple algorithmic releases and updates happening every single day.

Have more questions?

If you have not found the answer to a specific SEO question you are looking for here or maybe you just want to talk about Google Search and SEO, you can find me at the upcoming SMX East in New York where I will be speaking on several SEO topics close to my heart and address SEO questions, myths and misconceptions. See you in November 2019 in New York City!


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Fili is a renowned technical SEO expert, ex-Google engineer and was a senior technical lead in the Google Search Quality team. At SearchBrothers he offers SEO consulting services with SEO audits, SEO workshops and successfully recovers websites from Google penalties. Fili is also a frequent speaker at SMX and other online marketing events.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com