Showing posts with label Google Search Console. Show all posts
Showing posts with label Google Search Console. Show all posts

Let’s Say Goodbye to the Google Old Search Console

After over two years since the new version of the Google Search Console has come out, Google has finally removed access to the old version.

Old & New Google Search Console
Google's Old & New Search Console

After over two years of testing the new Google Search Console and bringing it out of just about a year ago, Google has announced it has shut down the old Google Search Console.

In their Webmaster Central Blog, Google said,
Today we are reaching another important milestone in our graduation journey, we are saying goodbye to many old Search Console reports, including the home and dashboard pages πŸ‘‹. Those pages are part of the history of the web, they were viewed over a billion times by webmasters from millions of websites. These pages helped site owners and webmasters to monitor and improve their performance on Google Search for over a decade.

From now on, if you try to access the old homepage or dashboard you’ll be redirected to the relevant Search Console pages. There are only a few reports that will still be available on the old interface for now - check the Legacy tools and reports in the Help Center. We're continuing to work on making the insights from these reports available in the new Search Console, so stay tuned!

The announcement also shared on the twitter account:

Interface Redirect: Google is redirecting all attempts to reach the old Google Search Console into the new Google Search Console interface. There are several legacy reports that are still not migrated over or replaced in the new interface. Those reports will be accessible via an option in the new Google Search Console named “legacy tools and reports.” You will be able to access those legacy tools and reports via the links.

Google is officially sunsetting the old version of the Search Console. Users who try to access the old version will be redirected to relevant pages in the new Search Console.
There is still no 1-to-1 replacement for every old tool and report in Search Console, so Google will still be keeping these alive for now:
  1. Disavow Tool
  2. Remove URLs tool
  3. Crawl stats report
  4. Robots.txt testing tool
  5. URL Parameters tool
  6. International targeting
  7. Google analytics association
  8. Data highlighter tool
  9. Messages report
  10. Crawl rate settings
  11. Email preferences
  12. Web tools

Google will continue to work on making the above tools and reports available in the new Search Console. However, Google recently stated that users shouldn’t expect all tools to be ported over to the new version.

Below are some tools and reports which Google will be working on to make them permanent in the new search console but not all the tools will be ported over to the new version.
  • Structured Data Testing Tool
  • Structured Data Markup Helper
  • Page Speed Insights

Currently, there are few reports from the old search console which is accessible in the New Console under Legacy Tools and Reports below Security issues and above the Links tab.

New Legacy Tools And Reports
Legacy Tools And Reports In New Search Console

In the new Webmaster Search Console, when you click the Legacy Tools and Reports, a drop-down menu displays the following items:
  • International targeting
  • Removals
  • Crawl stats
  • Messages
  • URL parameters
  • Web Tools

Later I'll Explain How to Use Search Console New Legacy Tools And Reports.

Google paid tribute to the occasion by gathering its team for a group photo in front of the old Search Console interface in their blog post and a tweet.

Why Webmaster and SEOs Should Care: This is a big change for a lot of SEOs who were familiar and comfortable with the old google webmaster tool interface. Google removed features slowly from the old interface, but some SEOs continued to hang on to it.

By now, we should be used to the new interface and check out the new option to see some of the legacy reports. However, Google promised to add new tools in the new search console.

How to Remove 404 Errors in Google Search Console

Fix Crawl Errors in Google Search Console
If you having issue with 404 Error in Google Search Console, this tutorial is for you! Follow the Steps to Improve your Website Usability and Organic SEO.

The Framework is as Follows: Despite your efforts, Google lists 404 errors (And other types of HTTP Errors) in the "Crawl Errors" report in your website Search Console Account.

Is 404 Error Bad For My Website?

How to correct them and make HTTP Errors disappear from GSC?

Do 404 Errors Bad For My Website SEO?


Here are the explanations provided by Google:
In general, 404 errors do not adversely affect your site's performance in search results, but they can help improve the user experience.
In the help, you will find additional tips:

They often occur as a result of typos or configuration errors, for example in the case of links generated automatically by a content management system. They can also be the result of the growing development of our services to detect and explore links in integrated content such as JavaScript.

This last sentence is not very clear, but basically we understand that Google sometimes seeks to access URLs that simply do not exist, but its interpretation of JavaScript codes makes Google believe that these URLs may exist.

I End with the latest tips from Google about 404 error:
It is quite normal, even desirable in some cases, to encounter 404 error codes on the Web. You will probably never be able to control all links that redirect to your site or resolve any 404 errors that appear in the Search Console. Focus on the most important issues, solve the problems you can, then move on to another step.

So? The purpose of my article is precisely to help you understand 404 error impact on organic seo and how to fix them.

My Point of View:

  • If a user land 404 page by clicking search engine organic result then you should fix the link ASAP. Because this 404 error (404 or other) will negatively impact your website search ranking.
  • If a page is in error 404 because of a link coming from other website, you can do nothing and it does not degrade your SEO performance. But if the link is coming from quality site, as long as it landing in a broken page, you're not getting any seo advantage of this link, so you should correct the link.
  • If you have too many 404 errors in your internal links, it degrades the user experience so you should fix them to improve user experience.

Steps to Fix 404 Errors


In your place, I would follow these steps.

Step 1- Fix 404 Errors Generated by Internal Links


If 404 errors are caused by internal links, they must be corrected because:

  • These errors degrade the user experience
  • These errors interfere with your SEO since a page does not receive the link you had planned to make
  • It's easy to detect and correct :-)
  • It will do a first cleaning of the list of errors 404 indicated in Search Console

Step 2- Fix 404 Errors Generated by Sitemaps


Use an HTTP header checker tool to check that each and every URL in your Sitemap actually returns a 200 status code (which means everything is OK). There should be no redirection.

If you are lost in all these codes: see the list of HTTP codes .

Step 3- Fix 404 Errors Generated by backlinks


Some 404 errors may be linked to backlinks, that is, links from other sites, pointing to a wrong URL on your site. To identify them, use your favorite backlinks analysis tool (Majestic, Ahrefs or Moz) and retrieve the list of backlinks pointing to a 404 error. Majestic has published an article on this topic, it's up to you to exploit it for your own Site instead of that of a competitor. If you do not arrive at all, and you ask me kindly , it is possible that I will do it for free ;-)

If the one who made the link was slightly mistaken in the URL, it is a pity and it must be corrected:

  • Contact the webmaster of the site that makes you the link in error to ask him to correct it. Introduce him in a positive way by explaining that he has a broken link on his site ...
  • If it does not respond, set up a 301 redirect from the wrong URL to the right one

Step 4- Fix other HTTP Errors


Once you have validated the previous steps, wait 1 or 2 weeks for Google to update your Search Console account. Next, see the Crawl Errors report, click the "Not Found" tab in the "URL-level Errors" sub-section.

Google lists Errors in Order of Priority, so enjoy.


If it's easier for you to manage the list in Excel, simply download the table in CSV format (or Google Docs). In the case of CSV, here are the columns that you will retrieve:
  • URLs
  • Response code: 404 for pages not found
  • Google News error: only applies to sites in Google News
  • Detected: date of first detection by Googlebot (the robot of Google)
  • Category: error type (here "not found")
  • Platform: Googlebot version encountered error (computer, smartphone or multimedia phone)
  • Last exploration: date of last crawl by Googlebot

Depending on the different cases that remain listed, you may need to repeat one of the previous steps. To see more clearly, you can check the box in front of the URL processed and click on the button "Mark as correct".

Tip : If you have a lot of errors, you may get to the limit set by Google, which is 1000 URLs. To circumvent it, simply declare a subpart of your site as a new property in Search Console. This technique works only if you have directories at the root, for example / blog /. In this case, you can declare http://www.example.com/blog/ as a new property; It will be validated immediately and you will be able to consult the list of errors 404 concerning only this directory. Convenient !

Redirect to the Reception: bad solution!


I still regularly encounter badly configured sites, on which in case of page not found we are redirected to the homepage. Do not do that!

Indeed Google's online help confirms that a web server must return a 404 code when a resource can not be found:
It is quite normal, even desirable in some cases, to encounter 404 error codes on the Web.

You may have configured a custom 404 error page (that's fine). However, check that the HTTP returned is a 404 code, and not 302 (temporary redirection). Use an HTTP code test tool on a non-existent URL on your site. If you have a 302 redirect, you may have specified an entire URL in your .htaccess file for the custom error page.

How often should you check for 404 errors?


You should be checking your 404s at least once every month and on a bigger site, every week. It doesn’t really depend on how much visitors you have but much more on how much content you have and create and how much can go wrong because of that. The first time you start looking into and trying to fix your 404 error pages you might find out that there are a lot of them and it can take quite a bit of time. Try to make it a habit so you’ll at least find the important ones quickly.

Need Help?


If you do not get there or something is not clear enough for you, feel free to ask the question in the comments

The Ad Experience Report in Google Search Console

Google ad experience report
Google Ad Experience Report

Do you have too many ads on your site? Or ads degrading the user experience? Since June 2017, a report tells you in your Search Console account. Knowing that these ads will be blocked in 2018, get ready!

Google Search Console Ad Experience Report


What is the purpose of this "Ad Experience" report?


As of January 1, 2018, certain ads will be automatically blocked in Google Chrome, without the user having anything to install ( source ). In other words, Chrome will include an ad blocker!

This report will help you identify your pages that display advertisements that may be blocked by Chrome. If your site is concerned, you will see screenshots and even videos of ads that are problematic.

It's worth taking a look at, is not it?

Where can I find the advertising experience report?


Go to https://www.google.com/webmasters/tools/ad-experience and choose the property you want to test. The drop-down menu is on the top right on a fixed computer, it lists the properties to which you have access.

You can also click on "Web Tools" in the left margin, once connected to your Search Console account, for the site of your choice.

Then click on "Computer" or "Mobile" to access the results.

Here is what it gives in image:
Google ad experience report
Google Ad Experience Report


Note: it is possible that your site has not yet been analyzed, in this case you will only receive the message "Status: not examined". So we will have to wait and consult the report another day ...

Definition of the Advertising Experience


The advertising experience is defined by several elements:
  • The presentation of the site
  • The behavior of the site
  • The content and ads your users are exposed to

It can be a direct result of trafficking (creating and displaying) creatives on your pages (for example, automatic video ad play with sound).

It may also depend on how ads are embedded in your site (eg, high ad density on mobile devices).

Inventory of the Advertising Experience on your Site


Google conducts reviews to verify that ad experiences are likely to cause inconvenience to your users.

If they are found several times, they will be listed in this report.

Reports are grouped by root domain, including all subdomains and directories. If your root domain is example.com, the report contains the ad experiences of www.example.com, actualites.example.com, example.com/finances, and so on.

To review the ad experiences on your site, Google takes into account your main region of broadcast. This region corresponds to the geographical area from which the majority of visitors come. Internet users in each country may have their own preferences for advertising experience. However, the data suggest that there are groups of countries with similar preferences. Each major dissemination region has one or more countries assumed to have similar preferences in terms of advertising experience. For now there are 2 regions:

  • Region A: United States and Canada
  • Region B: Europe

How do I know if my Ads will be Blocked by Chrome?


Look at the filtering status of the ads on your site, it may correspond to the following values:

  1. Disabled : Chrome does not filter ads on your site.
  2. Enabled : Regardless of the region in which they are located, users browsing your pages with the Chrome browser do not see the ads that are showing. You must resolve non-compliance issues and submit your site for review.
  3. Paused : Ad filtering in Chrome is paused during review of your site.
  4. Pending : The status of your site is "Failed". Ad filtering will take place in the future. We will send an e-mail to users and registered site owners at least 30 calendar days before the start of ad filtering. To avoid this filtering, troubleshoot non-compliance issues and submit your site for review.

Google Warns Before Blocking Ads?


Yes Fortunately !
  1. Initially, no ads are blocked.
  2. If you identify bad advertising experience on your site, Google will send you a warning . The ads are not yet blocked.
  3. If 30 days after the notification your site is not corrected, Google considers it a failure and Chrome will start blocking your ads.

How do I Request a Review of the Site?


If you think you have corrected the issue, please visit this support page for Google .

Which Pubs are Deemed Immediately Harmful?


Adverse advertising experiences are particularly misleading or abusive.

Definition: An advertising experience is considered harmful if it meets one of the following conditions ( source ):

  1. It uses malicious or unwanted software that can be installed on the user's computer.
  2. It leads to a phishing attempt.
  3. It aims to trick the user into inciting him to download software (malicious or not), or to install malicious or unwanted software.
  4. It causes an automatic redirection without intervention of the user.
  5. It involves components that are confusing (such as a close button that does not close the ad, but clicks on it or returns to another content).
  6. It has deceptive content intended to trap the Internet user. For example, the ad may appear to be a system message, such as an update button.

Which Ads Annoy the most Users?


According to the Coalition for Better Ads study, respondents indicated the following 3 types of ads:

  • Advertisements that interrupt : When you arrive on a press article, an announcement forces you to wait ten seconds before you can read it. Advertisements that disrupt the flow of information - especially on mobile devices - are generally considered the most boring by consumers. The study shows that 74% of mobile users find that ads that interrupt access to content (such as pop-ups) are extremely or very annoying.
  • Advertisements that distract : It only takes a few seconds for people to decide if your site is worth their time. Flashing animations and commercials that automatically play a sound distract people during these first critical seconds. In the end, these ads could cause them to abandon your site. These experiments are extremely disruptive on both fixed and mobile computers.
  • Disruptive ads : When a page is cluttered with commercials, it takes more time to load, making it more difficult for people to find what they are looking for. On mobile, a high density display slows down strongly, which degrades the user experience.

Advertisements to Avoid on Mobile


Mobile browsing is a matter of speed and convenience. To avoid annoying or even irritating mobile users, avoid anything that can hinder or deconcentrate the consultation of the content. Especially that the screen is not that big!

Ads to Avoid on Computer


On desktops, users like to control their experience, so the obstacles that prevent them from controlling the flow of information at their own pace are unacceptable.

3 Golden Rules for a Better user Experience:


  1. Be immediate : people are more likely to engage when ads are quick and do not slow down content. For example, using the system GPA , advertisements AMP offer a more effective way to build, serve and measure adaptable ads ( responsive ). With ads loaded 6 times faster, Time Inc. measured 13% greater visibility and increased eCPM and CTR .
  2. Be immersive : Advertising experiences that blend perfectly with a user's content experience are less likely to bother them. Native advertising offers the ability to show ads that match the form and function of your site's content. The responsive native ads can even adapt to all types of devices and screens. The New York Times measured a CTR multiplied by 6 and visibility impressions multiplied by 4 with native ads compared to comparable standard banners.
  3. Be relevant : programmatic technology allows advertisers and publishers to publish more relevant ads based on consumer interests, which helps them stay more committed to your site.

What's Your Opinion About Google Ad Experience Report?

Googlebot Crawl Budget Explained on Google Webmaster Central Blog

Recently, we've heard a number of definitions for "crawl budget", however we don't have a single term that would describe everything that "crawl budget" stands for externally. With this post we'll clarify what we actually have and what it means for Googlebot.

First, we'd like to emphasize that crawl budget, as described below, is not something most publishers have to worry about. If new pages tend to be crawled the same day they're published, crawl budget is not something webmasters need to focus on. Likewise, if a site has fewer than a few thousand URLs, most of the time it will be crawled efficiently.

Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is more important for bigger sites, or those that auto-generate pages based on URL parameters, for example.

Crawl Rate Limit

Googlebot is designed to be a good citizen of the web. Crawling is its main priority, while making sure it doesn't degrade the experience of users visiting the site. We call this the "crawl rate limit," which limits the maximum fetching rate for a given site.

Simply put, this represents the number of simultaneous parallel connections Googlebot may use to crawl the site, as well as the time it has to wait between the fetches. The crawl rate can go up and down based on a couple of factors:


  • Crawl health: if the site responds really quickly for a while, the limit goes up, meaning more connections can be used to crawl. If the site slows down or responds with server errors, the limit goes down and Googlebot crawls less.
  • Limit set in Search Console: website owners can reduce Googlebot's crawling of their site. Note that setting higher limits doesn't automatically increase crawling.


Crawl Demand

Even if the crawl rate limit isn't reached, if there's no demand from indexing, there will be low activity from Googlebot. The two factors that play a significant role in determining crawl demand are:


  • Popularity: URLs that are more popular on the Internet tend to be crawled more often to keep them fresher in our index.
  • Staleness: our systems attempt to prevent URLs from becoming stale in the index.


Additionally, site-wide events like site moves may trigger an increase in crawl demand in order to reindex the content under the new URLs. Taking crawl rate and crawl demand together we define crawl budget as the number of URLs Googlebot can and wants to crawl.

Factors Affecting Crawl Budget

According to our analysis, having many low-value-add URLs can negatively affect a site's crawling and indexing. We found that the low-value-add URLs fall into these categories, in order of significance:


  1. Faceted navigation and session identifiers
  2. On-site duplicate content
  3. Soft error pages
  4. Hacked pages
  5. Infinite spaces and proxies
  6. Low quality and spam content


Wasting server resources on pages like these will drain crawl activity from pages that do actually have value, which may cause a significant delay in discovering great content on a site.

Top Questions

Crawling is the entry point for sites into Google's search results. Efficient crawling of a website helps with its indexing in Google Search.

Q: Does site speed affect my crawl budget? How about errors?

Amswer: Making a site faster improves the users' experience while also increasing crawl rate. For Googlebot a speedy site is a sign of healthy servers, so it can get more content over the same number of connections. On the flip side, a significant number of 5xx errors or connection timeouts signal the opposite, and crawling slows down. We recommend paying attention to the Crawl Errors report in Search Console and keeping the number of server errors low.

Q: Is crawling a ranking factor?

Answer: An increased crawl rate will not necessarily lead to better positions in Search results. Google uses hundreds of signals to rank the results, and while crawling is necessary for being in the results, it's not a ranking signal.

Q: Do alternate URLs and embedded content count in the crawl budget?

Answer: Generally, any URL that Googlebot crawls will count towards a site's crawl budget. Alternate URLs, like AMP or hreflang, as well as embedded content, such as CSS and JavaScript, may have to be crawled and will consume a site's crawl budget. Similarly, long redirect chains may have a negative effect on crawling.

Q: Can I control Googlebot with the "crawl-delay" directive?

Answer: The non-standard "crawl-delay" robots.txt directive is not processed by Googlebot.

Q: Does the nofollow directive affect crawl budget?

Answer: It depends. Any URL that is crawled affects crawl budget, so even if your page marks a URL as nofollow it can still be crawled if another page on your site, or any page on the web, doesn't label the link as nofollow.

For information on how to optimize crawling of your site, take a look at our blogpost on optimizing crawling from 2009 that is still applicable. If you have questions, ask in the forums!

Google Drops Their Feature Phone Crawler & Error Report in Search Console

No Feature Phone Crawler & Error Report Available in Google Search Console
November 29, 2016 Google Announced Goodbye to Content Keywords, Wednesday, November 30, 2016 Wrote on Webmaster Central Blog that they're doping Google's feature-phone crawling & indexing in Search Console. Although there are probably a lot of feature phones around, I wonder how many people actually use search with a feature-phone.

No surprise there, This does not impact how Google crawls or indexes smartphone content, just feature phones. Feature phones are those old Nokia phones that let you access websites in a text-based interface.

Google said that “Limited mobile devices, "feature-phones", require a special form of markup or a transcoder for web content. Most websites don't provide feature-phone-compatible content in WAP/WML any more".
"We won't be using the feature-phone user-agents for crawling for search going forward."
It Means, Google Bot won’t be using the feature-phone user-agents for crawling for search going forward. So you will no longer see those in Search Console logs.
Use "handheld" link annotations for dynamic serving of feature-phone content.
Some sites provide content for feature-phones through dynamic serving, based on the user's user-agent. To understand this configuration, make sure your desktop and smartphone pages have a self-referential alternate URL link for handheld (feature-phone) devices:  
<link rel="alternate" media="handheld" href="[current page URL]" /> 
 This is a change from our previous guidance of only using the "vary: user-agent" HTTP header. We've updated our documentation on making feature-phone pages accordingly. We hope adding this link element is possible on your side, and thank you for your help in this regard. We'll continue to show feature-phone URLs in search when we can recognize them, and when they're appropriate for users.
It means, that if you do have a feature phone support on your website, you need to use “handheld” link annotations for dynamic serving of feature-phone content.
We're retiring feature-phone tools in Search Console:
Without the feature-phone Googlebot, special sitemaps extensions for feature-phone, the Fetch as Google feature-phone options, and feature-phone crawl errors are no longer needed. We continue to support sitemaps and other sitemaps extensions (such as for videos or Google News), as well as the other Fetch as Google options in Search Console.
"We've worked to make these changes as minimal as possible. Most websites don't serve feature-phone content, and wouldn't be affected. If your site has been providing feature-phone content, we thank you for your help in bringing the Internet to feature-phone users worldwide!" Posted by John Mueller, Google Webmaster Trends Analyst.
This is a change from previous documentation for feature-phones of only using the "vary: user-agent" HTTP header. Feature-phone tools are going to disappear from the search console.