[go: nahoru, domu]

With 2020 hanging above our heads much the same way that bricks don't, people start reflecting on what they achieved this year, what went wrong, and how they could improve. We're no different, but instead of choosing what went well or wrong ourselves, we picked the announcements on our @GoogleWMC Twitter account that users interacted with the most, and decided to reflect on those. 

We had launches that you appreciated a lot. For example, we announced at Google I/O that Googlebot is becoming evergreen, meaning that it's always going to use an up-to-date version of Chromium for rendering. We hope that this will make it easier for developers to create stunning, modern, and snappy JavaScript experiences, by tapping onto the power of over 1000 new features and interfaces that are now supported.

Speaking of robots, together with the original author of the Robots Exclusion Protocol, other search engines, and input from webmasters, we submitted an Internet Draft to the IETF in order to start standardizing the 25-year-old protocol. 



Like Twitter users, we also thought it's an exciting project which lays down the rules of crawling for good, although it doesn't change anything for most.

But we haven't stopped there with touching ancient protocols: we also rethought how we need to treat "nofollow" links to keep up with the evolution of the web. It was an announcement that seemed to be welcomed by most Twitter users, and for a good reason: having a "hint" model for rel="nofollow" may help us reward those who create high quality content more, by serving even better results to our users.

One of the most tweeted  – and also most humbling – moments this year was when we lost a part of our index, which caused Search Console to misbehave, and also had rendering failures roughly the same time. Since Google Search works like a well oiled machine most of the time, we didn't have processes to quickly communicate issues to those who should know about them: webmasters. Lacking a proper process and channel to communicate these issues was a mistake and we are still working hard to rectify it, however one thing is clear: we need to do more on the critical communication side of things. 

We do like to communicate, in general: we shoot videos, we go to conferences, big and small, where we reach thousands of webmasters and SEOs, and in 2019 we extended our reach with the Webmaster Conference, which landed in 35 locations around the world in 12 languages. Not to mention the weather reports on our YouTube channel.

We hope you had a fantastic year and the new year will bring you even more success. If you need help with the latter, you can follow our blogs, @googlewmc on Twitter, or you could join us at a Webmaster Conference near you!

Posted by John Mueller, Cheese Connoisseur, and Gary the house elf 


Today we are announcing the launch of Publisher Center to help publishers more easily manage their content across news products at Google. Publisher Center merges two existing tools, Google News Producer and Google News Publisher Center, improving their user experience and functionality.
Publisher Center’s new features include a simpler way to manage your publication’s identity, like updating light and dark theme logos. It also provides an easier way for those that own multiple publications to organize and switch between them, particularly with improved permission settings that make it easier to collaborate with colleagues. Additionally, publishers can now point to the URLs for their website’s sections instead of RSS to configure sections in Google News. Content for News will now come directly from the web, just as it does for Search.
Publisher Center launches today in the existing four languages of the previous tools (English, Spanish, French, and German) and will expand to more languages soon. Learn more here.

People frequently come to Google Search looking to find information on the status of their packages. To make it easier to find, we have created a new package tracking feature that enables shipping companies to show people the status of their packages right on Search. Here’s an example of how this information may appear:
package tracking search result



Through the package tracking Early Adopters Program, which is available in all countries, shipping companies can sign-up to participate in this feature and give feedback on how to improve it. To take part in the program, a carrier will need to provide a RESTful JSON or XML API that returns the package tracking information. We can work with you to reuse an existing API or setup a new one.

Interested in providing package tracking information to your customers? Please review the full eligibility requirements and fill out the interest form.

If you have any questions, let us know through the forum or on Twitter.

Search Console sends millions of messages every month. They’re our main way to let you know when your website has a new issue and to make you aware of updates and opportunities to improve your presence on Google.

To make working with messages more efficient in Search Console, we have been working to make messages an integral part of the product. As of today, messages will be available through a panel accessed easily by clicking the bell icon at the top of any page in Search Console. The main difference from the old interface is that now you’ll have access to your messages throughout the product, no need to leave your reports.

You’ll see a notification with the number of new messages on top of the bell icon, as shown below. Click it to see a panel with a list of messages for your site, you can mark one or more messages as read to clear the clutter.

Image: New Search Console Messages

We are also introducing a new way to make messages more actionable by categorizing them into several types, such as Coverage, Enhancement types, Performance, and others. This will make it easier for you to find information about a specific issue.

Image: Categories in Search Console messages

We are also introducing the capability for site owners to access all the messages sent to the site, even those that were previously sent, regardless of when they were verified and when the messages were sent. Now, when users gain access to a new site they can see messages the site have received in the past, which should help the new owners to understand the context for that property.

The message panel shows messages only from May 23, 2019 onward. Messages sent before that date can be viewed only in your personal email or in the legacy message list. Your old messages will be available in the old interface for the time being, you can find them under the “Legacy tools & reports” section in the sidebar.

We hope this new feature improves your workflow and puts all the information you need at your fingertips to make better and faster decisions. If you have any feedback please let us know through the forum or via Twitter.

Posted by Maya Mamo, Haymi Shalom & Yuval Kurtser, Search Console engineering team

The ecosystem around and in Google Search has continued to evolve since the first episode of Google Search News. With this video series, we aim to give regular updates on Google Search, in particular for SEOs, publishers, webmasters -- essentially anyone who's creating content with the goal of making it discoverable in search. 

In this episode, we cover:

We hope you find these updates useful! Let us know in the video comments if there's something we can improve on.

We recently announced that in addition to schema.org markup, product data feeds submitted through Google Merchant Center and Google Manufacturer Center will be used to enrich Google Search shopping journeys. Today, we are announcing a new Search Appearance in the Search Console Performance report, which captures search stats for Product rich results on Google Search.

People come to Google to discover, research, and decide which products and brands to purchase. In order to provide helpful product information to shoppers, Google shows rich product data like product descriptions, price and availability within the search results.

Image: Rich product information on Google Search results

Website owners need to understand the impact of these rich results. The Google Search Console Performance report provides key metrics like clicks and impressions to help webmasters understand and optimize the performance of their website results on Google Search. These metrics can further be segmented by device, geography and queries.

If your website is eligible to appear on Product search results, you’ll find a new Search Appearance type called “Product results”, with which you can segment your traffic to analyze your search performance.

Image: Search Appearance of “Product results”

The new ‘Product results’ search appearance (shown in the screenshot above) will help website owners understand their search performance for product rich results. For example, they’ll be able to answer the following questions:

  • How much traffic comes from experiences with rich data like price, availability?
  • How does shopping traffic change over time, and for what shopping search queries is the website shown?

If you have any questions on this (or other) Search Console reports, let us know through the forum or on Twitter.

Posted by Siddhartha Garg, Product Manager

A fast web experience has long been an important user experience factor that we have promoted and advocated for. To help site owners on this quest we showed a preview of the Speed report in Search Console at Google I/O 2019. Since then, we've been iterating on all the great feedback from the beta testers and, starting today, are excited to begin public rollout!

Discovering potential user experience problems can be hard, especially on sites with many pages. To help site owners, the Speed report automatically assigns groups of similar URLs into "Fast", "Moderate", and "Slow" buckets. The data is based on the Chrome User Experience Report, which provides user experience metrics for how real-world Chrome users experience popular destinations on the web.

Image: Speed report overview
The report classifies URLs by speed and the issue that causes any slowdowns. Drill down on a specific issue to see examples of slow URLs to help you prioritize performance improvements for them. To get a better sense of what type of optimization can be performed for a specific URL, the report links to the Page Speed Insight tool, which provides information on specific optimization opportunities.

You should use this report both for monitoring your performance over time and for tracking fixes you made to your website. If you fix an issue, use the report to track whether users experienced a performance improvement when browsing the fixed version of your website.

To help you understand how your site is performing, you can also see what types of URLs are doing better by checking the moderate and fast buckets.

Image: URLs in the fast bucket

This new report is classified as "experimental" as we plan to continue to iterate and improve the metrics and recommendations for site owners. We will monitor feedback submitted directly through the report and in the user forum to ensure that it is useful and helpful, so please let us know how we are doing.

Posted By Sion Schori & Amir Rachum, Software Engineers

Today, we're announcing that Site Kit is available for everyone to install from the WordPress plugin directory. Site Kit is Google's official WordPress plugin — it gives you insights on how people find and use your site, how to improve and monetize your content, with data from multiple Google tools. Learn more about Site Kit on the official Site Kit website.

Who is Site Kit for?


Site Kit makes it easy for WordPress site owners to understand how their site is doing and what to do next. As a WordPress site owner, Site Kit offers you:
  • Easy to verify site ownership through Search Console — no need to paste a code snippet on your site to prove you're an owner.
  • Convenient and easy access to relevant stats directly in your WordPress dashboard. We save you time — see the key information right when you sign in.
  • Cross-product insights — Google Search Console, Google Analytics, PageSpeed Insights, AdSense — are combined into a single, intuitive dashboard. We help you make informed decisions, quickly.
  • No source code editing. This is the easiest way to install and manage Google products on your site in just a few clicks.
If you are a developer or an agency working on WordPress sites for clients, Site Kit can make your life easier:
  • You’ll get aggregated insights from Google products, distilled in a dashboard that your clients or other teams can easily access. No need to copy data from multiple products to compile reports for clients.
  • The site performance stats and improvement recommendations come directly from Google — your customers will be getting the latest best practices recommended by Google products.
  • Site Kit provides roles and permissions to help you manage access to the site’s data and make sure only the relevant people can see stats from Google products.
If you work on a plugin or hosting provider, Site Kit provides a scalable, easy way for your users to provision and access key Google tools and metrics.
  • Easy connection to official Google tools. Your customers don’t have to edit the source code to set up Google tools.
  • Added value for your platform. Your customers get important information about how their site is performing right in your own dashboard.
  • Flexible UI. You can pull stats from the plugin dashboard and display stats natively in your platform’s UI.
Site Kit’s main dashboard helps you see the most important info about how your site is doing at a glance: how people are finding your site (traffic sources), your most popular pages, and what people search for to find your site. For more, check out the Find your way around Site Kit guide in our docs.

Get started with Site Kit

To get started, install the plugin from the WordPress plugin directory. Here are some recommendations on how you can make the most of Site Kit:
  • Review the main dashboard on a weekly basis and check for any significant changes in how people are finding your site. Are people finding your site from a new place?
  • Keep track of how your recent posts are doing by checking the individual page reports.
  • Compare the top performing pages and how people found them. Is a particular topic or product attracting more visitors from social channels?
  • Set up new Google services to get more interesting insights — for example, if you enable PageSpeed Insights and Analytics, you’ll be able to see whether page load time affects bounce rate.
If you are attending WordCamp US in Saint Louis, stop by to see a demo of Site Kit and talk to the team. We're also happy to answer your questions in the Site Kit support forum.

We are happy to let you know that the Google Webmaster Conference is coming to Tel Aviv, Israel this winter!

If you are looking for an opportunity to socialize with the Search engineering team, or to hear from us what we’ve been working on, here is your chance! We’re planning lots of interesting content: John Mueller on Search, Andre Naumann on Trust & Safety, Daniel Waisberg on structured data, and many more speakers from the Search Console team.

The event will take place on December 4th, between 15:00 and 19:00 in the Google Tel Aviv offices. Learn more and apply for a spot at our website.



Get ready, and see you in Tel Aviv! And if you miss this one, don’t worry, we’ll have more events around the world next year. To be the first to learn about new locations, make sure to follow Google Webmasters on Twitter.

Posted by Daniel Waisberg, Search Advocate.

We are excited to introduce our newest video series: “Search For Beginners”! The series was created primarily to help new webmasters. It is also for anyone with an interest in Search or anyone who is still learning about the Web and how to manage their online presence.

We love to see the webmaster community grow! Every day, there are countless new webmasters who are taking the first steps in learning how Search works, and how to make their websites perform well and discoverable on Search. We understand that it sometimes can be challenging or even overwhelming to start with our existing content without some prior knowledge or basic understandings of the Web. We find our basic videos in our YouTube channels to be the ones with the most views. At the same time, advanced webmasters also see the need for content that can be sent to clients or stakeholders to help explain important concepts in managing an online presence.

We want to help all webmasters succeed, regardless of whether you have been managing websites for many years or you’ve just started out yesterday. We want to do more to help the new webmasters and this video series will hopefully help us achieve that.

Introduction to the series:

Episode 1:

The “Search For Beginners” video series covers basic online presence topics ranging from ‘Do you need a website?’, ‘What are the goals for your website?’ to more organic search-related topics such as ‘How does Google Search work?’, ‘How to change description line’, or ‘How to change wrong address information on Google’. Actually, we get asked these questions frequently in forums, social channels and at events around the world! The videos are fully animated. The videos are in English with subtitles available in Spanish, Portuguese, Korean, Chinese, Indonesian, Italian, Japanese, and English. We are working on more, so please stay tuned!

And if you consider yourself a more experienced user, please feel free to use these videos to support your pitches or explaining things to your clients. If you want to share any ideas or learnings, please leave them in the comment section in each video so that others can benefit from your knowledge and experience.

Follow us on Twitter and subscribe on YouTube for the upcoming videos! We will be adding new videos in this series to this playlist about every two weeks!

"The end of an era"... that's the title Microsoft used to announce it will stop supporting Flash in their web browsers. Flash is disabled by default in Chrome (starting in version 76), Microsoft Edge, and FireFox 69. Soon, we'll also move on from Flash in indexing for Google Search.

Flash was the answer to the boring static web, with rich animations, media, and actions. It was a prolific technology that inspired many new content creators on the web. It was everywhere. The Flash runtime, which plays Flash content, was installed 500 million times in the second half of 2013.

I still remember my son playing endless number of Flash games until my wife yelled at him. It's time to go to bed, son. Hey Flash, it's your turn to go to bed.

Google Search will stop supporting Flash later this year. In Web pages that contain Flash content, Google Search will ignore the Flash content. Google Search will stop indexing standalone SWF files. Most users and websites won't see any impact from this change.

Flash, you inspired the web. Now, there are web standards like HTML5 to continue your legacy.

Jalgayo /tʃɑlˈgɑjɔ/ (goodbye in Korean), Flash.

Video is an important and growing medium used to consume information online, and we want to make it as easy as possible for people to find useful and interesting videos on Google. Today, we’re introducing two new tools to help you understand your videos’ performance in Search and identify opportunities to improve your video markup.

There are three main ways people can see videos on Google Search today: on the main Search page; on the videos Search tab; and in Discover:

Left to right: Videos on the main search page; video search; and Discover.


Video Enhancement Report

Structured data can help search engines understand when videos appear on a page, so they can be displayed with a rich visual treatment, including accurate information on a video’s duration, upload date, and other metadata, as well as previews. This in turn helps users better understand what they’ll find in your video before they click.

A new report for “Videos” is now available in Search Console for sites that use structured data to annotate videos. The report allows you to see any errors and warnings for markup implemented on your site. When you fix an issue, you can use the report to validate if it was resolved by re-crawling your affected pages. Learn more about the rich result status reports.


Video Appearances in Performance Report

The Search Console performance report already includes an option to see the performance of your video tab search results (type = video). We are excited to share that we’ve extended our support for videos, so you can now also see the performance of your videos in the main Search results tab (type = web) and in Discover using the new “Videos” appearance. Content can appear with the video appearance if your page uses VideoObject structured data, or if Google uses other signals to detect that there is a video on the page.



These new tools should make it easier to understand how your videos perform on Search and to identify and fix video issues. We also recommend you follow these video best practices. If you have any questions, be sure to post in our forum.

Posted by Danielle Marshak, Product Manager

Googlebot uses a Chrome-based browser to render webpages, as we announced at Google I/O earlier this year. As part of this, in December 2019 we'll update Googlebot's user agent strings to reflect the new browser version, and periodically update the version numbers to match Chrome updates in Googlebot.

See Google crawlers (user agents) and Make sure Google can index JavaScript for background reading about user agent strings and rendering.
Googlebot user agents today
Mobile:
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Desktop:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

OR

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Safari/537.36

The new evergreen Googlebot and its user agent
In December we'll start periodically updating the above user agent strings to reflect the version of Chrome used in Googlebot. In the following user agent strings, "W.X.Y.Z" will be substituted with the Chrome version we're using. For example, instead of W.X.Y.Z you'll see something similar to "76.0.3809.100". This version number will update on a regular basis.

Mobile:
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Desktop:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

OR

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Chrome/W.X.Y.Z Safari/537.36

How to test your site

We've run an evaluation, so are confident that most websites will not be affected by the change.
Sites that follow our recommendations to use feature detection and progressive enhancement instead of user agent sniffing should continue to work without any changes.

If your site looks for a specific user agent, it may be affected. You should use feature detection instead of user agent sniffing. If you cannot use feature detection and need to detect Googlebot via the user agent, then look for "Googlebot" within the user agent.

Some common issues we saw while evaluating this change include:

  • Pages that present an error message instead of normal page contents. For example, a page may assume Googlebot is a user with an ad-blocker, and accidentally prevent it from accessing page contents.
  • Pages that redirect to a roboted or noindex document.

If you're not sure if your site is affected or not, you can try loading your webpage in your browser using the new Googlebot user agent. These instructions show how to override your user agent in Chrome.

If you have any questions, be sure to reach out to our webmaster help community, join our webmaster office hours on YouTube or follow us on Twitter.

Earlier this year we announced a series of Webmaster Conferences being held around the world to help website creators understand how to optimize their sites for Search. We’ve already held 22 of these events, with more planned through the end of the year. Building on the success of these events so far, we’re hosting a product summit version of this event at the Google Headquarters in Mountain View on Monday November 4th.


Photos from the Webmaster Conference in Kuala Lumpur, earlier this year.

This event is designed to facilitate an open dialog between the webmaster and SEO community and Search product teams. This one-day event will include talks from Search product managers, Q&A sessions, and a product fair giving attendees the opportunity to have direct conversations with product managers. Attendees will learn from the people building Search about how they think about the evolution of the platform, and have the opportunity to share feedback about the needs of the community.

We also realize that not everyone will be able to make this event in person, so we plan to share out much of the content and feedback after the event.

If you’re interested and able to make it, we encourage you to apply today as space is limited. Complete details about the event and the application process can be found on the event registration site. And as always, you can check out our other upcoming events on the general Webmaster Conference site, the Google Webmasters event calendar, or follow our blogs and @googlewmc on Twitter!

Posted by John Mueller, Google Switzerland

The world of search is constantly evolving. New tools, opportunities, and features are regularly arriving, sometimes existing things change, and sometimes we say goodbye to some things to make way for the new. To help you stay on top of things, we've started a new YouTube series called Google Search News.

With Google Search News, we want to give you a regular & short summary of what's been happening around Google Search, specifically for SEOs, publishers, developers, and webmasters. The first episode is out now, so check it out. 

(The first episode, now on your screen)

In this first episode, we cover:

We plan to make these updates regularly, and adjust the format over time as needed. Let us know what you think in the video comments!

Google uses content previews, including text snippets and other media, to help people decide whether a result is relevant to their query. The type of preview shown depends on many factors, including the type of content a person is looking for and the kind of device they're viewing it on.

For instance, if you look for recipe results on Google, you may see thumbnail images and user ratings--things that may be more helpful than text snippets when it comes to deciding what you want to eat. Alternately, or perhaps you're looking for a concert nearby, and are able to check out the events directly in the search results. These are made possible by publishers marking up their pages with structured data.

Google automatically generates previews in a way intended to help a user understand why the results shown are relevant to their search and why the user would want to visit the linked pages. However, we recognize that site owners may wish to independently adjust the extent of their preview content in search results. To make it easier for individual websites to define how much or which text should be available for snippeting and the extent to which other media should be included in their previews, we're now introducing several new settings for webmasters. 

Letting Google know about your snippet and content preview preferences

Previously, it was only possible to allow a textual snippet or to not allow one. We're now introducing a set of methods that allow more fine-grained configuration of the preview content shown for your pages. This is done through two types of new settings: a set of robots meta tags and an HTML attribute. 

Using robots meta tags

The robots meta tag is added to an HTML page's <head>, or specified via the x-robots-tag HTTP header. The robots meta tags addressing the preview content for a page are:

  • "nosnippet"
    This is an existing option to specify that you don't want any textual snippet shown for this page. 
  • "max-snippet:[number]"
    New! Specify a maximum text-length, in characters, of a snippet for your page.
  • "max-video-preview:[number]"
    New! Specify a maximum duration in seconds of an animated video preview.
  • "max-image-preview:[setting]"
    New! Specify a maximum size of image preview to be shown for images on this page, using either "none", "standard", or "large".

They can be combined, for example:

<meta name="robots" content="max-snippet:50, max-image-preview:large">

Preview settings from these meta tags will become effective in mid-to-late October 2019 and may take about a week for the global rollout to complete.

Using the new data-nosnippet HTML attribute

A new way to help limit which part of a page is eligible to be shown as a snippet is the "data-nosnippet" HTML attribute on span, div, and section elements. With this, you can prevent that part of an HTML page from being shown within the textual snippet on the page.

For example:

<p><span data-nosnippet>Harry Houdini</span> is undoubtedly the most famous magician ever to live.</p>

The data-nosnippet HTML attribute will be start affecting presentation on Google products later this year. Learn more in our developer documentation for the robots meta tag, x-robots-tag, and data-nosnippet.

A note about rich results and featured snippets

Content in structured data is eligible for display as rich results in search. These kinds of results do not conform to limits declared in the above meta robots settings, but rather, can be addressed with much greater specificity by limiting or modifying the content provided in the structured data itself. For example, if a recipe is included in structured data, the contents of that structured data may be presented in a recipe carousel in the search results. Similarly, if an event is marked up with structured data, it may be presented as such in the search results. To limit that presentation, a publisher can limit the amount and type of content in the structured data. 

Some special features on Search depend on the availability of preview content, so limiting your previews may prevent your content from appearing in these areas. Featured snippets, for example, requires a certain minimum number of characters to be displayed. This can vary by language, which is why there is no exact max-snippets length we can provide to ensure appearing in this feature. Those who do not wish to have content appear as featured snippets can experiment with lower max-snippet lengths. Those who want a guaranteed way to opt-out of featured snippets should use nosnippet.

The AMP Format

The AMP format comes with certain benefits, including eligibility for more prominent presentation of thumbnail images in search results and in the Google Discover feed. These characteristics have been shown to drive more traffic to publishers’ articles. However, publishers who do not want Google to use larger thumbnail images when their AMP pages are presented in search and Discover can use the above meta robots settings to specify max-image-preview of “standard” or “none.”

These new options are available to content owners worldwide and will operate the same for results we display globally. We hope they make it easier for you to optimize the value you get from Search and achieve your business goals. For more information, check out our developer documentation on meta tags. Should you have any questions, feel free to reach out to us, or drop by our webmaster help forums.


We analyzed our user feedback, and today would like to announce a new improvement to the report based on users' #1 feature request - improved data freshness!

The Performance report helps webmasters and site owners better understand how their site performs on Google search, and answer questions such as:
  • General stats: How much traffic did my site get from Search and Discover?
  • Search queries: What are my site’s top and trending search queries?
  • Top content: What are my site’s most successful pages on Google search? 
  • Site’s audiences: From which countries? From which devices - is it mostly mobile?
  • Formats: What search formats does my site get (AMP, recipes, etc.) ?

With the new fresh data, users can now see data as recent as less than a day old - a significant improvement compared to the previous few days.

We hope this improved data freshness allows you to better monitor and track your site’s performance and addresses some important needs such as:
  • Seeing your weekend performance on Monday morning - no need to wait until Wednesday.
  • Checking on your site’s stats first thing in the morning after, or even during, important days such as holidays, global events, and shopping days.
  • Checking if your site's traffic rebounds soon after fixing an important technical issue.

Fresh Data in Search Performance report

In addition, we updated the report to clearly communicate the data timezone (Pacific time zone). This is useful when you’d like to interpret the data compared to your local time zone or integrate it with other sources such as Google Analytics.

Performance report date picker

Each fresh data point will be replaced with the final data point after a few days. It is expected that from time to time the fresh data might change a bit before being finalized.

The Search Analytics API does not support fresh data yet. In addition, fresh data is not available on the Discover performance report. As a result, properties that are eligible for Discover performance report will not see fresh data in their Overview report. We hope to address these items in the future.
Exporting performance data over time We also heard your feedback about wanting a simple way to explore and export your performance over time. Starting today, this is possible. Simply choose ‘dates’ in the table below the graph, select the desired time frame, and explore the data in Search Console or export the chart. We hope that this new feature will help you further explore your performance trends and changes over time.

Performance report now with ‘dates’ table

In conclusion We hope that this new fresh data will help you better monitor your site’s performance and identify trends, patterns and interesting changes much closer to when they happen. In addition, we hope that the new date table dimension will assist you in exploring performance trends and changes over time. If you have any questions or concerns, please reach out on the Webmaster Help Forum or on Twitter

Posted by Ziv Hodak, Search Console product manager

Search results that are enhanced by review rich results can be extremely helpful when searching for products or services (the scores and/or “stars” you sometimes see alongside search results).
Review stars example in search results

To make them more helpful and meaningful, we are now introducing algorithmic updates to reviews in rich results. This also addresses some of the invalid or misleading implementations webmasters have flagged to us.

Focus on schema types that lend themselves to reviews

While, technically, you can attach review markup to any schema type, for many types displaying star reviews does not add much value for the user. With this change, we’re limiting the pool of schema types that can potentially trigger review rich results in search. Specifically, we’ll only display reviews with those types (and their respective subtypes):

Self-serving reviews aren't allowed for LocalBusiness and Organization

Reviews that can be perceived as “self-serving” aren't in the best interest of users. We call reviews “self-serving” when a review about entity A is placed on the website of entity A - either directly in their markup or via an embedded 3rd party widget. That’s why, with this change, we’re not going to display review rich results anymore for the schema types LocalBusiness and Organization (and their subtypes) in cases when the entity being reviewed controls the reviews themselves.
Updated on September 18, 2019: To explain more, in the past, an entity like a business or an organization could add review markup about themselves to their home page or another page and often cause a review snippet to show for that page. That markup could have been added directly by the entity or embedded through the use of a third-party widget.
We consider this “self-serving” because the entity itself has chosen to add the markup to its own pages, about its own business or organization.
Self-serving reviews are no longer displayed for businesses and organizations (the LocalBusiness and Organization schema types). For example, we will no longer display rich review snippets for how people have reviewed a business, if those reviews are considered self-serving.

Reviews are allowed and displayed for other schema types listed in the documentation. For example, a cooking site might use markup for recipes to summarize its visitor reviews. In turn, we might include this rich review markup for when those recipes appear in search.
FAQ What if I'm using a third-party widget to display reviews about my business? Google Search won't display review snippets for those pages. Embedding a third-party widget is seen as controlling the process of linking reviews.
Do I need to remove self-serving reviews from LocalBusiness or Organization? No, you don't need to remove them. Google Search just won't display review snippets for those pages anymore.
Will I get a manual action for having self-serving reviews on my site? You won’t get a manual action just for this. However, we recommend making sure that your structured data matches our guidelines.
Does this update affect my Google My Business listing/profile? No, Google My Business is not affected as this update only relates to organic Search.
Will sites that gather reviews about other organizations be affected? No, that’s unchanged. Sites that gather reviews can show up with review snippets (for their reviews of other organizations) in search results.
Does this update apply to AggregateRating too? Yes. It applies to Review and AggregateRating.
How do I report if a self-serving review is still appearing in search results? We’re considering creating a special form for this, if needed. We're slowly rolling out this change, so you may still see some cases of review stars where they shouldn't be.

Add the name of the item that's being reviewed

With this update, the name property is now required, so you'll want to make sure that you specify the name of the item that's being reviewed.
This update will help deliver a much more meaningful review experience for users, while requiring little to no changes on the part of most webmasters. You can find all those updates documented in our developer documentation. If you have any questions, feel free to come to our webmaster forums!

Nearly 15 years ago, the nofollow attribute was introduced as a means to help fight comment spam. It also quickly became one of Google’s recommended methods for flagging advertising-related or sponsored links. The web has evolved since nofollow was introduced in 2005 and it’s time for nofollow to evolve as well.
Today, we’re announcing two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links. These, along with nofollow, are summarized below:

rel="sponsored": Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.

rel="ugc": UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user generated content, such as comments and forum posts.

rel="nofollow": Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.

When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes -- sponsored, UGC and nofollow -- are treated as hints about which links to consider or exclude within Search. We’ll use these hints -- along with other signals -- as a way to better understand how to appropriately analyze and use links within our systems.
Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.
We know these new attributes will generate questions, so here’s a FAQ that we hope covers most of those.

Do I need to change my existing nofollows?
No. If you use nofollow now as a way to block sponsored links, or to signify that you don’t vouch for a page you link to, that will continue to be supported. There’s absolutely no need to change any nofollow links that you already have.

Can I use more than one rel value on a link?
Yes, you can use more than one rel value on a link. For example, rel="ugc sponsored" is a perfectly valid attribute which hints that the link came from user-generated content and is sponsored. It’s also valid to use nofollow with the new attributes -- such as rel="nofollow ugc" -- if you wish to be backwards-compatible with services that don’t support the new attributes.

If I use nofollow for ads or sponsored links, do I need to change those?
No. You can keep using nofollow as a method for flagging such links to avoid possible link scheme penalties. You don't need to change any existing markup. If you have systems that append this to new links, they can continue to do so. However, we recommend switching over to rel=”sponsored” if or when it is convenient.

Do I still need to flag ad or sponsored links?
Yes. If you want to avoid a possible link scheme action, use rel=“sponsored” or rel=“nofollow” to flag these links. We prefer the use of “sponsored,” but either is fine and will be treated the same, for this purpose.

What happens if I use the wrong attribute on a link?
There’s no wrong attribute except in the case of sponsored links. If you flag a UGC link or a non-ad link as “sponsored,” we’ll see that hint but the impact -- if any at all -- would be at most that we might not count the link as a credit for another page. In this regard, it’s no different than the status quo of many UGC and non-ad links already marked as nofollow.
It is an issue going the opposite way. Any link that is clearly an ad or sponsored should use “sponsored” or “nofollow,” as described above. Using “sponsored” is preferred, but “nofollow” is acceptable.

Why should I bother using any of these new attributes?
Using the new attributes allows us to better process links for analysis of the web. That can include your own content, if people who link to you make use of these attributes.

Won’t changing to a “hint” approach encourage link spam in comments and UGC content?
Many sites that allow third-parties to contribute to content already deter link spam in a variety of ways, including moderation tools that can be integrated into many blogging platforms and human review. The link attributes of “ugc” and “nofollow” will continue to be a further deterrent. In most cases, the move to a hint model won’t change the nature of how we treat such links. We’ll generally treat them as we did with nofollow before and not consider them for ranking purposes. We will still continue to carefully assess how to use links within Search, just as we always have and as we’ve had to do for situations where no attributions were provided.

When do these attributes and changes go into effect?
All the link attributes, sponsored, ugc and nofollow, now work today as hints for us to incorporate for ranking purposes. For crawling and indexing purposes, nofollow will become a hint as of March 1, 2020. Those depending on nofollow solely to block a page from being indexed (which was never recommended) should use one of the much more robust mechanisms listed on our Learn how to block URLs from Google help page.

Posted by Danny Sullivan and Gary

Today we are reaching another important milestone in our graduation journey, we are saying goodbye to many old Search Console reports, including the home and dashboard pages 👋. Those pages are part of the history of the web, they were viewed over a billion times by webmasters from millions of websites. These pages helped site owners and webmasters to monitor and improve their performance on Google Search for over a decade.

From now on, if you try to access the old homepage or dashboard you’ll be redirected to the relevant Search Console pages. There are only a few reports that will still be available on the old interface for now - check the Legacy tools and reports in the Help Center. We're continuing to work on making the insights from these reports available in the new Search Console, so stay tuned!

Below is our last tribute to them, a picture of the team with the old Search Console in the background 😍. But we thought you might also have something to share, maybe some beautiful memories you have with the home and dashboard pages below (or any old Search Console page) - so we’ll be monitoring #SCmemories if you want to share your stories with us on Twitter.

Image: the team saying goodbye to the old Search Console

Image: old Search Console dashboard

Thank you for working together with us on making the web better - and see you at the new Search Console! If you have any feedback, let us know through the Webmasters community.

Posted by Hillel Maoz on behalf of the Search Console team.

Back in February, we announced domain-wide data in Search Console, to give site owners a comprehensive view of their site, removing the need to switch between different properties to get the full picture of your data.

We’ve seen lots of positive reactions from users who verified domain properties. A common feedback we heard from users is that before moving to domain properties they were underestimating their traffic, and the new method helped them understand their clicks and impressions aggregated data more effectively. When we asked Domain property users about their satisfaction with the feature, almost all of them seem to be satisfied. Furthermore, most of these users reported that they find domain properties more useful than the traditional URL prefix properties.

However, changing a DNS record is not always trivial, especially for small and medium businesses. We heard that the main challenge preventing site owners from switching to Domain properties is getting their domain verified. To help with this challenge, we collaborated with various domain name registrars to automate part of the verification flow. The flow will guide you through the necessary steps needed to update your registrar configuration so that your DNS record includes the verification token we provide. This will make the verification process a lot easier.
How to use Auto-DNS verification To verify your domain using the new flow, click ‘add property’ from the property selector (drop down on top of Search Console sidebar). Then, choose the ‘Domain’ option. The system will guide you through a series of steps, including a visit to the registrar site where you need to apply changes - there will be fewer and easier steps than before for you to go through. You can learn more about verifying your site at the Help Center.


Image: Auto-DNS verification flow 

We hope you can use this new capability and gain ownership of your Domain property today. As always, please let us know if there is anything we can do to improve via the product feedback button, the Webmasters community or mention us on Twitter.

Posted by Ruty Mundel, Search Console engineering team

With the move to the new Search Console, we've decided to clean up some parts of the Search Console API as well. In the Search Analytics API, going forward we'll no longer support these Android app search appearance types:

  • Is Install
  • Is App Universal
  • Is Opened

Since these appearance types are no longer used in the UI, they haven't been populated with data recently. Going forward, we won't be showing these types at all through the API. 

Additionally, for the Sitemaps API, we're no longer populating data on indexing status of submitted sitemap files in the "Indexed" field.

We're still committed to the Search Console API. In particular, we're working on updating the Search Console API to the new Search Console. We don't have any specific timeframes to share at the moment, but stay tuned to find out more!


We love to help folks make awesome websites. For a while now, we've been answering questions from developers, site-owners, webmasters, and of course SEOs in our office hours hangouts, in the help forums, and at events. Recently, we've (re-)started answering your questions in a video series called #AskGoogleWebmasters on our YouTube channel


(At Google, behind the scenes, during the recording of one of the episodes.)

When we started with the webmaster office-hours back in 2012, we thought we'd be able to get through all questions within a few months, or perhaps a year. Well ... the questions still haven't stopped -- it's great to see such engagement when it comes to making great websites! 

To help make it a bit easier to find answers, we've started producing shorter videos answering individual questions. Some of the questions may seem fairly trivial to you, others don't always have simple answers, but all of them are worth answering.

Curious about the first episodes? Check out the videos below and the playlist for all episodes!

To ask a question, just use the hashtag #AskGoogleWebmasters on Twitter. While we can't get to all submissions, we regularly pick up the questions there to use in future episodes. We pick questions primarily about websites & websearch, which are relevant to many sites. Want to stay in the loop? Make sure to subscribe to our channel. If you'd like to discuss the questions or other important webmaster topics, feel free to drop by our webmaster help forums and chat with the awesome experts there. 


Most of the time, our search engine runs properly. Our teams work hard to prevent technical issues that could affect our users who are searching the web, or webmasters whose sites we index and serve to users. Similarly, the underlying systems that we use to power the search engine also run as intended most of the time. When small disruptions happen, they are largely not visible to anyone except our teams who ensure that our products are up and running. However, like all complex systems, sometimes larger outages can occur, which may lead to disruptions for both users and website creators.

In the last few months, such a situation occurred with our indexing systems, which had a ripple effect on some other parts of our infrastructure. While we worked as quickly as possible to remedy the situation, we apologize for the disruption, as our goal is to continuously provide high-quality products to our users and to the web ecosystem.

Since then, we took a closer, careful look into the situation. In the process, we learned a few lessons that we'd like to share with you today. In this blog post, we will go into more details about what happened, clarify how we plan to communicate better if such things happen in the future, and remind website owners of the channels they can use to communicate with us.

So, what happened a few months ago?

In April, we had several issues related to our index. The Search index is the database that holds the hundreds of billions of web pages that we crawled on the web and that we think could answer some of our users’ queries. When a user enters a query in the Google search engine, our ranking algorithms sort through those pages in our Search index to find the most relevant, useful results in a fraction of a second. Here is more information on what happened.

1. The indexing issue

To start it off, we temporarily lost part of the Search index.
Wait... What? What do you mean “lost part of the index?” Is that even possible?

Basically, when serving search results to users, to accelerate the speed of the service, the query of the user only “travels” as far as the closest of our data centers supporting the Google Search product, from which the Search Engine Results Page (SERP) is generated. So when there are modifications to the composition of the index (some pages added and removed, documents are merged, or other types of data modification), those modifications need to be reflected in all of those data centers. The consequence is that users all over the world are consistently served pages from the most recent version of the index.


Google owns and operates data centers (like the one pictured above) around the world, to keep our products running 24 hours a day, 7 days a week - source

Keeping the index unified across all those data centers is a non trivial task. For large user-facing services, we may deploy updates by starting in one data center and expand until all relevant data centers are updated. For sensitive pieces of infrastructure, we may extend a rollout over several days, interleaving them across instances in different geographic regions. source

So, as we pushed some planned changes to the Search index, on April 5th parts of the deployment system broke, on a Friday no-less! More specifically: as we were updating the index over some of our data centers, a small number of documents ended up being dropped from the index accidentally. Hence: “we lost part of the index.”

Luckily, our on-call engineers caught the issue pretty quickly, at the same time as we started picking up chatter on social media (thanks to everyone who notified us over that weekend!). As a result, we were able to start reverting the Search index to its previous stable state in all data centers only a few hours after the issue was uncovered (we keep back-ups of our indexes just in case such events happen).

We communicated on Sunday, April 7th that we were aware of the issue, and that things were starting to get back to normal. As data centers were progressively reverting back to a stable index, we continued updating on Twitter (on April 8th, on April 9th), until we were confident that all data centers were fully back to a complete version of the index on April 11th.

2. The Search Console issue

Search Console is the set of tools and reports any webmaster can use to access data about their website’s performance in Search. For example, it shows how many impressions and clicks a website gets in the organic search results every day, or information on what pages of a website are included and excluded from the Search index.

As a consequence of the Search index having the issues we described above, Search Console started to also show inconsistencies. This is because some of the data that surfaces in Search Console originates from the Search index itself:

  • the Index Coverage report depends on the Search index being consistent across data centers.
  • when we store a page in the Search index, we can annotate the entry with key signals about the page, like the fact that the page contains rich results markup for example. Therefore, an issue with the Search index can have an impact on the Rich Results reports in Search Console.

Basically, many Search Console individual reports read data from a dedicated database. That database is partially built by using information that comes from the Search index. As we had to revert back to a previous version of the Search index, we also had to pause the updating of the Search Console database. This resulted in plateau-ing data for some reports (and flakiness in others, like the URL inspection tool).


Index coverage report for indexed pages, which shows an example of the data freshness issues in Search Console in April 2019, with a longer time between 2 updates than what is usually observed.

Because the whole Search index issue took several days to roll back (see explanation above), we were delayed focusing on fixing the Search Console database until a few days later, only after the indexing issues were fixed. We communicated on April 15th - tweet - that the Search Console was having troubles and that we were working on fixing it, and we completed our fixes on April 28th (day on which the reports started gathering fresh data again, see graph above). We communicated on Twitter on April 30th that the issue was resolved- tweet.

3. Other issues unrelated to the main indexing bug

Google Search relies on a number of systems that work together. While some of those systems can be tightly linked to one another, in some cases different parts of the system experience unrelated problems around the same time.

In the present case for example, around the same time as the main indexing bug explained above, we also had brief problems gathering fresh Google News content. Additionally, while rendering pages, certain URLs started to redirect Googlebot to other unrelated pages. These issues were entirely unrelated to the indexing bug, and were quickly resolved (tweet 1 & tweet 2).

Our communication and how we intend on doing better

In addition to communicating on social media (as highlighted above) during those few weeks, we also gave webmasters more details in 2 other channels: Search Console, as well as the Search Console Help Center.

In the Search Console Help Center

We updated our “Data anomalies in Search Console” help page after the issue was fully identified. This page is used to communicate information about data disruptions to our Search Console service when the impact affects a large number of website owners.

In Search Console

Because we know that not all our users read social media or the external Help Center page, we also added annotations on Search Console reports, to notify users that the data might not be accurate (see image below). We added this information after the resolution of the bugs. Clicking on “see here for more details” sends users to the “Data Anomalies” page in the Help Center.


Index coverage report for indexed pages, which shows an example of the data annotations that we can include to notify users of specific issues.

Communications going forward

When things break at Google, we have a strong “postmortem” culture: creating a document to debrief on the breakage, and try to avoid it happening next time. The whole process is described in more detail at the Google Site Reliability Engineering website.

In the wake of the April indexing issues, we included in the postmortem how to better communicate with webmasters in case of large system failures. Our key decisions were:

  1. Explore ways to more quickly share information within Search Console itself about widespread bugs, and have that information serve as the main point of reference for webmasters to check, in case they are suspecting outages.
  2. More promptly post to the Search Console data anomalies page, when relevant (if the disturbance is going to be seen over the long term in Search Console data).
  3. Continue tweeting as quickly as we can about such issues to quickly reassure webmasters we’re aware and that the issue is on our end.

Those commitments should make potential future similar situations more transparent for webmasters as a whole.

Putting our resolutions into action: the “new URLs not indexed” case study

On May 22nd, we tested our new communications strategy, as we experienced another issue. Here’s what happened: while processing certain URLs, our duplicate management system ran out of memory after a planned infrastructure upgrade, which caused all incoming URLs to stop processing.

Here is a timeline of how we thought about communications, following the 3 points highlighted just above:

  1. We noticed the issue (around 5.30am California time, May 22nd)
    We tweeted about the ongoing issue (around 6.40am California time, May 22nd)
    We tweeted about the resolution (around 10pm California time, May 22nd)
  2. We evaluated updating the “Data Anomalies” page in the Help Center, but decided against it since we did not expect any long-term impact for the majority of webmasters' Search Console data in the long run.
  3. The confusion that this issue created for many confirmed our earlier conclusions that we need a way to signal more clearly in the Search Console itself that there might be a disruption to one of our systems which could impact webmasters. Such a solution might take longer to implement. We will communicate on this topic in the future, as we have more news.

Last week, we also had another indexing issue. As with May 22, we tweeted to let people know there was an issue, that we were working to fix it and when the issue was resolved.

How to debug and communicate with us

We hope that this post will bring more clarity to how our systems are complex and can sometimes break, and will also help you understand how we communicate about these matters. But while this post focuses on a widespread breakage of our systems, it’s important to keep in mind that most website indexing issues are caused by an individual website’s configuration, which can create difficulties for Google Search to index that website properly. For those cases, all webmasters can debug issues using Search Console and our Help center. After doing so, if you still think that an issue is not coming from your site or don’t know how to resolve it, come talk to us and our community, we always want to take feedback from our users. Here is how to signal an issue to us:

  • Check our Webmaster Community, sometimes other webmasters have highlighted an issue that also impacts your site.
  • In person! We love contact, come and talk to us at events. Calendar.
  • Within our products! The Search Console feedback tool is very useful to our teams.
  • Twitter and YouTube!