[go: nahoru, domu]

With the move to the new Search Console, we've decided to clean up some parts of the Search Console API as well. In the Search Analytics API, going forward we'll no longer support these Android app search appearance types:

  • Is Install
  • Is App Universal
  • Is Opened

Since these appearance types are no longer used in the UI, they haven't been populated with data recently. Going forward, we won't be showing these types at all through the API. 

Additionally, for the Sitemaps API, we're no longer populating data on indexing status of submitted sitemap files in the "Indexed" field.

We're still committed to the Search Console API. In particular, we're working on updating the Search Console API to the new Search Console. We don't have any specific timeframes to share at the moment, but stay tuned to find out more!


Last year, we launched job search on Google to connect more people with jobs. When you provide Job Posting structured data, it helps drive more relevant traffic to your page by connecting job seekers with your content. To ensure that job seekers are getting the best possible experience, it's important to follow our Job Posting guidelines.

We've recently made some changes to our Job Posting guidelines to help improve the job seeker experience.

  • Remove expired jobs
  • Place structured data on the job's detail page
  • Make sure all job details are present in the job description

Remove expired jobs

When job seekers put in effort to find a job and apply, it can be very discouraging to discover that the job that they wanted is no longer available. Sometimes, job seekers only discover that the job posting is expired after deciding to apply for the job. Removing expired jobs from your site may drive more traffic because job seekers are more confident when jobs that they visit on your site are still open for application. For more information on how to remove a job posting, see Remove a job posting.


Place structured data on the job's detail page

Job seekers find it confusing when they land on a list of jobs instead of the specific job's detail page. To fix this, put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages) and only add it to the most specific page describing a single job with its relevant details.

Make sure all job details are present in the job description

We've also noticed that some sites include information in the JobPosting structured data that is not present anywhere in the job posting. Job seekers are confused when the job details they see in Google Search don't match the job description page. Make sure that the information in the JobPosting structured data always matches what's on the job posting page. Here are some examples:

  • If you add salary information to the structured data, then also add it to the job posting. Both salary figures should match.
  • The location in the structured data should match the location in the job posting.

Providing structured data content that is consistent with the content of the job posting pages not only helps job seekers find the exact job that they were looking for, but may also drive more relevant traffic to your job postings and therefore increase the chances of finding the right candidates for your jobs.

If your site violates the Job Posting guidelines (including the guidelines in this blog post), we may take manual action against your site and it may not be eligible for display in the jobs experience on Google Search. You can submit a reconsideration request to let us know that you have fixed the problem(s) identified in the manual action notification. If your request is approved, the manual action will be removed from your site or page.

For more information, visit our Job Posting developer documentation and our JobPosting FAQ.

A few months ago we released a beta version of a new Search Console experience to a limited number of users. We are now starting to release this beta version to all users of Search Console, so that everyone can explore this simplified process of optimizing a website's presence on Google Search. The functionality will include Search performance, Index Coverage, AMP status, and Job posting reports. We will send a message once your site is ready in the new Search Console.
We started by adding some of the most popular functionality in the new Search Console (which can now be used in your day-to-day flow of addressing these topics). We are not done yet, so over the course of the year the new Search Console (beta) will continue to add functionality from the classic Search Console. Until the new Search Console is complete, both versions will live side-by-side and will be easily interconnected via links in the navigation bar, so you can use both.
The new Search Console was rebuilt from the ground up by surfacing the most actionable insights and creating an interaction model which guides you through the process of fixing any pending issues. We’ve also added ability to share reports within your own organization in order to simplify internal collaboration.

Search Performance: with 16 months of data!



If you've been a fan of Search Analytics, you'll love the new Search Performance report. Over the years, users have been consistent in asking us for more data in Search Analytics. With the new report, you'll have 16 months of data, to make analyzing longer-term trends easier and enable year-over-year comparisons. In the near future, this data will also be available via the Search Console API.

Index Coverage: a comprehensive view on Google's indexing



The updated Index Coverage report gives you insight into the indexing of URLs from your website. It shows correctly indexed URLs, warnings about potential issues, and reasons why Google isn't indexing some URLs. The report is built on our new Issue tracking functionality that alerts you when new issues are detected and helps you monitor their fix.
So how does that work?
  1. When you drill down into a specific issue you will see a sample of URLs from your site. Clicking on error URLs brings up the page details with links to diagnostic-tools that help you understand what is the source of the problem.
  2. Fixing Search issues often involves multiple teams within a company. Giving the right people access to information about the current status, or about issues that have come up there, is critical to improving an implementation quickly. Now, within most reports in the new Search Console, you can do that with the share button on top of the report which will create a shareable link to the report. Once things are resolved, you can disable sharing just as easily.
  3. The new Search Console can also help you confirm that you've resolved an issue, and help us to update our index accordingly. To do this, select a flagged issue, and click validate fix. Google will then crawl and reprocess the affected URLs with a higher priority, helping your site to get back on track faster than ever.
  4. The Index Coverage report works best for sites that submit sitemap files. Sitemap files are a great way to let search engines know about new and updated URLs. Once you've submitted a sitemap file, you can now use the sitemap filter over the Index Coverage data, so that you're able to focus on an exact list of URLs.

Search Enhancements: improve your AMP and Job Postings pages

The new Search Console is also aimed at helping you implement Search Enhancements such as AMP and Job Postings (more to come). These reports provide details into the specific errors and warnings that Google identified for these topics. In addition to the functionally described in the index coverage report, we augmented the reports with two extra features:
  • The first feature is aimed at providing faster feedback in the process of fixing an issue. This is achieved by running several instantaneous tests once you click the validate fix button. If your pages don’t pass this test we provide you with an immediate notification, otherwise we go ahead and reprocess the rest of the affected pages.
  • The second new feature is aimed at providing positive feedback during the fix process by expanding the validation log with a list of URLs that were identified as fixed (in addition to URLs that failed the validation or might still be pending).
Similar to the AMP report, the new Search Console provides a Job postings report. If you have jobs listings on your website, you may be eligible to have those shown directly through Google for Jobs (currently only available in certain locations).

Feedback welcome

We couldn’t have gotten so far without the ongoing feedback from our diligent trusted testers (we plan to share more on how their feedback helped us dramatically improve Search Console). However, your continued feedback is critical for us: if there's something you find confusing or wrong, or if there's something you really like, please let us know through the feedback feature in the sidebar. Also note that the mobile experience in the new Search Console is still a work in progress.
We want to end this blog sharing an encouraging response we got from a user who has been testing the new Search Console recently:

"The UX of new Search Console is clean and well laid out, everything is where we expect it to be. I can even kick-off validation of my fixes and get email notifications with the result. It’s been a massive help in fixing up some pesky AMP errors and warnings that were affecting pages on our site. On top of all this, the Search Analytics report now extends to 16 months of data which is a total game changer for us" - Noah Szubski, Chief Product Officer, DailyMail.com

Are there any other tools that would make your life as a webmaster easier? Let us know in the comments here, and feel free to jump into our webmaster help forums to discuss your ideas with others!

Webmaster level: intermediate-advanced

Submitting sitemaps can be an important part of optimizing websites. Sitemaps enable search engines to discover all pages on a site and to download them quickly when they change. This blog post explains which fields in sitemaps are important, when to use XML sitemaps and RSS/Atom feeds, and how to optimize them for Google.

Sitemaps and feeds

Sitemaps can be in XML sitemap, RSS, or Atom formats. The important difference between these formats is that XML sitemaps describe the whole set of URLs within a site, while RSS/Atom feeds describe recent changes. This has important implications:

  • XML sitemaps are usually large; RSS/Atom feeds are small, containing only the most recent updates to your site.
  • XML sitemaps are downloaded less frequently than RSS/Atom feeds.

For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index. Note that submitting sitemaps or feeds does not guarantee the indexing of those URLs.

Example of an XML sitemap:

<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
 <url>
   <loc>http://example.com/mypage</loc>
   <lastmod>2011-06-27T19:34:00+01:00</lastmod>
   <!-- optional additional tags -->
 </url>
 <url>
   ...
 </url>
</urlset>

Example of an RSS feed:

<?xml version="1.0" encoding="utf-8"?>
<rss>
 <channel>
   <!-- other tags -->
   <item>
     <!-- other tags -->
     <link>http://example.com/mypage</link>
     <pubDate>Mon, 27 Jun 2011 19:34:00 +0100</pubDate>
   </item>
   <item>
     ...
   </item>
 </channel>
</rss>

Example of an Atom feed:

<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
 <!-- other tags -->
 <entry>
   <link href="http://example.com/mypage" />
   <updated>2011-06-27T19:34:00+01:00</updated>
   <!-- other tags -->
 </entry>
 <entry>
   ...
 </entry>
</feed>

“other tags” refer to both optional and required tags by their respective standards. We recommend that you specify the required tags for Atom/RSS as they will help you to appear on other properties that might use these feeds, in addition to Google Search.

Best practices

Important fields

XML sitemaps and RSS/Atom feeds, in their core, are lists of URLs with metadata attached to them. The two most important pieces of information for Google are the URL itself and its last modification time:

URLs

URLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:

  • Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.
  • Only include canonical URLs. A common mistake is to include URLs of duplicate pages. This increases the load on your server without improving indexing.
Last modification time

Specify a last modification time for each URL in an XML sitemap and RSS/Atom feed. The last modification time should be the last time the content of the page changed meaningfully. If a change is meant to be visible in the search results, then the last modification time should be the time of this change.

  • XML sitemap uses  <lastmod>
  • RSS uses <pubDate>
  • Atom uses <updated>

Be sure to set or update last modification time correctly:

  • Specify the time in the correct format: W3C Datetime for XML sitemaps, RFC3339 for Atom and RFC822 for RSS.
  • Only update modification time when the content changed meaningfully.
  • Don’t set the last modification time to the current time whenever the sitemap or feed is served.

XML sitemaps

XML sitemaps should contain URLs of all pages on your site. They are often large and update infrequently. Follow these guidelines:

  • For a single XML sitemap: update it at least once a day (if your site changes regularly) and ping Google after you update it.
  • For a set of XML sitemaps: maximize the number of URLs in each XML sitemap. The limit is 50,000 URLs or a maximum size of 10MB uncompressed, whichever is reached first. Ping Google for each updated XML sitemap (or once for the sitemap index, if that's used) every time it is updated. A common mistake is to put only a handful of URLs into each XML sitemap file, which usually makes it harder for Google to download all of these XML sitemaps in a reasonable time.

RSS/Atom

RSS/Atom feeds should convey recent updates of your site. They are usually small and updated frequently. For these feeds, we recommend:

  • When a new page is added or an existing page meaningfully changed, add the URL and the modification time to the feed.
  • In order for Google to not miss updates, the RSS/Atom feed should have all updates in it since at least the last time Google downloaded it. The best way to achieve this is by using PubSubHubbub. The hub will propagate the content of your feed to all interested parties (RSS readers, search engines, etc.) in the fastest and most efficient way possible.

Generating both XML sitemaps and Atom/RSS feeds is a great way to optimize crawling of a site for Google and other search engines. The key information in these files is the canonical URL and the time of the last modification of pages within the website. Setting these properly, and notifying Google and other search engines through sitemaps pings and PubSubHubbub, will allow your website to be crawled optimally, and represented accordingly in search results.

If you have any questions, feel free to post them here, or to join other webmasters in the webmaster help forum section on sitemaps.

Webmaster level: advanced

Over the summer the Webmaster Tools team has been cooking up an update to the Webmaster Tools API. The new API is consistent with other Google APIs, makes it easier to authenticate for apps or web-services, and provides access to some of the main features of Webmaster Tools.

If you've used other Google APIs, getting started with the new Webmaster Tools API will be easy! We have examples for Python, Java, as well as OACurl (for fans of command lines).

This API allows you to:

  • list, add, or remove sites from your account (you can currently have up to 500 sites in your account)
  • list, add, or remove sitemaps for your websites
  • get warning, error, and indexed counts for individual sitemaps
  • get a time-series of all kinds of crawl errors for your site
  • list crawl error samples for specific types of errors
  • mark individual crawl errors as "fixed" (this doesn't change how they're processed, but can help simplify the UI for you)

We'd love to see what you're building with our APIs! Feel free to link to your projects in the comments below. Should you have any questions about the usage of the API, feel free to post in our help forum as well.


Webmaster Level: Advanced

In October, we announced guidelines for App Indexing for deep linking directly from Google Search results to your Android app. Thanks to all of you that have expressed interest. We’ve just enabled 20+ additional applications that users will soon see app deep links for in Search Results, and starting today we’re making app deep links to English content available globally.


We’re continuing to onboard more publishers in all languages. If you haven’t added deep link support to your Android app or specified these links on your website or in your Sitemaps, please do so and then notify us by filling out this form.

Here are some best practices to consider when adding deep links to your sitemap or website:
  • App deep links should only be included for canonical web URLs.
  • Remember to specify an app deep link for your homepage.
  • Not all website URLs in a Sitemap need to have a corresponding app deep link. Do not include app deep links that aren't supported by your app.
  • If you are a news site and use News Sitemaps, be sure to include your deep link annotations in the News Sitemaps, as well as your general Sitemaps.
  • Don't provide annotations for deep links that execute native ARM code. This enables app indexing to work for all platforms
When Google indexes content from your app, your app will need to make HTTP requests that it usually makes under normal operation. These requests will appear to your servers as originating from Googlebot. Therefore, your server's robots.txt file must be configured properly to allow these requests.

Finally, please make sure the back button behavior of your app leads directly back to the search results page.

For more details on implementation, visit our updated developer guidelines. And, as always, you can ask questions on the mobile section of our webmaster forum.

Webmaster Level: Advanced

Searchers on smartphones experience many speed bumps that can slow them down. For example, any time they need to change context from a web page to an app, or vice versa, users are likely to encounter redirects, pop-up dialogs, and extra swipes and taps. Wouldn't it be cool if you could give your users the choice of viewing your content either on the website or via your app, both straight from Google's search results?
Today, we’re happy to announce a new capability of Google Search, called app indexing, that uses the expertise of webmasters to help create a seamless user experience across websites and mobile apps.
Just like it crawls and indexes websites, Googlebot can now index content in your Android app. Webmasters will be able to indicate which app content you'd like Google to index in the same way you do for webpages today — through your existing Sitemap file and through Webmaster Tools. If both the webpage and the app contents are successfully indexed, Google will then try to show deep links to your app straight in our search results when we think they’re relevant for the user’s query and if the user has the app installed. When users tap on these deep links, your app will launch and take them directly to the content they need. Here’s an example of a search for home listings in Mountain View:


We’re currently testing app indexing with an initial group of developers. Deep links for these applications will start to appear in Google search results for signed-in users on Android in the US in a few weeks. If you are interested in enabling indexing for your Android app, it’s easy to get started:
  1. Let us know that you’re interested. We’re working hard to bring this functionality to more websites and apps in the near future.
  2. Enable deep linking within your app.
  3. Provide information about alternate app URIs, either in the Sitemaps file or in a link element in pages of your site.
For more details on implementation and for information on how to sign up, visit our developer site. As always, if you have any questions, please ask in the mobile section of our webmaster forum.


Webmaster level: All

Sitemaps are a way to tell Google about pages on your site. Webmaster Tools’ Sitemaps feature gives you feedback on your submitted Sitemaps, such as how many Sitemap URLs have been indexed, or whether your Sitemaps have any errors. Recently, we’ve added even more information! Let’s check it out:


The Sitemaps page displays details based on content-type. Now statistics from Web, Videos, Images and News are featured prominently. This lets you see how many items of each type were submitted (if any), and for some content types, we also show how many items have been indexed. With these enhancements, the new Sitemaps page replaces the Video Sitemaps Labs feature, which will be retired.

Another improvement is the ability to test a Sitemap. Unlike an actual submission, testing does not submit your Sitemap to Google as it only checks it for errors. Testing requires a live fetch by Googlebot and usually takes a few seconds to complete. Note that the initial testing is not exhaustive and may not detect all issues; for example, errors that can only be identified once the URLs are downloaded are not be caught by the test.

In addition to on-the-spot testing, we’ve got a new way of displaying errors which better exposes what types of issues a Sitemap contains. Instead of repeating the same kind of error many times for one Sitemap, errors and warnings are now grouped, and a few examples are given. Likewise, for Sitemap index files, we’ve aggregated errors and warnings from the child Sitemaps that the Sitemap index encloses. No longer will you need to click through each child Sitemap one by one.

Finally, we’ve changed the way the “Delete” button works. Now, it removes the Sitemap from Webmaster Tools, both from your account and the accounts of the other owners of the site. Be aware that a Sitemap may still be read or processed by Google even if you delete it from Webmaster Tools. For example if you reference a Sitemap in your robots.txt file search engines may still attempt to process the Sitemap. To truly prevent a Sitemap from being processed, remove the file from your server or block it via robots.txt.

For more information on Sitemaps in Webmaster Tools and how Sitemaps work, visit our Help Center. If you have any questions, go to Webmaster Help Forum.

Webmaster Level: All

If your website is the authoritative source for the video of a particular TV show, make sure we know about it! Hopefully, you already submit Video Sitemaps or mRSS feeds to inform us about video content on your website. We now support additional fields in both video Sitemaps and mRSS feeds where you can specify metadata specific to television or episodic content. This includes the series’ title, the season and episode numbers for the video in question, the premiere date, as well as other additional information. The metadata from your video feed helps us provide more detailed, relevant results to users wanting to view your show.

Here’s an example Video Sitemap entry that includes all the required and some optional TV metadata in the <video:tvshow> element:

<video:video>
  <video:title>The Sample Show, Season 1, Episode 2</video:title>
 &nbsp<!-- other required root level video tags omitted -->
 &nbsp<video:tvshow>
    <video:show_title>The Sample Show</video:show_title&gt
    <video:video_type>full</video:video_type>
    <video:episode_title>A Sample Episode Title</video:episode_title>
    <video:season_number>1</video:season_number>
    <video:episode_number>2</video:episode_number>
  </video:tvshow>
</video:video>


The full documentation for the tags for both mRSS and Video Sitemaps can be found in our Webmaster Tools Help Center. As always, if you have any questions about Video Sitemaps or mRSS feeds, feel free to reach out to us in the Sitemaps section of the Webmaster Help Forum.

Webmaster level: Intermediate
We’ve noticed a rise in the number of questions from webmasters about how best to structure a website for mobile phones and how websites can best interact with Googlebot-Mobile. In this post we’ll explain the current situation and give you specific recommendations you can implement now.

Some Background

Let’s start with a simple question: what do we mean by “mobile phone” when talking about mobile-friendly websites?
A good way to answer this question is to think about the capabilities of the mobile phone’s web browser, especially in relation to the capabilities of modern desktop browsers. To simplify matters, we can break mobile phones into a few classifications:
  1. Traditional mobile phones: Phones with browsers that cannot render normal desktop webpages. This includes browsers for cHTML (iMode), WML, WAP, and the like.
  2. Smartphones: Phones with browsers that render normal desktop pages, at least to some extent. This category includes a diversity of devices, such Windows Phone 7, Blackberry devices, iPhones, and Android phones, and also tablets and eBook readers.We can further break down this category by support for HTML5:
    • Devices with browsers that do not support HTML5
    • Devices with browsers that support HTML5
Once upon a time, mobile phones connected to the Internet using browsers with limited rendering capabilities; but this is clearly a changing situation with the fast rise of smartphones which have browsers that rival the full desktop experience. As such, it’s important to note that the distinction we are making here is based on the current situation as we see it and might change in the future.

Googlebot and Mobile Content

Google has two crawlers relevant to this topic: Googlebot and Googlebot-Mobile. Googlebot crawls desktop-browser type of webpages and content embedded in them and Googlebot-Mobile crawls mobile content. The questions we’re seeing more of can be summed up as follows:
Given the diversity of capabilities of mobile web browsers, what kind of content should I serve to Googlebot-Mobile?
The answer lies in the User-agent that Googlebot-Mobile supplies when crawling. There are several User-agent strings in use by Googlebot-Mobile, all of which use this format:
[Phone name(s)] (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)
To decide which content to serve, assess which content your website has that best serves the phone(s) in the User-agent string. A full list of Googlebot-Mobile User-agents can be found here.
Notice that we currently do not crawl with Googlebot-Mobile using a smartphone User-agent string. Thus at the current time, a correctly-configured content serving system will serve Googlebot-Mobile content only for the traditional phones described above, because that’s what the User-agent strings in use today dictate. This may change in the future, and if so, it may mean there would be a new Googlebot-Mobile User-agent string.
For now, we expect smartphones to handle desktop experience content so there is no real need for mobile-specific effort from webmasters. However, for many websites it may still make sense for the content to be formatted differently for smartphones, and the decision to do so should be based on how you can best serve your users.

URL Structure for Mobile Content

The next set of questions ask about the URLs mobile content should be served from. Let’s look in detail at some common use cases.

Websites with only Desktop Experience Content

Most websites currently have only one version of their content, namely in HTML that is designed for desktop web browsers. This means all browsers access the content from the same URL.
These websites may not be serving traditional mobile phone users. The quality experienced by their smartphone users depends on the mobile browser they are using and it could be as good as browsing from the desktop.
If you serve only desktop experience content for all User Agents, you should do so for Googlebot-Mobile too; that is, treat Googlebot-Mobile as you treat all other or unknown User Agents. In these cases, Google may modify your webpages for an improved mobile experience.

Websites with Dedicated Mobile Content

Many websites have content specifically optimized for mobile users. The content could be simply reformatted for the typically smaller mobile displays, or it could be in a different format (e.g., served using WAP, etc.).
A very common question we see is: Does it matter if the different types of content are served from the same URL or from different URLs? For example, some websites have www.example.com as the URL desktop browsers are meant to access and have m.example.com or wap.example.com for the different mobile devices. Other websites serve all types of content from just one URL structure like www.example.com.
For Googlebot and Googlebot-Mobile, it does not matter what the URL structure is as long as it returns exactly what a user sees too. For example, if you redirect mobile users from www.example.com to m.example.com, that will be recognized by Googlebot-Mobile and both websites will be crawled and added to the correct index. In this case, use a 301 redirect for both users and Googlebot-Mobile.
If you serve all types of content from www.example.com, i.e. serving desktop-optimized content or mobile-optimized content from the same URL depending on the User-agent, this will also lead to correct crawling by Googlebot and Googlebot-Mobile. This is not considered cloaking by Google.
It is worth repeating that regardless of URL structure, you must correctly detect the User-agent as given by your users and Googlebot-Mobile, and serve both the same content. Don’t forget to keep the default content, the desktop-optimized content, for when an unknown User-agent requests it.

Mobile Sitemaps in Webmaster Tools

Finally, we receive many questions about what URLs to put in Mobile Sitemaps. As explained in our Mobile Sitemaps Help Center articles, you should include only mobile content URLs in Mobile Sitemaps, even if these URLs also return non-mobile content when accessed by a non-mobile User-agent.

More Questions?

A good place to start is our Mobile Sites Help Center articles and the relevant sections in our Search Engine Optimization Starter Guide. We also created a thread in our forums for you to ask questions about this post.

Webmaster Level: Intermediate to Advanced

To the fabulous, savvy audience that attended our Video Sitemap webinar several months ago, please accept our re-gift: a summary of your questions from the Video Sitemaps Q&A!

To those who were unable to attend the webinar, please enjoy our gift of the summarized Q&A -- it’s like new!

Either way, happy holidays from all of us on the Webmaster Central Team. :)


Our entire webinar covers the basics of Video Sitemaps and best practices -- nearly everything you’d need to know when submitting a video feed.

  1. Can the source/content of the video (perhaps a third-party vendor) be hosted on another site? For example, can I host my videos on YouTube and still be eligible for Video Search traffic?

    Yes, you can use a third party to host videos. Only the play page--the URL within the <loc> tag--needs to be on your site. <video:content_loc> and <video:player_loc> can list URLs on a different site or subdomain.

    For example, here’s a snippet from a valid Video Sitemap that shows content hosted on a different subdomain from the play page:

    <url>
      <loc>http://www.example.com/videos/some_video_landing_page.html</loc>
        <video:video>
          <video:thumbnail_loc>http://www.example.com/thumbs/123.jpg</video:thumbnail_loc>
          <video:title>Grilling steaks for summer</video:title>
          <video:description>Alkis shows you how to get perfectly done steaks every time</video:description>
          <video:content_loc>http://video-hoster.example.com/video123.flv</video:content_loc>
          <video:player_loc allow_embed="yes" autoplay="ap=1">http://www.example.com/videoplayer.swf?video=123</video:player_loc>
        </video:video>
    </url>


  2. If I’m using YouTube to host my videos, can Google verify that I’m the legitimate owner?

    Currently, there doesn’t exist functionality that allows you, as the uploader, to verify that you’re the owner of a video. The issue of authorship is a hard problem on the web, not just for videos, but nearly all types of content.

  3. Because Google owns YouTube, should users who embed YouTube videos still submit Video Sitemaps or is it unnecessary?

    Google treats YouTube as just another source for video content -- though you don’t need to submit a Video Sitemap if you only want your YouTube-hosted videos indexed. If, however, you’re using YouTube as a online video platform (i.e., with play pages on your own site), then we do recommend Sitemap submission.

  4. How long does it take for Google to accept and verify a Video Sitemap?

    Video Sitemap submission is a two-part process:

    1. We fetch the Sitemap and parse it for syntax errors. This happens within minutes.

    2. We fetch the assets referenced in the Sitemap, perform checks, validate metadata, do more cool stuff, and last, index the video. This step can require varied amounts of time depending on your site and our system load.

  5. What tags and categories are most important in Video Sitemaps or mRSS? Should I create my own categories or is there a list that I should conform to?

    Currently, the most important metadata to include is title and description -- both are required. The category tag is optional, and there isn’t a list from which to select.

  6. Do I have to use HTML5 to use Video Sitemaps?
    Does HTML5 help with discovery?
    Or, if my site is HTML5 compliant, do I still need to submit a Video Sitemap?


    None of the Video Search principles change with HTML5. We still recommend using a Video Sitemap regardless of the markup on your site. HTML5 can be helpful, though, because tags like <video> make it easier for our systems to verify that video exists on the page.

  7. If I use an iframe rather than embedding my videos, can Google still find it?

    We do not recommend using iframes to embed video content on your pages.

  8. Can I have multiple videos on one URL?

    You can. We’ve found, however, that users may not consider it the best experience. When users click on a video search result, they most often don’t like being forced to locate the correct video among multiple videos on the resulting page.

  9. Do I need to specifically create a robots.txt file that allows Googlebot, or do I just need to make sure Googlebot isn’t blocked?

    Just make sure that Googlebot isn’t blocked.

  10. I provided a thumbnail, but it’s not being used. Does Google create their own thumbnails from my videos?

    We try to use the thumbnail you provide if it’s valid. If not, we’ll try to generate a thumbnail ourselves. We recommend that you provide thumbnails that are at least 120x90 pixels. We also accept many thumbnail formats, such as PNG and JPEG.

  11. Any video filesize limitations?

    At this time, there aren’t video filesize limitations on content submitted through VIdeo Sitemaps.

  12. Is there any way to indicate a transcript or closed captioning for a video?

    Currently there isn’t, but perhaps down the road.

  13. What if I’m using Lightbox or a popup to display a video; can it still be indexed?

    Depends on the use case and how it’s rendered, but if indexing by search engines is important to you, it’s not the safest method. In the Webmaster Help Center, we explain that “When designing your site, it's important to configure your video pages without any overly complex JavaScript or Flash setup.” Most often, for bots, simpler is safer.
Have a safe and happy holiday!

Webmaster Level: Intermediate to Advanced

What are the benefits of submitting feeds like Video Sitemaps and mRSS vs. the benefits of Facebook Share and RDFa? Is one better than the other? Let’s start the discussion.

Functionality of feeds vs. on-page markup

Google accepts information from both video feeds, such as Video Sitemaps and mRSS, as well as on-page markup, such as Facebook Share and RDFa. We recommend that you use both!

If you have limited resources, however, here’s a chart explaining the pros and cons of each method. The key differentiators include:
  • While both feeds and on-page markup give search engines metadata, Video Sitemaps/mRSS also help with crawl discovery. We may find a new URL through your feed that we wouldn’t have easily discovered otherwise.

  • Using Video Sitemaps/mRSS requires that the search engine support these formats and not all engines do. Because on-page markup is just that -- on the page -- crawlers can gather the metadata through organic means as they index the URL. No feed support is required.

 Feeds
(Video Sitemaps & mRSS)
On-page markup
(Facebook Share & RDFa)
Accepted by Google
Helps search engines discover new URLs with videos (improves discovery and coverage)
Provides structured metadata (e.g. video title and description)
Allows search engines without sitemap/mRSS support to still obtain metadata information (allows organic gathering of metadata)
Incorporates additional metadata like “duration”


If you’re further wondering about the benefits of specific feeds (Video Sitemaps vs. mRSS), we can help with clarification there, too. First of all, you can use either. We’re agnostic. :) One benefit of Video Sitemaps is that, because it’s a format we’re actively enhancing, we can quickly extend it to allow for more specifications.

All this said, if you’re going to start from scratch, Video Sitemaps is our recommended start.

 Video SitemapsmRSS
Accepted by Google
Been around for a long, long time and pretty widely accepted
Extremely quick for Google Video Search team to extend


“Starving” to start conversation about feeds or on-page markup? Join us in the Sitemaps section of the Webmaster discussion forum.

Webmaster Level: All

Often a website which hosts videos will have a common top-level page that groups conceptually related videos together. Such a page may be of interest to a user searching on that subject. Sites with many videos about a single subject can group these videos together on a top-level page, often known as a gallery. This can make it easier for users to find exactly what they're looking for. In this case, you can use a Sitemap to tell Google the URL of the gallery page on which each video appears.


You can specify the URL of the gallery level page using the optional tag <video:gallery_loc> on a per-video basis. Note that only one gallery_loc is allowed per video.

For more information on Google Videos, including Sitemap specifications, please visit our Help Center. To post questions and search for answers, check out our Help Forum.

Webmaster Level: All

You can now check your Video Sitemap for even more errors right in Webmaster Tools! It’s a new Labs feature to signal issues in your Video Sitemap such as:
  • URLs disallowed by robots.txt
  • Thumbnail size errors (160x120px is ideal. Anything smaller than 90x50 will be rejected.)



Video Sitemaps help us to better crawl and extract information about your videos, so we can appropriately feature them in search results.

Totally new to Video Sitemaps? Check out the Video Sitemaps center for more information. Otherwise, take a look at this new Labs feature in Webmaster Tools.

Webmaster Level: All

If you want to add video information to a Sitemap or mRSS feed you must specify the location of the video. This means you must include one of two tags, either the video:player_loc or video:content_loc. In the case of an mRSS feed, these equivalent tags are media:player or media:content, respectively. We need this information to verify that there is actually a live video on your landing page and to extract metadata and signals from the video bytes for ranking. If one of these tags is not included we will not be able to verify the video and your Sitemap/mRSS feed will not be crawled. To reduce confusion, here is some more detail about these elements.

Video Locations Defined

Player Location/URL: the player (e.g., .swf) URL with corresponding arguments that load and play the actual video.

Content Location/URL: the actual raw video bytes (e.g., .flv, .avi) containing the video content.

The Requirements

One of either the player video:player_loc or content video:content_loc location is required. However, we strongly suggest you provide both, as they each serve distinct purposes: player location is primarily used to help verify that a video exists on the page, and content location helps us extract more signals and metadata to accurately rank your videos.

URL extensions at a glance:



















Sitemap:mRSS:Contents:
<loc><link>The playpage URL
<video:player_loc>

<media:player> (url attribute)The SWF URL
<video:content_loc><media:content> (url attribute)The FLV or other raw video URL

NOTE: All URLs should be unique (every URL in your entire Video Sitemap and mRSS feed should be unique)

If you would like to better ensure that only Googlebot accesses your content, you can perform a reverse DNS lookup.

For more information on Google Videos please visit our Help Center, and to post questions and search for answers check out our Help Forum.

Webmaster Level: All

We know that some of you, or your clients or colleagues, may be new to online video publishing. To make it easier for everyone to understand video indexing and Video Sitemaps, we’ve created a video -- narrated by Nelson Lee, Video Search Product Manager -- that explains everything in basic terms:



Also, last month we wrote about some best practices for getting video content indexed on Google. Today, to help beginners better understand the whys and hows of implementing a Video Sitemap, we added a starting page to the information on Video Sitemaps in the Webmaster Help Center. Please take a look and share your thoughts.

Webmaster Level: All

Have you ever wanted to submit your various content types (video, images, etc.) in one Sitemap? Now you can! If your site contains videos, images, mobile URLs, code or geo information, you can now create—and submit—a Sitemap with all the information.

Site owners have been leveraging Sitemaps to let Google know about their sites’ content since Sitemaps were first introduced in 2005. Since that time additional specialized Sitemap formats have been introduced to better accommodate video, images, mobile, code or geographic content. With the increasing number of specialized formats, we’d like to make it easier for you by supporting Sitemaps that can include multiple content types in the same file.

The structure of a Sitemap with multiple content types is similar to a standard Sitemap, with the additional ability to contain URLs referencing different content types. Here's an example of a Sitemap that contains a reference to a standard web page for Web search, image content for Image search and a video reference to be included in Video search:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
 xmlns:image="http://www.google.com/schemas/sitemap-image/1.1"
 xmlns:video="http://www.google.com/schemas/sitemap-video/1.1">
<url>
 <loc>http://example.com/foo.html</loc>
 <image:image>
   <image:loc>http://example.com/image.jpg</image:loc>
 </image:image>
 <video:video>
   <video:content_loc>http://example.com/videoABC.flv</video:content_loc>
   <video:title>Grilling tofu for summer</video:title>
 </video:video>
</url>
</urlset>

Here's an example of what you'll see in Webmaster Tools when a Sitemap containing multiple content types is submitted:



We hope the capability to include multiple content types in one Sitemap simplifies your Sitemap submission. The rest of the Sitemap rules, like 50,000 max URLs in one file and the 10MB uncompressed file size limit, still apply. If you have questions or other feedback, please visit the Webmaster Help Forum.

Webmaster Level: All

The single best way to make Google aware of all your videos on your website is to create and maintain a Video Sitemap. Video Sitemaps provide Google with essential information about your videos, including the URLs for the pages where the videos can be found, the titles of the videos, keywords, thumbnail images, durations, and other information. The Sitemap also allows you to define the period of time for which each video will be available. This is particularly useful for content that has explicit viewing windows, so that we can remove the content from our index when it expires.

Once your Sitemap is created, you can can submit the URL of the Sitemap file in Google Webmaster Tools or through your robots.txt file.

Once we have indexed a video, it may appear in our web search results in what we call a Video Onebox (a cluster of videos related to the queried topic) and in our video search property, Google Videos. A video result is immediately recognizable by its thumbnail, duration, and a description.

As an example, this is what a video result from CNN.com looks like on Google:


We encourage those of you with videos to submit Video Sitemaps and to keep them updated with your new content. Please also visit our recently updated Video Sitemap Help Center, and utilize our Sitemap Help Forum. If you've submitted a Video Sitemap file via Webmaster Tools and want to share your experiences or problems, you can do so here.

Webmaster Level: All

Sitemaps are an invaluable resource for search engines. They can highlight the important content on a site and allow crawlers to quickly discover it. Images are an important element of many sites and search engines could equally benefit from knowing which images you consider important. This is particularly true for images that are only accessible via JavaScript forms, or for pages that contain many images but only some of which are integral to the page content.

Now you can use a Sitemaps extension to provide Google with exactly this information. For each URL you list in your Sitemap, you can add additional information about important images that exist on that page. You don’t need to create a new Sitemap, you can just add information on images to the Sitemap you already use.

Adding images to your Sitemaps is easy. Simply follow the instructions in the Webmaster Tools Help Center or refer to the example below:

<?xml version="1.0" encoding="UTF-8"?>
  <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
   xmlns:image="http://www.google.com/schemas/sitemap-image/1.1">
  <url>
    <loc>http://example.com/sample.html</loc>
    <image:image>
        <image:loc>http://example.com/image.jpg</image:loc>
    </image:image>
  </url>
</urlset>


We index billions of images and see hundreds of millions of image-related queries each day. To take advantage of that traffic most effectively, take a moment to update your Sitemap file with information on the images from your site. Let us know in the Sitemaps forum if you have any questions.

Webmaster Level: All

During my stint on the "How Google Works Tour: Seattle", I heard plenty of questions regarding News Search from esteemed members of the press, such as The Stranger, The Seattle Times and Seattle Weekly. After careful note-taking throughout our conversations, the News team and I compiled this presentation to provide background and FAQs for all publishers interested in Google News Search.



Along with the FAQs about News Sitemaps and PageRank in the video above, here's additional Q&A to get you started:

Would adding a city name to my paper—for example, changing our name from "The Times" to "The San Francisco Bay Area Times"—help me target my local audience in News Search?
No, this won't help News rankings. We extract geography and location information from the article itself (see video). Changing your name to include relevant keywords or adding a local address in your footer won't help you target a specific audience in our News rankings.
What happens if I accidentally include URLs in my News Sitemap that are older than 72 hours?
We want only the most recently added URLs in your News Sitemap, as it directs Googlebot to your breaking information. If you include older URLs, no worries (there's no penalty unless you're perceived as maliciously spamming -- this case would be rare, so again, no worries); we just won't include those URLs in our next News crawl.
To get the full scoop, check out the video!