[go: nahoru, domu]

Google has a broad range of resources to help you better understand your website and improve its performance. This Webmaster Central Blog, the Help Center, the Webmaster forum, and the recently released Search Engine Optimization (SEO) Starter Guide are just a few.

We also have a YouTube channel, for answers to your questions in video format. To help with short & to the point answers to specific questions, we've just launched a new series, which we call SEO Snippets.

In this series of short videos, the Google team will be answering some of the webmaster and SEO questions that we regularly see on the Webmaster Central Help Forum. From 404 errors, how and when crawling works, a site's URL structure, to duplicate content, we'll have something here for you.

Check out the links shared in the videos to get more helpful webmaster information, drop by our help forum and subscribe to our YouTube channel for more tips and insights!


Over the years, the different ways you can choose to highlight your website's content in search has grown dramatically. In the past, we've called these rich snippets, rich cards, or enriched results. Going forward - to simplify the terminology -  our documentation will use the name "rich results" for all of them. Additionally, we're introducing a new rich results testing tool to make diagnosing your pages' structured data easier.

The new testing tool focuses on the structured data types that are eligible to be shown as rich results. It allows you to test all data sources on your pages, such as JSON-LD (which we recommend), Microdata, or RDFa. The new tool provides a more accurate reflection of the page’s appearance on Search and includes improved handling for Structured Data found on dynamically loaded content. The tests for Recipes, Jobs, Movies, and Courses are currently supported -- but this is just a first step, we plan on expanding over time.

Testing a page is easy: just open the testing tool, enter a URL, and review the output. If there are issues, the tool will highlight the invalid code in the page source. If you're working with others on this page, the share-icon on the bottom-right lets you do that quickly. You can also use preview button to view all the different rich results the page is eligible for. And … once you're happy with the result, use Submit To Google to fetch & index this page for search.

Want to get started with rich snippets rich results? Check out our guides for marking up your content. Feel free to drop by our Webmaster Help forums should you have any questions or get stuck; the awesome experts there can often help resolve issues and give you tips in no time!


So far on #NoHacked, we have shared some tips on detection and prevention. Now that you are able to detect hack attack, we would like to introduce some common hacking techniques and guides on how to fix them!

  • Fixing the Cloaked Keywords and Links Hack
    The cloaked keywords and link hack automatically creates many pages with nonsensical sentences, links, and images. These pages sometimes contain basic template elements from the original site, so at first glance, the pages might look like normal parts of the target site until you read the content. In this type of attack, hackers usually use cloaking techniques to hide the malicious content and make the injected page appear as part of the original site or a 404 error page.
  • Fixing the Gibberish Hack
    The gibberish hack automatically creates many pages with nonsensical sentences filled with keywords on the target site. Hackers do this so the hacked pages show up in Google Search. Then, when people try to visit these pages, they'll be redirected to an unrelated page, like a porn site for example.
  • Fixing the Japanese Keywords Hack
    The Japanese keywords hack typically creates new pages with Japanese text on the target site in randomly generated directory names. These pages are monetized using affiliate links to stores selling fake brand merchandise and then shown in Google search. Sometimes the accounts of the hackers get added in Search Console as site owners.

Lastly, after you clean your site and fix the problem, make sure to file for a reconsideration request to have our teams review your site.

If you have any questions, post your questions on our Webmaster Help Forums!

When we announced almost a year ago that we're experimenting with mobile-first indexing, we said we'd update publishers about our progress, something that we've done the past few months through public talks in office hours on Hangouts on Air and at conferences like Pubcon.

To recap, currently our crawling, indexing, and ranking systems typically look at the desktop version of a page's content, which may cause issues for mobile searchers when that version is vastly different from the mobile version. Mobile-first indexing means that we'll use the mobile version of the content for indexing and ranking, to better help our – primarily mobile – users find what they're looking for. Webmasters will see significantly increased crawling by Smartphone Googlebot, and the snippets in the results, as well as the content on the Google cache pages, will be from the mobile version of the pages.

As we said, sites that make use of responsive web design and correctly implement dynamic serving (that include all of the desktop content and markup) generally don't have to do anything. Here are some extra tips that help ensure a site is ready for mobile-first indexing:
  • Make sure the mobile version of the site also has the important, high-quality content. This includes text, images (with alt-attributes), and videos - in the usual crawlable and indexable formats.
  • Structured data is important for indexing and search features that users love: it should be both on the mobile and desktop version of the site. Ensure URLs within the structured data are updated to the mobile version on the mobile pages.
  • Metadata should be present on both versions of the site. It provides hints about the content on a page for indexing and serving. For example, make sure that titles and meta descriptions are equivalent across both versions of all pages on the site.
  • No changes are necessary for interlinking with separate mobile URLs (m.-dot sites). For sites using separate mobile URLs, keep the existing link rel=canonical and link rel=alternate elements between these versions.
  • Check hreflang links on separate mobile URLs. When using link rel=hreflang elements for internationalization, link between mobile and desktop URLs separately. Your mobile URLs' hreflang should point to the other language/region versions on other mobile URLs, and similarly link desktop with other desktop URLs using hreflang link elements there.
  • Ensure the servers hosting the site have enough capacity to handle potentially increased crawl rate. This doesn't affect sites that use responsive web design and dynamic serving, only sites where the mobile version is on a separate host, such as m.example.com.
We will be evaluating sites independently on their readiness for mobile-first indexing based on the above criteria and transitioning them when ready. This process has already started for a handful of sites and is closely being monitored by the search team.

We continue to be cautious with rolling out mobile-first indexing. We believe taking this slowly will help webmasters get their sites ready for mobile users, and because of that, we currently don't have a timeline for when it's going to be completed. If you have any questions, drop by our Webmaster forums or our public events.

Posted by Gary

Last week on #NoHacked, we have shared on hack detection and the reasons why you might get hacked. This week we focus on prevention and here are some tips for you!
  • Be mindful of your sources! Be very careful of a free premium theme/plugin!

    You probably have heard about free premium plugins! If you've ever stumbled upon a site offering you plugins you normally have to purchase for free, be very careful. Many hackers lure you in by copying a popular plugin and then add backdoors or malware that will allow them to access your site. Read more about a similar case on the Sucuri blog. Additionally, even legit good quality plugins and themes can become dangerous if :

    • you do not update them as soon as a new version becomes available
    • the developer of said theme or plugin does not update them, and they become old over time

In any case, keeping all your site's software modern and updated is essential in keeping hackers out of your website.

  • Botnet in wordpress
    A botnet is a cluster of machines, devices, or websites under the control of a third party often used to commit malicious acts, such as operating spam campaigns, clickbots, or DDoS. It's difficult to detect if your site has been infected by a botnet because there are often no specific changes to your site. However, your site's reputation, resources, and data are at risk if your site is in a botnet. Learn more about botnets, how to detect them, and how they can affect your site at Botnet in wordpress and joomla article.
As usual if you have any questions post on our Webmaster Help Forums for help from the friendly community and see you next week!

There are lots of resources out there to create great websites. Website owners often ask Google what our recommended practices are to make sure great websites are search-engine-friendly. Traditionally, our resources for beginners were the SEO Starter Guide and the Webmaster Academy. To help webmasters create modern, search-engine-friendly websites, we’re announcing today the launch of a new, updated SEO Starter Guide.

The traditional SEO Starter Guide lists best practices that make it easier for search engines to crawl, index and understand content on websites. The Webmaster Academy has the information and tools to teach webmasters how to create a site and have it found in Google Search. Since these two resources have some overlapping purpose and content, and could be more exhaustive on some aspects of creating a user friendly and safe website, we’re deprecating the Webmaster Academy and removing the old SEO Starter Guide PDF.



The updated SEO Starter Guide will replace both the old Starter Guide and the Webmaster Academy. The updated version builds on top of the previously available document, and has additional sections on the need for search engine optimization, adding structured data markup and building mobile-friendly websites.
This new Guide is available in nine languages (English, German, Spanish, French, Italian, Japanese, Portuguese, Russian and Turkish) starting today, and we’ll be adding sixteen more languages very soon.

Go check out the new SEO Starter Guide, and let us know what you think about it.

For any questions, feel free to drop by our Webmaster Help Forums!

Posted by Abhas Tripathi, Search Quality Strategist

Last week #NoHacked is back on our G+ and Twitter channels! #NoHacked is our social campaign which aims to bring awareness about hacking attacks and offer tips on how to keep your sites safe from hackers. This time we would like to start sharing content from #NoHacked campaign on this blog in your local language!

Why do sites get hacked? Hackers have different motives for compromising a website, and hack attacks can be very different, so they are not always easily detected. Here are some tips which will help you in detecting hacked sites!

  • Getting started:

    Start with our guide "How do I know if my site is hacked?" if you've received a security alert from Google or another party. This guide will walk you through basic steps to check for any signs of compromises on your site.

  • Understand the alert on Google Search:

    At Google, we have different processes to deal with hacking scenarios. Scanning tools will often detect malware, but they can miss some spamming hacks. A clean verdict from Safe Browsing does not mean that you haven't been hacked to distribute spam.

    • If you ever see "This site may be hacked", your site may have been hacked to display spam. Essentially, your site has been hijacked to serve some free advertising.
    • If you see "This site may harm your computer" beneath the site URL then we think the site you're about to visit might allow programs to install malicious software on your computer.
    • If you see a big red screen before your site, that can mean a variety of things:
      • If you see "The site ahead contains malware", Google has detected that your site distributes malware.
      • If you see "The site ahead contains harmful programs", then the site has been flagged for distributing unwanted software.
      • "Deceptive site ahead" warnings indicate that your site may be serving phishing or social engineering. Your site could have been hacked to do any of these things.
  • Malvertising vs Hack:

    Malvertising happens when your site loads a bad ad. It may make it seem as though your site has been hacked, perhaps by redirecting your visitors, but in fact is just an ad behaving badly.

  • Open redirects: check if your site is enabling open redirects

    Hackers might want to take advantage of a good site to mask their URLs. One way they do this is by using open redirects, which allow them to use your site to redirect users to any URL of their choice. You can read more here!

  • Mobile check: make sure to view your site from a mobile browser in incognito mode. Check for bad mobile ad networks.

    Sometimes bad content like ads or other third-party elements unknowingly redirect mobile users. This behavior can easily escape detection because it's only visible from certain browsers. Be sure to check that the mobile and desktop versions of your site show the same content.

  • Use Search Console and get message:

    Search Console is a tool that Google uses to communicate with you about your website. It also includes many other tools that can help you improve and manage your website. Make sure you have your site verified in Search Console even if you aren't a primary developer on your site. The alerts and messages in Search Console will let you know if Google has detected any critical errors on your site.

If you're still unable to find any signs of a hack, ask a security expert or post on our Webmaster Help Forums for a second look.

The #NoHacked campaign will run for the next 3 weeks. Follow us on our G+ and Twitter channels or look out for the content in this blog as we will be posting summary for each week right here at the beginning of each week! Stay safe meanwhile!

The AJAX crawling scheme was introduced as a way of making JavaScript-based webpages accessible to Googlebot, and we've previously announced our plans to turn it down. Over time, Google engineers have significantly improved rendering of JavaScript for Googlebot. Given these advances, in the second quarter of 2018, we'll be switching to rendering these pages on Google's side, rather than on requiring that sites do this themselves. In short, we'll no longer be using the AJAX crawling scheme.

As a reminder, the AJAX crawling scheme accepts pages with either a "#!" in the URL or a "fragment meta tag" on them, and then crawls them with an "?_escaped_fragment_=" in the URL. That escaped version needs to be a fully-rendered and/or equivalent version of the page, created by the website itself.

With this change, Googlebot will render the #! URL directly, making it unnecessary for the website owner to provide a rendered version of the page. We'll continue to support these URLs in our search results.

We expect that most AJAX-crawling websites won't see significant changes with this update. Webmasters can double-check their pages as detailed below, and we'll be sending notifications to any sites with potential issues.

If your site is currently using either #! URLs or the fragment meta tag, we recommend:

  • Verify ownership of the website in Google Search Console to gain access to the tools there, and to allow Google to notify you of any issues that might be found.
  • Test with Search Console's Fetch & Render. Compare the results of the #! URL and the escaped URL to see any differences. Do this for any significantly different part of the website. Check our developer documentation for more information on supported APIs, and see our debugging guide when needed.
  • Use Chrome's Inspect Element to confirm that links use "a" HTML elements and include a rel=nofollow where appropriate (for example, in user-generated content)
  • Use Chrome's Inspect Element to check the page's title and description meta tag, any robots meta tag, and other meta data. Also check that any structured data is available on the rendered page.
  • Content in Flash, Silverlight, or other plugin-based technologies needs to be converted to either JavaScript or "normal" HTML, if their content should be indexed in search.

We hope that this change makes it a bit easier for your website, and reduces the need to render pages on your end. Should you have any questions or comments, feel free to drop by our webmaster help forums, or to join our JavaScript sites working group.