[go: nahoru, domu]

Webmaster Level: All

As part of our efforts to make search results more useful to our users around the world, we’re announcing the international availability of rich snippets. If you’ve been following our blog posts, you already know that rich snippets let users see additional facts and data from your site in search results.

For example, we recently launched rich snippets for recipes which, for certain sites, lets users see quick recipe facts as part of the snippet and makes it easier to determine if the page has what they are looking for:


We’ve had a lot of questions on our blogs and forums about international support for rich snippets - and we know that many of you have already started marking up your content - so today’s announcement is very exciting for us.

In addition to adding support for rich snippets in any language, we have published documentation on how to mark up your sites for rich snippets in the following languages: simplified Chinese, traditional Chinese, Czech, Dutch, English, French, German, Hungarian, Italian, Japanese, Korean, Polish, Portuguese, Russian, Spanish, and Turkish. (You can change the Help language by scrolling to the bottom of the help page and selecting the language you want from the drop-down menu.)

We encourage you to read the documentation to take advantage of the different types of rich snippets currently supported: people profiles, reviews, videos, events and recipes. You can also use our testing tool (in English only, but useful to test markup in any language) and start validating your markup to make sure results show as you would expect.

Finally and as you’ve probably heard by now (several times), we’re taking a gradual approach to surface rich snippets. This means that marking up your site doesn’t guarantee that we’ll show rich snippets for your pages. We’re doing this to ensure a good experience for our users; but rest assured we’re working hard to expand coverage and include more web pages.

Webmaster Level: Intermediate

That is the question we hear often. Onward to the answers! Historically, it’s common for URLs with a trailing slash to indicate a directory, and those without a trailing slash to denote a file:

http://example.com/foo/ (with trailing slash, conventionally a directory)
http://example.com/foo (without trailing slash, conventionally a file)

But they certainly don’t have to. Google treats each URL above separately (and equally) regardless of whether it’s a file or a directory, or it contains a trailing slash or it doesn’t contain a trailing slash.

Different content on / and no-/ URLs okay for Google, often less ideal for users

From a technical, search engine standpoint, it’s certainly permissible for these two URL versions to contain different content. Your users, however, may find this configuration horribly confusing -- just imagine if www.google.com/webmasters and www.google.com/webmasters/ produced two separate experiences.

For this reason, trailing slash and non-trailing slash URLs often serve the same content. The most common case is when a site is configured with a directory structure:
http://example.com/parent-directory/child-directory/

Your site’s configuration and your options

You can do a quick check on your site to see if the URLs:
  1. http://<your-domain-here>/<some-directory-here>/
    (with trailing slash)
  2. http://<your-domain-here>/<some-directory-here>
    (no trailing slash)
don’t both return a 200 response code, but that one version redirects to the other.
  • If only one version can be returned (i.e., the other redirects to it), that’s great! This behavior is beneficial because it reduces duplicate content. In the particular case of redirects to trailing slash URLs, our search results will likely show the version of the URL with the 200 response code (most often the trailing slash URL) -- regardless of whether the redirect was a 301 or 302.

  • If both slash and non-trailing-slash versions contain the same content and each returns 200, you can:
    • Consider changing this behavior (more info below) to reduce duplicate content and improve crawl efficiency.
    • Leave it as-is. Many sites have duplicate content. Our indexing process often handles this case for webmasters and users. While it’s not totally optimal behavior, it’s perfectly legitimate and a-okay. :)
    • Rest assured that for your root URL specifically, http://example.com is equivalent to http://example.com/ and can’t be redirected even if you’re Chuck Norris.
Steps for serving only one URL version

What if your site serves duplicate content on these two URLs:

http://<your-domain-here>/<some-directory-here>/
http://<your-domain-here>/<some-directory-here>

meaning that both URLs return 200 (neither has a redirect or contains rel=”canonical”), and you want to change the situation?
  1. Choose one URL as the preferred version. If your site has a directory structure, it’s more conventional to use a trailing slash with your directory URLs (e.g., example.com/directory/ rather than example.com/directory), but you’re free to choose whichever you like.

  2. Be consistent with the preferred version. Use it in your internal links. If you have a Sitemap, include the preferred version (and don’t include the duplicate URL).

  3. Use a 301 redirect from the duplicate to the preferred version. If that’s not possible, rel=”canonical” is a strong option. rel=”canonical” works similarly to a 301 for Google’s indexing purposes, and other major search engines as well.

  4. Test your 301 configuration through Fetch as Googlebot in Webmaster Tools. Make sure your URLs:
    http://example.com/foo/
    http://example.com/foo
    are behaving as expected. The preferred version should return 200. The duplicate URL should 301 to the preferred URL.

  5. Check for Crawl errors in Webmaster Tools, and, if possible, your webserver logs as a sanity check that the 301s are implemented.

  6. Profit! (just kidding) But you can bask in the sunshine of your efficient server configuration, warmed by the knowledge that your site is better optimized.

Webmaster Level: All

Welcome to the third episode of our URL removals series! In episodes one and two, we talked about expediting the removal of content that's under your control and requesting expedited cache removals. Today, we're covering how to use Google's public URL removal tool to request removal of content from Google’s search results when the content originates on a website not under your control.

Google offers two tools that provide a way to request expedited removal of content:

1. Verified URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site of which you’re a verified owner in Webmaster Tools (like your blog or your company’s site)

2. Public URL removal tool: for requesting to remove content from Google’s search results when it’s published on a site which you can’t verify ownership (like your friend’s blog)

Sometimes a situation arises where the information you want to remove originates from a site that you don't own or can't control. Since each individual webmaster controls their site and their site’s content, the best way to update or remove results from Google is for the site owner (where the content is published) to either block crawling of the URL, modify the content source, or remove the page altogether. If the content isn't changed, it would just reappear in our search results the next time we crawled it. So the first step to remove content that's hosted on a site you don't own is to contact the owner of the website and request that they remove or block the content in question.
  • Removed or blocked content

    If the website owner removes a page, requests for the removed page should return a "404 Not Found" response or a "410 Gone" response. If they choose to block the page from search engines, then the page should either be disallowed in the site's robots.txt file or contain a noindex meta tag. Once one of these requirements is met, you can submit a removal request using the "Webmaster has already blocked the page" option.



    Sometimes a website owner will claim that they’ve blocked or removed a page but they haven’t technically done so. If they claim a page has been blocked you can double check by looking at the site’s robots.txt file to see if the page is listed there as disallowed.
    User-agent: *
    Disallow: /blocked-page/
    Another place to check if a page has been blocked is within the page’s HTML source code itself. You can visit the page and choose “View Page Source” from your browser. Is there a meta noindex tag in the HTML “head” section?
    <html>
    <head>
    <title>blocked page</title>
    <meta name="robots" content="noindex">
    </head>
    ...
    If they inform you that the page has been removed, you can confirm this by using an HTTP response testing tool like the Live HTTP Headers add-on for the Firefox browser. With this add-on enabled, you can request any URL in Firefox to test that the HTTP response is actually 404 Not Found or 410 Gone.

  • Content removed from the page

    Once you've confirmed that the content you're seeking to remove is no longer present on the page, you can request a cache removal using the 'Content has been removed from the page' option. This type of removal--usually called a "cache" removal--ensures that Google's search results will not include the cached copy or version of the old page, or any snippets of text from the old version of the page. Only the current updated page (without the content that's been removed) will be accessible from Google's search results. However, the current updated page can potentially still rank for terms related to the old content as a result of inbound links that still exist from external sites. For cache removal requests you’ll be asked to enter a "term that has been removed from the page." Be sure to enter a word that is not found on the current live page, so that our automated process can confirm the page has changed -- otherwise the request will be denied. Cache removals are covered in more detail in part two of the "URL removal explained" series.


  • Removing inappropriate webpages or images that appear in our SafeSearch filtered results

    Google introduced the SafeSearch filter with the goal of providing search results that exclude potentially offensive content. For situations where you find content that you feel should have been filtered out by SafeSearch, you can request that this content be excluded from SafeSearch filtered results in the future. Submit a removal request using the 'Inappropriate content appears in our SafeSearch filtered results' option.

If you encounter any issues with the public URL removal tool or have questions not addressed here, please post them to the Webmaster Help Forum or consult the more detailed removal instructions in our Help Center. If you do post to the forum, remember to use a URL shortening service to share any links to content you want removed.

Edit: Read the rest of this series:
Part I: Removing URLs & directories
Part II: Removing & updating cached content
Part IV: Tracking requests, what not to remove
Companion post: Managing what information is available about you online

Webmaster Level: All

The single best way to make Google aware of all your videos on your website is to create and maintain a Video Sitemap. Video Sitemaps provide Google with essential information about your videos, including the URLs for the pages where the videos can be found, the titles of the videos, keywords, thumbnail images, durations, and other information. The Sitemap also allows you to define the period of time for which each video will be available. This is particularly useful for content that has explicit viewing windows, so that we can remove the content from our index when it expires.

Once your Sitemap is created, you can can submit the URL of the Sitemap file in Google Webmaster Tools or through your robots.txt file.

Once we have indexed a video, it may appear in our web search results in what we call a Video Onebox (a cluster of videos related to the queried topic) and in our video search property, Google Videos. A video result is immediately recognizable by its thumbnail, duration, and a description.

As an example, this is what a video result from CNN.com looks like on Google:


We encourage those of you with videos to submit Video Sitemaps and to keep them updated with your new content. Please also visit our recently updated Video Sitemap Help Center, and utilize our Sitemap Help Forum. If you've submitted a Video Sitemap file via Webmaster Tools and want to share your experiences or problems, you can do so here.

Webmaster Level: All

A little over six months ago we released a new malware diagnostic tool in Webmaster Tools with the help of Lucas Ballard from the anti-malware team. This feature has been a great success; many of you were interested to know if Google had detected malicious software in your site, and you used the tool's information to find and remove that malware and to fix the vulnerabilities in your servers.

Well, a few days ago we promoted the malware diagnostics tool from Labs to a full Webmaster Tools feature. You can now find it under the Diagnostics menu. Not only that, we also added support for malware notifications. As you may already know, if your site has malware we may show a warning message in our search results indicating that the site is potentially harmful. If this is the case, you should remove any dangerous content as soon as possible and patch the vulnerabilities in your server. After you've done that, you can request a malware review in order to have the warning for your site removed. What's new in our latest release is that the form to request a review is now right there with the rest of the malware data:

Screenshot of the new malware feature in Webmaster Tools

We've also made several other improvements under the covers. Now the data is updated almost four times faster than before. And we've improved our algorithms for identifying injected content and can pinpoint exploits that were difficult to catch when the feature first launched.

On the Webmaster Tools dashboard you'll still see a warning message when you have malware on one of your sites. This message has a link that will take you directly to the malware tool. Here at Google we take malware very seriously, and we're working on several improvements to this feature so that we can tell you ASAP if we detect that your site is potentially infected. Stay tuned!

For more details, check out the Malware & Hacked sites help forum.


Webmaster Level: All

Today, as announced on the Official Google Blog, we’ve taken an additional step to improve access to Google webmaster services. Parallels, one of the leading providers of control panel software to hosting companies, has integrated Google Services for Websites into Parallels Plesk Panel, used by millions of website owners globally to manage their websites.

If you use Plesk for managing your hosting and website services, you can easily configure Webmaster Tools, Custom Search, Site Search, Web Elements and AdSense for your website right from within Plesk.

Since Plesk knows what domains you own, it automatically registers your domains to Webmaster Tools and allows you to automatically login to the Webmaster Tools console to verify your sites, as shown below.



We’re always trying to make our tools easier to use and easier to access. Since you’re probably visiting your hosting control panel on a regular basis, we hope that you find this integration convenient. If you have feedback please let us know in the Webmaster Forum.

Webmaster Level: All

We've got good news for site owners who are frequent users of the Top search queries feature in Webmaster Tools: we’re now providing more detailed data for each individual search query. We previously just reported the average position at which your site’s pages appeared in the search results for a particular query. Now you can click on a given search query in the Top search queries report to see a breakdown of the number of impressions and the amount of clickthrough for each position that your site’s pages appeared at in the search results associated with that query. Impressions are the number of times that your site’s pages appeared in the search results for the query. Clickthrough is the number of times searchers clicked on that query’s search results to visit a page from your site. In addition to impressions and clickthrough numbers, you’ll also see a list of your site's pages that were linked to from the search results for that search query. As we went about increasing the amount of data available, we also implemented measures to increase the detail of the data overall.



It used to be that you could only see Top search queries data for your site's top 100 queries. We’ve significantly increased the number of queries we show. Now if your site ranks for more than 100 queries, you’ll see new pagination buttons at the bottom of the Top Search Queries table allowing you to page through a much larger sampling of the queries that return your site in search results.



Previously, if you wanted to visualize your Top search queries data you could download your site's data and generate your own charts. To save you some time and effort, we're now generating a chart for you, and displaying it right within the page.



The Top search queries chart includes a date range selector similar to what Google Analytics offers. So now if you really want to see what your site's top search queries were for a particular week in the past, you can see the data for just that slice in time.



Finally, for sites that have numerous keywords that change frequently, we’ve added the ability to search through your site’s top search queries so that you can filter the data to exactly what you’re looking for in your query haystack.



We hope you enjoy these updates to the Top search queries feature and that it's even more useful for understanding how your site appears and performs in our search results. If you've got feedback or questions about the new Top search queries, please share your thoughts in our Webmaster Help Forum.

Webmaster Level: All

Anticipating the start of the season of barbecues and potlucks, we’ve added recipes as our newest rich snippets format. This means that for certain sites with recipe content, Google users will see quick facts when these recipe pages show up as part of the search results.

For example, if you were searching for an easy to make thai mango salad, you can now see user ratings, preparation time, and a picture of the dish directly in search result snippets.


Recipes is the fifth format we support, following the introduction of reviews, people, video and, most recently, events.

If you have recipe content on your site, you can get started now by marking up your recipes with microdata, RDFa, or the hRecipe microformat. To learn more, read our documentation on how to mark up recipe information or our general help articles on rich snippets for a more complete overview.

Please remember that to ensure a great user experience we’re taking a gradual approach to surface rich snippets. This means that we can’t guarantee that marking up your site will result in a rich snippet when your page shows up on our search results. However, we encourage you to get started, and once you’re done you can test your pages with our rich snippets testing tool.

Webmaster Level: All

You may have heard that here at Google we're obsessed with speed, in our products and on the web. As part of that effort, today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.

Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we've seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don't just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.

If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate the speed of your site:
  • Page Speed, an open source Firefox/Firebug add-on that evaluates the performance of web pages and gives suggestions for improvement.
  • YSlow, a free tool from Yahoo! that suggests ways to improve website speed.
  • WebPagetest shows a waterfall view of your pages' load performance plus an optimization checklist.
  • In Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below. We've also blogged about site performance.
While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.

We encourage you to start looking at your site's speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone's experience on the Internet.

Webmaster Level: All

We’ve been hearing this question for many years from webmasters. That’s why we built features such as the Safe Browsing API, the malware review form, and our Malware details Labs feature.

As of today, once we notice your site is infected, we’ll do our best to send an e-mail to the address you have associated with your account in Webmaster Tools. We believe malware is such an important issue for site owners that being quickly informed is beneficial to you and your website’s visitors.

In addition, we’ve promoted our Malware details feature out of Labs and placed it under Diagnostics. The malware data is now updated four times faster than before, we’ve updated our algorithms for identifying injected content, and we’re now able to identify exploits which we were unable to catch earlier.



We hope this allows you to stay up-to-date with any malware issues we detect on your site, and to fix them quickly.

As always, please let us know if you have any feedback or questions about how to fix malware-related issues in our Webmaster Help Forum.


Webmaster Level: All

Sitemaps are an invaluable resource for search engines. They can highlight the important content on a site and allow crawlers to quickly discover it. Images are an important element of many sites and search engines could equally benefit from knowing which images you consider important. This is particularly true for images that are only accessible via JavaScript forms, or for pages that contain many images but only some of which are integral to the page content.

Now you can use a Sitemaps extension to provide Google with exactly this information. For each URL you list in your Sitemap, you can add additional information about important images that exist on that page. You don’t need to create a new Sitemap, you can just add information on images to the Sitemap you already use.

Adding images to your Sitemaps is easy. Simply follow the instructions in the Webmaster Tools Help Center or refer to the example below:

<?xml version="1.0" encoding="UTF-8"?>
  <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
   xmlns:image="http://www.google.com/schemas/sitemap-image/1.1">
  <url>
    <loc>http://example.com/sample.html</loc>
    <image:image>
        <image:loc>http://example.com/image.jpg</image:loc>
    </image:image>
  </url>
</urlset>


We index billions of images and see hundreds of millions of image-related queries each day. To take advantage of that traffic most effectively, take a moment to update your Sitemap file with information on the images from your site. Let us know in the Sitemaps forum if you have any questions.

Webmaster Level: All

We try to communicate with webmasters in lots of different places. For example, when we send representatives to conferences we’re happy to participate in public site clinics where we share best practices on how to improve the crawlability and site architecture of websites suggested by the audience.

However, we also want to help users who can’t or don’t want to attend search conferences. To reach more people, we started doing free virtual site clinics in languages other than English. These site clinics help site owners make websites in such a way that they are more easily crawled, indexed, and returned by search engine crawlers, which in turn helps webmasters gain more visibility on the web.

We did a series of free virtual site clinics in Spanish last year which spanned 5 blog posts. The clinics covered real problems on real sites, and we posted the results on the Spanish Webmaster Central blog. If you read Spanish, I recommend you go read the different posts covering everything from issues with framed sites, to more technical domain setup.

In some countries we don’t have dedicated webmaster-focused blogs, but we still want to help webmasters in those countries. That means that you might occasionally see site clinic or webmaster-related posts on AdWords blogs such as the forthcoming ones on the Nordic AdWords blogs (which cover Danish, Finnish, Norwegian and Swedish). Recently when we posted some advice for webmasters on one of our AdWords blogs, we received questions about the relationship between Google’s search and advertising programs. We wanted to again reassure our users that the ranking of Google’s organic search results is entirely separate from our advertising programs. Furthermore, we do not give any preference to AdWords customers in our site clinics - everybody is welcome to participate. We’re simply posting this on local “AdWords” blogs because it’s the best way for us to reach webmasters in those communities and languages.