[go: nahoru, domu]

Earlier this year we announced a series of Webmaster Conferences being held around the world to help website creators understand how to optimize their sites for Search. We’ve already held 22 of these events, with more planned through the end of the year. Building on the success of these events so far, we’re hosting a product summit version of this event at the Google Headquarters in Mountain View on Monday November 4th.


Photos from the Webmaster Conference in Kuala Lumpur, earlier this year.

This event is designed to facilitate an open dialog between the webmaster and SEO community and Search product teams. This one-day event will include talks from Search product managers, Q&A sessions, and a product fair giving attendees the opportunity to have direct conversations with product managers. Attendees will learn from the people building Search about how they think about the evolution of the platform, and have the opportunity to share feedback about the needs of the community.

We also realize that not everyone will be able to make this event in person, so we plan to share out much of the content and feedback after the event.

If you’re interested and able to make it, we encourage you to apply today as space is limited. Complete details about the event and the application process can be found on the event registration site. And as always, you can check out our other upcoming events on the general Webmaster Conference site, the Google Webmasters event calendar, or follow our blogs and @googlewmc on Twitter!

Posted by John Mueller, Google Switzerland

The world of search is constantly evolving. New tools, opportunities, and features are regularly arriving, sometimes existing things change, and sometimes we say goodbye to some things to make way for the new. To help you stay on top of things, we've started a new YouTube series called Google Search News.

With Google Search News, we want to give you a regular & short summary of what's been happening around Google Search, specifically for SEOs, publishers, developers, and webmasters. The first episode is out now, so check it out. 

(The first episode, now on your screen)

In this first episode, we cover:

We plan to make these updates regularly, and adjust the format over time as needed. Let us know what you think in the video comments!

Google uses content previews, including text snippets and other media, to help people decide whether a result is relevant to their query. The type of preview shown depends on many factors, including the type of content a person is looking for and the kind of device they're viewing it on.

For instance, if you look for recipe results on Google, you may see thumbnail images and user ratings--things that may be more helpful than text snippets when it comes to deciding what you want to eat. Alternately, or perhaps you're looking for a concert nearby, and are able to check out the events directly in the search results. These are made possible by publishers marking up their pages with structured data.

Google automatically generates previews in a way intended to help a user understand why the results shown are relevant to their search and why the user would want to visit the linked pages. However, we recognize that site owners may wish to independently adjust the extent of their preview content in search results. To make it easier for individual websites to define how much or which text should be available for snippeting and the extent to which other media should be included in their previews, we're now introducing several new settings for webmasters. 

Letting Google know about your snippet and content preview preferences

Previously, it was only possible to allow a textual snippet or to not allow one. We're now introducing a set of methods that allow more fine-grained configuration of the preview content shown for your pages. This is done through two types of new settings: a set of robots meta tags and an HTML attribute. 

Using robots meta tags

The robots meta tag is added to an HTML page's <head>, or specified via the x-robots-tag HTTP header. The robots meta tags addressing the preview content for a page are:

  • "nosnippet"
    This is an existing option to specify that you don't want any textual snippet shown for this page. 
  • "max-snippet:[number]"
    New! Specify a maximum text-length, in characters, of a snippet for your page.
  • "max-video-preview:[number]"
    New! Specify a maximum duration in seconds of an animated video preview.
  • "max-image-preview:[setting]"
    New! Specify a maximum size of image preview to be shown for images on this page, using either "none", "standard", or "large".

They can be combined, for example:

<meta name="robots" content="max-snippet:50, max-image-preview:large">

Preview settings from these meta tags will become effective in mid-to-late October 2019 and may take about a week for the global rollout to complete.

Using the new data-nosnippet HTML attribute

A new way to help limit which part of a page is eligible to be shown as a snippet is the "data-nosnippet" HTML attribute on span, div, and section elements. With this, you can prevent that part of an HTML page from being shown within the textual snippet on the page.

For example:

<p><span data-nosnippet>Harry Houdini</span> is undoubtedly the most famous magician ever to live.</p>

The data-nosnippet HTML attribute will be start affecting presentation on Google products later this year. Learn more in our developer documentation for the robots meta tag, x-robots-tag, and data-nosnippet.

A note about rich results and featured snippets

Content in structured data is eligible for display as rich results in search. These kinds of results do not conform to limits declared in the above meta robots settings, but rather, can be addressed with much greater specificity by limiting or modifying the content provided in the structured data itself. For example, if a recipe is included in structured data, the contents of that structured data may be presented in a recipe carousel in the search results. Similarly, if an event is marked up with structured data, it may be presented as such in the search results. To limit that presentation, a publisher can limit the amount and type of content in the structured data. 

Some special features on Search depend on the availability of preview content, so limiting your previews may prevent your content from appearing in these areas. Featured snippets, for example, requires a certain minimum number of characters to be displayed. This can vary by language, which is why there is no exact max-snippets length we can provide to ensure appearing in this feature. Those who do not wish to have content appear as featured snippets can experiment with lower max-snippet lengths. Those who want a guaranteed way to opt-out of featured snippets should use nosnippet.

The AMP Format

The AMP format comes with certain benefits, including eligibility for more prominent presentation of thumbnail images in search results and in the Google Discover feed. These characteristics have been shown to drive more traffic to publishers’ articles. However, publishers who do not want Google to use larger thumbnail images when their AMP pages are presented in search and Discover can use the above meta robots settings to specify max-image-preview of “standard” or “none.”

These new options are available to content owners worldwide and will operate the same for results we display globally. We hope they make it easier for you to optimize the value you get from Search and achieve your business goals. For more information, check out our developer documentation on meta tags. Should you have any questions, feel free to reach out to us, or drop by our webmaster help forums.


We analyzed our user feedback, and today would like to announce a new improvement to the report based on users' #1 feature request - improved data freshness!

The Performance report helps webmasters and site owners better understand how their site performs on Google search, and answer questions such as:
  • General stats: How much traffic did my site get from Search and Discover?
  • Search queries: What are my site’s top and trending search queries?
  • Top content: What are my site’s most successful pages on Google search? 
  • Site’s audiences: From which countries? From which devices - is it mostly mobile?
  • Formats: What search formats does my site get (AMP, recipes, etc.) ?

With the new fresh data, users can now see data as recent as less than a day old - a significant improvement compared to the previous few days.

We hope this improved data freshness allows you to better monitor and track your site’s performance and addresses some important needs such as:
  • Seeing your weekend performance on Monday morning - no need to wait until Wednesday.
  • Checking on your site’s stats first thing in the morning after, or even during, important days such as holidays, global events, and shopping days.
  • Checking if your site's traffic rebounds soon after fixing an important technical issue.

Fresh Data in Search Performance report

In addition, we updated the report to clearly communicate the data timezone (Pacific time zone). This is useful when you’d like to interpret the data compared to your local time zone or integrate it with other sources such as Google Analytics.

Performance report date picker

Each fresh data point will be replaced with the final data point after a few days. It is expected that from time to time the fresh data might change a bit before being finalized.

The Search Analytics API does not support fresh data yet. In addition, fresh data is not available on the Discover performance report. As a result, properties that are eligible for Discover performance report will not see fresh data in their Overview report. We hope to address these items in the future.
Exporting performance data over time We also heard your feedback about wanting a simple way to explore and export your performance over time. Starting today, this is possible. Simply choose ‘dates’ in the table below the graph, select the desired time frame, and explore the data in Search Console or export the chart. We hope that this new feature will help you further explore your performance trends and changes over time.

Performance report now with ‘dates’ table

In conclusion We hope that this new fresh data will help you better monitor your site’s performance and identify trends, patterns and interesting changes much closer to when they happen. In addition, we hope that the new date table dimension will assist you in exploring performance trends and changes over time. If you have any questions or concerns, please reach out on the Webmaster Help Forum or on Twitter

Posted by Ziv Hodak, Search Console product manager

Search results that are enhanced by review rich results can be extremely helpful when searching for products or services (the scores and/or “stars” you sometimes see alongside search results).
Review stars example in search results

To make them more helpful and meaningful, we are now introducing algorithmic updates to reviews in rich results. This also addresses some of the invalid or misleading implementations webmasters have flagged to us.

Focus on schema types that lend themselves to reviews

While, technically, you can attach review markup to any schema type, for many types displaying star reviews does not add much value for the user. With this change, we’re limiting the pool of schema types that can potentially trigger review rich results in search. Specifically, we’ll only display reviews with those types (and their respective subtypes):

Self-serving reviews aren't allowed for LocalBusiness and Organization

Reviews that can be perceived as “self-serving” aren't in the best interest of users. We call reviews “self-serving” when a review about entity A is placed on the website of entity A - either directly in their markup or via an embedded 3rd party widget. That’s why, with this change, we’re not going to display review rich results anymore for the schema types LocalBusiness and Organization (and their subtypes) in cases when the entity being reviewed controls the reviews themselves.
Updated on September 18, 2019: To explain more, in the past, an entity like a business or an organization could add review markup about themselves to their home page or another page and often cause a review snippet to show for that page. That markup could have been added directly by the entity or embedded through the use of a third-party widget.
We consider this “self-serving” because the entity itself has chosen to add the markup to its own pages, about its own business or organization.
Self-serving reviews are no longer displayed for businesses and organizations (the LocalBusiness and Organization schema types). For example, we will no longer display rich review snippets for how people have reviewed a business, if those reviews are considered self-serving.

Reviews are allowed and displayed for other schema types listed in the documentation. For example, a cooking site might use markup for recipes to summarize its visitor reviews. In turn, we might include this rich review markup for when those recipes appear in search.
FAQ What if I'm using a third-party widget to display reviews about my business? Google Search won't display review snippets for those pages. Embedding a third-party widget is seen as controlling the process of linking reviews.
Do I need to remove self-serving reviews from LocalBusiness or Organization? No, you don't need to remove them. Google Search just won't display review snippets for those pages anymore.
Will I get a manual action for having self-serving reviews on my site? You won’t get a manual action just for this. However, we recommend making sure that your structured data matches our guidelines.
Does this update affect my Google My Business listing/profile? No, Google My Business is not affected as this update only relates to organic Search.
Will sites that gather reviews about other organizations be affected? No, that’s unchanged. Sites that gather reviews can show up with review snippets (for their reviews of other organizations) in search results.
Does this update apply to AggregateRating too? Yes. It applies to Review and AggregateRating.
How do I report if a self-serving review is still appearing in search results? We’re considering creating a special form for this, if needed. We're slowly rolling out this change, so you may still see some cases of review stars where they shouldn't be.

Add the name of the item that's being reviewed

With this update, the name property is now required, so you'll want to make sure that you specify the name of the item that's being reviewed.
This update will help deliver a much more meaningful review experience for users, while requiring little to no changes on the part of most webmasters. You can find all those updates documented in our developer documentation. If you have any questions, feel free to come to our webmaster forums!

Nearly 15 years ago, the nofollow attribute was introduced as a means to help fight comment spam. It also quickly became one of Google’s recommended methods for flagging advertising-related or sponsored links. The web has evolved since nofollow was introduced in 2005 and it’s time for nofollow to evolve as well.
Today, we’re announcing two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links. These, along with nofollow, are summarized below:

rel="sponsored": Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.

rel="ugc": UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user generated content, such as comments and forum posts.

rel="nofollow": Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.

When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes -- sponsored, UGC and nofollow -- are treated as hints about which links to consider or exclude within Search. We’ll use these hints -- along with other signals -- as a way to better understand how to appropriately analyze and use links within our systems.
Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.
We know these new attributes will generate questions, so here’s a FAQ that we hope covers most of those.

Do I need to change my existing nofollows?
No. If you use nofollow now as a way to block sponsored links, or to signify that you don’t vouch for a page you link to, that will continue to be supported. There’s absolutely no need to change any nofollow links that you already have.

Can I use more than one rel value on a link?
Yes, you can use more than one rel value on a link. For example, rel="ugc sponsored" is a perfectly valid attribute which hints that the link came from user-generated content and is sponsored. It’s also valid to use nofollow with the new attributes -- such as rel="nofollow ugc" -- if you wish to be backwards-compatible with services that don’t support the new attributes.

If I use nofollow for ads or sponsored links, do I need to change those?
No. You can keep using nofollow as a method for flagging such links to avoid possible link scheme penalties. You don't need to change any existing markup. If you have systems that append this to new links, they can continue to do so. However, we recommend switching over to rel=”sponsored” if or when it is convenient.

Do I still need to flag ad or sponsored links?
Yes. If you want to avoid a possible link scheme action, use rel=“sponsored” or rel=“nofollow” to flag these links. We prefer the use of “sponsored,” but either is fine and will be treated the same, for this purpose.

What happens if I use the wrong attribute on a link?
There’s no wrong attribute except in the case of sponsored links. If you flag a UGC link or a non-ad link as “sponsored,” we’ll see that hint but the impact -- if any at all -- would be at most that we might not count the link as a credit for another page. In this regard, it’s no different than the status quo of many UGC and non-ad links already marked as nofollow.
It is an issue going the opposite way. Any link that is clearly an ad or sponsored should use “sponsored” or “nofollow,” as described above. Using “sponsored” is preferred, but “nofollow” is acceptable.

Why should I bother using any of these new attributes?
Using the new attributes allows us to better process links for analysis of the web. That can include your own content, if people who link to you make use of these attributes.

Won’t changing to a “hint” approach encourage link spam in comments and UGC content?
Many sites that allow third-parties to contribute to content already deter link spam in a variety of ways, including moderation tools that can be integrated into many blogging platforms and human review. The link attributes of “ugc” and “nofollow” will continue to be a further deterrent. In most cases, the move to a hint model won’t change the nature of how we treat such links. We’ll generally treat them as we did with nofollow before and not consider them for ranking purposes. We will still continue to carefully assess how to use links within Search, just as we always have and as we’ve had to do for situations where no attributions were provided.

When do these attributes and changes go into effect?
All the link attributes, sponsored, ugc and nofollow, now work today as hints for us to incorporate for ranking purposes. For crawling and indexing purposes, nofollow will become a hint as of March 1, 2020. Those depending on nofollow solely to block a page from being indexed (which was never recommended) should use one of the much more robust mechanisms listed on our Learn how to block URLs from Google help page.

Posted by Danny Sullivan and Gary

Today we are reaching another important milestone in our graduation journey, we are saying goodbye to many old Search Console reports, including the home and dashboard pages 👋. Those pages are part of the history of the web, they were viewed over a billion times by webmasters from millions of websites. These pages helped site owners and webmasters to monitor and improve their performance on Google Search for over a decade.

From now on, if you try to access the old homepage or dashboard you’ll be redirected to the relevant Search Console pages. There are only a few reports that will still be available on the old interface for now - check the Legacy tools and reports in the Help Center. We're continuing to work on making the insights from these reports available in the new Search Console, so stay tuned!

Below is our last tribute to them, a picture of the team with the old Search Console in the background 😍. But we thought you might also have something to share, maybe some beautiful memories you have with the home and dashboard pages below (or any old Search Console page) - so we’ll be monitoring #SCmemories if you want to share your stories with us on Twitter.

Image: the team saying goodbye to the old Search Console

Image: old Search Console dashboard

Thank you for working together with us on making the web better - and see you at the new Search Console! If you have any feedback, let us know through the Webmasters community.

Posted by Hillel Maoz on behalf of the Search Console team.

Back in February, we announced domain-wide data in Search Console, to give site owners a comprehensive view of their site, removing the need to switch between different properties to get the full picture of your data.

We’ve seen lots of positive reactions from users who verified domain properties. A common feedback we heard from users is that before moving to domain properties they were underestimating their traffic, and the new method helped them understand their clicks and impressions aggregated data more effectively. When we asked Domain property users about their satisfaction with the feature, almost all of them seem to be satisfied. Furthermore, most of these users reported that they find domain properties more useful than the traditional URL prefix properties.

However, changing a DNS record is not always trivial, especially for small and medium businesses. We heard that the main challenge preventing site owners from switching to Domain properties is getting their domain verified. To help with this challenge, we collaborated with various domain name registrars to automate part of the verification flow. The flow will guide you through the necessary steps needed to update your registrar configuration so that your DNS record includes the verification token we provide. This will make the verification process a lot easier.
How to use Auto-DNS verification To verify your domain using the new flow, click ‘add property’ from the property selector (drop down on top of Search Console sidebar). Then, choose the ‘Domain’ option. The system will guide you through a series of steps, including a visit to the registrar site where you need to apply changes - there will be fewer and easier steps than before for you to go through. You can learn more about verifying your site at the Help Center.


Image: Auto-DNS verification flow 

We hope you can use this new capability and gain ownership of your Domain property today. As always, please let us know if there is anything we can do to improve via the product feedback button, the Webmasters community or mention us on Twitter.

Posted by Ruty Mundel, Search Console engineering team