[go: nahoru, domu]

Today we mark an important milestone in Search Console’s history: we are graduating the new Search Console out of beta! With this graduation we are also launching the Manual Actions report and a “Test Live” capability to the recently launched URL inspection tool, which are joining a stream of reports and features we launched in the new Search Console over the past few months.

Our journey to the new Search Console

We launched the new Search Console at the beginning of the year. Since then we have been busy hearing and responding to your feedback, adding new features such as the URL Inspection Tool, and migrating key reports and features. Here's what the new Search Console gives you:

More data:

  • Get an accurate view of your website content using the Index Coverage report.
  • Review your Search Analytics data going back 16 months in the Performance report.
  • See information on links pointing to your site and within your site using the Links report.
  • Retrieve crawling, indexing, and serving information for any URL directly from the Google index using the URL Inspection Tool.

Better alerting and new "fixed it" flows:

  • Get automatic alerts and see a listing of pages affected by Crawling, Indexing, AMP, Mobile Usability, Recipes, or Job posting issues.
  • Reports now show the HTML code where we think a fix necessary (if applicable).
  • Share information quickly with the relevant people in your organization to drive the fix.
  • Notify Google when you've fixed an issue. We will review your pages, validate whether the issue is fixed, and return a detailed log of the validation findings.

Simplified sitemaps and account settings management:

Out of Beta

While the old Search Console still has some features that are not yet available in the new one, we believe that the most common use cases are supported, in an improved way, in the new Search Console. When an equivalent feature exists in both old and new Search Console, our messages will point users to the new version. We'll also add a reminder link in the old report. After a reasonable period, we will remove the old report.

Read more about how to migrate from old to the new Search Console, including a list of improved reports and how to perform common tasks, in our help center.

Manual Actions and Security Issues alerts

To ensure that you don't miss any critical alerts for your site, active manual actions and security issues will be shown directly on the Overview page in the new console. In addition, the Manual Actions report has gotten a fresher look in the new Search Console. From there, you can review the details for any pending Manual Action and, if needed, file a reconsideration request.

URL Inspection - Live mode and request indexing

The URL inspection tool that we launched a few months ago now enables you to run the inspection on the live version of the page. This is useful for debugging and fixing issues in a page or confirming whether a reported issue still exists in a page. If the issue is fixed on the live version of the page, you can ask Google to recrawl and index the page.

We're not finished yet!

Your feedback is important to us! As we evolve Search Console, your feedback helps us to tune our efforts. You can still switch between the old and new products easily, so any missing functionality you need is just a few clicks away. We will continue working on moving more reports and tools as well as adding exciting new capabilities to the new Search Console.


As part of our reinvention of Search Console, we have been rethinking the models of facilitating cooperation and accountability for our users. We decided to redesign the product around cooperative team usage and transparency of action history. The new Search Console will gradually provide better history tracking to show who performed which significant property-affecting modifications, such as changing a setting, validating an issue or submitting a new sitemap. In that spirit we also plan to enable all users to see critical site messages.

New features

  • User management is now an integral part of Search Console.
  • The new Search Console enables you to share a read-only view of many reports, including Index coverage, AMP, and Mobile Usability. Learn more.
  • A new user management interface that enables all users to see and (if appropriate), manage user roles for all property users.

New Role definition

  • In order to provide a simpler permission model, we are planning to limit the "restricted" user role to read-only status. While being able to see all information, read-only users will no longer be able to perform any state-changing actions, including starting a fix validation or sharing an issue.

Best practices

As a reminder, here are some best practices for managing user permissions in Search Console:

User feedback

As part of our Beta exploration, we released visibility of the user management interface to all user roles. Some users reached out to request more time to prepare for the updated user management model, including the ability of restricted and full users to easily see a list of other collaborators on the site. We’ve taken that feedback and will hold off on that part of the launch. Stay tuned for more updates relating to collaboration tools and changes on our permission models.

As always, we love to hear feedback from our users. Feel free to use the feedback form within Search Console, and we welcome your discussions in our help forums as well!


More features are coming to the new Search Console. This time we've focused on importing existing popular features from the old Search Console to the new product.

Links Report

Search Console users value the ability to see links to and within their site, as Google Search sees them. Today, we are rolling out the new Links report, which combines the functionality of the “Links to your site” and “Internal Links” reports on the old Search Console. We hope you find this useful!

Mobile Usability report

Mobile Usability is an important priority for all site owners. In order to help site owners with fixing mobile usability issues, we launched the Mobile Usability report on the new Search Console. Issue names are the same as in the old report but we now allow users to submit a validation and reindexing request when an issue is fixed, similar to other reports in the new Search Console.

Site and user management

To make the new Search Console feel more like home, we’ve added the ability to add and verify new sites, and manage your property's users and permissions, directly in new Search Console using our newly added settings page.

Keep sending feedback

As always, we would love to get your feedback through the tools directly and our help forums so please share and let us know how we're doing.


Since launching the Google Assistant in 2016, we have seen users ask questions about everything from weather to recipes and news. In order to fulfill news queries with results people can count on, we collaborated on a new schema.org structured data specification called speakable for eligible publishers to mark up sections of a news article that are most relevant to be read aloud by the Google Assistant.

When people ask the Google Assistant -- "Hey Google, what's the latest news on NASA?", the Google Assistant responds with an excerpt from a news article and the name of the news organization. Then the Google Assistant asks if the user would like to hear another news article and also sends the relevant links to the user's mobile device.

As a news publisher, you can surface your content on the Google Assistant by implementing Speakable markup according to the developer documentation. This feature is now available for English language users in the US and we hope to launch in other languages and countries as soon as a sufficient number of publishers have implemented speakable. As this is a new feature, we are experimenting over time to refine the publisher and user experience.

If you have any questions, ask us in the Webmaster Help Forum. We look forward to hearing from you!

UPDATE: After testing and further consideration, we have determined that the best place to measure query and click traffic from Google Images is in the Search Console Performance Report. Accordingly, we will continue to use https://www.google.com (or the appropriate ccTLD) as the referrer URL for all traffic from Google Images, and will not be providing a Google Images specific referrer URL (images.google.com).

Every day, hundreds of millions of people use Google Images to visually discover and explore content on the web. Whether it be finding ideas for your next baking project, or visual instructions on how to fix a flat tire, exploring image results can sometimes be much more helpful than exploring text.

Updating the referral source

For webmasters, it hasn't always been easy to understand the role Google Images plays in driving site traffic. To address this, we will roll out a new referrer URL specific to Google Images over the next few months. The referrer URL is part of the HTTP header, and indicates the last page the user was on and clicked to visit the destination webpage.
If you create software to track or analyze website traffic, we want you to be prepared for this change. Make sure that you are ingesting the new referer URL, and attribute the traffic to Google Images. The new referer URL is: https://images.google.com.
If you use Google Analytics to track site data, the new referral URL will be automatically ingested and traffic will be attributed to Google Images appropriately. Just to be clear, this change will not affect Search Console. Webmasters will continue to receive an aggregate list of top search queries that drive traffic to their site.

How this affects country-specific queries

The new referer URL has the same country code top level domain (ccTLD) as the URL used for searching on Google Images. In practice, this means that most visitors worldwide come from images.google.com. That's because last year, we made a change so that google.com became the default choice for searchers worldwide. However, some users may still choose to go directly to a country specific service, such as google.co.uk for the UK. For this use case, the referer uses that country TLD (for example, images.google.co.uk).
We hope this change will foster a healthy visual content ecosystem. If you're interested in learning how to optimize your pages for Google Images, please refer to the Google Image Publishing Guidelines. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum.

Note: The information in this post may be outdated. See our latest post about reporting spam.





We always want to make sure that when you use Google Search to find information, you get the highest quality results. But, we are aware of many bad actors who are trying to manipulate search ranking and profit from it, which is at odds with our core mission: to organize the world's information and make it universally accessible and useful. Over the years, we've devoted a huge effort toward combating abuse and spam on Search. Here's a look at how we fought abuse in 2017.


We call these various types of abuse that violate the webmaster guidelines “spam.” Our evaluation indicated that for many years, less than 1 percent of search results users visited are spammy. In the last couple of years, we’ve managed to further reduce this by half.



Google webspam trends and how we fought webspam in 2017

As we continued to improve, spammers also evolved. One of the trends in 2017 was an increase in website hacking—both for spamming search ranking and for spreading malware. Hacked websites are serious threats to users because hackers can take complete control of a site, deface homepages, erase relevant content, or insert malware and harmful code. They may also record keystrokes, stealing login credentials for online banking or financial transactions. In 2017 we focused on reducing this threat, and were able to detect and remove from search results more than 80 percent of these sites. But hacking is not just a spam problem for search users—it affects the owners of websites as well. To help website owners keep their websites safe, we created a hands-on resource to help webmasters strengthen their websites’ security and revamped our help resources to help webmasters recover from a hacked website. The guides are available in 19 languages.

We’re also recognizing the importance of robust content management systems (CMSs). A large percentage of websites are run on one of several popular CMSs, and subsequently spammers exploited them by finding ways to abuse their provisions for user-generated content, such as posting spam content in comment sections or forums. We’re working closely with many of the providers of popular content management systems like WordPress and Joomla to help them also fight spammers that abuse their forums, comment sections and websites.


Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.


Working with users and webmasters for a better web

We’re here to listen: Our automated systems are constantly working to detect and block spam. Still, we always welcome hearing from you when something seems … phishy. Last year, we were able to take action on nearly 90,000 user reports of search spam.


Reporting spam, malware and other issues you find helps us protect the site owner and other searchers from this abuse. You can file a spam report, a phishing report or a malware report. We very much appreciate these reports—a big THANK YOU to all of you who submitted them.


We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.

Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites' Google Search presence more easily.

Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.


We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!


Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.



Posted by Cody Kwok, Principal Engineer

Last June we launched a job search experience that has since connected tens of millions of job seekers around the world with relevant job opportunities from third party providers across the web. Timely indexing of new job content is critical because many jobs are filled relatively quickly. Removal of expired postings is important because nothing's worse than finding a great job only to discover it's no longer accepting applications.

Today we're releasing the Indexing API to address this problem. This API allows any site owner to directly notify Google when job posting pages are added or removed. This allows Google to schedule job postings for a fresh crawl, which can lead to higher quality user traffic and job applicant satisfaction. Currently, the Indexing API can only be used for job posting pages that include job posting structured data.

For websites with many short-lived pages like job postings, the Indexing API keeps job postings fresh in Search results because it allows updates to be pushed individually. This API can be integrated into your job posting flow, allowing high quality job postings to be searchable quickly after publication. In addition, you can check the last time Google received each kind of notification for a given URL.

Follow the Quickstart guide to see how the Indexing API works. If you have any questions, ask us in the Webmaster Help Forum. We look forward to hearing from you!