Création des Logiciels de gestion d'Entreprise, Création et référencement des sites web, Réseaux et Maintenance, Conception




Création des Logiciels de gestion d'Entreprise, Création et référencement des sites web, Réseaux et Maintenance, Conception




| DIRECTIVE | IMPACT | USE CASES |
| Disallow | Tells a crawler not to index your site -- your site's robots.txt file still needs to be crawled to find this directive, however disallowed pages will not be crawled | 'No Crawl' page from a site. This directive in the default syntax prevents specific path(s) of a site from being crawled. |
| Allow | Tells a crawler the specific pages on your site you want indexed so you can use this in combination with Disallow | This is useful in particular in conjunction with Disallow clauses, where a large section of a site is disallowed except for a small section within it |
| $ Wildcard Support | Tells a crawler to match everything from the end of a URL -- large number of directories without specifying specific pages | 'No Crawl' files with specific patterns, for example, files with certain filetypes that always have a certain extension, say pdf |
| * Wildcard Support | Tells a crawler to match a sequence of characters | 'No Crawl' URLs with certain patterns, for example, disallow URLs with session ids or other extraneous parameters |
| Sitemaps Location | Tells a crawler where it can find your Sitemaps | Point to other locations where feeds exist to help crawlers find URLs on a site |
| DIRECTIVE | IMPACT | USE CASES |
| NOINDEX META Tag | Tells a crawler not to index a given page | Don't index the page. This allows pages that are crawled to be kept out of the index. |
| NOFOLLOW META Tag | Tells a crawler not to follow a link to other content on a given page | Prevent publicly writeable areas to be abused by spammers looking for link credit. By using NOFOLLOW you let the robot know that you are discounting all outgoing links from this page. |
| NOSNIPPET META Tag | Tells a crawler not to display snippets in the search results for a given page | Present no snippet for the page on Search Results |
| NOARCHIVE META Tag | Tells a search engine not to show a "cached" link for a given page | Do not make available to users a copy of the page from the Search Engine cache |
| NOODP META Tag | Tells a crawler not to use a title and snippet from the Open Directory Project for a given page | Do not use the ODP (Open Directory Project) title and snippet for this page |
"Go Daddy is continually looking for ways to provide our customers the best user experience possible. That's the reason we partnered with Google on the 'Make the Web Faster' initiative. Go Daddy engineers are seeing a dramatic decrease in load times of customers' websites using mod_pagespeed and other technologies provided. We hope to provide the technology to our customers soon - not only for their benefit, but for their website visitors as well.”We’re also working with Cotendo to integrate the core engine of mod_pagespeed as part of their Content Delivery Network (CDN) service.
Once you click “Submit to index” you’ll see a dialog box that allows you to choose whether you want to submit only the one URL, or that URL and all its linked pages.
When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month. You can see how many submissions you have left on the Fetch as Googlebot page. Any URL submitted should point to content that would be suitable for Google Web Search, so if you're trying to submit images or videos you should use Sitemaps instead.