Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog: When we launched this blog in early August, we said goodbye to the Inside Google Sitemaps blog and started redirecting it here. The redirect makes the posts we did there a little difficult to get to. For those of you who started reading with this newer blog, here are links to some of the older posts that may be of interest.

Webmaster Tools Account questions

And you can always browse the blog's archives .this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

We just launched a new feature that allows you as a verified site owner to grant limited access to your site's data and settings in Webmaster Tools. You've had the ability to grant full verified access to others for a couple of years. Since then we've heard lots of requests from site owners for the ability to grant limited permission for others to view a site's data in Webmaster Tools without being able to modify all the settings. Now you can do exactly that with our new User administration feature.

On the Home page when you click the "Manage site" drop-down menu you'll see the menu option that was previously titled "Add or remove owners" is now "Add or remove users."


Selecting the "Add or remove users" menu item will take you to the new User administration page where you can add or delete up to 100 users and specify each user's access as "Full" or "Restricted." Users added via the User administration page are tied to a specific site. If you become unverified for that site any users that you've added will lose their access to that site in Webmaster Tools. Adding or removing verified site owners is still done on the owner verification page which is linked from the User administration page.


Granting a user "Full" permission means that they will be able to view all data and take most actions, such as changing site settings or demoting sitelinks. When a user’s permission is set to "Restricted" they will only have access to view most data, and can take some actions such as using Fetch as Googlebot and configuring message forwarding for their account. Restricted users will see a “Restricted Access” indicator at various locations within Webmaster Tools.



To see which features and actions are accessible for Restricted users, Full users and site owners, visit our Permissions Help Center article.

We hope the addition of Full and Restricted users makes management of your site in Webmaster Tools easier since you can now grant access within a more limited scope to help prevent undesirable or unauthorized changes. If you have questions or feedback about the new User administration feature please let us know in our Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:
Recently, Danny Sullivan brought up good questions about how search engines handle meta tags. Here are some answers about how we handle these tags at Google.

Multiple content values
We recommend that you place all content values in one meta tag. This keeps the meta tags easy to read and reduces the chance for conflicts. For instance:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

If the page contains multiple meta tags of the same type, we will aggregate the content values. For instance, we will interpret

<META NAME="ROBOTS" CONTENT="NOINDEX">
<META NAME="ROBOTS" CONTENT="NOFOLLOW">

The same way as:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

If content values conflict, we will use the most restrictive. So, if the page has these meta tags:

<META NAME="ROBOTS" CONTENT="NOINDEX">
<META NAME="ROBOTS" CONTENT="INDEX">

We will obey the NOINDEX value.

Unnecessary content values
By default, Googlebot will index a page and follow links to it. So there's no need to tag pages with content values of INDEX or FOLLOW.

Directing a robots meta tag specifically at Googlebot
To provide instruction for all search engines, set the meta name to "ROBOTS". To provide instruction for only Googlebot, set the meta name to "GOOGLEBOT". If you want to provide different instructions for different search engines (for instance, if you want one search engine to index a page, but not another), it's best to use a specific meta tag for each search engine rather than use a generic robots meta tag combined with a specific one. You can find a list of bots at robotstxt.org.

Casing and spacing
Googlebot understands any combination of lowercase and uppercase. So each of these meta tags is interpreted in exactly the same way:

<meta name="ROBOTS" content="NOODP">
<meta name="robots" content="noodp">
<meta name="Robots" content="NoOdp">

If you have multiple content values, you must place a comma between them, but it doesn't matter if you also include spaces. So the following meta tags are interpreted the same way:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
<META NAME="ROBOTS" CONTENT="NOINDEX,NOFOLLOW">

If you use both a robots.txt file and robots meta tags
If the robots.txt and meta tag instructions for a page conflict, Googlebot follows the most restrictive. More specifically:
  • If you block a page with robots.txt, Googlebot will never crawl the page and will never read any meta tags on the page.
  • If you allow a page with robots.txt but block it from being indexed using a meta tag, Googlebot will access the page, read the meta tag, and subsequently not index it.
Valid meta robots content values
Googlebot interprets the following robots meta tag values:
  • NOINDEX - prevents the page from being included in the index.
  • NOFOLLOW - prevents Googlebot from following any links on the page. (Note that this is different from the link-level NOFOLLOW attribute, which prevents Googlebot from following an individual link.)
  • NOARCHIVE - prevents a cached copy of this page from being available in the search results.
  • NOSNIPPET - prevents a description from appearing below the page in the search results, as well as prevents caching of the page.
  • NOODP - blocks the Open Directory Project description of the page from being used in the description that appears below the page in the search results.
  • NONE - equivalent to "NOINDEX, NOFOLLOW".
A word about content value "NONE"
As defined by robotstxt.org, the following direction means NOINDEX, NOFOLLOW.

<META NAME="ROBOTS" CONTENT="NONE">

However, some webmasters use this tag to indicate no robots restrictions and inadvertently block all search engines from their content.

Update: For more information, please see our robots meta tag documentation.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.