Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Making harmonious use of Webmaster Tools and Analytics 2013

salam every one, this is a topic from google web master centrale blog: Written by Reid Yokoyama, Search Quality Team

Occasionally in the discussion group, webmasters ask, "Should I be using Google Webmaster Tools or Google Analytics?" Our answer is: use both! Here are three scenarios that really highlight the power of both tools.

1. Make the most of your impressions
One of my favorite features of Webmaster Tools is that it will show you the Top 20 search queries your site appeared for along with the Top 20 clicked queries. The data from the Top Search Queries allows you to quickly pinpoint what searches your site appears for and which of those searches are resulting in clicks. Let's look at last week's data for www.google.com/webmasters as an example.


As you can see, Google Webmaster Central is receiving a great number of impressions for the query [gadgets] but may not be fully capitalizing on these impressions with user clicks. Click on [gadgets] to see how your site appears in our search results. Does your title and snippet look appealing to users? As my colleague Michael recently wrote, it might be time to do some "housekeeping" on your website -- it's a great, low-to-no-cost way to catch the attention of your users. For example, we could work to improve our snippet from:

To something more readable such as "Use gadgets to easily add cool, dynamic content to your site..." by adding a meta description to the URL.

And what are users doing when they visit your site? Are they browsing your content or bouncing off your site quickly? To find out, Google Analytics will calculate your site's "bounce rate," or the percentage of single-page visits (e.g. someone just visiting your homepage and then leaving). This can be a helpful measure of the quality of your site's landing page and the traffic your site receives. After all, once you've worked hard to get your users to visit your site, you want to keep them there! Check out the Analytics blog for further information about "bounce rate."

2. Perform smart geo-targeting
Let's imagine you have a .com that you want to target at a Japanese market. Webmaster Tools allows you to set a geographic target for your site, where you would probably pick Japan. But, doing so is not an immediate solution. You can confirm the location of your visitors using the map overlay of Analytics, right up to the city level. You can also discover what types of users are accessing your site - including their browser and connection speed. If users cannot access your website due to an incompatible browser or slower connection speeds, you may need to rethink your website's design. Doing so can go a long way toward achieving the level of relevant traffic you would like.

3. Control access to sensitive content
One day, you log into Analytics and look at your "Content by Title" data. You shockingly discover that users are visiting your /privatedata pages. Have no fear! Go into Webmaster Tools and use the URL removal tool to remove those pages from Google's search results. Modifying your robots.txt file will also block Googlebot from crawling that section of your site in the future.

For more tips and tricks on Analytics, check out the Analytics Help Center. If you have any more suggestions, feel free to comment below or in our Webmaster Help Group.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Introducing the Structured Data Dashboard 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Structured data is becoming an increasingly important part of the web ecosystem. Google makes use of structured data in a number of ways including rich snippets which allow websites to highlight specific types of content in search results. Websites participate by marking up their content using industry-standard formats and schemas.

To provide webmasters with greater visibility into the structured data that Google knows about for their website, we’re introducing today a new feature in Webmaster Tools - the Structured Data Dashboard. The Structured Data Dashboard has three views: site, item type and page-level.

Site-level view
At the top level, the Structured Data Dashboard, which is under Optimization, aggregates this data (by root item type and vocabulary schema).  Root item type means an item that is not an attribute of another on the same page.  For example, the site below has about 2 million Schema.Org annotations for Books (“http://schema.org/Book”)


Itemtype-level view
It also provides per-page details for each item type, as seen below:


Google parses and stores a fixed number of pages for each site and item type. They are stored in decreasing order by the time in which they were crawled. We also keep all their structured data markup. For certain item types we also provide specialized preview columns as seen in this example below (e.g. “Name” is specific to schema.org Product).


The default sort order is such that it would facilitate inspection of the most recently added Structured Data.

Page-level view
Last but not least, we have a details page showing all attributes of every item type on the given page (as well as a link to the Rich Snippet testing tool for the page in question).


Webmasters can use the Structured Data Dashboard to verify that Google is picking up new markup, as well as to detect problems with existing markup, for example monitor potential changes in instance counts during site redesigns.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Reorganizing internal vs. external backlinks 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Today we’re making a change to the way we categorize link data in Webmaster Tools. As you know, Webmaster Tools lists links pointing to your site in two separate categories: links coming from other sites, and links from within your site. Today’s update won’t change your total number of links, but will hopefully present your backlinks in a way that more closely aligns with your idea of which links are actually from your site vs. from other sites.

You can manage many different types of sites in Webmaster Tools: a plain domain name (example.com), a subdomain (www.example.com or cats.example.com), or a domain with a subfolder path (www.example.com/cats/ or www.example.com/users/catlover/). Previously, only links that started with your site’s exact URL would be categorized as internal links: so if you entered www.example.com/users/catlover/ as your site, links from www.example.com/users/catlover/profile.html would be categorized as internal, but links from www.example.com/users/ or www.example.com would be categorized as external links. This also meant that if you entered www.example.com as your site, links from example.com would be considered external because they don’t start with the same URL as your site (they don’t contain www).

Most people think of example.com and www.example.com as the same site these days, so we’re changing it such that now, if you add either example.com or www.example.com as a site, links from both the www and non-www versions of the domain will be categorized as internal links. We’ve also extended this idea to include other subdomains, since many people who own a domain also own its subdomains—so links from cats.example.com or pets.example.com will also be categorized as internal links for www.example.com.

Links for www.google.comExternal linksInternal links
Previously categorized as...www.example.com/
www.example.org/stuff.html
scholar.google.com/
sketchup.google.com/
google.com/
www.google.com/
www.google.com/stuff.html
www.google.com/support/webmasters/
Now categorized as...www.example.com/
www.example.org/stuff.html
scholar.google.com/
sketchup.google.com/
google.com/
www.google.com/
www.google.com/stuff.html
www.google.com/support/webmasters/

If you own a site that’s on a subdomain (such as www..matrixar.com) or in a subfolder (www.google.com/support/webmasters/) and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links. We’ve made a few backend changes so that these numbers should be even more accurate for you.

Note that, if you own a root domain like example.com or www.example.com, your number of external links may appear to go down with this change; this is because, as described above, some of the URLs we were previously classifying as external links will have moved into the internal links report. Your total number of links (internal + external) should not be affected by this change.

As always, drop us a comment or join our Webmaster Help Forum if you have questions!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Better geographic choices for webmasters 2013

salam every one, this is a topic from google web master centrale blog: Written by Amanda Camp, Webmaster Tools and Trystan Upstill, International Search Quality Team

Starting today Google Webmaster Tools helps you better control the country association of your content on a per-domain, per-subdomain, or per-directory level. The information you give us will help us determine how your site appears in our country-specific search results, and also improves our search results for geographic queries.

We currently only allow you to associate your site with a single country and location. If your site is relevant to an even more specific area, such as a particular state or region, feel free to tell us that. Or let us know if your site isn't relevant to any particular geographic location at all. If no information is entered in Webmaster Tools, we'll continue to make geographic associations largely based on the top-level domain (e.g. .co.uk or .ca) and the IP of the webserver from which the context was served.

For example, if we wanted to associate www.google.com with Hungary:


But you don't want www.google.com/webmasters/tools" associated with any country...


This feature is restricted for sites with a country code top level domain, as we'll always associate that site with the country domain. (For example, google.ru will always be the version of Google associated with Russia.)


Note that in the same way that Google may show your business address if you register your brick-and-mortar business with the Google Local Business Center, we may show the information that you give us publicly.

This feature was largely initiated by your feedback, so thanks for the great suggestion. Google is always committed towards helping more sites and users get better and more relevant results. This is a new step as we continue to think about how to improve searches around the world.

We encourage you to tell us what you think in the Webmaster Tools section of our discussion group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: URL removal explained, Part I: URLs & directories 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

There's a lot of content on the Internet these days. At some point, something may turn up online that you would rather not have out there—anything from an inflammatory blog post you regret publishing, to confidential data that accidentally got exposed. In most cases, deleting or restricting access to this content will cause it to naturally drop out of search results after a while. However, if you urgently need to remove unwanted content that has gotten indexed by Google and you can't wait for it to naturally disappear, you can use our URL removal tool to expedite the removal of content from our search results as long as it meets certain criteria (which we'll discuss below).

We've got a series of blog posts lined up for you explaining how to successfully remove various types of content, and common mistakes to avoid. In this first post, I'm going to cover a few basic scenarios: removing a single URL, removing an entire directory or site, and reincluding removed content. I also strongly recommend our previous post on managing what information is available about you online.

Removing a single URL

In general, in order for your removal requests to be successful, the owner of the URL(s) in question—whether that's you, or someone else—must have indicated that it's okay to remove that content. For an individual URL, this can be indicated in any of three ways:
Before submitting a removal request, you can check whether the URL is correctly blocked:
  • robots.txt: You can check whether the URL is correctly disallowed using either the Fetch as Googlebot or Test robots.txt features in Webmaster Tools.
  • noindex meta tag: You can use Fetch as Googlebot to make sure the meta tag appears somewhere between the <head> and </head> tags. If you want to check a page you can't verify in Webmaster Tools, you can open the URL in a browser, go to View > Page source, and make sure you see the meta tag between the <head> and </head> tags.
  • 404 / 410 status code: You can use Fetch as Googlebot, or tools like Live HTTP Headers or web-sniffer.net to verify whether the URL is actually returning the correct code. Sometimes "deleted" pages may say "404" or "Not found" on the page, but actually return a 200 status code in the page header; so it's good to use a proper header-checking tool to double-check.
If unwanted content has been removed from a page but the page hasn't been blocked in any of the above ways, you will not be able to completely remove that URL from our search results. This is most common when you don't own the site that's hosting that content. We cover what to do in this situation in a subsequent post. in Part II of our removals series.

If a URL meets one of the above criteria, you can remove it by going to http://www.google.com/webmasters/tools/removals, entering the URL that you want to remove, and selecting the "Webmaster has already blocked the page" option. Note that you should enter the URL where the content was hosted, not the URL of the Google search where it's appearing. For example, enter
   http://www.example.com/embarrassing-stuff.html
not
   http://www.google.com/search?q=embarrassing+stuff

This article has more details about making sure you're entering the proper URL. Remember that if you don't tell us the exact URL that's troubling you, we won't be able to remove the content you had in mind.

Removing an entire directory or site

In order for a directory or site-wide removal to be successful, the directory or site must be disallowed in the site's robots.txt file. For example, in order to remove the http://www.example.com/secret/ directory, your robots.txt file would need to include:
   User-agent: *
   Disallow: /secret/

It isn't enough for the root of the directory to return a 404 status code, because it's possible for a directory to return a 404 but still serve out files underneath it. Using robots.txt to block a directory (or an entire site) ensures that all the URLs under that directory (or site) are blocked as well. You can test whether a directory has been blocked correctly using either the Fetch as Googlebot or Test robots.txt features in Webmaster Tools.

Only verified owners of a site can request removal of an entire site or directory in Webmaster Tools. To request removal of a directory or site, click on the site in question, then go to Site configuration > Crawler access > Remove URL. If you enter the root of your site as the URL you want to remove, you'll be asked to confirm that you want to remove the entire site. If you enter a subdirectory, select the "Remove directory" option from the drop-down menu.

Reincluding content

You can cancel removal requests for any site you own at any time, including those submitted by other people. In order to do so, you must be a verified owner of this site in Webmaster Tools. Once you've verified ownership, you can go to Site configuration > Crawler access > Remove URL > Removed URLs (or > Made by others) and click "Cancel" next to any requests you wish to cancel.

Still have questions? Stay tuned for the rest of our series on removing content from Google's search results. If you can't wait, much has already been written about URL removals, and troubleshooting individual cases, in our Help Forum. If you still have questions after reading others' experiences, feel free to ask. Note that, in most cases, it's hard to give relevant advice about a particular removal without knowing the site or URL in question. We recommend sharing your URL by using a URL shortening service so that the URL you're concerned about doesn't get indexed as part of your post; some shortening services will even let you disable the shortcut later on, once your question has been resolved.

Edit: Read the rest of this series:
Part II: Removing & updating cached content
Part III: Removing content you don't own
Part IV: Tracking requests, what not to remove

Companion post: Managing what information is available about you online

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Update on penalty notifications 2013

salam every one, this is a topic from google web master centrale blog:

First, a brief recap: In late 2005, we started emailing webmasters to let them know that their site is violating our Webmaster Guidelines and that we have temporarily removed some of their pages from our index. A few months ago we put these emails on hold due to a number of spoofed messages being sent from outside Google, primarily to German webmasters. Then, in mid-July, we launched Message Center in our webmaster console, which allows us to send messages to verified site owners.

While Message Center is great for verified site owners, it doesn't allow us to notify webmasters who aren't registered in Google's Webmaster Tools. For this reason, we plan to resume sending emails in addition to the Message Center notifications. Please note that, as before, our emails will not include attachments. Currently, the Message Center won't keep messages waiting if you haven't previously registered, but we hope to add that feature in the next few months. We'll keep you posted as things change.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Site Errors Breakdown 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Today we’re announcing more detailed Site Error information in Webmaster Tools. This information is useful when looking for the source of your Site Errors. For example, if your site suffers from server connectivity problems, your server may simply be misconfigured; then again, it could also be completely unavailable!  Since each Site Error (DNS, Server Connectivity, and Robots.txt Fetch) is comprised of several unique issues, we’ve broken down each category into more specific errors to provide you with a better analysis of your site’s health.

Site Errors will display statistics for each of your site-wide crawl errors from the past 90 days.  In addition, it will show the failure rates for any category-specific errors that have been affecting your site.




If you’re not sure what a particular error means, you can read a short description of it by hovering over its entry in the legend.  You can find more detailed information by following the “More info” link in the tooltip.


We hope that these changes will make Site Errors even more informative and helpful in keeping your site in tip-top shape.  If you have any questions or suggestions, please let us know through the Webmaster Tools Help Forum.

Written by Cesar Cuenca and Tiffany Wang, Webmaster Tools Internsthis is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Work smarter, not harder, with site health 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

We consistently hear from webmasters that they have to prioritize their time. Some manage dozens or hundreds of clients’ sites; others run their own business and may only have an hour to spend on website maintenance in between managing finances and inventory. To help you prioritize your efforts, Webmaster Tools is introducing the idea of “site health,” and we’ve redesigned the Webmaster Tools home page to highlight your sites with health problems. This should allow you to easily see what needs your attention the most, without having to click through all of the reports in Webmaster Tools for every site you manage.

Here’s what the new home page looks like:


You can see that sites with health problems are shown at the top of the list. (If you prefer, you can always switch back to listing your sites alphabetically.) To see the specific issues we detected on a site, click the site health icon or the “Check site health” link next to that site:


This new home page is currently only available if you have 100 or fewer sites in your Webmaster Tools account (either verified or unverified). We’re working on making it available to all accounts in the future. If you have more than 100 sites, you can see site health information at the top of the Dashboard for each of your sites.

Right now we include three issues in your site’s health check:
  1. Have we detected malware on the site?
  2. Have any important pages been removed via our URL removal tool?
  3. Are any of your important pages blocked from crawling in robots.txt?
You can click on any of these items to get more details about what we detected on your site. If the site health icon and the “Check site health” link don’t appear next to a site, it means that we didn’t detect any of these issues on that site (congratulations!).

A word about “important pages:” as you know, you can get a comprehensive list of all URLs that have been removed by going to Site configuration > Crawler access > Remove URL; and you can see all the URLs that we couldn’t crawl because of robots.txt by going to Diagnostics > Crawl errors > Restricted by robots.txt. But since webmasters often block or remove content on purpose, we only wanted to indicate a potential site health issue if we think you may have blocked or removed a page you didn’t mean to, which is why we’re focusing on “important pages.” Right now we’re looking at the number of clicks pages get (which you can see in Your site on the web > Search queries) to determine importance, and we may incorporate other factors in the future as our site health checks evolve.

Obviously these three issues—malware, removed URLs, and blocked URLs—aren’t the only things that can make a website “unhealthy;” in the future we’re hoping to expand the checks we use to determine a site’s health, and of course there’s no substitute for your own good judgment and knowledge of what’s going on with your site. But we hope that these changes make it easier for you to quickly spot major problems with your sites without having to dig down into all the data and reports.

After you’ve resolved any site health issues we’ve flagged, it will usually take several days for the warning to disappear from your Webmaster Tools account, since we have to recrawl the site, see the changes you’ve made, and then process that information through our Web Search and Webmaster Tools pipelines. If you continue to see a site health warning for that site after a week or so, the issue may not have been resolved. Feel free to ask for help tracking it down in our Webmaster Help Forum... and let us know what you think!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Adding associates to manage your YouTube presence 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Many organizations have multiple presences on the web. For example, Webmaster Tools lives at www.google.com/webmasters, but it also has a Twitter account and a YouTube channel. It's important that visitors to these other properties have confidence that they are actually associated with the Webmaster Tools site. However to date it has been challenging for webmasters to manage which users can take actions on behalf of their site in different services.

Today we're happy to announce a new feature in Webmaster Tools that allows webmasters to add "associates" -- trusted users who can act on behalf of your site in other Google products. Unlike site owners and users, associates can't view site data or take any site actions in Webmaster Tools, but they are authorized to perform specific tasks in other products.

For this initial launch, members of YouTube's partner program that have created a YouTube channel for their site can now link the two together. By doing this, your YouTube channel will be displayed as the "official channel" for your website.


Management within Webmaster Tools

To add or change associates:

  1. On the Webmaster Tools home page, click the site you want.
  2. Under Configuration, click Associates.
  3. Click Add a new associate.
  4. In the text box, type the email address of the person you want to add.
  5. Select the type of association you want.
  6. Click Add.

Management within YouTube

It’s also possible for users to request association from a site’s webmaster.
  1. Log in to your YouTube partner account.
  2. Click on the user menu and choose Settings > Associated Website.
  3. Fill in the page you would like to associate your channel with.
  4. Click Add. If you’re a verified owner of the site, you’re done. But if someone else in your organization manages the website, the association will be marked Pending. The owner receives a notification with an option to approve or deny the request.
  5. After approval is granted, navigate back to this page and click Refresh to complete the association.
Through associates, webmasters can easily and safely allow others to associate their website with YouTube channels. We plan to support integration with additional Google products in the future.

If you have more questions, please see the Help with Associates article or visit our webmaster help forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: An update on spam reporting 2013

salam every one, this is a topic from google web master centrale blog: (Note: this post has been translated into English from our German blog.)

In 2006 one of our initiatives in the area of communication was to notify some webmasters in case of a violation of our Webmaster Guidelines (e.g. by using a "particular search engine friendly" software that generates doorways as an extra). No small number of these good-will emails to webmasters have been brought about by spam reports from our users.

We are proud of our users who alert us to potential abuses for the sake of the whole internet community. We appreciate this even more, as PageRank™ (and thus Google search) is based on a democratic principle, i.e. a webmaster is giving other sites a "vote" of approval by linking to it.

In 2007 as an extension and complement of this democratic principle, we want to further increase our users' awareness of webmaster practices that do or do not conform to Google's standards. Such informed users are then able to take counter-action against webspam by filing spam reports. By doing so a mutually beneficial process can be initiated. Ultimately, not only will all Google users benefit from the best possible search quality, but also will spammy webmasters realize that their attempts to unfairly manipulate their site's ranking will pay off less and less.

Our spam report forms are provided in two different flavors: an authenticated form that requires registration in Webmaster Tools, and an unauthenticated form. Currently, we investigate every spam report from a registered user. Spam reports to the unauthenticated form are assessed in terms of impact, and a large fraction of those are reviewed as well.

So, the next time you can't help thinking that the ranking of a search result was not earned by virtue of its content and legitimate SEO, then it is the perfect moment for a spam report. Each of them can give us crucial information for the continual optimization of our search algorithms.

Interested in learning more? Then find below answers to the three most frequent questions.

FAQs concerning spam reports:

Q: What happens to an authenticated spam report at Google?
A: An authenticated spam report is analyzed and then used for evaluating new spam-detecting algorithms, as well as to identify trends in webspam. Our goal is to detect all the sites engaging in similar manipulation attempts automatically in the future and to make sure our algorithms rank those sites appropriately. We don´t want to get into an inefficient game of cat and mouse with individual webmasters who have reached into the wrong bag of tricks.

Q: Why are there sometimes no immediately noticeable consequences of a spam report?
A: Google is always seeking to improve its algorithms for countering webspam, but we also take action on individual spam reports. Sometimes that action will not be immediately visible to an outside user, so there is no need to submit a site multiple times in order for Google to evaluate a URL. There are different reasons that might account for a user´s false impression that a particular spam report went unnoticed. Here are a few of those reasons:

  • Sometimes, Google might already be handling the situation appropriately. For example, if you are reporting a site that seems to engage in excessive link exchanging, it could be the case that we are already discounting the weight of those unearned backlinks correctly, and the site is showing up for other reasons. Note that changes in how Google handles backlinks for a site are not immediately obvious to outside users. Or it may be the case that we already deal with a phenomenon such as keyword stuffing correctly in our scoring, and therefore we are not quite as concerned about something that might not look wonderful, but that isn't affecting rankings.
  • A complete exclusion from Google´s SERPs is only one possible consequence of a spam report. Google might also choose to give a site a "yellow card" so that the site can not be found in the index for a short time. However, if a webmaster ignores this signal, then a "red card" with a longer-lasting effect might follow. So it's possible that Google is already aware of an issue and communicating with the webmaster about that issue, or that we have taken action other than a removal on a spam report.
  • Sometimes, simple patience is the answer, because it takes time for algorithmic changes to be thoroughly checked out, or for the externally displayed PageRank to be updated.
  • It can also be the case that Google is working on solving the more general instance of an issue, and so we are reluctant to take action on an individual situation.
  • A spam report may also just have been considered unjustified. For example, this may be true for a report whose sole motivation appears to attempt to harm a direct competitor with a better ranking.

Q: Can a user expect to receive feedback for a spam report?
A: This is a common request, and we know that our users might like verification of the reported URLs or simple confirmation that the spam report had been taken care of. Given the choice how to spend our time, we have decided to invest our efforts into taking action on spam reports and improving our algorithms to be more robust. But we are open to consider how to scale communication with our users going forward.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: +1 reporting in Google Webmaster Tools and Google Analytics 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

It’s been a busy week for us here at the Googleplex. First we released +1 buttons to Google search sites globally, then we announced the beginning of the Google+ project.

The +1 button and the Google+ project are both about making it easier to connect with the people you trust online. For the +1 button, that means bringing advice from trusted friends and contacts right into Google search, letting the users who love your web content recommend it at the moment of decision.

But when you’re managing a website, it's usually not real until you can measure it. So we’re happy to say we’ve got one more announcement to make -- today we’re releasing reports that show you the value +1 buttons bring to your site.

First, +1 metrics in Google Webmaster Tools can show you how the +1 button affects the traffic coming to your pages:


  • The Search Impact report gives you an idea of how +1‘s affect your organic search traffic. You can find out if your clickthrough rate changes when personalized recommendations help your content stand out. Do this by comparing clicks and impressions on search results with and without +1 annotations. We’ll only show statistics on clickthrough rate changes when you have enough impressions for a meaningful comparison.
  • The Activity report shows you how many times your pages have been +1’d, from buttons both on your site and on other pages (such as Google search).
  • Finally, the Audience report shows you aggregate geographic and demographic information about the Google users who’ve +1’d your pages. To protect privacy, we’ll only show audience information when a significant number of users have +1’d pages from your site.
Use the +1 Metrics menu on the side of the page to view your reports. If you haven’t yet verified your site on Google Webmaster Tools, you can follow these instructions to get access.

Finally, you can also see how users share your content using other buttons besides +1 by using Social Plugin Analytics in Google Analytics. Once you configure the JavaScript for Analytics, the Social Engagement reports help you compare the various types of sharing actions that occur on your pages.


  • The Social Engagement report lets you see how site behavior changes for visits that include clicks on +1 buttons or other social actions. This allows you to determine, for example, whether people who +1 your pages during a visit are likely to spend more time on your site than people who don’t.
  • The Social Actions report lets you track the number of social actions (+1 clicks, Tweets, etc) taken on your site, all in one place.
  • The Social Pages report allows you to compare the pages on your site to see which are driving the highest the number of social actions.
If you’re using the default version of the latest Google Analytics tracking code, when you add +1 buttons to your site, we automatically enable Social Plugin Analytics for +1 in your account. You can enable analytics for other social plugins in just a few simple steps.

Social reporting is just getting started. As people continue to find new ways to interact across the web, we look forward to new reports that help business owners understand the value that social actions are providing to their business. So +1 to data!

UPDATE: 7/11/11 1:44pm PST, corrected references to the social plugin analytics feature.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: iGoogle Gadgets for Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:


Update: The described feature is no longer available.

When you plan to do something, are you a minimalist, or are you prepared for every potential scenario? For example, would you hike out into the Alaskan wilderness during inclement weather with only a wool overcoat and a sandwich in your pocket - like the naturalist John Muir (and you thought Steve McQueen was tough)?

Or are you more the type of person where even on a day hike, you bring a few changes of clothes, 3 dehydrated meals, a couple of kitchen appliances, a power inverter, and a foot- powered generator, because, well, you never know when the urge will arise to make toast?

The Webmaster Tools team strives to serve all types of webmasters, from the minimalist to those who use every tool they can find. If you're reading this blog, you've probably had the opportunity to use the current version of Webmaster Tools, which offers as many features as possible just shy of the kitchen sink. Now there's something for those of you who would prefer to access only the features of Webmaster Tools that you need: we've just released Webmaster Tools Gadgets for iGoogle.

Here's the simple process to start using these Gadgets right away. (Note: this assumes you've already got a Webmaster Tools account and have verified at least one site.)

1. Visit Webmaster Tools and select any site that you've validated from the dashboard.
2. Click on the Tools section.
3. Click on Gadgets sub-section.
4. Click on the big "Add an iGoogle Webmaster Tools homepage" button.
5. Click the "Add to Google" button on the following confirm page to add the new tab to iGoogle.
6. Now you're in iGoogle, where you should see your new Google Webmaster Tools tab with a number of Gadgets. Enjoy!

You'll notice that each Gadget has a drop down menu at the top which lets you select from all the sites you have validated to see that Gadget's information for the particular site you select. A few of the Gadgets that we're currently offering are:

Crawl errors - Does Googlebot encounter issues when crawling your site?



Top search queries - What are people searching for to find your site?



External links - What websites are linking to yours?




We plan to add more Gadgets in the future and improve their quality, so if there's a feature that you'd really like to see which is not included in one of the Gadgets currently available, let us know. As you can see, it's a cinch to get started.

It looks like rain clouds are forming over here in Seattle, so I'm off for a hike.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Fresher query stats 2013

salam every one, this is a topic from google web master centrale blog: Query stats in webmaster tools provide information about the search queries that most often return your site in the results. You can view this information by a variety of search types (such as web search, mobile search, or image search) and countries. We show you the top search types and locations for your site. You can access these stats by selecting a verified site in your account and then choosing Query stats from the Statistics tab.


If you've checked your site's query stats lately, you may have noticed that they're changing more often than they used to. This is because we recently changed how frequently we calculate them. Previously, we showed data that was averaged over a period of three weeks. Now, we show data that is averaged over a period of one week. This results in fresher stats for you, as well as stats that more accurately reflect the current queries that return your site in the results. We update these stats every week, so if you'd like to keep a history of the top queries for your site week by week, you can simply download the data each week. We generally update this data each Monday.

How we calculate query stats
Some of you have asked how we calculate query stats.

These results are based on results that searchers see. For instance, say a search for [Britney Spears] brings up your site as position 21, which is on the third page of the results. And say 1000 people searched for [Britney Spears] during the course of a week (in reality, a few more people than that search for her name, but just go with me for this example). 600 of those people only looked at the first page of results and the other 400 browsed to at least the third page. That means that your site was seen by 400 searchers. Even though your site was at position 21 for all 1000 searchers, only 400 are counted for purposes of this calculation.

Both top search queries and top search query clicks are based on the total number of searches for each query. The stats we show are based on the queries that most often return your site in the results. For instance, going back to that familiar [Britney Spears] query -- 400 searchers saw your site in the results. Now, maybe your site isn't really about Britney Spears -- it's more about Buffy the Vampire Slayer. And say Google received 50 queries for [Buffy the Vampire Slayer] in the same week, and your site was returned in the results at position 2. So, all 50 searchers saw your site in the results. In this example, Britney Spears would show as a top search query above Buffy the Vampire Slayer (because your site was seen by 400 searchers for Britney but 50 searchers for Buffy).

The same is true of top search query clicks. If 100 of the Britney-seekers clicked on your site in the search results and all 50 of the Buffy-searchers click on your site in the search results, Britney would show as a top search query above Buffy.

At times, this may cause some of the query stats we show you to seem unusual. If your site is returned for a very high-traffic query, then even if a low percentage of searchers click on your site for that query, the total number of searchers who click on your site may still be higher for the query than for queries for which a much higher percentage of searchers click on your site in the results.

The average top position for top search queries is the position of the page on your site that ranks most highly for the query. The average top position for top search query clicks is the position of the page on your site that searchers clicked on (even if a different page ranked more highly for the query). We show you the average position for this top page across all data centers over the course of the week.

A variety of download options are available. You can:
  • download individual tables of data by clicking the Download this table link.
  • download stats for all subfolders on your site (for all search types and locations) by clicking the Download all query stats for this site (including subfolders) link.
  • download all stats (including query stats) for all verified sites in your account by choosing Tools from the My Sites page, then choosing Download data for all sites and then Download statistics for all sites.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Register non-English domain names with Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:


I'm happy to announce that Webmaster Tools is expanding support for webmasters outside of the English-speaking world, by supporting Internationalizing Domain Names in Applications (IDNA). IDNA provides a way for site owners to have domains that go beyond the domain name system's limitations of English letters and numbers. Prior to IDNA, Internet host names could only be in the 26 letters of the English alphabet, the numbers 0-9, and the hyphen character. With IDNA support, you'll now be able to add your sites that use other character sets, and organize them easily on your Webmaster Tools Dashboard.

Let's say you wanted to add http://北京大学.cn/ (Peking University) to your Webmaster Tools account before we launched IDNA support. If you typed that in to the "Add Site" box, you'd get back an error message that looks like this:



Some webmasters discovered a workaround. Internally, IDNA converts nicely encoded http://北京大学.cn/ to a format called Punycode, which looks like http://xn--1lq90ic7fzpc.cn/. This allowed them to diagnose and view information about their site, but it looked pretty ugly. Also, if they had more than one IDNA site, you can imagine it would be pretty hard to tell them apart.



Since we now support IDNA throughout Webmaster Tools, all you need to do is type in the name of your site, and we will add it correctly. Here is what it looks like if you attempt to add http://北京大学.cn/ to your account:



If you are one of the webmasters who discovered the workaround previously (i.e., you have had sites listed in your account that look like http://xn--1lq90ic7fzpc.cn/), those sites will now automatically display correctly.

We'd love to hear your questions and feedback on this new feature; you can write a comment below or post in the Google Webmaster Tools section of our Webmaster Help Group. We'd also appreciate suggestions for other ways we can improve our international support.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Speaking the language of robots 2013

salam every one, this is a topic from google web master centrale blog:

We all know how friendly Googlebot is. And like all benevolent robots, he listens to us and respects our wishes about parts of our site that we don't want crawled. We can just give him a robots.txt file explaining what we want, and he'll happily comply. But what if you're intimidated by the idea of communicating directly with Googlebot? After all, not all of us are fluent in the language of robots.txt. This is why we're pleased to introduce you to your personal robot translator: the Robots.txt Generator in Webmaster Tools. It's designed to give you an easy and interactive way to build a robots.txt file. It can be as simple as entering the files and directories you don't want crawled by any robots.

Or, if you need to, you can create fine-grained rules for specific robots and areas of your site.
Once you're finished with the generator, feel free to test the effects of your new robots.txt file with our robots.txt analysis tool. When you're done, just save the generated file to the top level (root) directory of your site, and you're good to go. There are a couple of important things to keep in mind about robots.txt files:
  • Not every search engine will support every extension to robots.txt files
The Robots.txt Generator creates files that Googlebot will understand, and most other major robots will understand them too. But it's possible that some robots won't understand all of the robots.txt features that the generator uses.
  • Robots.txt is simply a request
Although it's highly unlikely from a major search engine, there are some unscrupulous robots that may ignore the contents of robots.txt and crawl blocked areas anyway. If you have sensitive content that you need to protect completely, you should put it behind password protection rather than relying on robots.txt.

We hope this new tool helps you communicate your wishes to Googlebot and other robots that visit your site. If you want to learn more about robots.txt files, check out our Help Center. And if you'd like to discuss robots.txt and robots with other webmasters, visit our Google Webmaster Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Using Webmaster Tools like an SEO 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: Beginner to Intermediate

We on the Webmaster Central team aren’t SEOs, but that doesn’t stop me from pretending to be one! In our latest video, I’ll talk about utilizing some features in Webmaster Tools as though I were the SEO for www.googlestore.com.


Just as a grandparent raves about their grandchild, I could have gone on for hours about (my baby!) Webmaster Tools. Thankfully I stopped myself -- but if you have tips to share or questions to ask, please comment below.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New Message Center notifications for detecting an increase in Crawl Errors 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

When Googlebot crawls your site, it’s expected that most URLs will return a 200 response code, some a 404 response, some will be disallowed by robots.txt, etc. Whenever we’re unable to reach your content, we show this information in the Crawl errors section of Webmaster Tools (even though it might be intentional and not actually an error). Continuing with our effort to provide useful and actionable information to webmasters, we're now sending SiteNotice messages when we detect a significant increase in the number of crawl errors impacting a specific site. These notifications are meant to alert you of potential crawl-related issues and provide a sample set of URLs for diagnosing and fixing them.

A SiteNotice for a spike in the number of unreachable URLs, for example, will look like this:


We hope you find SiteNotices helpful for discovering and dealing with issues that, if left unattended, could negatively affect your crawl coverage. You’ll only receive these notifications if you’ve verified your site in Webmaster Tools and we detect significant changes to the number of crawl errors we encounter on your site. And if you don't want to miss out on any these important messages, you can use the email forwarding feature to receive these alerts in your inbox.

If you have any questions, please post them in our Webmaster Help Forum or leave your comments below.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Even more Top Search Queries data 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All We recently updated the Top Search Queries data to take into account the average top position, we enabled programmatic download and we made sure you could still get all the queries that drive traffic to your site. Well, now it’s time to give you more search queries data!

First, and most important, you can now see up to 90 days of historical data. If you click on the date picker in the top right of Search queries, you can go back three months instead of the previous 35 days.

And after you click:

In order to see 90 days, the option to view with changes will be disabled. If you want to see the changes with respect to the previous time period, the limit remains 30 days. Changes are disabled by default but you can switch them on and off with the button between the graph and the table. Top search queries data is normally available within 2 or 3 days.

Another big improvement in Webmaster Tools is that you can now see basic search query data as soon as you verify ownership of a site. No more waiting to see your information.

Finally, we're now collecting data for the top 2000 queries for which your site gets clicks. You may see less than 2000 if we didn’t record any clicks for a particular query in a given day, or if your query data is spread out among many countries or languages. For example, a search for [flowers] on Google Canada is counted separately from a search for [flowers] on google.com. Nevertheless, with this change 98% of sites will have complete coverage. Let us know what you think. We hope the new data will be useful.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Better badware notifications for webmasters 2013

salam every one, this is a topic from google web master centrale blog:
In the fight against badware, protecting Google users by showing warnings before they visit dangerous sites is only a small piece of the puzzle. It's even more important to help webmasters protect their own users, and we've been working on this with StopBadware.org. A few months ago we took the first step and integrated malware notifications into webmaster tools. I'm pleased to announce that we are now including more detailed information in these notifications, and are also sending them to webmasters via email.

Webmaster tools notifications
Now instead of simply informing webmasters that their sites have been flagged and suggesting next steps, we're also showing example URLs that we've determined to be dangerous. This can be helpful when the malicious content is hard to find. For example, a common occurrence with compromised sites is the insertion of a 1-pixel iframe causing the automatic download of badware from another site. By providing example URLs, webmasters are one step closer to diagnosing the problem and ultimately re-securing their sites.

Email notifications
In addition to notifying webmaster tools users, we've also begun sending email notifications to some of the webmasters of sites that we flag for badware. We don't have a perfect process for determining a webmaster's email address, so for now we're sending the notifications to likely webmaster aliases for the domain in question (e.g., webmaster@, admin@, etc). We considered using whois records, but these often contain contact information for the hosting provider or registrar, and you can guess what might happen if a web host learned that one of its client sites was distributing badware. We're planning to allow webmasters to provide a preferred email address for notifications through webmaster tools, so look for this change in the future.

Update: For more information, please see our Help Center article on malware and hacked sites.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: What’s new with Sitemaps 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Sitemaps are a way to tell Google about pages on your site. Webmaster Tools’ Sitemaps feature gives you feedback on your submitted Sitemaps, such as how many Sitemap URLs have been indexed, or whether your Sitemaps have any errors. Recently, we’ve added even more information! Let’s check it out:


The Sitemaps page displays details based on content-type. Now statistics from Web, Videos, Images and News are featured prominently. This lets you see how many items of each type were submitted (if any), and for some content types, we also show how many items have been indexed. With these enhancements, the new Sitemaps page replaces the Video Sitemaps Labs feature, which will be retired.

Another improvement is the ability to test a Sitemap. Unlike an actual submission, testing does not submit your Sitemap to Google as it only checks it for errors. Testing requires a live fetch by Googlebot and usually takes a few seconds to complete. Note that the initial testing is not exhaustive and may not detect all issues; for example, errors that can only be identified once the URLs are downloaded are not be caught by the test.

In addition to on-the-spot testing, we’ve got a new way of displaying errors which better exposes what types of issues a Sitemap contains. Instead of repeating the same kind of error many times for one Sitemap, errors and warnings are now grouped, and a few examples are given. Likewise, for Sitemap index files, we’ve aggregated errors and warnings from the child Sitemaps that the Sitemap index encloses. No longer will you need to click through each child Sitemap one by one.

Finally, we’ve changed the way the “Delete” button works. Now, it removes the Sitemap from Webmaster Tools, both from your account and the accounts of the other owners of the site. Be aware that a Sitemap may still be read or processed by Google even if you delete it from Webmaster Tools. For example if you reference a Sitemap in your robots.txt file search engines may still attempt to process the Sitemap. To truly prevent a Sitemap from being processed, remove the file from your server or block it via robots.txt.

For more information on Sitemaps in Webmaster Tools and how Sitemaps work, visit our Help Center. If you have any questions, go to Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.