Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Download to Google Spreadsheet from Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Webmaster Tools now has a new download option for exporting your data directly to a Google Spreadsheet. The download option is available for most of our data heavy features, such as Crawl errors, Search queries, and Links to your site. If you enjoy digging into the data from Webmaster Tools but don’t want to use Python scripts or the API, we’ve added new functionality just for you. Now when you click a download button from a Webmaster Tools feature like Search queries, you'll be presented with the "Select Download Format" option where you can choose to download the data as "CSV" or "Google Docs."


Choosing "CSV" initiates a download of the data in CSV format which has long been available in Webmaster Tools and can be imported into other spreadsheet tools like Excel. If you select the new “Google Docs” option then your data will be saved into a Google Spreadsheet and the newly created spreadsheet will be opened in a new browser tab.

We hope the ability to easily download your data to a Google Spreadsheet helps you to get crunching on your site's Webmaster Tools data even faster than you could before. Using only a web browser you can instantly dive right into slicing and dicing your data to create customized charts for detecting significant changes and tracking longer term trends impacting your site. If you've got questions or feedback please share it in the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Your fast pass through security 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Security checks are nobody's cup of tea. We've never seen people go through airport baggage checks for fun. But while security measures are often necessary, that doesn't mean they have to be painful. In that spirit, we’ve implemented several major improvements to make the Google Site Verification process faster, more straightforward, and perhaps even a pleasure to use—so you can get on with the tasks that matter to you.

New verification method recommendations


You’ll quickly notice the changes we’ve made to the verification page, namely the new tabbed interface. These tabs allow us to give greater visibility to the verification method that we think will be most useful to you, which is listed in the Recommended Method tab.


Our recommendation is just an educated guess, but sometimes guesses can be wrong. It’s possible that the method we recommend might not work for you. If this is the case, simply click the "Alternate methods" tab to see the other verification methods that are available. Verifying with an alternate method is just as powerful as verifying with a recommended method.

Our recommendations are computed from statistical data taken from users with a similar configuration to yours. For example, we can guess which verification methods might be successful by looking at the public hosting information for your website. In the future we plan to add more signals so that we can provide additional customized instructions along with more relevant recommendations.

New Google Sites Are Automatically Verified
For some of you, we’ve made the process even more effortless—Google Sites administrators are now automatically verified for all new sites that they create. When you create a new Google Site, it’ll appear verified in the details page. The same goes for adding or removing owners: when you edit the owners list in your Google Site's settings, the changes will automatically appear in Webmaster Tools.

One thing to note is that we’re unable to automatically verify preexisting Google Sites at this time. If you’d like to verify your older Google Sites, please continue to use the meta tag method already available.

We hope these enhancements help get you through security even faster. Should you get pulled over and have any questions, feel free to check out our Webmaster Help Forums.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Googlebot activity reports 2013

salam every one, this is a topic from google web master centrale blog: The webmaster tools team has a very exciting mission: we dig into our logs, find as much useful information as possible, and pass it on to you, the webmasters. Our reward is that you more easily understand what Google sees, and why some pages don't make it to the index.

The latest batch of information that we've put together for you is the amount of traffic between Google and a given site. We show you the number of requests, number of kilobytes (yes, yes, I know that tech-savvy webmasters can usually dig this out, but our new charts make it really easy to see at a glance), and the average document download time. You can see this information in chart form, as well as in hard numbers (the maximum, minimum, and average).

For instance, here's the number of pages Googlebot has crawled in the Webmaster Central blog over the last 90 days. The maximum number of pages Googlebot has crawled in one day is 24 and the minimum is 2. That makes sense, because the blog was launched less than 90 days ago, and the chart shows that the number of pages crawled per day has increased over time. The number of pages crawled is sometimes more than the total number of pages in the site -- especially if the same page can be accessed via several URLs. So http://www..matrixar.com/2006/10/learn-more-about-googlebots-crawl-of.html and http://www..matrixar.com/2006/10/learn-more-about-googlebots-crawl-of.html#links are different, but point to the same page (the second points to an anchor within the page).


And here's the average number of kilobytes downloaded from this blog each day. As you can see, as the site has grown over the last two and a half months, the number of average kilobytes downloaded has increased as well.


The first two reports can help you diagnose the impact that changes in your site may have on its coverage. If you overhaul your site and dramatically reduce the number of pages, you'll likely notice a drop in the number of pages that Googlebot accesses.

The average document download time can help pinpoint subtle networking problems. If the average time spikes, you might have network slowdowns or bottlenecks that you should investigate. Here's the report for this blog that shows that we did have a short spike in early September (the maximum time was 1057 ms), but it quickly went back to a normal level, so things now look OK.

In general, the load time of a page doesn't affect its ranking, but we wanted to give this info because it can help you spot problems. We hope you will find this data as useful as we do!this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: The Year in Review 2013

salam every one, this is a topic from google web master centrale blog: Welcome to 2007! The webmaster central team is very excited about our plans for this year, but we thought we'd take a moment to reflect on 2006. We had a great year building communication with you, the webmaster community, and creating tools based on your feedback. Many on the team were able to come out to conferences and met some of you in person, and we're looking forward to meeting many more of you in 2007. We've also had great conversations and gotten valuable feedback in our discussion forum, and we hope this blog has been helpful in providing information to you.

We said goodbye to the Sitemaps blog and launched this broader blog in August. And after doing so, our number of unique monthly visitors more than doubled. Thanks! We got much of our non-Google traffic from other webmaster community blogs and forums, such as the Search Engine Watch blog, Google Blogoscoped, and WebmasterWorld. In December, seomoz.org and the new Searchengineland.com were our biggest non-Google referrers. And social networking sites such as digg.com, reddit,com, del.icio.us, and slashdot.org sent webmaster tools many of our visitors, and a blog by somebody named Matt Cutts sent a lot of referrers our way as well. And these are the top Google queries that visitors clicked on:


Our most popular post was about the Googlebot activity reports and crawl rate control that we launched in October, followed by details about how to authenticate Googlebot. We have only slightly more Firefox users (46.28%) than Internet Explorer users (46.25%). 89% of you use Windows. After English, our readers most commonly speak French, German, Japanese, and Spanish. And after the United States, our readers primarily come from the UK, Canada, Germany, and France.

Here's some of what we did last year.

January
We expanded into Swedish, Danish, Norwegian, and Finnish.
You could hear Matt on webmaster radio.

February
We lauched several new features, including:
  • robots.txt analysis tool
  • page with the highest PageRank by month
  • common words in your site's content and in anchor text to your site
We met many of you at the Google Sitemaps lunch at SES NY.
You could hear me on webmaster radio.

March
We launched a few more features, including:
  • showing the top position of your site for your top queries
  • top mobile queries
  • download options for Sitemaps data, stats, and errors

April
We got a whole new look and added yet more features, such as:
  • meta tag verification
  • notification of violations to the webmaster guidelines
  • reinclusion request form and spam reporting form
  • indexing information (can we crawl your home page? is your site indexed?)
We also added a comprehensive webmaster help center and expanded the webmaster guidelines from 10 languages to 18.
We met more of you at the Google Sitemaps lunch at Boston Pubcon.
Matt talked about the new caching proxy.
We talked to many of you at SES Toronto.

May
Matt introduced you to our new search evangelist, Adam Lasnik.
We hung out with some of you in our hometown at Search Engine Watch Live Seattle and over at SES London.

June

We launched user surveys, to learn more about how you interact with webmaster tools.
We expanded some of our features, such as:
  • increased the number of crawl errors shown to 100% within the last two weeks
  • Increased the number of Sitemaps you can submit from 200 to 500
  • Expanded query stats so you can see them per property and per country and made them available for subdirectories
  • Increased the number of common words in your site and in links to your site from 20 to 75
  • Added Adsbot-Google to the robots.txt analysis tool
Yahoo! Stores incorporated Sitemaps for their merchants.

July
We expanded into Polish.
We began supporting the <meta name="robots" content="noodpt"> tag to allow you to opt out of using Open Directory titles and descriptions for your site in the search results.
We had a great time talking to many of you about international issues at SES Latino in Miami.

August
August was an exciting month for us, as we launched webmaster central! As part of that, we renamed Google Sitemaps to webmaster tools, expanded our Google Group to include all types of webmaster topics, and expanded the help content in our webmaster help center. We also launched some new features, including:
  • Preferred domain control
  • Site verification management
  • Downloads of query stats for all subfolders
In addition, I took over the GoodKarma podcast on webmasterradio for two shows (one all about Buffy the Vampire Slayer!) and we met even more of you at the Google Webmaster Central lunch at SES San Jose.

September
We improved reporting of the cache date in search results.
We provided a way for you to authenticate Googlebot.
And we started updating query stats more often and for a shorter timeframe.

October
We launched several new features, such as:
  • Crawl rate control
  • Googlebot activity reports
  • Opting in to enhanced image search
  • Display of the number of URLs submitted via a Sitemap
And you could hear Matt being interviewed in a podcast.

November
We launched sitemaps.org, for joint support of the Sitemaps protocol between us, Yahoo!, and Microsoft.
We also started notifying you if we flagged your site for badware and if you're an English news publisher included in Google News, we made News Sitemaps available to you.
Partied with lots of you at "Safe Bets with Google" at Pubcon Las Vegas.
We introduced you to our new Sitemaps support engineer, Maile Ohye, and our first webmaster trends analyst, Jonathan Simon.

Dec
We met even more of you at the webmaster central lunch at SES Chicago.

Thanks for spending the year with us. We look forward to even more collaboration and communication in the coming year.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Updated malware feature in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

A little over six months ago we released a new malware diagnostic tool in Webmaster Tools with the help of Lucas Ballard from the anti-malware team. This feature has been a great success; many of you were interested to know if Google had detected malicious software in your site, and you used the tool's information to find and remove that malware and to fix the vulnerabilities in your servers.

Well, a few days ago we promoted the malware diagnostics tool from Labs to a full Webmaster Tools feature. You can now find it under the Diagnostics menu. Not only that, we also added support for malware notifications. As you may already know, if your site has malware we may show a warning message in our search results indicating that the site is potentially harmful. If this is the case, you should remove any dangerous content as soon as possible and patch the vulnerabilities in your server. After you've done that, you can request a malware review in order to have the warning for your site removed. What's new in our latest release is that the form to request a review is now right there with the rest of the malware data:

Screenshot of the new malware feature in Webmaster Tools

We've also made several other improvements under the covers. Now the data is updated almost four times faster than before. And we've improved our algorithms for identifying injected content and can pinpoint exploits that were difficult to catch when the feature first launched.

On the Webmaster Tools dashboard you'll still see a warning message when you have malware on one of your sites. This message has a link that will take you directly to the malware tool. Here at Google we take malware very seriously, and we're working on several improvements to this feature so that we can tell you ASAP if we detect that your site is potentially infected. Stay tuned!

For more details, check out the Malware & Hacked sites help forum.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Blast from the past 2013

salam every one, this is a topic from google web master centrale blog: Written by Sahala Swenson, Webmaster Tools Team

As you know, the queries used to find your website in search results can change over time. Your website content changes, as do the needs of all the busy searchers out there. Whether the queries associated with your site change subtly or dramatically, it's pretty useful to see how they transform over time.

Recognizing this, Top Search Queries in Webmaster Tools now presents historical data and other enhancements. Let's take a closer look:


Up to 6 months of historical data:
Previously we only showed query stats for the last 7 days. Now you can jump between 9 query stats snapshots ranging from now to 6 months ago. Note that the time interval for each of these snapshots is different. For the 7 day, 2 week, and 3 week snapshots, we report the top queries for the previous week. For the 1 to 6 month snapshots, we report statistics for the previous month. And still others of you who log in may notice that you don't have query stats data going back to 6 months ago. We hope to improve that experience in the future. :)

Top query percentages:
You might have noticed a new column in the top query listings. Previously we just ranked your query results and clicks. While useful, this didn't really tell you to what extent one query was ranked higher than another. Now we show what percentage each query result or click represents out of the top 20 queries. This should help you see how well the result or click volume is distributed in the top 20.

Downloads:

Since we're now showing historical data on the Top Search Queries screen, we figured it would be rude to not let you download it all and play with the data yourself (spreadsheet masochists, I'm looking at you). We added a “Download data” link that lets you download all the stats in CSV format. Note that this exports all query stats historical data across all snapshots as well as search types and languages, so you can slice and dice to your satisfaction. The “Download all stats (including subfolders)” link, however, will still only show query stats for your site and sub-folders for the last 7 days.

Freshness:

We've improved data freshness in Webmaster Tools a couple of times in the past, and we've done it again with the new Top Search Queries. Statistics are being now updated constantly. Top query results and clicks may visibly change rank a lot more often now, sometimes daily.


So enough talk. Sign in and play around with the new improvements for yourself. As always we welcome feedback (especially in the form of beer), so feel free to drop us a note in the Webmaster Help Group and let us know what you think.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Troubleshooting Instant Previews in Webmaster Tools 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

In November, we launched Instant Previews to help users better understand if a particular result was relevant for a their search query. Since launch, our Instant Previews team has been keeping an eye on common complaints and problems related to how pages are rendered for Instant Previews.

When we see issues with preview images, they are frequently due to:
  • Blocked resources due to a robots.txt entry
  • Cloaking: Erroneous content being served to the Googlebot user-agent
  • Poor alternative content when Flash is unavailable
To help webmasters diagnose these problems, we have a new Instant Preview tool in the Labs section of Webmaster Tools (in English only for now).



Here, you can input the URL of any page on your site. We will then fetch the page from your site and try to render it both as it would display in Chrome and through our Instant Preview renderer. Please keep in mind that both of these renders are done using a recent build of Webkit which does not include plugins such as Flash or Silverlight, so it's important to consider the value of providing alternative content for these situations. Alternative content can be helpful to search engines, and visitors to your site without the plugin would benefit as well.

Below the renders, you’ll also see automated feedback on problems our system can detect such as missing or roboted resources. And, in the future, we plan to add more informative and timely feedback to help improve your Instant Previews!

Please direct your questions and feedback to the Webmaster Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Verification time savers —  Analytics included! 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Nobody likes to duplicate effort. Unfortunately, sometimes it's a fact of life. If you want to use Google Analytics, you need to add a JavaScript tracking code to your pages. When you're ready to verify ownership of your site in other Google products (such as Webmaster Tools), you have to add a meta tag, HTML file or DNS record to your site. They're very similar tasks, but also completely independent. Until today.

You can now use a Google Analytics JavaScript snippet to verify ownership of your website. If you already have Google Analytics set up, verifying ownership is as simple as clicking a button.


This only works with the newer asynchronous Analytics JavaScript, so if you haven't migrated yet, now is a great time. If you haven't set up Google Analytics or verified yet, go ahead and set up Google Analytics first, then come verify ownership of your site. It'll save you a little time — who doesn't like that? Just as with all of our other verification methods, the Google Analytics JavaScript needs to stay in place on your site, or your verification will expire. You also need to remain an administrator on the Google Analytics account associated with the JavaScript snippet.

Don't forget that once you've verified ownership, you can add other verified owners quickly and easily through the Verification Details page. There's no need for each owner to manually verify ownership. More effort and time saved!


We've also introduced an improved interface for verification. The new verification page gives you more information about each verification method. In some cases, we can now provide detailed instructions about how to complete verification with your specific domain registrar or provider. If your provider is included, there's no need to dig through their documentation to figure out how to add a verification DNS record — we'll walk you through it.


The time you save using these new verification features might not be enough to let you take up a new hobby, but we hope it makes the verification process a little bit more pleasant. As always, please visit the Webmaster Help Forum if you have any questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Introducing Code Search Sitemaps 2013

salam every one, this is a topic from google web master centrale blog:

Update: Code Search Sitemaps are no longer supported. More information.


The Sitemaps team is continuing its trend of extending the Sitemap Protocol for specific products and content types. Our latest work with the Google Code Search team now enables you to create Sitemaps that contain information about public source code you host and would like to include in Code Search. There's more information about this new functionality on the Google Code blog. If you're eager to get going, take a look at our Help Center documentation, create a Code Search Sitemap, sign into Google Webmaster Tools, and submit a Sitemap for Code Search!
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmasters can now provide feedback on Sitelinks 2013

salam every one, this is a topic from google web master centrale blog:


Sitelinks are extra links that appear below some search results in Google. They serve as shortcuts to help users quickly navigate to the important pages on your site.

Selecting pages to appear as sitelinks is a completely automated process. Our algorithms parse the structure and content of websites and identify pages that provide fast navigation and relevant information for the user's query. Since our algorithms consider several factors to generate sitelinks, not all websites have them.

Now, Webmaster Tools lets you view potential sitelinks for your site and block the ones you don't want to appear in Google search results. Because sitelinks are extremely useful in helping users navigate your site, we don't typically recommend blocking them. However, occasionally you might want to exclude a page from your sitelinks, for example: a page that has become outdated or unavailable, or a page that contains information you don't want emphasized to users. Once you block a page, it won't appear as a sitelink for 90 days unless you choose to unblock it sooner. It may take a week or so to remove a page from your sitelinks, but we are working on making this process faster.

To view and manage your sitelinks, go to the Webmaster Tools Dashboard and click the site you want. In the left menu click Links, then click Sitelinks.
Thanks for your feedback and stay tuned for more updates!



Update: the user-interface for this feature has changed. For more information, please see the Sitelinks Help Center article.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: More insight into anchor text 2013

salam every one, this is a topic from google web master centrale blog: Last month, we replaced the individual anchor text words that we showed for your site in webmaster tools with a list of full anchor phrases. This report shows you the top phrases that other sites use to link to the pages of your site. Now, we've enhanced the information we show you in the following ways:
  • We've expanded the number of phrases we show to 200.
  • You can now see the variations of each phrase (for instance, with different capitalization and punctuation).
  • More sites now have access to the anchor phrase report. So, if you didn't have this report before, you may have it now.
  • We've brought back the report showing the most common individual words in anchor text (you asked; we delivered!).
  • We've expanded the number of common words in anchor text and common words in your site that we show to 100 each.
To view this information, click the Page analysis link from the Statistics tab.


In addition, we've updated our robots.txt analysis tool to correctly interpret the new Sitemap instruction that we announced support for last week.

We hope this additional insight is helpful in learning how others view your site and keep your suggestions coming! We're listening.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Accessing search query data for your sites 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

SSL encryption on the web has been growing by leaps and bounds. As part of our commitment to provide a more secure online experience, today we announced that SSL Search on https://www.google.com will become the default experience for signed in users on google.com. This change will be rolling out over the next few weeks.

What is the impact of this change for webmasters? Today, a web site accessed through organic search results on http://www.google.com (non-SSL) can see both that the user came from google.com and their search query. (Technically speaking, the user’s browser passes this information via the HTTP referrer field.) However, for organic search results on SSL search, a web site will only know that the user came from google.com.

Webmasters can still access a wealth of search query data for their sites via Webmaster Tools. For sites which have been added and verified in Webmaster Tools, webmasters can do the following:
  • View the top 1000 daily search queries and top 1000 daily landing pages for the past 30 days.
  • View the impressions, clicks, clickthrough rate (CTR), and average position in search results for each query, and compare this to the previous 30 day period.
  • Download this data in CSV format.
In addition, users of Google Analytics’ Search Engine Optimization reports have access to the same search query data available in Webmaster Tools and can take advantage of its rich reporting capabilities.

We will continue to look into further improvements to how search query data is surfaced through Webmaster Tools. If you have questions, feedback or suggestions, please let us know through the Webmaster Tools Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Message Center: Let us communicate with you about your site 2013

salam every one, this is a topic from google web master centrale blog:
Today we're launching our Message Center, a new way for webmasters to receive personalized information from Google in our webmaster console. Should we need to contact you, you'll see a notification in your Webmaster Tools dashboard.


Initially the messages will refer to search quality issues, but over time we'll use the Message Center as a communication channel for more types of information. Here's an example: informing the site owner about hidden text, a violation in our webmaster guidelines.


For our webmasters outside the U.S., we’re also pleased to tell you that Message Center is capable of providing information in all supported Webmaster Tools languages (French, Italian, German, Spanish, Danish, Dutch, Swedish, Russian, Chinese-Simplified, Chinese-Traditional, Korean, Japanese, etc.), across all countries.

Right now the number of sites we’re contacting is small, but we hope to expand this program over time. We’re also really happy that the Message Center lets us communicate with webmasters in an authenticated way. As time goes on, we’ll keep looking for even more ways to improve communication with site owners, but right now, why not claim your site in our webmaster tools so that we can give you a heads-up of any issues that we see?this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: SES London Calling! 2013

salam every one, this is a topic from google web master centrale blog:

February is that time of the year: the Search Engine Strategies conference hits London! A few of us were there to meet webmasters and search engine representatives to talk about the latest trends and issues in the search engine world.

It was a three-day marathon full of interesting talks - and of course, we heard a lot of good questions in between the sessions! If you didn't get a chance to talk with us, fear not: we've pulled together some of the best questions we encountered. You can find a few of them below, and an additional set in our Webmaster Help Group. Please join the discussion!

Why should I upload a Sitemap to Google Webmaster Tools, if my site is crawled just fine?

All sites can benefit from submitting a Sitemap to Google Webmaster Tools. You may help us to do a better job of crawling and understanding your site, especially if it has dynamic content or a complicated architecture.

Besides, you will have access to more information about your site, for example the number of pages from your Sitemaps that are indexed by Google, any errors Google found with your Sitemap, as well as warnings about potential problems. Also, you can submit specialized Sitemaps for certain types of content including Video, Mobile, News and Code.
More information about the benefits of submitting a Sitemap to Google Webmaster Tools can be found here.

How do you detect paid links? If I want to stay on the safe side, should I use the "nofollow" attribute on all links?

We blogged about our position on paid links and the use of nofollow a few months ago. You may also find it interesting to read this thread in our Help Group about appropriate uses of the nofollow attribute.

How do I associate my site with a particular country/region using Google Webmaster Tools? Can I do this for a dynamic website?

The instructions in our Help Center explain that you can associate a country or region to an entire domain, individual subdomains or subdirectories. A quick tip: if, for instance, you are targeting the UK market, better ways of structuring your site would be example.co.uk, uk.example.com, or example.com/uk/. Google can geolocate all of those patterns.

If your domain name has no regional significance, such as www.example.com, you can still associate your website with a country or region. To do that you will need to verify the domain, or the subdomains and/or subdirectories one by one in your Webmaster Tools account and then associate each of them with a country/region. However, for the moment we don't support setting a geographical target for patterns that can't be verified such as, for example, www.example.com/?region=countrycode.

I have a news site and it is not entirely crawled. Why? Other crawlers had no problem crawling us...

First off, make sure that nothing prevents us from crawling your news site - the architecture of your site or the robots.txt file. Also, we suggest you sign up for Webmaster Tools and submit your content. We specifically have the News Sitemap protocol for sites offering this type of content. If you take advantage of this feature, we can give you more information on which URLs we had trouble with and why. It really rocks!

A quick note to conclude: the lively, international environment of SES is always incredible. I have had a lot of interesting conversations in English, as well as in Italian, French and Spanish. Fellow Googlers chatted with webmasters in English, Danish, Dutch, German and Hungarian. That's amazing - and a great opportunity to get to know each other better, in the language you speak! So next time you wonder how Google Universal Search works in English or you're concerned about Google News Search in German, don't hesitate; grab us for a chat or write to us!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Requesting removal of content from our index 2013

salam every one, this is a topic from google web master centrale blog:

Note: The user-interface of the described features has changed.

As a site owner, you control what content of your site is indexed in search engines. The easiest way to let search engines know what content you don't want indexed is to use a robots.txt file or robots meta tag. But sometimes, you want to remove content that's already been indexed. What's the best way to do that?

As always, the answer begins: it depends on the type of content that you want to remove. Our webmaster help center provides detailed information about each situation. Once we recrawl that page, we'll remove the content from our index automatically. But if you'd like to expedite the removal rather than wait for the next crawl, the way to do that has just gotten easier.

For sites that you've verified ownership for in your webmaster tools account, you'll now see a new option under the Diagnostic tab called URL Removals. To get started, simply click the URL Removals link, then New Removal Request. Choose the option that matches the type of removal you'd like.



Individual URLs
Choose this option if you'd like to remove a URL or image. In order for the URL to be eligible for removal, one of the following must be true:
Once the URL is ready for removal, enter the URL and indicate whether it appears in our web search results or image search results. Then click Add. You can add up to 100 URLs in a single request. Once you've added all the URLs you would like removed, click Submit Removal Request.

A directory
Choose this option if you'd like to remove all files and folders within a directory on your site. For instance, if you request removal of the following:

http://www.example.com/myfolder

this will remove all URLs that begin with that path, such as:

http://www.example.com/myfolder
http://www.example.com/myfolder/page1.html
http://www.example.com/myfolder/images/image.jpg

In order for a directory to be eligible for removal, you must block it using a robots.txt file. For instance, for the example above, http://www.example.com/robots.txt could include the following:

User-agent: Googlebot
Disallow: /myfolder

Your entire site
Choose this option only if you want to remove your entire site from the Google index. This option will remove all subdirectories and files. Do not use this option to remove the non-preferred version of your site's URLs from being indexed. For instance, if you want all of your URLs indexed using the www version, don't use this tool to request removal of the non-www version. Instead, specify the version you want indexed using the Preferred domain tool (and do a 301 redirect to the preferred version, if possible). To use this option, you must block the site using a robots.txt file.

Cached copies

Choose this option to remove cached copies of pages in our index. You have two options for making pages eligible for cache removal.

Using a meta noarchive tag and requesting expedited removal
If you don't want the page cached at all, you can add a meta noarchive tag to the page and then request expedited cache removal using this tool. By requesting removal using this tool, we'll remove the cached copy right away, and by adding the meta noarchive tag, we will never include the cached version. (If you change your mind later, you can remove the meta noarchive tag.)

Changing the page content
If you want to remove the cached version of a page because it contained content that you've removed and don't want indexed, you can request the cache removal here. We'll check to see that the content on the live page is different from the cached version and if so, we'll remove the cached version. We'll automatically make the latest cached version of the page available again after six months (and at that point, we likely will have recrawled the page and the cached version will reflect the latest content) or, if you see that we've recrawled the page sooner than that, you can request that we reinclude the cached version sooner using this tool.

Checking the status of removal requests
Removal requests show as pending until they have been processed, at which point, the status changes to either Denied or Removed. Generally, a request is denied if it doesn't meet the eligibility criteria for removal.


To reinclude content
If a request is successful, it appears in the Removed Content tab and you can reinclude it any time simply by removing the robots.txt or robots meta tag block and clicking Reinclude. Otherwise, we'll exclude the content for six months. After that six month period, if the content is still blocked or returns a 404 or 410 status message and we've recrawled the page, it won't be reincluded in our index. However, if the page is available to our crawlers after this six month period, we'll once again include it in our index.

Requesting removal of content you don't own
But what if you want to request removal of content that's located on a site that you don't own? It's just gotten easier to do that as well. Our new Webpage removal request tool steps through the process for each type of removal request.

Since Google indexes the web and doesn't control the content on web pages, we generally can't remove results from our index unless the webmaster has blocked or modified the content or removed the page. If you would like content removed, you can work with the site owner to do so, and then use this tool to expedite the removal from our search results.

If you have found search results that contain specific types of personal information, you can request removal even if you've been unable to work with the site owner. For this type of removal, provide your email address so we can work with you directly.



If you have found search results that shouldn't be returned with SafeSearch enabled, you can let us know using this tool as well.

You can check on the status of pending requests, and as with the version available in webmaster tools, the status will change to Removed or Denied once it's been processed. Generally, the request is denied if it doesn't meet the eligibility criteria. For requests that involve personal information, you won't see the status available here, but will instead receive an email with more information about next steps.

What about the existing URL removal tool?
If you've made previous requests with this tool, you can still log in to check on the status of those requests. However, make any new requests with this new and improved version of the tool.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Learn more about Googlebot's crawl of your site and more! 2013

salam every one, this is a topic from google web master centrale blog: We've added a few new features to webmaster tools and invite you to check them out.

Googlebot activity reports
Check out these cool charts! We show you the number of pages Googlebot's crawled from your site per day, the number of kilobytes of data Googlebot's downloaded per day, and the average time it took Googlebot to download pages. Webmaster tools show each of these for the last 90 days. Stay tuned for more information about this data and how you can use it to pinpoint issues with your site.

Crawl rate control
Googlebot uses sophisticated algorithms that determine how much to crawl each site. Our goal is to crawl as many pages from your site as we can on each visit without overwhelming your server's bandwidth.

We've been conducting a limited test of a new feature that enables you to provide us information about how we crawl your site. Today, we're making this tool available to everyone. You can access this tool from the Diagnostic tab. If you'd like Googlebot to slow down the crawl of your site, simply choose the Slower option.

If we feel your server could handle the additional bandwidth, and we can crawl your site more, we'll let you know and offer the option for a faster crawl.

If you request a changed crawl rate, this change will last for 90 days. If you liked the changed rate, you can simply return to webmaster tools and make the change again.


Enhanced image search
You can now opt into enhanced image search for the images on your site, which enables our tools such as Google Image Labeler to associate the images included in your site with labels that will improve indexing and search quality of those images. After you've opted in, you can opt out at any time.

Number of URLs submitted
Recently at SES San Jose, a webmaster asked me if we could show the number of URLs we find in a Sitemap. He said that he generates his Sitemaps automatically and he'd like confirmation that the number he thinks he generated is the same number we received. We thought this was a great idea. Simply access the Sitemaps tab to see the number of URLs we found in each Sitemap you've submitted.

As always, we hope you find these updates useful and look forward to hearing what you think.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Easier URL removals for site owners 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

We recently made a change to the Remove URL tool in Webmaster Tools to eliminate the requirement that the webpage's URL must first be blocked by a site owner before the page can be removed from Google's search results. Because you've already verified ownership of the site, we can eliminate this requirement to make it easier for you, as the site owner, to remove unwanted pages (e.g. pages accidentally made public) from Google's search results.

Removals persist for at least 90 days
When a page’s URL is requested for removal, the request is temporary and persists for at least 90 days. We may continue to crawl the page during the 90-day period but we will not display it in the search results. You can still revoke the removal request at any time during those 90 days. After the 90-day period, the page can reappear in our search results, assuming you haven’t made any other changes that could impact the page’s availability.

Permanent removal
In order to permanently remove a URL, you must ensure that one of the following page blocking methods is implemented for the URL of the page that you want removed:
This will ensure that the page is permanently removed from Google's search results for as long as the page is blocked. If at any time in the future you remove the previously implemented page blocking method, we may potentially re-crawl and index the page. For immediate and permanent removal, you can request that a page be removed using the Remove URL tool and then permanently block the page’s URL before the 90-day expiration of the removal request.



For more information about URL removals, see our “URL removal explained” blog series covering this topic. If you still have questions about this change or about URL removal requests in general, please post in our Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools verification strategies 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

Verifying ownership of your website is the first step towards using Google Webmaster Tools. To help you keep verification simple & reduce its maintenance to a minimum, especially when you have multiple people using Webmaster Tools, we’ve put together a small list of tips & tricks that we’d like to share with you:
  • The method that you choose for verification is up to you, and may depend on your CMS & hosting providers. If you want to be sure that changes on your side don’t result in an accidental loss of the verification status, you may even want to consider using two methods in parallel.
  • Back in 2009, we updated the format of the verification meta tag and file. If you’re still using the old format, we recommend moving to the newer version. The newer meta tag is called “google-site-verification, and the newer file format contains just one line with the file name. While we’re currently supporting ye olde format, using the newer one ensures that you’re good to go in the future.
  • When removing users’ access in Webmaster Tools, remember to remove any active associated verification tokens (file, meta tag, etc.). Leaving them on your server means that these users would be able to gain access again at any time. You can view the site owners list in Webmaster Tools under Configuration / Users.
  • If multiple people need to access the site, we recommend using the “add users” functionality in Webmaster Tools. This makes it easier for you to maintain the access control list without having to modify files or settings on your servers.
  • Also, if multiple people from your organization need to use Webmaster Tools, it can be a good policy to only allow users with email addresses from your domain. By doing that, you can verify at a glance that only users from your company have access. Additionally, when employees leave, access to Webmaster Tools is automatically taken care of when that account is disabled.
  • Consider using “restricted” (read-only) access where possible. Settings generally don’t need to be changed on a daily basis, and when they do need to be changed, it can be easier to document them if they have to go through a central account.

We hope these tips help you to simplify the situation around verification of your website in Webmaster Tools. For more questions about verification, feel free to drop by our Webmaster Help Forums.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: To err is human, Video Sitemap feedback is divine! 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

You can now check your Video Sitemap for even more errors right in Webmaster Tools! It’s a new Labs feature to signal issues in your Video Sitemap such as:
  • URLs disallowed by robots.txt
  • Thumbnail size errors (160x120px is ideal. Anything smaller than 90x50 will be rejected.)



Video Sitemaps help us to better crawl and extract information about your videos, so we can appropriately feature them in search results.

Totally new to Video Sitemaps? Check out the Video Sitemaps center for more information. Otherwise, take a look at this new Labs feature in Webmaster Tools.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Webmaster Tools in 40 languages! 2013

salam every one, this is a topic from google web master centrale blog:
(Инструменти за уеб администратори, Eines per a administradors web de Google, Webmaster Tools, Googlen Verkkovastaavan työkalut, Εργαλεία για Webmasters, Alat WebMaster, Tīmekļa pārziņa rīki, Žiniatinkli valdytojo įrankiai, Ferramentas para o webmaster do Google, Алатке за вебмастере, Nástroje správcu webu, Orodja za spletne skrbnike, Інструменти для веб-майстра, Công cụ Quản trị Trang Web)

In our recent Webmaster Tools launch, we went live in 14 new languages, bringing our total language support count to 40! With the launch of Bulgarian, Catalan, Croatian, Filipino, Greek, Indonesian, Lithuanian, Latvian, Portuguese (Portugal), Slovak, Slovenian, Serbian, Ukrainian and Vietnamese, Webmaster Tools joins Google products such as Google.com, AdWords, Gmail and Toolbar to reach the 40 Language Initiative (Google's company-wide initiative to make sure Google products are available in the 40 languages read by more than 98% of Internet users).

Our team is very excited to reach so many of you by offering our tools in 40 languages. At the same time, both the Google Localization and Webmaster Tools teams know that there's more room for improvements in the features and quality of our service. We hope to hear your input in the comments below, especially on the linguistic quality of our new languages.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.