Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog:


I'm happy to announce that Webmaster Tools is expanding support for webmasters outside of the English-speaking world, by supporting Internationalizing Domain Names in Applications (IDNA). IDNA provides a way for site owners to have domains that go beyond the domain name system's limitations of English letters and numbers. Prior to IDNA, Internet host names could only be in the 26 letters of the English alphabet, the numbers 0-9, and the hyphen character. With IDNA support, you'll now be able to add your sites that use other character sets, and organize them easily on your Webmaster Tools Dashboard.

Let's say you wanted to add http://北京大学.cn/ (Peking University) to your Webmaster Tools account before we launched IDNA support. If you typed that in to the "Add Site" box, you'd get back an error message that looks like this:



Some webmasters discovered a workaround. Internally, IDNA converts nicely encoded http://北京大学.cn/ to a format called Punycode, which looks like http://xn--1lq90ic7fzpc.cn/. This allowed them to diagnose and view information about their site, but it looked pretty ugly. Also, if they had more than one IDNA site, you can imagine it would be pretty hard to tell them apart.



Since we now support IDNA throughout Webmaster Tools, all you need to do is type in the name of your site, and we will add it correctly. Here is what it looks like if you attempt to add http://北京大学.cn/ to your account:



If you are one of the webmasters who discovered the workaround previously (i.e., you have had sites listed in your account that look like http://xn--1lq90ic7fzpc.cn/), those sites will now automatically display correctly.

We'd love to hear your questions and feedback on this new feature; you can write a comment below or post in the Google Webmaster Tools section of our Webmaster Help Group. We'd also appreciate suggestions for other ways we can improve our international support.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:

We all know how friendly Googlebot is. And like all benevolent robots, he listens to us and respects our wishes about parts of our site that we don't want crawled. We can just give him a robots.txt file explaining what we want, and he'll happily comply. But what if you're intimidated by the idea of communicating directly with Googlebot? After all, not all of us are fluent in the language of robots.txt. This is why we're pleased to introduce you to your personal robot translator: the Robots.txt Generator in Webmaster Tools. It's designed to give you an easy and interactive way to build a robots.txt file. It can be as simple as entering the files and directories you don't want crawled by any robots.

Or, if you need to, you can create fine-grained rules for specific robots and areas of your site.
Once you're finished with the generator, feel free to test the effects of your new robots.txt file with our robots.txt analysis tool. When you're done, just save the generated file to the top level (root) directory of your site, and you're good to go. There are a couple of important things to keep in mind about robots.txt files:
  • Not every search engine will support every extension to robots.txt files
The Robots.txt Generator creates files that Googlebot will understand, and most other major robots will understand them too. But it's possible that some robots won't understand all of the robots.txt features that the generator uses.
  • Robots.txt is simply a request
Although it's highly unlikely from a major search engine, there are some unscrupulous robots that may ignore the contents of robots.txt and crawl blocked areas anyway. If you have sensitive content that you need to protect completely, you should put it behind password protection rather than relying on robots.txt.

We hope this new tool helps you communicate your wishes to Googlebot and other robots that visit your site. If you want to learn more about robots.txt files, check out our Help Center. And if you'd like to discuss robots.txt and robots with other webmasters, visit our Google Webmaster Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:
We've been tracking the growth of Sitemaps on the web. It's been just 2 years since Google, Yahoo and Microsoft co-announced the Sitemaps directive in robots.txt, and it is already supported in many millions of websites including educational and government websites! At the WWW'09 conference in Madrid, Uri Schonfeld presented his summer internship work studying Sitemaps from a coverage and freshness perspective. If you're interested in how some popular websites are using Sitemaps, and how Sitemaps complement "classic" webcrawling, take a look:


At Google, we care deeply about getting increased coverage and freshness of the content we index. We are excited about open standards that help webmasters open up their content automatically to search engines, so users can find relevant content for their searches.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.