Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Sitemaps are a way to tell Google about pages on your site. Webmaster Tools’ Sitemaps feature gives you feedback on your submitted Sitemaps, such as how many Sitemap URLs have been indexed, or whether your Sitemaps have any errors. Recently, we’ve added even more information! Let’s check it out:


The Sitemaps page displays details based on content-type. Now statistics from Web, Videos, Images and News are featured prominently. This lets you see how many items of each type were submitted (if any), and for some content types, we also show how many items have been indexed. With these enhancements, the new Sitemaps page replaces the Video Sitemaps Labs feature, which will be retired.

Another improvement is the ability to test a Sitemap. Unlike an actual submission, testing does not submit your Sitemap to Google as it only checks it for errors. Testing requires a live fetch by Googlebot and usually takes a few seconds to complete. Note that the initial testing is not exhaustive and may not detect all issues; for example, errors that can only be identified once the URLs are downloaded are not be caught by the test.

In addition to on-the-spot testing, we’ve got a new way of displaying errors which better exposes what types of issues a Sitemap contains. Instead of repeating the same kind of error many times for one Sitemap, errors and warnings are now grouped, and a few examples are given. Likewise, for Sitemap index files, we’ve aggregated errors and warnings from the child Sitemaps that the Sitemap index encloses. No longer will you need to click through each child Sitemap one by one.

Finally, we’ve changed the way the “Delete” button works. Now, it removes the Sitemap from Webmaster Tools, both from your account and the accounts of the other owners of the site. Be aware that a Sitemap may still be read or processed by Google even if you delete it from Webmaster Tools. For example if you reference a Sitemap in your robots.txt file search engines may still attempt to process the Sitemap. To truly prevent a Sitemap from being processed, remove the file from your server or block it via robots.txt.

For more information on Sitemaps in Webmaster Tools and how Sitemaps work, visit our Help Center. If you have any questions, go to Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:

When Matt Cutts, who heads up Google's webspam team, dropped by our Kirkland offices a little while ago we found ourselves with a video camera and an hour to spare. The result? We quickly put together a few videos we hope you'll find useful.

In our first video, Matt talks about the anatomy of a search result, and gives some useful tips on how you can help improve how your site appears in our results pages. This talk covers everything you'll see in a search result, including page title, page description, and sitelinks, and explains those other elements that can appear, such as stock quotes, cached pages links, and more.



If you like the video format (and even if you don't), or have ideas for subjects you'd like covered in the future, let us know what you think in our Webmaster Help Group. And rest assured, we'll be working to improve the sound quality for our next batch of vids.
this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

If you allow users to publish content on your website, from leaving comments to creating user profiles, you’ll likely see spammers attempt to take advantage of these mechanisms to generate traffic to their own sites. Having this spammy content on your site isn't fun for anyone. Users may be subjected to annoying advertisements directing them to low-quality or dangerous sites containing scams or malware. And you as a webmaster may be hosting content that violates a search engine's quality guidelines, which can harm your site's standing in search results.

There are ways to handle this abuse, such as moderating comments and reviewing new user accounts, but there is often so much spam created that it can become impossible to keep up with. Spam can easily get to this unmanageable level because most spam isn’t created manually by a human spammer. Instead, spammers use computer programs called “bots” to automatically fill out web forms to create spam, and these bots can generate spam much faster than a human can review it.

To level the playing field, you can take steps to make sure that only humans can interact with potentially spammable features of your website. One way to determine which of your visitors are human is by using a CAPTCHA , which stands for "completely automated public Turing test to tell computers and humans apart." A typical CAPTCHA contains an image of distorted letters which humans can read, but are not easily understood by computers. Here's an example:


You can easily take advantage of this technology on your own site by using reCAPTCHA, a free service owned by Google. One unique aspect of reCAPTCHA is that data collected from the service is used to improve the process of scanning text, such as from books or newspapers. By using reCAPTCHA, you're not only protecting your site from spammers; you're helping to digitize the world's books.

Luis Von Ahn, one reCAPTCHA's co-founders, gives more details about how the service works in the video below:


If you’d like to implement reCAPTCHA for free on your own site, you can sign up here. Plugins are available for easy installation on popular applications and programming environments such as WordPress and PHP.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.