Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

from web contents: Rich snippets go international 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

As part of our efforts to make search results more useful to our users around the world, we’re announcing the international availability of rich snippets. If you’ve been following our blog posts, you already know that rich snippets let users see additional facts and data from your site in search results.

For example, we recently launched rich snippets for recipes which, for certain sites, lets users see quick recipe facts as part of the snippet and makes it easier to determine if the page has what they are looking for:


We’ve had a lot of questions on our blogs and forums about international support for rich snippets - and we know that many of you have already started marking up your content - so today’s announcement is very exciting for us.

In addition to adding support for rich snippets in any language, we have published documentation on how to mark up your sites for rich snippets in the following languages: simplified Chinese, traditional Chinese, Czech, Dutch, English, French, German, Hungarian, Italian, Japanese, Korean, Polish, Portuguese, Russian, Spanish, and Turkish. (You can change the Help language by scrolling to the bottom of the help page and selecting the language you want from the drop-down menu.)

We encourage you to read the documentation to take advantage of the different types of rich snippets currently supported: people profiles, reviews, videos, events and recipes. You can also use our testing tool (in English only, but useful to test markup in any language) and start validating your markup to make sure results show as you would expect.

Finally and as you’ve probably heard by now (several times), we’re taking a gradual approach to surface rich snippets. This means that marking up your site doesn’t guarantee that we’ll show rich snippets for your pages. We’re doing this to ensure a good experience for our users; but rest assured we’re working hard to expand coverage and include more web pages.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: New Message Center notifications for detecting an increase in Crawl Errors 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

When Googlebot crawls your site, it’s expected that most URLs will return a 200 response code, some a 404 response, some will be disallowed by robots.txt, etc. Whenever we’re unable to reach your content, we show this information in the Crawl errors section of Webmaster Tools (even though it might be intentional and not actually an error). Continuing with our effort to provide useful and actionable information to webmasters, we're now sending SiteNotice messages when we detect a significant increase in the number of crawl errors impacting a specific site. These notifications are meant to alert you of potential crawl-related issues and provide a sample set of URLs for diagnosing and fixing them.

A SiteNotice for a spike in the number of unreachable URLs, for example, will look like this:


We hope you find SiteNotices helpful for discovering and dealing with issues that, if left unattended, could negatively affect your crawl coverage. You’ll only receive these notifications if you’ve verified your site in Webmaster Tools and we detect significant changes to the number of crawl errors we encounter on your site. And if you don't want to miss out on any these important messages, you can use the email forwarding feature to receive these alerts in your inbox.

If you have any questions, please post them in our Webmaster Help Forum or leave your comments below.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help us make the web better: An update on Rich Snippets 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

In May this year we announced Rich Snippets which makes it possible to show structured data from your pages on Google's search results.


We're convinced that structured data makes the web better, and we've worked hard to expand Rich Snippets to more search results and collect your feedback along the way. If you have review or people/social networking content on your site, it's easier than ever to mark up your content using microformats or RDFa so that Google can better understand it to generate useful Rich Snippets. Here are a few helpful improvements on our end to enable you to mark up your content:

Testing tool. See what Google is able to extract, and preview how microformats or RDFa marked-up pages would look on Google search results. Test your URLs on the Rich Snippets Testing Tool.


Google Custom Search users can also use the Rich Snippets Testing Tool to test markup usable in their Custom Search engine.

Better documentation. We've extended our documentation to include a new section containing Tips & Tricks and Frequently Asked Questions. Here we have responded to common points of confusion and provided instructions on how to maximize the chances of getting Rich Snippets for your site.

Extended RDFa support. In addition to the Person RDFa format, we have added support for the corresponding fields from the FOAF and vCard vocabularies for all those of you who asked for it.

Videos. If you have videos on your page, you can now mark up your content to help Google find those videos.

As before, marking up your content does not guarantee that Rich Snippets will be shown for your site. We will continue to expand this feature gradually to ensure a great user experience whenever Rich Snippets are shown in search results.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Even more Top Search Queries data 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All We recently updated the Top Search Queries data to take into account the average top position, we enabled programmatic download and we made sure you could still get all the queries that drive traffic to your site. Well, now it’s time to give you more search queries data!

First, and most important, you can now see up to 90 days of historical data. If you click on the date picker in the top right of Search queries, you can go back three months instead of the previous 35 days.

And after you click:

In order to see 90 days, the option to view with changes will be disabled. If you want to see the changes with respect to the previous time period, the limit remains 30 days. Changes are disabled by default but you can switch them on and off with the button between the graph and the table. Top search queries data is normally available within 2 or 3 days.

Another big improvement in Webmaster Tools is that you can now see basic search query data as soon as you verify ownership of a site. No more waiting to see your information.

Finally, we're now collecting data for the top 2000 queries for which your site gets clicks. You may see less than 2000 if we didn’t record any clicks for a particular query in a given day, or if your query data is spread out among many countries or languages. For example, a search for [flowers] on Google Canada is counted separately from a search for [flowers] on google.com. Nevertheless, with this change 98% of sites will have complete coverage. Let us know what you think. We hope the new data will be useful.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: What’s new with Sitemaps 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Sitemaps are a way to tell Google about pages on your site. Webmaster Tools’ Sitemaps feature gives you feedback on your submitted Sitemaps, such as how many Sitemap URLs have been indexed, or whether your Sitemaps have any errors. Recently, we’ve added even more information! Let’s check it out:


The Sitemaps page displays details based on content-type. Now statistics from Web, Videos, Images and News are featured prominently. This lets you see how many items of each type were submitted (if any), and for some content types, we also show how many items have been indexed. With these enhancements, the new Sitemaps page replaces the Video Sitemaps Labs feature, which will be retired.

Another improvement is the ability to test a Sitemap. Unlike an actual submission, testing does not submit your Sitemap to Google as it only checks it for errors. Testing requires a live fetch by Googlebot and usually takes a few seconds to complete. Note that the initial testing is not exhaustive and may not detect all issues; for example, errors that can only be identified once the URLs are downloaded are not be caught by the test.

In addition to on-the-spot testing, we’ve got a new way of displaying errors which better exposes what types of issues a Sitemap contains. Instead of repeating the same kind of error many times for one Sitemap, errors and warnings are now grouped, and a few examples are given. Likewise, for Sitemap index files, we’ve aggregated errors and warnings from the child Sitemaps that the Sitemap index encloses. No longer will you need to click through each child Sitemap one by one.

Finally, we’ve changed the way the “Delete” button works. Now, it removes the Sitemap from Webmaster Tools, both from your account and the accounts of the other owners of the site. Be aware that a Sitemap may still be read or processed by Google even if you delete it from Webmaster Tools. For example if you reference a Sitemap in your robots.txt file search engines may still attempt to process the Sitemap. To truly prevent a Sitemap from being processed, remove the file from your server or block it via robots.txt.

For more information on Sitemaps in Webmaster Tools and how Sitemaps work, visit our Help Center. If you have any questions, go to Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Protect your site from spammers with reCAPTCHA 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: All

If you allow users to publish content on your website, from leaving comments to creating user profiles, you’ll likely see spammers attempt to take advantage of these mechanisms to generate traffic to their own sites. Having this spammy content on your site isn't fun for anyone. Users may be subjected to annoying advertisements directing them to low-quality or dangerous sites containing scams or malware. And you as a webmaster may be hosting content that violates a search engine's quality guidelines, which can harm your site's standing in search results.

There are ways to handle this abuse, such as moderating comments and reviewing new user accounts, but there is often so much spam created that it can become impossible to keep up with. Spam can easily get to this unmanageable level because most spam isn’t created manually by a human spammer. Instead, spammers use computer programs called “bots” to automatically fill out web forms to create spam, and these bots can generate spam much faster than a human can review it.

To level the playing field, you can take steps to make sure that only humans can interact with potentially spammable features of your website. One way to determine which of your visitors are human is by using a CAPTCHA , which stands for "completely automated public Turing test to tell computers and humans apart." A typical CAPTCHA contains an image of distorted letters which humans can read, but are not easily understood by computers. Here's an example:


You can easily take advantage of this technology on your own site by using reCAPTCHA, a free service owned by Google. One unique aspect of reCAPTCHA is that data collected from the service is used to improve the process of scanning text, such as from books or newspapers. By using reCAPTCHA, you're not only protecting your site from spammers; you're helping to digitize the world's books.

Luis Von Ahn, one reCAPTCHA's co-founders, gives more details about how the service works in the video below:


If you’d like to implement reCAPTCHA for free on your own site, you can sign up here. Plugins are available for easy installation on popular applications and programming environments such as WordPress and PHP.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: The +1 Button: Now Faster 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: All

One of the 10 things we hold to be true here at Google is that fast is better than slow. We keep speed in mind in all things that we do, and the +1 button is no exception. Since the button’s launch, we have been hard at work improving its load time. Today, we’re proud to announce two updates that will make both the +1 button and the page loading it, faster.

First, we’ve begun to roll out out a set of changes that will make the button render up to 3x faster on your site. No action is required on your part, so just sit back, relax, and watch as the button loads more quickly than before.

In addition to the improvements made to the button, we’re also introducing a new asynchronous snippet, allowing you to make the +1 experience even faster. The async snippet allows your web page to continue loading while your browser downloads the +1 JavaScript. By loading these elements in parallel, we’re ensuring the HTTP request to get the +1 button JavaScript doesn’t lead to an increase in your page load time. For those of you who have already implemented the button, you’ll need to update the code to the new async snippet, and then you should see an overall improvement in your page load time.

To generate the new async snippet, use our +1 Configuration Tool. Below, you’ll find an example of the code, which should be included below the last <g:plusone> tag on your page for best performance.


If you haven’t already implemented the +1 button on your site, we’re excited for your first experience to be a fast one. This is a great opportunity to allow your users to recommend your site to their friends, potentially bringing in more qualified traffic from Google search. To those that already have the button, we hope that you enjoy the improvements in speed. Our team will continue to work hard to enhance the +1 button experience as we know that “fast is better than slow” is as true today as it’s ever been.

If you have any questions, please join us in the Webmaster forum. To receive updates about the +1 button, please subscribe to the Google Publisher Buttons Announce Group. For advanced tips and tricks, check our Google Code site.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Update to Top Search Queries data 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Starting today, we’re updating our Top Search Queries feature to make it better match expectations about search engine rankings. Previously we reported the average position of all URLs from your site for a given query. As of today, we’ll instead average only the top position that a URL from your site appeared in.

An example
Let’s say Nick searched for [bacon] and URLs from your site appeared in positions 3, 6, and 12. Jane also searched for [bacon] and URLs from your site appeared in positions 5 and 9. Previously, we would have averaged all these positions together and shown an Average Position of 7. Going forward, we’ll only average the highest position your site appeared in for each search (3 for Nick’s search and 5 for Jane’s search), for an Average Position of 4.

We anticipate that this new method of calculation will more accurately match your expectations about how a link's position in Google Search results should be reported.

How will this affect my Top Search Queries data?
This change will affect your Top Search Queries data going forward. Historical data will not change. Note that the change in calculation means that the Average Position metric will usually stay the same or decrease, as we will no longer be averaging in lower-ranking URLs.

Check out the updated Top Search Queries data in the Your site on the web section of Webmaster Tools. And remember, you can also download Top Search Queries data programmatically!

We look forward to providing you a more representative picture of your Google Search data. Let us know what you think in our Webmaster Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: How to deal with planned site downtime 2013

salam every one, this is a topic from google web master centrale blog:
Webmaster level: Intermediate to Advanced

Once in a while we get asked whether a site’s visibility in Google’s search results can be impacted in a negative way if it’s unavailable when Googlebot tries to crawl it. Sometimes downtime is unavoidable: a webmaster might decide to take a site down due to ongoing site maintenance, or legal or cultural requirements. Outages that are not clearly marked as such can negatively affect a site’s reputation. While we cannot guarantee any crawling, indexing or ranking, there are methods to deal with planned website downtime in a way that will generally not negatively affect your site’s visibility in the search results.

For example, instead of returning an HTTP result code 404 (Not Found) or showing an error page with the status code 200 (OK) when a page is requested, it’s better to return a 503 HTTP result code (Service Unavailable) which tells search engine crawlers that the downtime is temporary. Moreover, it allows webmasters to provide visitors and bots with an estimated time when the site will be up and running again. If known, the length of the downtime in seconds or the estimated date and time when the downtime will be complete can be specified in an optional Retry-After header, which Googlebot may use to determine when to recrawl the URL.

Returning a 503 HTTP result code can be a great solution for a number of other situations. We encounter a lot of problems with sites that return 200 (OK) result codes for server errors, downtime, bandwidth-overruns or for temporary placeholder pages (“Under Construction”). The 503 HTTP result code is the webmaster’s solution of choice for all these situations. As for planned server downtime like hardware maintenance, it’s a good idea to have a separate
server available to actually return the 503 HTTP result code. It is important, however, to not treat 503 as a permanent solution: lasting 503s can eventually be seen as a sign that the server is now permanently unavailable and can result in us removing URLs from Google’s index.

header('HTTP/1.1 503 Service Temporarily Unavailable');
header('Retry-After: Sat, 8 Oct 2011 18:27:00 GMT');

If you set up a 503 (Service Unavailable) response, the header information might look like this when using PHP.
Similar to how you can make 404 pages more useful to users, it’s also a good idea to provide a customized 503 message explaining the situation to users and letting them know when the site will be available again. For further information regarding HTTP result codes, please see RFC 2616.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Helping webmasters from user to user 2013

salam every one, this is a topic from google web master centrale blog: You have to have some kind of super-powers to keep up with all of the issues posted in our Webmaster Help Forum—that's why we call our Top Contributors the "Bionic Posters." They're able to leap through tall questions in a single bound, providing helpful and solid information all around. We're thankful to the Bionics for tackling problems both hard and easy (well, easy if you know how). Our current Bionic Posters are: Webado (Christina), Phil Payne, Red Cardinal (Richard), Shades1 (Louis), Autocrat, Tim Abracadabra, Aaron, Cristina, Robbo, John, Becky Sharpe, Sasch, BbDeath, Beussery (Brian), Chibcha (Terry), Luzie (Herbert), 奥宁 (Andy), Ashley, Kaleh and Redleg!

With thousands of webmasters visiting the English Help Forum every day, some questions naturally pop up more often than others. To help catch these common issues, the Bionic Posters have also helped to create and maintain a comprehensive list of frequently asked questions and their answers. These FAQs cover everything from "Why isn't my site indexed?" to diagnosing difficult issues with the help of Google Webmaster Tools, often referring to our Webmaster Help Center for specific topics. Before you post in the forum, make sure you've read through these resources and do a quick search in the forum; chances are high that your question has been answered there already.

Besides the Bionic Posters, we're lucky to have a number of very active and helpful users in the forum, such as: squibble, Lysis, yasir, Steven Lockey, seo101, RickyD, MartinJ and many more. Thank you all for making this community so captivating and—most of the time—friendly.

Here are just a few (well, a little more than a few) of the many comments that we've seen posted in the forum:

  • "Thank you for this forum... Thank you to those that take the time to answer and care!"
  • "I've only posted one question here, but have received a wealth of knowledge by reading tons of posts and answers. The time you experts put into helping people with their problems is very inspiring and my hat's off to each of you. Anyway, I just wanted to let you know that your services aren't going unnoticed and I truly appreciate the lessons."
  • "Thank you very much cristina, what you told me has done the trick. I really appriciate the help as this has been bugging me for a while now and I didn't know what was wrong."
  • "thank you ssssssssssoooo much kaleh. "
  • "OK, Phil Payne big thanks to You! I have made changes and maybe people are starting to find me in G! Thanks to Ashley, I've started to make exclusive and relevant content for people."
  • "If anything, it has helped me reflect on the sites and projects of days gone by so as to see what I could have done better - so that I can deliver that much more and better results going forward. I've learned that some things I had done right, were spot on, and other issues could have been handled differently, as well as a host of technical information that I've stored away for future use. Bottom Line: this forum rocks and is incredibly helpful."
  • "I asked a handful of questions, got GREAT help while doing a whole lot of lurking, and now I've got a site that rocks!! (...) Huge thanks to all the Top Contributors, and a very special mention to WEBADO, who helped me a TON with my .htaccess file."
  • "Over the years of reading (and sometimes contributing) to this forum I think it has helped to remove many false assumptions and doubts over Google's ranking systems. Contrary to what many have said I verily believe Google can benefit small businesses. Keep up the good work. "
  • "The forum members are awesome and are a most impressive bunch. Their contribution is immeasurable as it is huge. Not only have they helped Google in their success as a profitable business entity, but also helped webmasters both aspiring and experienced. There is also an engender(ment) of "family" or "belonging" in the group that has transcended the best and worst of times (Current forum change still TBD :-) ). We can agree, disagree and agree to disagree but remain respectful and civil (Usually :-) )."
  • "Hi Redleg, Thank you very much for all of the information. Without your help, I don't think I would ever have known how to find the problem. "
  • "What an amazing board. Over the last few days I have asked 1 question and recieved a ton of advice mainly from Autocrat. "
  • "A big thank you to the forum and the contributors that helped me get my site on Google . After some hassle with my web hosters and their naff submission service, issues over adding pages Google can see, issues over Sitemaps, I can now say that when I put my site name into the search and when i put in [custom made watch box], for instance, my site now comes up."
  • "Thank you Autocrat! You are MAGNIFICENT! (...) I am your biggest fan today. : ) Imagine Joe Cocker singing With a Little Help from My Friends...that's my theme song today."
  • "I've done a lot of reading since then and I've learned more in the last year than I learned in the previous 10. When I stumbled into this forum I had no idea what I was getting into but finding this forum was a gift from God! Words cannot express the amount of gratitude I feel for the help you have given me and I wish I could repay you some how.... I don't mean to sound so mushy, but I write this with tears in my eyes and I am truly, truly grateful..."

Are you new to the Webmaster Help Forum? Tell us a little bit about yourself and then join us to learn more and help others!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Another step to reward high-quality sites 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we're not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can't divulge specific signals because we don't want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Here’s an example of a webspam tactic like keyword stuffing taken from a site that will be affected by this change:


Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition:


Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Tag Your TV Shows! 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

If your website is the authoritative source for the video of a particular TV show, make sure we know about it! Hopefully, you already submit Video Sitemaps or mRSS feeds to inform us about video content on your website. We now support additional fields in both video Sitemaps and mRSS feeds where you can specify metadata specific to television or episodic content. This includes the series’ title, the season and episode numbers for the video in question, the premiere date, as well as other additional information. The metadata from your video feed helps us provide more detailed, relevant results to users wanting to view your show.

Here’s an example Video Sitemap entry that includes all the required and some optional TV metadata in the <video:tvshow> element:

<video:video>
  <video:title>The Sample Show, Season 1, Episode 2</video:title>
  <!-- other required root level video tags omitted -->
  <video:tvshow>
    <video:show_title>The Sample Show</video:show_title>
    <video:video_type>full</video:video_type>
    <video:episode_title>A Sample Episode Title</video:episode_title>
    <video:season_number>1</video:season_number>
    <video:episode_number>2</video:episode_number>
  </video:tvshow>
</video:video>


The full documentation for the tags for both mRSS and Video Sitemaps can be found in our Webmaster Tools Help Center. As always, if you have any questions about Video Sitemaps or mRSS feeds, feel free to reach out to us in the Sitemaps section of the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Introducing Recipe View, based on rich snippets markup 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Today, we’re happy to introduce Recipe View, a new way of finding recipes when searching on Google. Recipe View enables you to filter your regular web search results to show only recipes and to restrict results based on ingredients, cook time, or calorie preferences:

Read more about Recipe View on the Official Google Blog and be sure to check out our video of Google Chef Scott Giambastiani demonstrating how he uses Recipe View to find great recipes for Googlers:



Recipe View is based on data from recipe rich snippets markup. As a webmaster, to make sure your recipe content can show in Recipe View (currently rolling out in the US and Japan) as well as in regular search results with rich snippets (available globally), be sure to add structured data markup to your recipe pages. Rich snippets are also available for reviews, people, products, and events, and we’ll continue to expand this list of categories over time. You can always see the full list of supported types by referring to our rich snippets documentation and by watching for further updates here on the Webmaster Central Blog.

This marks an exciting milestone for us -- it’s the first time we’ve introduced search filters based on rich snippets markup from webmasters. Over time, we’ll continue exploring new ways to enhance the search experience using this data.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Multilingual and multinational site annotations in Sitemaps 2013

salam every one, this is a topic from google web master centrale blog:

Webmaster level: All

In December 2011 we announced annotations for sites that target users in many languages and, optionally, countries. These annotations define a cluster of equivalent pages that target users around the world, and were implemented using rel-alternate-hreflang link elements in the HTML of each page in the cluster.

Based on webmaster feedback and other considerations, today we’re adding support for specifying the rel-alternate-hreflang annotations in Sitemaps. Using Sitemaps instead of HTML link elements offers many advantages including smaller page size and easier deployment for some websites.

To see how this works, let's take a simple example: We wish to specify that for the URL http://www.example.com/en, targeting English language users, the equivalent URL targeting German language speakers http://www.example.com/de. Up till now, the only way to add such annotation is to use a link element, either as an HTTP header or as HTML elements on both URLs like this:

<link rel="alternate" hreflang="en" href="http://www.example.com/en" >
<link rel="alternate" hreflang="de" href="http://www.example.com/de" >

As of today, you can alternately use the following equivalent markup in Sitemaps:

<url>
<loc>http://www.example.com/en</loc>
<xhtml:link
rel="alternate"
hreflang="de"
href="http://www.example.com/de" />

<xhtml:link
rel="alternate"
hreflang="en"
href="http://www.example.com/en" />

</url>
<url>
<loc>http://www.example.com/de</loc>
<xhtml:link
rel="alternate"
hreflang="de"
href="http://www.example.com/de" />

<xhtml:link
rel="alternate"
hreflang="en"
href="http://www.example.com/en" />

</url>

Briefly, the new Sitemaps tags shown in bold function in the same way as the HTML link tags, with both using the same attributes. The full technical details of how the annotations are implemented in Sitemaps, including how to implement the xhtml namespace for the link tag, are in our new Help Center article.

A more detailed example can be found in our new Help Center article, and if you need more help, please ask in our brand new internationalization help forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: A faster image search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

People looking for images on Google often want to browse through many images, looking both at the images and their metadata (detailed information about the images). Based on feedback from both users and webmasters, we redesigned Google Images to provide a better search experience. In the next few days, you’ll see image results displayed in an inline panel so it’s faster, more beautiful, and more reliable. You will be able to quickly flip through a set of images by using the keyboard. If you want to go back to browsing other search results, just scroll down and pick up right where you left off.

Screenshot of new Google Images results using the query nasa earth as an example


Here’s what it means for webmasters:
  • We now display detailed information about the image (the metadata) right underneath the image in the search results, instead of redirecting users to a separate landing page.
  • We’re featuring some key information much more prominently next to the image: the title of the page hosting the image, the domain name it comes from, and the image size.
  • The domain name is now clickable, and we also added a new button to visit the page the image is hosted on. This means that there are now four clickable targets to the source page instead of just two. In our tests, we’ve seen a net increase in the average click-through rate to the hosting website.
  • The source page will no longer load up in an iframe in the background of the image detail view. This speeds up the experience for users, reduces the load on the source website’s servers, and improves the accuracy of webmaster metrics such as pageviews. As usual, image search query data is available in Top Search Queries in Webmaster Tools.
As always, please ask on our Webmaster Help forum if you have questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Help us improve Google Search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Advanced

Yes, we're looking for help improving Google search—but this time we're not asking you to submit more spam reports. Although we still appreciate receiving quality spam reports, today we've got a different opportunity for you to improve Google Search: how about YOU join our team and do the webspam fighting yourself?

Interested? Here's what we're looking for: open-minded academic graduates willing to work in a multinational environment in our Dublin office. Looking at a site's source code should not scare you. You should be excited about search engines and the Internet. It’s also essential that you share our aversion to webspam and the drive to make high-quality content accessible. PlayStation or foosball skills are a plus.


This is an actual work environment photo taken at the Dublin Google office.


If you'd like to know more about the positions available, here's the full list of requirements and responsibilities. Great candidates should be able to email the recruiter directly.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: A reminder about selling links that pass PageRank 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: all

Google has said for years that selling links that pass PageRank violates our quality guidelines. We continue to reiterate that guidance periodically to help remind site owners and webmasters of that policy.

Please be wary if someone approaches you and wants to pay you for links or "advertorial" pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations. The consequences for a linkselling site start with losing trust in Google's search results, as well as reduction of the site's visible PageRank in the Google Toolbar. The consequences can also include lower rankings for that site in Google's search results.

If you receive a warning for selling links that pass PageRank in Google's Webmaster Tools, you'll see a notification message to look for "possibly artificial or unnatural links on your site pointing to other sites that could be intended to manipulate PageRank." That's an indication that your site has lost trust in Google's index.

To address the issue, make sure that any paid links on your site don't pass PageRank. You can remove any paid links or advertorial pages, or make sure that any paid hyperlinks have the rel="nofollow" attribute. After ensuring that no paid links on your site pass PageRank, you can submit a reconsideration request and if you had a manual webspam action on your site, someone at Google will review the request. After the request has been reviewed, you'll get a notification back about whether the reconsideration request was granted or not.

We do take this issue very seriously, so we recommend you avoid selling (and buying) links that pass PageRank in order to prevent loss of trust, lower PageRank in the Google Toolbar, lower rankings, or in an extreme case, removal from Google's search results.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Download search queries data using Python 2013

salam every one, this is a topic from google web master centrale blog: Webmaster level: Advanced

For all the developers who have expressed interest in getting programmatic access to the search queries data for their sites in Webmaster Tools, we've got some good news. You can now get access to your search queries data in CSV format using a open source Python script from the webmaster-tools-downloads project. Search queries data is not currently available via the Webmaster Tools API, which has been a common API user request that we're considering for the next API update. For those of you who need access to search queries data right now, let's look at an example of how the search queries downloader Python script can be used to download your search queries data and upload it to a Google Spreadsheet in Google Docs.

Example usage of the search queries downloader Python script
1) If Python is not already installed on your machine, download and install Python.
2) Download and install the Google Data APIs Python Client Library.
3) Create a folder and add the downloader.py script to the newly created folder.
4) Copy the example-create-spreadsheet.py script to the same folder as downloader.py and edit it to replace the example values for “website,” “email” and “password” with valid values for your Webmaster Tools verified site.
5) Open a Terminal window and run the example-create-spreadsheet.py script by entering "python example-create-spreadsheet.py" at the Terminal window command line:
python example-create-spreadsheet.py
6) Visit Google Docs to see a new spreadsheet containing your search queries data.


If you just want to download your search queries data in a .csv file without uploading the data to a Google spreadsheet use example-simple-download.py instead of example-create-spreadsheet.py in the example above.

You could easily configure these scripts to be run daily or monthly to archive and view your search queries data across larger date ranges than the current one month of data that is available in Webmaster Tools, for example, by setting up a cron job or using Windows Task Scheduler.

An important point to note is that this script example includes user name and password credentials within the script itself. If you plan to run this in a production environment you should follow security best practices like using encrypted user credentials retrieved from a secure data storage source. The script itself uses HTTPS to communicate with the API to protect these credentials.

Take a look at the search queries downloader script and start using search queries data in your own scripts or tools. Let us know if you have questions or feedback in the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Introducing a new Rich Snippets format: Events 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All

Last year we introduced Rich Snippets, a new feature that makes it possible to surface structured data from your pages on Google's search results. So far, user reaction to Rich Snippets has been enthusiastic -- after all, Rich Snippets help people make more informed clicks and find what they need even faster.

We originally introduced Rich Snippets with two formats: reviews and people. Later in the year we added support for marking up video information which is used to improve Video Search. Today, we're excited to kick off the new year by adding support for events.

Events markup is based off of the hCalendar microformat. Here's an example of what the new events Rich Snippets will look like:


The new format shows links to specific events on the page along with dates and locations. It provides a fast and convenient way for users to determine if a page has events they may be interested in.

If you have event listings on your site, we encourage you to review the events documentation we've prepared to help you get started. Please note, however, that marking up your content is not a guarantee that Rich Snippets will show for your site. Just as we did for previous formats, we will take a gradual approach to incorporating the new event snippets to ensure a great user experience along the way.

Stay tuned for more developments in Rich Snippets throughout the year!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com

from web contents: Finding Places on the Web: Rich Snippets for Local Search 2013

salam every one, this is a topic from google web master centrale blog: Webmaster Level: All
Cross-posted from the Lat Long Blog.

We’re sharing some news today that we hope webmasters will find exciting. As you know, we’re constantly working to organize the world’s information - be it textual, visual, geographic or any other type of useful data. From a local search perspective, part of this effort means looking for all the great web pages that reference a particular place. The Internet is teeming with useful information about local places and points of interest, and we do our best to deliver relevant search results that help shed light on locations all across the globe.

Today, we’re announcing that your use of Rich Snippets can help people find the web pages you’ve created that may reference a specific place or location. By using structured HTML formats like hCard to markup the business or organization described on your page, you make it easier for search engines like Google to properly classify your site, recognize and understand that its content is about a particular place, and make it discoverable to users on Place pages.

You can get started by reviewing these tips for using Rich Snippets for Local Search. Whether you’re creating a website for your own business, an article on a newly opened restaurant, or a guide to the best places in town, your precise markup helps associate your site with the search results for that particular place. Though this markup does not guarantee that your site will be shown in search results, we’re excited to expand support for making the web better organized around real world places.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.