Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Structured data is becoming an increasingly important part of the web ecosystem. Google makes use of structured data in a number of ways including rich snippets which allow websites to highlight specific types of content in search results. Websites participate by marking up their content using industry-standard formats and schemas.

To provide webmasters with greater visibility into the structured data that Google knows about for their website, we’re introducing today a new feature in Webmaster Tools - the Structured Data Dashboard. The Structured Data Dashboard has three views: site, item type and page-level.

Site-level view
At the top level, the Structured Data Dashboard, which is under Optimization, aggregates this data (by root item type and vocabulary schema).  Root item type means an item that is not an attribute of another on the same page.  For example, the site below has about 2 million Schema.Org annotations for Books (“http://schema.org/Book”)


Itemtype-level view
It also provides per-page details for each item type, as seen below:


Google parses and stores a fixed number of pages for each site and item type. They are stored in decreasing order by the time in which they were crawled. We also keep all their structured data markup. For certain item types we also provide specialized preview columns as seen in this example below (e.g. “Name” is specific to schema.org Product).


The default sort order is such that it would facilitate inspection of the most recently added Structured Data.

Page-level view
Last but not least, we have a details page showing all attributes of every item type on the given page (as well as a link to the Rich Snippet testing tool for the page in question).


Webmasters can use the Structured Data Dashboard to verify that Google is picking up new markup, as well as to detect problems with existing markup, for example monitor potential changes in instance counts during site redesigns.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Seo Master present to you: Author Photo
By Peter Dickman, Engineering Manager

Google has supported the PubSubHubbbub (PuSH) protocol since its introduction in 2009. Earlier this year we completely rewrote our PuSH hub implementation, both to make it more resilient and to considerably enhance its capacity and throughput. Our improved PuSH hub means we can expose feeds more efficiently, coherently and consistently, from a robust secure access point. Using the PuSH protocol, servers can subscribe to an almost arbitrarily large number of feeds and receive updates as they occur.

In contrast, the Feed API allows you to download any specific public Atom or RSS feed using only JavaScript, enabling easy mashups of feeds with your own content and other APIs. We are planning some improvements to the Feed API, as part of our ongoing infrastructure work.

We encourage you to consider PuSH as a means of accessing feeds in bulk. To support that, we’re clarifying our practices around bots interacting with Google’s PuSH system: we encourage providers of feed systems and related tools to connect their automated systems for feed acquisition to our PuSH hub (or other hubs in the PuSH ecosystem). The PuSH hub is designed to be accessed by bots and it’s tuned for large-scale reading from the PuSH endpoints. We have safeguards against abuse, but legitimate users of the access points should see generous limits, with few restrictions, speed bumps or barriers. Similarly, we encourage publishers to submit their feeds to a public PuSH hub, if they don’t want to implement their own.

Google directly hosts many feed producers (e.g. Blogger is one of the largest feed sources on the web) and is a feed consumer too (e.g. many webmasters use feeds to tell our Search system about changes on their sites). Our PuSH hub offers easy access to hundreds of millions of Google-hosted feeds, as well as hundreds of millions of other feeds available via the PuSH ecosystem and through active polling.

The announcement of v0.4 of the PuSH specification advances our goal of strengthening infrastructure support for feed handling. We’ve worked with Superfeedr and others on the new specification and look forward to it being widely adopted.


Peter Dickman spends his days herding cats for the Search Infrastructure group in Zurich. He divides his spare time between helping government bodies understand cloud computing and systematically evaluating the products of Switzerland’s chocolatiers.

Posted by Scott Knaster, Editor
2013, By: Seo Master
salam every one, this is a topic from google web master centrale blog: Webmaster Level: Intermediate to Advanced

The Chrome team is exploring a few changes to Chrome’s UA string. These changes are designed to provide additional details in the user-agent, remove redundancy, and increase compatibility with Internet Explorer. They’re also happening in conjunction with similar changes in Firefox 4.

We intend to ship Chrome 11 with these changes, assuming they don't cause major web compatibility problems. To test them out and ensure your website remains compatible with Chrome, we recommend trying the Chrome Dev and Beta channel builds. If you have any questions, please check out the blog post on the Chromium blog or drop us a line at our help forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.