Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

Seo Master present to you: Author PictureBy Patrick Copeland, Google.org

Cross-posted with the Google.org Blog

On June 1st and 2nd, thousands of developers from across the U.S. came together at nearly 100 different locations to participate in the first ever National Day of Civic Hacking. Using public data recently released by the government on topics like crime, health and the environment, developers built new applications that help address social challenges.


At the Googleplex in Mountain View, we hosted nearly 100 developers, statisticians, data scientists, and designers, who stayed long into the night hacking together prototypes that show how data on health and the environment can be used to enrich lives. Fusion Tables and Google App Engine were used to prototype, and groups relied on BigQuery as a workhorse to crunch the biggest datasets. Participants used Google+ Hangouts to connect with hackathons in other states and collaborated with Google Apps and platforms.

Here are a few highlights from the hackathon that stood out as useful, visually stunning, and informative ways to use public data:
  • Eat Healthy for Less, the winner of our Mountain View hackathon, is a mobile web application that uses the Consumer Pricing Index to suggest healthy recipes that can be made on a budget.
  • Data+, a reimagining of how we access data, can make exploring public datasets more intuitive and easily understandable for everyone.
  • Detoxic.org is a web experience and Android app that shows you toxic sites and landfills nearby that you might not know about so that you can take civic action against toxic waste.
Many of the ideas have great potential, and we are encouraging participants to continue their work. We hope that the National Day of Civic Hacking will be a catalyst for innovation in this space, and encourage you to keep track of our tools for civic developers at g.co/civicdevelopers.


Congratulations and thanks to everyone who participated!


Patrick Copeland is director of engineering at Google.org, where he works to build systems that leverage Google's reach to help people around the world.

Posted by Scott Knaster, Editor
2013, By: Seo Master
salam every one, this is a topic from google web master centrale blog: Webmaster level: Intermediate

Let's take a quick look at the individual sections in the Google Webmaster Tools' Site Performance feature:

Performance overview



The performance overview shows a graph of the aggregated speed numbers for the website, based on the pages that were most frequently accessed by visitors who use the Google Toolbar with the PageRank feature activated. By using data from Google Toolbar users, you don't have to worry about us testing your site from a location that your users do not use. For example, if your site is in Germany and all your users are in Germany, the chart will reflect the load time as seen in Germany. Similarly, if your users mostly use dial-up connections (or high-speed broadband), that would be reflected in these numbers as well. If only a few visitors of your site use the Google Toolbar, we may not be able to show this data in Webmaster Tools.

The line between the red and the green sections on the chart is the 20th percentile — only 20% of the sites we check are faster than this. This website is pretty close to the 20% mark, which pages would we have to work on first?

Example pages with load times



In this section you can find some example pages along with the average, aggregated load times that users observed while they were on your website. These numbers may differ from what you see as they can come from a variety of different browsers, internet connections and locations. This list can help you to recognize pages which take longer than average to load — pages that slow your users down.

As the page load times are based on actual accesses made by your users, it's possible that it includes pages which are disallowed from crawling. While Googlebot will not be able to crawl disallowed pages, they may be a significant part of your site's user experience.

Keep in mind that you may see occasional spikes here, so it's recommended that you watch the load times over a short period to see what's stable. If you consistently see very large load times, that probably means that most of your users are seeing very slow page loads (whether due to slow connections or otherwise), so it's something you should take seriously.

Page Speed suggestions



These suggestions are based on the Page Speed Firefox / Firebug plugin. In order to find the details for these sample URLs, we fetch the page and all its embedded resources with Googlebot. If we are not able to fetch all of embedded content with Googlebot, we may not be able to provide a complete analysis. Similarly, if the servers return slightly modified content for Googlebot than they would for normal users, this may affect what is shown here. For example, some servers return uncompressed content for Googlebot, similar to what would be served to older browsers that do not support gzip-compressed embedded content (this is currently the case for Google Analytics' "ga.js").

When looking at flagged issues regarding common third-party code such as website analytics scripts, one factor that can also play a role is how wide-spread these scripts are on the web. If they are common across the web, chances are that the average user's browser will have already cached the DNS lookup and the content of the script. While these scripts will still be flagged as separate DNS lookups, in practice they might not play a strong role in the actual load time.

We offer these suggestions as a useful guideline regarding possible first performance improvement steps and recommend using the Page Speed plugin (or a similar tool) directly when working on your website. This allows you to better recognize the blocking issues and makes it easy to see how modifications on the server affect the total load time.


For questions about Webmaster Tools and this new feature, feel free to read the Help Center article, search and post in the Webmaster Help Forums or in the Page Speed discussion group. We hope this information helps you make your website even faster!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Everyone who uses the web knows how frustrating it is to land on a page that sounds promising in the search results but ends up being useless when you visit it. We work hard to make sure Google’s algorithms catch as much as possible, but sometimes spammy sites still make it into search results. We appreciate the numerous spam reports sent in by users like you who find these issues; the reports help us improve our search results and make sure that great content is treated accordingly. Good spam reports are important to us. Here’s how to maximize the impact of any spam reports you submit:

Why report spam to Google?

Google’s search quality team uses spam reports as a basis for further improving the quality of the results that we show you, to provide a level playing field for webmasters, and to help with our scalable spam fighting efforts. With the release of new tools like our Chrome extension to report spam, we’ve seen people filing more spam reports and we have to allocate appropriate resources to the spam reports that are mostly likely to be useful.

Spam reports are prioritized by looking at how much visibility a potentially spammy site has in our search results, in order to help us focus on high-impact sites in a timely manner. For instance, we’re likely to prioritize the investigation of a site that regularly ranks on the first or second page over that of a site that only gets a few search impressions per month. A spam report for a page that is almost never seen by users is less likely to be reviewed compared to higher-impact pages or sites. We generally use spam reports to help improve our algorithms so that we can not only recognize and handle this particular site, but also cover any similar sites. In a few cases, we may additionally choose to immediately remove or otherwise take action on a site.

Which sites should I report?

We love seeing reports about spammy sites that our algorithms have missed. That said, it’s a poor use of your time to report sites that are not spammy. Sites submitted through the spam report form are reviewed for spam content only. Sites that you think should be tackled for other reasons should be submitted to us through the appropriate channels: for example, for those that contain content which you have removed, use our URL removal tools; for sites with malware, use the malware report form; for paid links that you find on sites, use the paid links reporting form. If you want to report spammy links for a page, make sure that you read how to report linkspam. If you have a complaint because someone is copying your content, we have a different copyright process--see our official documentation pages for more info. There’s generally no need to report sites with technical problems or parked domains because these are typically handled automatically.

The same applies to redirecting legitimate sites from one top level domain to another, e.g. example.de redirecting to example.com/de. As long as the content presented is not spammy, the technique of redirecting one domain to another does not automatically violate the Google Webmaster Guidelines.


If you happen to come across a gibberish site similar to this one, it’s most likely spam.

The best way to submit a compelling spam report is to take a good look at the website in question and compare it against the Google Webmaster Guidelines. For instance, these would be good reasons to report a site through the spam report form:
  • the cached version contains significantly different (often keyword-rich) content from the live version
  • you’re redirected to a completely different domain with off-topic, commercial content
  • the site is filled with auto-generated or keyword-stuffed content that seems to make no sense
These are just a few examples of techniques that might be potentially spammy, and which we would appreciate seeing in the form of a spam report. When in doubt, please feel free to discuss your concerns on the Help Forum with other users and Google guides.

What should I include in a spam report?

Some spam reports are easier to understand than others; having a clear and easy-to-understand report makes it much easier for us to analyze the issue and take appropriate actions. Here are some things to keep in mind when submitting the spam report:
  • Submit the URLs of the pages where you see spam (not just the domain name). This makes it easy for us to verify the problem on those specific pages.
  • Try to specify the issue as clearly as possible using the checkboxes. Don’t just check every single box--such reports are less likely to be reviewed.
  • If only a part of the page uses spammy techniques, for example if it uses cloaking or has hidden text on an otherwise good page, provide a short explanation on how to look for the spam you’re seeing. If you’re reporting a site for spammy backlinks rather than on-page content, mention that.
By following these guidelines, your spam reports will be reproducible and clear, making them easier to analyze on our side.

What happens next?

After reviewing the feedback from these reports (we want to confirm that the reported sites are actually spammy, not just sites that someone didn’t like), it may take a bit of time before we update our algorithms and a change is visible in the search results. Keep in mind that sometimes our algorithms may already be treating those techniques appropriately; for instance, perhaps we’re already ignoring all the hidden text or the exchanged links that you have reported. Submitting the same spam report multiple times is not necessary. Rest assured that we actively review spam reports and take appropriate actions, even if the changes are not immediately visible to you.

With your help, we hope that we can improve the quality of and fairness in our search results for everyone! Thank you for continuing to submit spam reports and feel free to post here or in our Help Forum should you have any questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.