Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Everyone who uses the web knows how frustrating it is to land on a page that sounds promising in the search results but ends up being useless when you visit it. We work hard to make sure Google’s algorithms catch as much as possible, but sometimes spammy sites still make it into search results. We appreciate the numerous spam reports sent in by users like you who find these issues; the reports help us improve our search results and make sure that great content is treated accordingly. Good spam reports are important to us. Here’s how to maximize the impact of any spam reports you submit:

Why report spam to Google?

Google’s search quality team uses spam reports as a basis for further improving the quality of the results that we show you, to provide a level playing field for webmasters, and to help with our scalable spam fighting efforts. With the release of new tools like our Chrome extension to report spam, we’ve seen people filing more spam reports and we have to allocate appropriate resources to the spam reports that are mostly likely to be useful.

Spam reports are prioritized by looking at how much visibility a potentially spammy site has in our search results, in order to help us focus on high-impact sites in a timely manner. For instance, we’re likely to prioritize the investigation of a site that regularly ranks on the first or second page over that of a site that only gets a few search impressions per month. A spam report for a page that is almost never seen by users is less likely to be reviewed compared to higher-impact pages or sites. We generally use spam reports to help improve our algorithms so that we can not only recognize and handle this particular site, but also cover any similar sites. In a few cases, we may additionally choose to immediately remove or otherwise take action on a site.

Which sites should I report?

We love seeing reports about spammy sites that our algorithms have missed. That said, it’s a poor use of your time to report sites that are not spammy. Sites submitted through the spam report form are reviewed for spam content only. Sites that you think should be tackled for other reasons should be submitted to us through the appropriate channels: for example, for those that contain content which you have removed, use our URL removal tools; for sites with malware, use the malware report form; for paid links that you find on sites, use the paid links reporting form. If you want to report spammy links for a page, make sure that you read how to report linkspam. If you have a complaint because someone is copying your content, we have a different copyright process--see our official documentation pages for more info. There’s generally no need to report sites with technical problems or parked domains because these are typically handled automatically.

The same applies to redirecting legitimate sites from one top level domain to another, e.g. example.de redirecting to example.com/de. As long as the content presented is not spammy, the technique of redirecting one domain to another does not automatically violate the Google Webmaster Guidelines.


If you happen to come across a gibberish site similar to this one, it’s most likely spam.

The best way to submit a compelling spam report is to take a good look at the website in question and compare it against the Google Webmaster Guidelines. For instance, these would be good reasons to report a site through the spam report form:
  • the cached version contains significantly different (often keyword-rich) content from the live version
  • you’re redirected to a completely different domain with off-topic, commercial content
  • the site is filled with auto-generated or keyword-stuffed content that seems to make no sense
These are just a few examples of techniques that might be potentially spammy, and which we would appreciate seeing in the form of a spam report. When in doubt, please feel free to discuss your concerns on the Help Forum with other users and Google guides.

What should I include in a spam report?

Some spam reports are easier to understand than others; having a clear and easy-to-understand report makes it much easier for us to analyze the issue and take appropriate actions. Here are some things to keep in mind when submitting the spam report:
  • Submit the URLs of the pages where you see spam (not just the domain name). This makes it easy for us to verify the problem on those specific pages.
  • Try to specify the issue as clearly as possible using the checkboxes. Don’t just check every single box--such reports are less likely to be reviewed.
  • If only a part of the page uses spammy techniques, for example if it uses cloaking or has hidden text on an otherwise good page, provide a short explanation on how to look for the spam you’re seeing. If you’re reporting a site for spammy backlinks rather than on-page content, mention that.
By following these guidelines, your spam reports will be reproducible and clear, making them easier to analyze on our side.

What happens next?

After reviewing the feedback from these reports (we want to confirm that the reported sites are actually spammy, not just sites that someone didn’t like), it may take a bit of time before we update our algorithms and a change is visible in the search results. Keep in mind that sometimes our algorithms may already be treating those techniques appropriately; for instance, perhaps we’re already ignoring all the hidden text or the exchanged links that you have reported. Submitting the same spam report multiple times is not necessary. Rest assured that we actively review spam reports and take appropriate actions, even if the changes are not immediately visible to you.

With your help, we hope that we can improve the quality of and fairness in our search results for everyone! Thank you for continuing to submit spam reports and feel free to post here or in our Help Forum should you have any questions.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Seo Master present to you: At Google, we focus constantly on speed; we believe that making our websites load and display faster improves the user's experience and helps them become more productive. Today, we want to share with the web community some of the best practices we've used and developed over the years, by open-sourcing Page Speed.

Page Speed is a tool we've been using internally to improve the performance of our web pages -- it's a Firefox Add-on integrated with Firebug. When you run Page Speed, you get immediate suggestions on how you can change your web pages to improve their speed. For example, Page Speed automatically optimizes images for you, giving you a compressed image that you can use immediately on your web site. It also identifies issues such as JavaScript and CSS loaded by your page that wasn't actually used to display the page, which can help reduce time your users spend waiting for the page to download and display.

Page Speed's suggestions are based on a set of commonly accepted best practices that we and other websites implement. To help you understand the suggestions and rules, we have created detailed documentation to describe the rationale behind each of the rules. We look forward to your feedback on the Webmaster Help Forum.

We hope you give Page Speed a try.

2013, By: Seo Master
salam every one, this is a topic from google web master centrale blog:

Written by , Webmaster Trends Analyst, Zürich

In writing and maintaining accurate meta tags (e.g., descriptive titles and robots information), you help Google to more accurately crawl, index and return your site in search results. Meta tags provide information to all sorts of clients, such as browsers and search engines. Just keep in mind that each client will likely only interpret the meta tags that it uses, and ignore the rest (although they might be useful for other reasons).

Here's how Google would interpret meta tags of this sample HTML page:


<!DOCTYPE …><head>
<title>Traditional Swiss cheese fondue recipes<title>utilized by Google, accuracy is valuable to webmasters
<meta name="description" content="Cheese fondue is …">utilized by Google, can be shown in our search results
<meta name="revisit-after" content="14 days">not utilized by Google or other major search engines
<META name="verify-v1" content="e8JG…Nw=" />optional, for Google webmaster tools
<meta name="GoogleBot" content="noOdp">optional
<meta …>
<meta …>
</head>

<meta name="description" content="A description of the page">
This tag provides a short description of the page. In some situations this description is used as a part of the snippet shown in the search results. For more information, please see our blog post "Improve snippets with a meta description makeover" and the Help Center article "How do I change my site's title and description?" While the use of a description meta tag is optional and will have no effect on your rankings, a good description can result in a better snippet, which in turn can help to improve the quality and quantity of visitors from our search results.

<title>The title of the page</title>
While technically not a meta tag, this tag is often used together with the "description." The contents of this tag are generally shown as the title in search results (and of course in the user's browser when visiting the page or viewing bookmarks). Some additional information can be found in our blog post "Target visitors or search engines?", especially under "Make good use of page titles."

<meta name="robots" content="…, …">
<meta name="googlebot" content="…, …">
These meta tags control how search engines crawl and index the page. The "robots" meta tag specifies rules that apply to all search engines, the "googlebot" meta tag specifies rules that apply only to Google. Google understands the following values (when specifying multiple values, separate them with a comma):

The default rule is "index, follow" -- this is used if you omit this tag entirely or if you specify content="all." Additional information about the "robots" meta tag can be found in "Using the robots meta tag." As a side-note, you can now also specify this information in the header of your pages using the "X-Robots-Tag" HTTP header directive. This is particularly useful if you wish to fine-tune crawling and indexing of non-HTML files like PDFs, images or other kinds of documents.

<meta name="google" content="notranslate">
When we recognize that the contents of a page are not in the language that the user is likely to want to read, we often provide a link in the search results to an automatic translation of your page. In general, this gives you the chance to provide your unique and compelling content to a much larger group of users. However, there may be situations where this is not desired. By using this meta tag, you can signal that you do not wish for Google to provide a link to a translation for this page. This meta tag generally does not influence the ranking of the page for any particular language. More information can be found in the "Google Translate FAQ".

<meta name="verify-v1" content="…">
This Google webmaster tools-specific meta tag is used on the top-level page of your site to verify ownership of a site in webmaster tools (alternatively you may upload an HTML file to do this). The content value you put into this tag is provided to you in your webmaster tools account. Please note that while the contents of this meta tag (including upper and lower case) must match exactly what is provided to you, it does not matter if you change the tag from XHTML to HTML or if the format of the tag matches the format of your page. For details, see "How do I verify my site by adding a meta tag to my site's home page?"

<meta http-equiv="Content-Type" content="…; charset=…">
This meta tag defines the content-type and character set of the page. When using this meta tag, make sure that you surround the value of the content attribute with quotes; otherwise the charset attribute may be interpreted incorrectly. If you decide to use this meta tag, it goes without saying that you should make sure that your content is actually in the specified character set. "Google Webauthoring Statistics" has interesting numbers on the use of this meta tag.

<meta http-equiv="refresh" content="…;url=…">
This meta tag sends the user to a new URL after a certain amount of time, sometimes used as a simple form of redirection. This kind of redirect is not supported by all browsers and can be confusing to the user. If you need to change the URL of a page as it is shown in search engine results, we recommended that you use a server-side 301 redirect instead. Additionally, W3C's "Techniques and Failures for Web Content Accessibility Guidelines 2.0" lists it as being deprecated.

(X)HTML and Capitalization
Google can read both HTML and XHTML-style meta tags (regardless of the code used on the page). In addition, upper or lower case is generally not important in meta tags -- we treat <TITLE> and <title> equally. The "verify-v1" meta tag is an exception, it's case-sensitive.

revisit-after Sitemap lastmod and changefreq
Occasionally webmasters needlessly include "revisit-after" to encourage a search engine's crawl schedule, however this meta tag is largely ignored. If you want to give search engines information about changes in your pages, use and submit an XML sitemap. In this file you can specify the last-modified date and the change-frequency of the URLs on your site.

If you're interested in more examples or have questions about the meta tags mentioned above, jump into our Google Webmaster Help Group and join the discussion.


Update: In case you missed it, the other popular picks were answered in the Webmaster Help Group.this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.