Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog:
Webmaster Level: Intermediate.

We hear lots of questions about site architecture issues and traffic drops, so it was a pleasure to talk about it in greater detail at SMX London and I'd like to highlight some key concepts from my presentation here. First off, let's gain a better understanding of drops in traffic, and then we'll take a look at site design and architecture issues.

Understanding drops in traffic

As you know, fluctuations in search results happen all the time; the web is constantly evolving and so is our index. Improvements in our ability to understand our users' interests and queries also often lead to differences in how our algorithms select and rank pages. We realize, however, that such changes might be confusing and sometimes foster misconceptions, so we'd like to address a couple of these myths head-on.

Myth number 1: Duplicate content causes drops in traffic!
Webmasters often wonder if the duplicates on their site can have a negative effect on their site's traffic. As mentioned in our guidelines, unless this duplication is intended to manipulate Google and/or users, the duplication is not a violation of our Webmaster Guidelines. The second part of my presentation illustrates in greater detail how to deal with duplicate content using canonicalization.

Myth number 2: Affiliate programs cause drops in traffic!
Original and compelling content is crucial for a good user experience. If your website participates in affiliate programs, it's essential to consider whether the same content is available in many other places on the web. Affiliate sites with little or no original and compelling content are not likely to rank well in Google search results, but including affiliate links within the context of original and compelling content isn't in itself the sort of thing that leads to traffic drops.

Having reviewed a few of the most common concerns, I'd like to highlight two important sections of the presentation. The first illustrates how malicious attacks -- such as an injection of hidden text and links -- might cause your site to be removed from Google's search results. On a happier note, it also covers how you can use the Google cache and Webmaster Tools to identify this issue. On a related note, if we've found a violation of the Webmaster Guidelines such as the use of hidden text or the presence of malware on your site, you will typically find a note regarding this in your Webmaster Tools Message center.
You may also find your site's traffic decreased if your users are being redirected to another site...for example, due to a hacker-applied server- or page-level redirection triggered by referrals from search engines. A similar scenario -- but with different results -- is the case in which a hacker has instituted a redirection for crawlers only. While this will cause no immediate drop in traffic since users and their visits are not affected, it might lead to a decrease in pages indexed over time.




Site design and architecture issues
Now that we've seen how malicious changes might affect your site and its traffic, let's examine some design and architecture issues. Specifically, you want to ensure that your site is able to be both effectively crawled and indexed, which is the prerequisite to being shown in our search results. What should you consider?

  • First off, check that your robots.txt file has the correct status code and is not returning an error.
  • Keep in mind some best practices when moving to a new site and the new "Change of address" feature recently added to Webmaster Tools.
  • Review the settings of the robots.txt file to make sure no pages -- particularly those rewritten and/or dynamic -- are blocked inappropriately.
  • Finally, make good use of the rel="canonical" attribute to reduce the indexing of duplicate content on your domain. The example in the presentation shows how using this attribute helps Google understand that a duplicate can be clustered with the canonical and that the original, or canonical, page should be indexed.


In conclusion, remember that fluctuations in search results are normal but there are steps that you can take to avoid malicious attacks or design and architecture issues that might cause your site to disappear or fluctuate unpredictably in search results. Start by learning more about attacks by hackers and spammers, make sure everything is running properly at crawling and indexing level by double-checking the HTML suggestions in Webmaster Tools, and finally, test your robots.txt file in case you are accidentally blocking Googlebot. And don't forget about those "robots.txt unreachable" errors!

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Webmaster level: Advanced

Two years ago we released the Page Speed browser extension and earlier this year the Page Speed Online API to provide developers with specific suggestions to make their web pages faster. Last year we released mod_pagespeed, an Apache module, to automatically rewrite web pages. To further simplify the life of webmasters and to avoid the hassles of installation, today we are releasing the latest addition to the Page Speed family: Page Speed Service.

Page Speed Service is an online service that automatically speeds up loading of your web pages. To use the service, you need to sign up and point your site’s DNS entry to Google. Page Speed Service fetches content from your servers, rewrites your pages by applying web performance best practices, and serves them to end users via Google's servers across the globe. Your users will continue to access your site just as they did before, only with faster load times. Now you don’t have to worry about concatenating CSS, compressing images, caching, gzipping resources or other web performance best practices.

In our testing we have seen speed improvements of 25% to 60% on several sites. But we know you care most about the numbers for your site, so check out how much Page Speed Service can speed up your site. If you’re encouraged by the results, please sign up. If not, be sure to check back later. We are diligently working on adding more improvements to the service.

At this time, Page Speed Service is being offered to a limited set of webmasters free of charge. Pricing will be competitive and details will be made available later. You can request access to the service by filling out this web form.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Yah setelah lama mesin pencari google tidak mengupdate pagerank sebuah website / blog, kini google kembali update pagerank atau PR. Saya baru mengetahui hal ini dari status rekan blogger di facebook dan penasaran juga, akhirnya saya test pagerank blog seo ini pada 27 juni 2011.

Pastinya saya tidak tahu sejak kapan google mengapdate pagerank tersebut, karena lama juga tidak ngeblog, mungkin para
Powered by Blogger.