Les nouveautés et Tutoriels de Votre Codeur | SEO | Création de site web | Création de logiciel

salam every one, this is a topic from google web master centrale blog:

Webmaster level: Beginner - Intermediate

We recently hosted our second site clinic, this time at TechHub in London, UK. Like last year, here’s our summary of the topics that came up.

  • Title tags and meta description tags are easy ways to improve your site's visibility in Google search results, yet we still see webmasters not fully utilizing their potential. We have a bit of help available about writing good page titles and descriptions which you can read to brush up on the subject. That said, you can ignore the meta keywords, at least as far as Google is concerned.
  • One way Google's algorithms determine the context of content on a page is by looking at the page’s headings. The way semantic markup is used throughout a site, including h1, h2, and h3 tags, helps us to understand the priorities of a site’s content. One should not fret, though, about every single H tag. Using common sense is the way to go.
  • Just as we recommend you structuring pages logically, it is similarly important to structure the whole website, particularly by linking to related documents within your site as necessary. This helps both users and search engine bots explore all the content you provide. To augment this, be sure to provide a regularly updated Sitemap, which can be conveniently linked to from your site’s robots.txt file for automatic discovery by Google and other search engines.
  • Duplicate content and canonicalization issues were discussed for many websites reviewed at the site clinic. Duplicate content within a website is generally not a problem, but can make it more difficult for search engines to properly index your content and serve the right version to users. There are two common ways to signal what your preferred versions of your content are: By using 301 redirects to point to your preferred versions, or by using the rel="canonical" link element. If you’re concerned about setting your preferred domain in terms of whether to use www or non-www, we recommend you check out the related feature for setting the preferred domain feature in Webmaster Tools.
  • Another commonly seen issue is that some sites have error pages which do not return an error HTTP result code, but instead return the HTTP success code 200. Only documents that are actually available should reply with the HTTP success result code 200. When a page no longer exists, it should return a 404 (Not found) response. Header responses of any URL can be checked using Fetch as Googlebot in Webmaster Tools or using third party tools such as the Live HTTP Headers Firefox addon or web-sniffer.net.
  • Ranking for misspelled queries, e.g. local business names including typos, seems to be an area of concern. In some cases, Google’s automatic spelling correction gets the job done for users by suggesting the correct spelling. It isn’t a wise idea to stuff a site's content with every typo imaginable. It’s also not advisable to hide this or any other type of content using JavaScript, CSS or similar techniques. These methods are in violation of Google’s Webmaster Guidelines and we may take appropriate action against a site that employs them. If you’re not sure how Googlebot “sees” your pages, e.g. when using lots of JavaScript, you can get a better idea by looking at the text-only version of the cached copy in Google web search results.
  • Users love fast websites. That’s why webpage loading speed is an important consideration for your users. We offer a wide range of tools and recommendations to help webmasters understand the performance of their websites and how to improve them. The easiest way to get started is to use Page Speed Online, which is the web-based version of our popular Page Speed Chrome extension. Our Let's make the web faster page has great list of resources from Google and elsewhere for improving website speed, which we recommend you to read.

We’d like to thank the TechHub team, who helped us facilitate the event, and give a big thank you to all participants. We hope you found the presentation and Q&A session interesting. We've embedded the presentation below.

And as we mentioned in the site clinic, sign up at the Google Webmaster Help Forum to discuss any further questions you might have and keep an eye on our Webmaster Central Blog.



this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Webmaster Level : Intermediate/Advanced

Having your website hacked can be a frustrating experience and we want to do everything we can to help webmasters get their sites cleaned up and prevent compromises from happening again. With this post we wanted to outline two common types of attacks as well as provide clean-up steps and additional resources that webmasters may find helpful.

To best serve our users it’s important that the pages that we link to in our search results are safe to visit. Unfortunately, malicious third-parties may take advantage of legitimate webmasters by hacking their sites to manipulate search engine results or distribute malicious content and spam. We will alert users and webmasters alike by labeling sites we’ve detected as hacked by displaying a “This site may be compromised” warning in our search results:



We want to give webmasters the necessary information to help them clean up their sites as quickly as possible. If you’ve verified your site in Webmaster Tools we’ll also send you a message when we’ve identified your site has been hacked, and when possible give you example URLs.

Occasionally, your site may become compromised to facilitate the distribution of malware. When we recognize that, we’ll identify the site in our search results with a label of “This site may harm your computer” and browsers such as Chrome may display a warning when users attempt to visit. In some cases, we may share more specific information in the Malware section of Webmaster Tools. We also have specific tips for preventing and removing malware from your site in our Help Center.

Two common ways malicious third-parties may compromise your site are the following:

Injected Content


Hackers may attempt to influence search engines by injecting links leading to sites they own. These links are often hidden to make it difficult for a webmaster to detect this has occurred. The site may also be compromised in such a way that the content is only displayed when the site is visited by search engine crawlers.



Example of injected pharmaceutical content

If we’re able to detect this, we’ll send a message to your Webmaster Tools account with useful details. If you suspect your site has been compromised in this way, you can check the content your site returns to Google by using the Fetch as Google tool. A few good places to look for the source of such behavior of such a compromise are .php files, template files and CMS plugins.

Redirecting Users


Hackers might also try to redirect users to spammy or malicious sites. They may do it to all users or target specific users, such as those coming from search engines or those on mobile devices. If you’re able to access your site when visiting it directly but you experience unexpected redirects when coming from a search engine, it’s very likely your site has been compromised in this manner.

One of the ways hackers accomplish this is by modifying server configuration files (such as Apache’s .htaccess) to serve different content to different users, so it’s a good idea to check your server configuration files for any such modifications.



This malicious behavior can also be accomplished by injecting JavaScript into the source code of your site. The JavaScript may be designed to hide its purpose so it may help to look for terms like “eval”, “decode”, and “escape”.



Cleanup and Prevention


If your site has been compromised, it’s important to not only clean up the changes made to your site but to also address the vulnerability that allowed the compromise to occur. We have instructions for cleaning your site and preventing compromises while your hosting provider and our Malware and Hacked sites forum are great resources if you need more specific advice.

Once you’ve cleaned up your site you should submit a reconsideration request that if successful will remove the warning label in our search results.

As always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.

this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
salam every one, this is a topic from google web master centrale blog: Webmaster level: All

Since we announced Google’s recommendations for building smartphone-optimized websites, a common question we’ve heard from webmasters is how to best treat tablet devices. This is a similar question Android app developers face, and for that the Building Quality Tablet Apps guide is a great starting point.

Although we do not have specific recommendations for building search engine friendly tablet-optimized websites, there are some tips for building websites that serve smartphone and tablet users well.

When considering your site’s visitors using tablets, it’s important to think about both the devices and what users expect. Compared to smartphones, tablets have larger touch screens and are typically used on Wi-Fi connections. Tablets offer a browsing experience that can be as rich as any desktop or laptop machine, in a more mobile, lightweight, and generally more convenient package. This means that, unless you offer tablet-optimized content, users expect to see your desktop site rather than your site’s smartphone site.

The NY Times mobile and tablet experience

Our recommendation for smartphone-optimized sites is to use responsive web design, which means you have one site to serve all devices. If your website uses responsive web design as recommended, be sure to test your website on a variety of tablets to make sure it serves them well too. Remember, just like for smartphones, there are a variety of device sizes and screen resolutions to test.

Another common configuration is to have separate sites for desktops and smartphones, and to redirect users to the relevant version. If you use this configuration, be careful not to inadvertently redirect tablet users to the smartphone-optimized site too.

Telling Android smartphones and tablets apart

For Android-based devices, it’s easy to distinguish between smartphones and tablets using the user-agent string supplied by browsers: Although both Android smartphones and tablets will include the word “Android” in the user-agent string, only the user-agent of smartphones will include the word “Mobile”.

In summary, any Android device that does not have the word “Mobile” in the user-agent is a tablet (or other large screen) device that is best served the desktop site.

For example, here’s the user-agent from Chrome on a Galaxy Nexus smartphone:

Mozilla/5.0 (Linux; Android 4.1.1; Galaxy Nexus Build/JRO03O) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Mobile Safari/535.19

Or from Firefox on the Galaxy Nexus:

Mozilla/5.0 (Android; Mobile; rv:16.0) Gecko/16.0 Firefox/16.0

Compare those to the user-agent from Chrome on Nexus 7:

Mozilla/5.0 (Linux; Android 4.1.1; Nexus 7 Build/JRO03S) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Safari/535.19

Or from Firefox on Nexus 7:

Mozilla/5.0 (Android; Tablet; rv:16.0) Gecko/16.0 Firefox/16.0

Because the Galaxy Nexus’s user agent includes “Mobile” it should be served your smartphone-optimized website, while the Nexus 7 should receive the full site.

We hope this helps you build better tablet-optimized websites. As always, please ask on our Webmaster Help forums if you have more questions.


this is a topic published in 2013... to get contents for your blog or your forum, just contact me at: devnasser@gmail.com
Powered by Blogger.